NOVALIDATE Constraints – No really … July 28, 2008

There have been a number of posts recently on the OTN database forum regarding the whole topic of NOVALIDATE of constraints and the associated indexes so I thought it might be worth going over a couple of interesting little quirks with all this.

A NOVALIDATE constraint is basically a constraint which can be enabled but for which Oracle will not check the existing data to determine whether there might be data that currently violates the constraint.

This is useful if we know there’s data that violates the constraint but we want to quickly put on a constraint to prevent further violations, with the intention to clean up any possible violations at some future point in time.

It’s also potentially useful if we know the data is clean and so want to prevent the potentially significant overheads of Oracle having to check all the data to ensure there are indeed no violations.

Note that the ID column is populated with unique values. However, let’s now introduce a duplicate value, 42:

SQL> INSERT INTO ziggy VALUES (42, ‘DUPLICATE’);

1 row created.

SQL> COMMIT;

Commit complete.

OK, we now want to add a Primary Key to this table but because we suspect there might be some duplicate values which we intend to clean up at some future point in time, we want to create the constraint with NOVALIDATE:

We clearly stated we want to create a NOVALIDATE constraint but Oracle appears to be ignoring this and is validating the constraint regardless and so generating an error because of the duplicate entry.

Why ?

Because by default Oracle will attempt to create a Unique index when creating a PK constraint. A Unique index MUST always contain unique values and so complains when it stumbles across our duplicate 42 ID value. The constraint is being effectively validated because the unique index will only be created providing there are indeed no duplicate values.

Not how I would have designed things but there you go …

However, if we either have an existing Non-Unique index which Oracle can use or we explicitly create a Non-Unique index, then we can proceed with creating the NOVALIDATE constraint as required:

Let’s now do something that based on our understanding might appear to be a little odd, let’s try and recreate the constraint in a NOVALIDATE state but with a Unique index. This of course should now work as there are indeed no duplicates within the data:

As expected, we have a constraint that’s policed by a Unique index that has not been validated.

This might appear be a little odd, because the question you might well now ask is why bother no validating a constraint that has effectively been validated anyways as the use of the Unique index has guaranteed there can not possibly be any duplicate values else the creation of the Unique index would have failed ?

We effectively have a validated constraint which Oracle is still classifying as being not validated 🙂

Then again, maybe not …

More later.

Share this:

Like this:

Related

I guess following line need to be corrected
“Because by default Oracle will attempt to create a Non-Unique index when creating a PK constraint. A Unique index MUST always contain unique values and so complains when it stumbles across our duplicate 42 ID value.”

Shouldn’t this be “Because by default Oracle will attempt to create a Unique index when creating a PK constraint. “

Thanks to Richard Foote, Oracle has lots of such miniscule problems. This is one of them…One i came across in last couple of weeks was UNLIMITED PRIVILEGE available to the user who is granted RESOURCE role, as dba_sys_privs does not mention this privilege when GRANTEE=RESOURCE… and worse part is that its difficult to find anywhere on Oracle’s official documentation

I am using oracle forms. i applied ckeck constraint not null with novalidate. but when ever new data gets updated it checks the old data and through me not null error. Is there any solution for this?
Kindly Guide.

Thanks Richard, this is very useful. I have one question though, can you please help to understand what is the use of declared constraints that are not enforced, e.g “DISABLE NOVALIDATE” primary key or foreign key constraints.

Is it in anyway used by the oracle optimizer, there is no information readily available on the internet about this.

Constraints add overheads in terms of having to be checked and also obviously prevent data being loaded that violate such constraints.

The use case is that you want to temporarily disable a constraint and not have the overhead of checking that new data meet the disabled business rule. So for example, you may have a bunch of external data you want to load into a table and either you know there might be violations that you wish to initially ignore and fix later, or you know the data is perfectly valid and you want to speed up the load by not having the database check the validity of the new data.

But it’s a temporary disablement which you plan to enable again hence why you haven’t dropped the constraint.