SQL Server DDL Triggers to Track All Database Changes

Problem

In a perfect world, only the DBA would have sa privileges, F5 would only ever be hit on purpose, every change would go through rigorous source control procedures, and we would have full backups of all databases every minute. Of course, in reality, we deal with much different circumstances, and we can find ourselves (or overhear someone else) saying, "Oops... how do I fix that?" One of the more common scenarios I've seen involves someone editing a stored procedure multiple times between backups or within some kind of cycle, and then wishing they had version (current - 1) available. It's not in the backup yet, so can't be restored; and the user, of course, has closed his or her window without saving.

Solution

There are a lot of solutions to this issue, of course. They include tightening down server access, adopting a reliable source control system, and implementing a rigorous and well-documented deployment process. These things do not happen overnight, so in the meantime, DDL Triggers can provide a short-term fix that is both easy to implement and simple to manage. The approach is to take a snapshot of the current objects in the database, and then log all DDL changes from that point forward. With a well-managed log, you could easily see the state of an object at any point in time (assuming, of course, the objects are not encrypted).

So where do we start? First, I like to keep housekeeping items (monitoring, administration etc.) in their own database. This allows me to query things centrally and also to control growth separately. For this task, let's use a database called AuditDB:

CREATE DATABASE AuditDB;
GO

To keep things relatively simple, let's assume we are only interested in actions taken on stored procedures - create, alter, drop. We have a set of stored procedures already, and they are in a given state. We will need to capture that state, in addition to any changes that are made to them from that point forward. This way, we will always be able to get back to any state, including the original state.

In addition to the data specific to the actions taken on stored procedures, we can also think of several other pieces of information we would want to store about each event. For example:

database name

schema / object name

login information

host name / IP address (useful with SQL auth)

So here is the definition for a table to capture these events and the surrounding information about them:

Yes, we could keep the table skinnier and use [object_id] instead of schema/object name, also protecting us from resolution problems due to renames. However, often stored procedures are dropped and re-created, in which case the system will generate a new [object_id]. I also prefer to use the database name to make ad hoc queries (and script automation) against specific databases easier. You can choose which metadata to rely on; personally, I'll trade the space for readability and scriptability.

Now that the table exists, we can easily grab a snapshot of our existing stored procedure definitions, leaving out some of the irrelevant auditing data, as follows (replacing 'my name' with whatever you want to display for the initial rows):

Now we're ready to start capturing actual changes to these procedures as they happen. You can create a DDL Trigger with the following code, that will record pertinent data to the above table when changes are made to stored procedures:

Assuming your system is relatively quiet, all you should see is the change above. Now to go one step further, you can examine the differences between the initial object and its most recent state using a query like this:

If you are tracking down a specific object or an object in a specific schema, you could put additional WHERE clauses against Original.ObjectName or Original.SchemaName. From here, you can take the values for "OriginalCode" and "NewestCode" and put them through your favorite diff tool to see what changes there have been. And you can also change the query slightly to retrieve the latest version of any procedure, and the version that preceded it - I'll leave that as an exercise for the reader.

What the above does not capture are other peripheral changes that can happen to a stored procedure. For example, what about moving a procedure to a different schema? You can change the DDL Trigger above in the following way to capture the ALTER_SCHEMA event:

And how about rename? Unfortunately in SQL Server 2005, DDL Triggers were unable to observe calls to sp_rename (or manual renames through Management Studio). In SQL Server 2008 and above, however, a rename can be captured with the aptly-named RENAME event:

(In a future tip, I'll demonstrate how to restrict these additional auditing rows to specific objects or object types, so that you're not capturing all kinds of irrelevant information about changes to objects other than stored procedures.)

Some other considerations:

You may want to put in a cleanup routine that gets rid of "noise" more than <n> days old (but still keeping the set of objects that are important to you).

To validate that your auditing process is capturing all changes, you can check modify_date in sys.procedures. Of course this only works for procedures that haven't been dropped - only if they have been created, modified, renamed, or transfered to a different schema.

Security might be an issue, depending on what you want to accomplish. Allow me to elaborate:

DDL Triggers will not be transparent to users - first of all, they can see them in the Object Explorer tree, so it won't be a big secret that they are there and operational. They also appear in execution plans; if users have this option enabled when they create or modify objects in Management Studio, they will see the query plan for statements such as INSERT AuditDB.dbo.DDLEvents.

If you want to hide the definition of the DDL Trigger, you can encrypt it as follows:

USE YourDatabase;
GO
ALTER TRIGGER DDLTrigger_Sample
ON DATABASE
WITH ENCRYPTION
FOR -- ...

This way, when users want to see what the trigger is doing, they will right-click to generate a script, but the following is what will happen:

TITLE: Microsoft SQL Server Management Studio
------------------------------
Script failed for DatabaseDdlTrigger 'DDLTrigger_Sample'.
Property TextHeader is not available for DatabaseDdlTrigger
'[DDLTrigger_Sample]'. This property may not exist for this
object, or may not be retrievable due to insufficient access
rights. The text is encrypted.
...

But users with sufficient privileges can still disable the trigger, as described above. And you can't even capture this event, much less prevent it (which DDL Triggers are sometimes used for). For more information, see these Connect items:

So, assuming SQL Server 2008 or above, you could use an audit specification to capture DDL events as a backup (or instead). But, given all of this, if you have to go to these lengths to prevent people from circumventing your auditing capabilites, then maybe your problems are larger and not all that technical. I suspect that in most reasonable environments, you'll sleep fine at night simply locking down the audit table.

I hope this provides a decent starting point to protect your environment(s) with DDL Triggers. However, given the manual aspect of this approach as well as its limitations, it will likely be best to consider this a short-term plan, and look into more robust source control and recovery techniques in the longer term.

Next Steps

Get a copy of Mladen Prajdic's free SSMS Tools Pack add-in for Management Studio. If you found this article useful and relevant, chances are, the Query Execution History feature alone will save your bacon (or one of your co-workers') someday.

Thanks Aaron for the job suggesion but the job should check periodically from SQL Agent whether any notified = 0 are there. But my concern is, Is there any way we can make it part of the trigger or some other way just to send an email after 2 or 3 minutes after first row is created in audit table? Appreciate your time ane help.

@Krishna Of course, instead of having the trigger send you an e-mail, just add a column to the audit table like Notified bit NOT NULL DEFAULT (0). Then have a background job that checks for new rows where Notified = 0, assemble a single e-mail containing the data from all of those rows, and then mark them as Notified = 1.

Beautiful article. I created a trigger on the audit table to send an email to me. But I am getting number of emails just for one database creation because lot ALTER databases are involved. So is there any way I can get one email including all those events in one email rather than sending one email for each alter or create?

Vikash - only if the Windows login name is being exposed to SQL Server (it may not be because they're using SQL authentication or ). You can experiment with things like USER_NAME(),SUSER_NAME(),SUSER_SNAME(),CURRENT_USER, and ORIGINAL_LOGIN().

Thanks for this great post. I was wondering is there any way to keep record of windows login User Name along with IP address using SQL Database DDL trigger. Currently this code is saving user name of SQL Server using "HOST_NAME()" and IP Address from where the query got executed but i want windows login name as well.

Hi, I have question for the trigger event. This is the case, I have a table with columns contains default value. When I add a new column using Design Mode in SQL Server, it triggers the ALTER TABLE event. I get the script from the EventData, but I found that the script drop the table first and recreate the table that the columns contains default value, now don`t. Is it possible the trigger event able to generate script correctly, that the table columns' default value won`t lost.

I 've been using such a trigger for a while, but I thought it would be nice to add the IP as you do, so I pasted that part in my trigger definition. Last night some jobs failed. Since the error said 'user does not have permission...' I checked if guest had insert permission in the audit table and since it did, I tried different ways to grant necessary rights, but any alter statement kept failing.

Now I found that the error was caused by the sys.dm_exec_connections DMV. So, a user who executes a DDL statement needs the VIEW SERVER STATE permission. I think that is a very important consequence. Do you advise to grant this to everyone, maintain a list of all users who need this or do you have another strategy to prevent this trigger from breaking the system?

Are you asking if there is a generic way to write a trigger that will capture all DML activity regardless of table? No, there is nothing native that you could just turn on and call it a day.

You could add a trigger to every table you create, of course (it could even largely be handled automatically by a DDL trigger that responds to CREATE_TABLE events, but you'd have to have something watch for ALTERs as they could break the trigger code that gets generated). Or you could use features like Change Tracking, Change Data Capture, or Audit. You could even set up an Extended Events session or server-side trace to capture all DML statements, and mine that data, but that could be quite invasive.

I have been using this code on many systems (2008R2) from all types of applications. I've had two issues...I commented one above about using select top 1 client_net_address because some apps have multiples during installs and it bombs. The other issue is that bigfix runs its installer in snapshot isolation. You may want to run the following to allow triggered events from such cases to write to the table without error.

Hi Elief, did you add RENAME as one of the DDL events, as I described toward the end of the article? Are you on 2008 or better? I don't know of a simple answer in 2005, maybe the default trace or a custom server-side trace...

A few versions of this code have been pasted around the net for a couple of years, and I have been using this for a couple of years. I did have issues with MS apps(SCOM) and at least one other app failing to upgrade when the trigger is enabled.

It appears that it is better to have the query say select top 1 client_net_address as some app installers do some conn trickery where that query comes back with multiple results.

I am thinking about implementing this on a mirrored server to copy the logins from one server to the other. Has anyone even attempted anything like this? I am just wondering how I will keep track of changes between the servers better then we currently are doing it.

Aaron! I was able to achive the DDL logging for partitioned tables using the method suggestd by you. I have two schemas dbo and Switch. I am trying to capture the events on dbo schema and after verifying the affected objects I am executing the same DDL on Switch schema by simply capturing the DDL and replacing the schema name to Switch.

However this approach fails when a user issues statements without specifying the schema name. Any suggestions how can I go about achiveing this functionality?

Great post Aaron. I am currently creating a similar DDL trigger on a database for partitioned tables. I need to check if DDL is executed against partitioned tables and apply the same to another table in different schema on the same database.

Your post helped me in uderstanding how the database level triggers work. Is there a way you can guide me on how to apply the same for partitioned tables? or to be more precise, can i implement database tiggers for DDL executed against specific tables?

@GB - while tracking changes across your environment using the DDL triggers is definitely doable, IMHO what you really want is Event Notifications. Uses Service Broker and can track all the things the triggers can, some trace items, and a bunch of other things as well. I presented on this for the virtual DBA chapter of PASS a couple of months ago. Might be worth a watch/read. http://thebakingdba.blogspot.com/2012/10/master-of-all-i-survey-using-event.html and mms://passmedia.sqlpass.org/share/dba/MasterofAllISurvey_03272013.wmv. Jonathon Kehayias has also done some amazing stuff with EN, I'd really recommend his articles on it, starting with http://www.sqlservercentral.com/articles/Event+Notifications/68831/ . Hope this helps someone!

can any one tell me how to create a table with different prfix name and with same structure, on creation of a table using ddl triggers. I also need similarly for modification of the table structure to reflect on the other table.

I really want to use a server level trigger for all this. I've tried and tried, with different permissions, and it just doesn't seem to work. I've read all the articles I can find, and it truly seems that it is just not possible to create a server level trigger to capture all DDL and Security changes.

I have a question -- if, as you have mentioned in this article, I wanted to get the latest version of the stored procedure and the one preceding it, then how should I do it ? A little commenting for statements in the CTE would the article a world of good. Anyways, great post sans my issue :->

Another way the DDL trigger may not be transparent to users is if they do not have INSERT rights on the logging table. You can enable the guest user in AuditDB and GRANT INSERT ON DDLEvents TO public to avoid problems.

I have had issues with TRANSACTION DOOMED IN TRIGGER errors on DDL changes if inconsistent settings are used for ANSI_PADDING. Using SET ANSI_PADDING ON in all CREATE and ALTER scripts is one solution.

Quick question: Has anyone tried to do this from a centralized server? Meaning, I have one server that we use for monitoring/maintenance and I want the triggers to feed the data into a centralized table. I'm having problems with the XML data type going through linkedservers.