One way to offer a better experience to our application users is to anticipate likely mistakes they could make, such as when filling in a web form, and to provide them with a meaningful error message that allows them to continue quickly with their work. One way to do this in SQL Server to is to use the sp_addmessage system stored procedure, to add custom error messages to the master.sys.messages system catalog view (which can then be referenced in our error handling routines, either using RAISERROR), or, preferably, TRY...CATCH...THROW.

To ensure consistent behavior, the administrator needs to "synchronize" the contents of sys.messages across all SQL Server instances. Unfortunately, there's no easy, built-in way to do this. I wrote an article on SQLServerCentral.com that showed one technique, using simple string concatenation to capture custom messages from one instance and then a cursor that called sp_addmessage to add them to sys.messages on another instance. However, it would still be a cumbersome and tedious task for an administrator to make sure that each message is added to all relevant SQL Server instances, and for all languages in which you need to display the messages.

Surely, there must be an easier way. SQL Data Compare is designed specifically to synchronize data between source and target tables. However, we need a workaround to make SQL Data Compare work when comparing system views, which involves creating a proxy view, and some INSTEADOF triggers on the view, in the master database (a practice of which I generally disapprove).

This trick might be useful for small data sets and periodic synchronization, such as to ensure that high availability or disaster recovery instances that are not clustered have their custom error messages synched, but I wouldn't recommend it as a general-purpose technique, for other types of tables and views.

Using SQL Data Compare With a "Proxy" View

SQL Data Compare will the source database (or scripts folder) to the target and generate a script to synchronize the data in the two. When you deploy the script to the target, it will INSERT any rows that exist in the source but not the target, DELETE any that are in the target but not the source, and UPDATE any that are in both, but different, to match what is in the source.

It's a perfect tool for synchronizing small volumes of static data, including reference data, seed data, test data, and custom error messages, which a developer or administrator needs to ensure are deployed to various downstream environments. However, the first problem we have is that the tool can't compare system views and tables, at least not directly.

The way around this is to create a "proxy" view of sys.messages, in the master database. You need to do this on both the source and target instances. This is a bit unorthodox, and I don't like creating objects in master. However, it will work, and credit goes to Sam Blackburn, one of my colleagues at Redgate Software, who came up with this as a workaround.

To try this out, first, let's put a test message in the source database

Open SQL Data Compare. In this case, I'm comparing a SQL Server 2014 and SQL Server 2016 instance. On the Data Sources tab, set the source and target instances, with the Database as master in each case.

Figure 1

On the Options tab, under Mapping Behavior, make sure the Include views checkbox is ticked. Now click over to the Tables&Views tab, where you'll need to specify the Comparison Key for our SysMessagesProxy view. I'll use the message_id and language_id columns. These will produce a unique key for the messages.

Figure 2

I've also set a WHERE clause to ensure I'm only trying to compare user-defined (not system) error messages.

Figure 3

We can now run the comparison. SQL Data Compare will generate a script to insert the row into the target view but if you try to deploy it, it will fail because direct inserts and updates are not allowed on system tables.

We need one more "workaround" step.

Using an INSTEAD OF Trigger on the Target View

We need to create three INSTEADOF triggers on the SysMessageProxy view, one each to handle INSERT, DELETE, and UPDATE actions. So, for example, an attempt to INSERT a row into the target view will fire the AddCustomMessage trigger, and using an embedded cursor, it will iterate through each message in the Inserted table. For each row, we first check to see if the message exists in sys.messages. If so, we move to the next row.

If the row does not exist, we get the name of the language, corresponding to the row's language ID, from sys.syslanguages. This is because the ID is stored in the table, but we need the name of the language for sp_addmessage. The trigger then executes sp_addmessage to insert the row into sys.messages and then fetches the next row, and so on.

Now, we're ready to re-run the comparison project. I've added a few more custom messages to SysMessageProxy on the source SQL Server 2014 instance, and one to the target SQL Server 2016 instance. Figure 4 shows the results of the comparison, using SQL Data Compare.

Figure 4

Click the Deploy button and choose Deploy using Data Compare. Upon clicking Next, you'll see the deployment script, which will be saved. I can see this has one DELETE and four INSERTs.

Figure 5

Click Deploy, and you'll see a confirmation dialog that you're about to make a permanent change to the target; once this is deployed, you can't undo the changes. If you haven't captured the previous state of the data, you won't be able to easily go back.

Now, with our triggers in place, the deployment will work. Assuming you had the Recompare after deployment box checked, SQL Data Compare will re-run the comparison is run and you will see that the data is now in sync.

Figure 6

Summary

I've presented a trick to get SQL Data Compare to synchronize custom error messages, stored in the sys.messages system catalog view, across SQL Server instances. This can be a handy way of ensuring your DR and HA instances keep their messages in sync. That said, the idea of creating object in master will not be appealing to most administrators. Also, a trigger running a cursor is very inefficient and could cause resource and performance issues if used for large datasets, so I would not recommend this technique for other types of tables.