In parts one and two of this series, I have been specifically focusing on the latest version of SQL Source Control by Red Gate Software. But I have been doing source-controlled SQL development for years, long before this product was available, and well before Microsoft came out with Database Projects for Visual Studio. “So, how does that work?” you may wonder. Well, let me share some of the details of how we do it where I work…

The key to this approach is that everything is done via Transact-SQL script files; either natively written T-SQL, or generated. My preference is to write all my code by hand, which forces you to become better at your SQL syntax. But if you really prefer to use the Management Studio GUI to make database changes, you can still do that, and then you use the Generate Scripts feature of the GUI to produce T-SQL scripts afterwards, and store those in your source control system. You can generate scripts for things like stored procedures and views by right-clicking on the database in the Object Explorer, and Choosing Tasks, Generate Scripts (see figure 1 to the left). You can also do that for the CREATE scripts for tables, but that does not work when you have a table that is already in production, and you need to make just a simple change, such as adding a new column or index. In this case, you can use the GUI to make the table changes, and then instead of clicking the Save button, click the Generate Change Script button (). Then, once you have saved the change script, go ahead and execute it on your development database to actually make the change. I believe that it is important to actually execute the script rather than just click the Save button because this is your first test that your change script is working and you didn’t somehow lose a portion of the change.

As you can imagine, all this generating of scripts can get tedious and tempting to skip entirely, so again, I would encourage you to just get in the habit of writing your own Transact-SQL code, and then it is just a matter of remembering to save your work, just like you are in the habit of saving changes to a Word or Excel document before you exit the program.

So, now that you have all of these script files, what do you do with them? Well, we organize ours into folders labeled ChangeScripts, Functions, Views, and StoredProcedures, and those folders are loaded into our source control system. ChangeScripts contains all of the table and index changes, and anything else that is basically a one-time-only execution. Of course you want to write your scripts with qualifying logic so that if a script were accidentally run more than once in a database, it would not crash nor corrupt anything; but these scripts are really intended to be run only once in a database.

Once you have your initial set of scripts loaded into source control, then making changes, such as altering a stored procedure becomes a simple matter of checking out your CREATE PROCEDURE* script, editing it in SSMS, saving the change, executing the script in order to effect the change in your database, and then checking the script back in to source control. Of course, this is where the lack of integration for source control systems within SSMS becomes an irritation, because this means that in addition to SSMS, I also have my source control client application running to do the check-out and check-in. And when you have 800+ procedures like we do, that can be quite tedious to locate the procedure I want to change in source control, check it out, then locate the script file in my working folder, open it in SSMS, do the change, save it, and the go back to source control to check in. Granted, it is not nearly as burdensome as, say, losing your source code and having to rebuild it from memory, or losing the audit trail that good source control systems provide. It is worth the effort, and this is how I have been doing development for the last several years.

Remember that everything that the SQL Server Management Studio does in modifying your database can also be done in plain Transact-SQL code, and this is what you are storing. And now I have shown you how you can do it all without spending any extra money. You already have source control, or can get free, open-source source control systems (almost seems like an oxymoron, doesn’t it) and of course Management Studio is free with your SQL Server database engine software. So, whether you spend the money on tools to make it easier, or not, you now have no excuse for not using source control with your SQL development.

* In our current model, the scripts for stored procedures and similar database objects are written with an IF EXISTS…DROP… at the top, followed by the CREATE PROCEDURE… section, and that followed by a section that assigns permissions. This allows me to run the same script regardless of whether the procedure previously existed in the database. If the script was only an ALTER PROCEDURE, then it would fail the first time that procedure was deployed to a database, unless you wrote other code to stub it if it did not exist. There are a few different ways you could organize your scripts for deployment, each with its own trade-offs, but I think it is absolutely critical that whichever way you organize things, you ensure that the same script is run throughout the deployment cycle, and do not allow customizations to creep in between TEST and PROD. If you do, then you have broken the integrity of your deployment process because what you deployed to PROD was not exactly the same as what was tested in TEST, so you effectively have now released untested code into PROD.

Comments on this entry:

>Of course, this is where the lack of integration for source control systems within SSMS becomes an irritation

We use VAULT and successfully check out/check in using SSMS without having to use the VAULT client; we create a SSMS project that includes the stored procs (or whatever) and then the classic client fine.

Andrew, you are correct that with SSMS Projects you can integrate with your source control system. I experimented with this a little bit, too. But as I mentioned in Part 1, and should have repeated here, is that Microsoft has deprecated SSMS Projects and is discouraging their use. BOL shows that Projects will be removed from a future version (see http://msdn.microsoft.com/en-us/library/ms173803(v=SQL.105).aspx). So, I didn't want to establish an SOP for my team that we already knew was going to have to change when we upgraded one of these days.

Firstly, thanks for the series. Great read. There's a bit of a discussion in our office over db scripts and source control. Our DBA doesn't store any scripts in source control and promotes data and schema changes via Red gates compare tools to test environments and beyond or via scripts attached to custom sharepoint forms that track development, saying there's no real benefit given the extra overhead of maintaining scripts in TFS due to the fast rate of (small) change in our development. Databases are backed up nightly and transactional every 15 minutes so developer catastrophes in development are reverted via a restore.While I disagree with this approach, I can see his point. What are your thoughts? Cheers.

Dan Bass, I think that what you don't get from a backup system, but will get from source control system, is to be able to detect conflicts between 2 versions of code, and merge them if needed. You will also be able to know, which changes were done, when and by whom.

Sure you can see what changed, and merge changes using backups and redgate compare software tool, but there is a overhead for this process.

You can try to ask the dba, if he/she can automate SQL Comapre to create changes scripts from the database to a script folder. This is done by a command line execution, and can be scheduled. This will give you a change script of all that is changed in the database since the last execution, and can easily be integrated with a source control system. This will though not be able to tell you who is to blame for breaking uo the code, but you will know it happend.

Dan, sorry for the long lag time in responding to your comment. Rafi has a couple of good points. In addition to those, my experience has shown that it is wise to make changes in databases by way of scripts and to SAVE those scripts somewhere. I like using source control for that, but if you prefer to stick them in a SharePoint Document Library, or just in a file folder somewhere, that's fine. I have found that sometimes you don't know that a change had an undesired ripple effect until you are well past the point where doing a RESTORE would be sensible. It is good in that case to have a copy of the script that actually made the changes to help analyze and troubleshoot. You can't do that if you are always using the live SYNC function of Red Gate's tools.

The other reason we kept all our scripts in source control is that all of my developers work in their own sandboxes, and so you needed those scripts not just for the deployment to QA/UAT/PROD environments, but also to keep the other developers in sync. Yes, this could be done with Red Gate's tools, but then you're synchronizing two potentially dirty development environments rather than updating from a clean (because only tested and actually desired changes are checked-in) repository.