Database Administrators Stack Exchange is a question and answer site for database professionals who wish to improve their database skills and learn from others in the community. It's 100% free, no registration required.

Currently, I am investigating some way to more efficiently backup 4 Production SQL Databases (3 different servers/instances) and would like to know if anyone else has had the problems we are having and if anyone has any recommendations.

Database 1: 80GB

Database 2: 74GB

Database 3: 5GB

Database 4: 30GB

Database 1 & 2 are still currently SQL Server 2005 with no plans on upgrading them right now

Database 3 & 4 are for new systems and will be growing rapidly over the next several months. Currently Mirrored.

The problem we are running into is that there are maintenance tasks on the servers that run for these databases for backups that take roughly 30 min for Database 1 & 2. Now these all run at the same time as several other environment backups that the Sys Eng have running.

After the backup, a schedules task using 7-zip is run to compress the backups. All of this is killing the CPU during that time frame and if for whatever reason a job/task gets locked up, it will affect the entire system.

Now due to the business we are in, we have a very limited timeframe we can run these backup jobs, and these backups need to be done. We also need to start encrypting the backups after the compression to further our HIPPA compliance... yep...

Another part of the problem that we are are aware of is we have 20 other databases doing the same thing at the same time, which is killing the SAN, all of which we know and believe don't need these backup tasks.

So aside from that which we are working on, we are trying new methods of backing up databases to try to take the CPU load off the database servers as much as possible and hopefully do the backups for the databases incrementally.

Any suggestions? Tools, methods? We've tested Red-Gates SQL backup pro tool, SQL backupandftp freeware and using SSIS but none have seemed to accomplish what we were hoping for (ability to quickly accomplish the task and/or run on another server taking the CPU off the DB server).

Could you please add details about the current backup strategy? What is the recovery model of each database? What I'm trying to find out is: are backups being taken too frequently, and are the right backups being taken to meet the business requirements.
–
Jon SeigelJul 11 '12 at 13:36

2 Answers
2

The first thing I generally recommend doing is introducing differential backups to the backup strategy. If the amount of data churn is moderate or low, this can work extremely well for saving space and time because you only back up changes rather than everything. This does add some complexity to restore scenarios, and you must make sure you know what COPY_ONLY backups are.

The second thing is leveraging native backup compression. SQL Server 2008 Enterprise and SQL Server 2008 R2 Standard editions support backup compression out of the box -- no need for 3rd-party tools. If you aren't taking advantage of this, you're really missing out. It is off by default for backwards compatibility. You can turn it on server-wide by default in the Server Properties | Database Settings page, or per-backup using the WITH COMPRESSION option. This will still have a small CPU hit, but the cost will be amortized over the duration of the backup instead of all at once (assuming backups are I/O bound, which they usually are).

Some other random thoughts that may or may not apply to your environment: stagger the backups to reduce I/O contention; back up to multiple locations simultaneously; back up to LUN(s) dedicated to backups; tweak the backup buffer settings.

For the 2005 backups, what you can consider doing is limiting the amount of CPU used by the compression utility by using Windows Server Resource Manager. Or, since the file has to be written somewhere before that step happens anyways, you could offload that process to another box.

Encrypting backups is not something I'm too familiar with, but it will depend on how you plan to encrypt the data and backups. If you use Transparent Data Encryption in SQL Server, you'll likely want to turn off backup compression because there won't be anything to gain because of the randomness of the encryption. If you use a 3rd-party encryption strategy, see the previous point about using WSRM or offloading the process to a different box, because encryption will also require CPU.

You could use an ETL tool to dump them into a more generic format; it would add a dependency during restoration, but it would allow you easily move most of the processing off the database host and into a different machine. I personally use Pentaho's PDI for exactly this.