Case Study: Secure Log Shipping via SSL FTP

Today I’m putting up sort of an oddball solution I build a couple of months ago. We had the need to provide a reporting copy of some production databases for analysts to do ad-hoc reporting. The trick was that we needed to move the databases from a less secure location into a more secure location, with an untrusted domain boundary and a firewall in between. Log shipping to Standby Mode databases fit the bill from a business perspective, but doing it securely through the firewall was a bit of a stunt.

It’s worth noting that log shipping to standby isn’t always a good solution to this problem – because the users of the reporting database have to be “kicked off” each time a log backup is restored, you can’t really get close to real-time reporting with this design. In our case, however, day-old data is more than sufficient, so we just stage the log backup files during the day, and then restore them all at night to bring the standby databases up to some point in time a few hours behind production.

For this solution we settled on a secure file transfer using SSL + FTP (aka. SFTP). This basically makes an SSL connection to an FTP server, and then uses that secure channel to transmit the files using FTP. Other methods like Secure Copy (SCP) could also be made to work, I think. The trick was to locate the details about how to do an SFTP upload with PowerShell, and then to devise a way that the upload could incrementally push only new log backup files and not repeatedly upload the same files. We also use “vanilla” log shipping on the system in question for disaster recovery, so we needed to tap into that set of files, without disrupting it.

Overall, the system looked like this:

Our source SQL Server system already has an out-of-the-box log shipping setup, so there is an existing file share that contains a “sliding window” of log backup files using Microsoft’s standard conventions.

We have a Linux FTP server on the other side of a firewall from this source system. Windows file sharing, however, is blocked, so moving the log backup files with the usual method is not possible.

Behind that FTP server we have some automation that can move files across the network to the destination SQL server we’ll use for our reporting function.

Prereq’s

Before our SSL setup would work, we needed to have a certificate arrangement in place where the source servers would trust the FTP server. We have internal certificates for this type of function, so the first step was to export a root certificate for the FTP server and add that to the source servers’ cert stores. This is a requirement because, as will become clear below, we are using the stock .NET framework FTP client classes, and there isn’t a safe way to force those to trust a server without a valid cert setup.

Second, being a PowerShell rank amateur, I had to do some research into how exactly one would use it to invoke FTP client classes. I found a few articles were very helpful on this topic:

Provided there’s appropriate cert/trust set up between the machines, automating SFTP doesn’t require any software purchase, because the .NET framework has quite suitable classes built right in. Accessing those from PowerShell is pretty easy – the last article shows how to port some of the C# examples from the first two over to PowerShell.

Sending a File

Getting one file across is a good starting point. The basic process is encapsulated in the following function – assuming a few variables are available to the function, such as the FTP site address, username and password, and so on, this function will open a file and transmit it to an FTP site destination. Enabling SSL support is as simple as setting the “.EnableSSL” property for the ftp request object:

Storing Credentials

Because this will be a fully automated process, and isn’t within the realm of Windows Auth, I also needed to store the FTP site user credentials in some secure fashion. PowerShell has a pretty neat pair of command-lets that can take a string and store it, in a file, as a value encrypted by the current Windows user key. Those are, essentially:

Remember, though, that only the user who encrypts the value can decrypt it, because this encryption method uses the current Window’s user’s key.

Now, armed with a method to upload files, and to keep the required credentials, the next problem was how to automate the process, making sure we didn’t repeatedly upload the same log backup files. The sending server would have no knowledge of the list of files on the destination server, so essentially we have to track which files were successfully transmitted in each pass of the upload script, and then send only the net-new files from a folder.

Tracking Sent Files

From the point of view of our source server, these log backup files go into a “black hole” – there’s no visibility to the destination server to see which files were sent already. So we wrapped the send function in some added code that uses a small text file to keep track of each file that was successfully transmitted. The basic idea is this, in pseudocode:

Set parameters such as file locations, ftp server URL, credentials.

If there is a text file available, containing a list of the log backup files that have already been sent, read it and put the list of files into a hash table (prevents uploading the same files twice).

Create an empty list in memory for new, pending files to send.

For each .TRN file currently present in the transaction log backup folder, matching some naming pattern:

If the file is on the list from #1, then note that it was already sent.

If the file is not on the list, then add it to the list of pending files to send.

Replace the list from #2, on disk, with the new list from #4.1. This is the new sent file list. (Performing this exchange automatically discards the oldest file names from the list, preventing the need to create a process to circle back and delete the oldest file names from the list.)

For each file on the list of files to send:

Upload the file by SFTP (function above).

Add the name of the file, after successful upload, to the new sent file list.

The full text of the script, including these details, is at the end of this post. The script runs as a scheduled task under a service account. (Worth noting: when encrypting the FTP password for this process, one has to use Run As, run a PowerShell session as the service account, and encrypt the password using that account’s key, so that the scheduler can decrypt the data later.)

Half a Log-Shipping Config

So, with this process in place, we should have files moving over the network and arriving at the destination server. The next piece is to enable the automated restores at the reporting server. It’s not possible to use the log-shipping GUI to set this up, because there’s no access to the source server. You can, however, use the log shipping stored procs to set up just the receiving half of a stock log-shipping config, like:

This type of script will create the destination log shipping config only, and will make disabled, unscheduled jobs for restore and copy. Based on those results, I enabled the restore job and scheduled it to run only at night (to catch the reporting databases up to production during off-hours). The FTP upload, together with the automation we have to move the files from the FTP server to the destination SQL Server, makes the copy job irrelevant, so I just left it disabled.

Last Details: Use the .WRK Extension, Agent Proxy issue

In testing this, I did find one bug in my crazy scheme: the destination server would try to open a .TRN file before that file had finished copying into the destination folder. To solve that issue, I just mimicked what SQL Server does natively: we adjusted the automation between the FTP server and the destination server to rename the files with a .WRK extension, then change the names back when delivery of each file is complete, so that SQL Server can open them safely.

Finally, while I had originally intended to schedule the upload script with SQL Agent, there’s some conflict, apparently, between using a SQL Agent proxy account and the PowerShell secure string methods. I had to fall back on the (less desirable) Windows Scheduled Tasks to get the password decryption to work as designed.

Final Script

Disclaimer: This is a crazy idea, and I am bad at PowerShell. If it helps you out, that’s great – but please read and understand the script, and test carefully before attempting to deploy in your environment. Also, sadly, this is a PowerShell 1.0 script, and therefor lacks Try/Catch, in favor of Trap. If you have PowerShell 2 as an option, it’s probably advisable to fix my questionable error handling: