How I manage Lightroom’s Catalog Backups

May 15, 2013

I don’t usually do the whole computer tips thing, maybe I should do some more of it. The impetus for this post was a friend of mine who had close to 100GB of Lightroom catalog backups—and that was less than a year of weekly backups—sitting on his computer taking up precious space on yet another drive stretched to the limits.

Lightroom’s catalog backups are merely a copy of the currently open .lrcat catalog file. If your catalog is big, the backup will be as well. So for example, my 30,000-image catalog takes up almost 700MB of disk space. Every week or so it gets backed by Lightroom, after 4 weeks there’s a total of 2.8GB of backed up catalog files in addition to the 700MB active catalog sitting on my drive. Perhapse not significant for me, but when you have a 2G catalog and you don’t keep on top of manually removing the backups you can very quickly end up with a whole lot of space taken up by the backups.

Compression

I approach the issue size with the catalog and it’s backups in a multi fold way. To start with, I use Window’s built in NTFS compression to compress my catalogs and their backups. On average, I see about a 50% reduction in Lightroom catalog file sizes due to NTFS compression.

I would point out, that while it might seem similar to “zip” or “compressed” folders, NTFS (file system level) compression is not the same thing. While you can zip the backup catalogs manually, you cannot compress the main catalog that way and still be able to use it.

As I understand it, MacOS also offers file system level compression for HFS+ formatted volumes. Unfortunately, from a cursory Google search there doesn’t appear to be an easy way to enable or disable it without dropping into the command line. Moreover, since I don’t have a Mac, I can’t walk through the enabling it. Further Apple recommends against using HFS+ compression for compatibility reasons. In short, you might be able to do something with file-system level compression, or not, I simply don’t know.

NTFS compression can be enabled though the Advanced button in the Folder Properties (right click -> properties) dialog.

Getting Fancy with PowerShell and 7-Zip

NTFS compression is good, but not that good. My 696MB catalog compresses with NTFS compression to 296MB, but if I use 7-Zip to compress it into a 7Z archive file, it shrinks down to a mere 24.6 MB.

Just compressing it is good enough to save tons of space; throwing away old copies take that even further. Unfortunately, carrying this out is a little more complicated than just turning on NTFS compression.

Normally for scheduled task kind of work, I turn to bash on a Linux box, this is also what you mac users would want to use.

Initially I had envisioned a rather simple script that duplicated the command find $lr_bak_path –directory –mtime +30 –delete –print. In short, delete every directory that’s older than 30 days.

That was the initial thought, after playing around in PowerShell for a while I came up with the script published later in the article. Instead of deleting based on time, it deletes based on the number of backups I want to keep. That is, it’ll keep the last 4 backups even if they are 2 or 3 months old. It also compresses the backed up catalogs into 7-Zip archives, so I get massive storage savings while retaining backups of my catalog.

One hurdle to running PowerShell scripts is that Microsoft sets PowerShell by default to not execute scripts. To fix this, you need to run set-executionpolicy from an administrator PowerShell prompt, as show below.

PS C:\windows\system32> set-executionpolicy RemoteSigned

There are several options you can pick, but RemoteSigned is most secure without also requiring you to sign your own scripts.

I also elected to use 7-Zip to compress my backups. I’ve added a check for it being installed in the event that you have it installed too, or you can download it from 7-zip.org and install it. Two keep points, you’ll need to edit the $lrbackup_path line to point to where your Lightroom catalog backups actually are, second you might want to change the $backups_to_keep line to however many old backups you want to keep.

I automated the whole shooting match by running the script as a scheduled task though the Windows Task Scheduler. Getting the task scheduler to work is a bit tricky. You have to invoke powershell to run the script, not just the script. In the action dialog, the program/script will be %SystemRoot%\system32\WindowsPowerShell\v1.0\powershell.exe. Then to run the script, I found that setting the argument to -NoProfile -File "%path_to_script" -ExecutionPolicy RemoteSigned, worked for me.

If I there’s demand for it, or I have some free time, I’ll look into seeing if i can put together a video walk though of the whole process. In the mean time, here’s the PowerShell script I’m running.

Note: This script requires PowerShell version 3 to be installed (it should install with the .NET framework 4.5, if not you can download it from Microsoft here. At least PowerShell version 1 lacks the -directory switch for the get-childItem commandlet. You can tell what PowerShell version your computer has installed by opening the PowerShell prompt and typing $host.version.

This entry was posted on Wednesday, May 15, 2013 at 12:00 and last updated on Tuesday, Jun 11, 2013 at 23:20 by Jason Franke. Follow the Author on Google+ and Twitter. This post was filed in the category Digital Darkroom.

If you found this article useful please consider using the following link and buying something for yourself from Amazon.com or donating a couple of bucks to help keep content like this coming.

Reader Comments

There are no comments on this article yet. Why don't you start the discussion?

Leave a Reply

Comments are moderated. Abusive, inflamatory, and/or "troll" posts will not be poasted. If your comment doesn't show up, it may have been blocked by the spam filters.