Next I needed a way to take my site offline automatically as Starkos warns "Be sure to take your site offline first, to avoid capturing an update in progress". This sounds like good advice for the database backup, and I didn't want to hang around to do this manually at some off-peak hour.

I decided not to use the modules (which are probably perfect for a shared hosting setup) and go with a combination of the afore mentioned options.

Regarding the server configuration, I already had my server's crontab configured to run drupal's cron.php, so I just added an extra cron command to run a backup script. I didn't mind installing drush and s3cmd (instructions below).

...and rather than edit the backup script provided by starkos (to keep this interchangeable), and to include the relevant commands for drush and s3cmd, I created a wrapper script that does the following:

Execute the drush command to take the site offline: drush vset site_offline 1 --yes

Run the database backup script by starkos (specifying .mysql file outside of the web root)

Create a tar archive of my sites files and the .mysql file

Specifying .tar.gz file outside of the web root

Adding the day of the week to the file name

Execute the drush command to bring the site back online: drush vset site_offline 0 --yes

My script also depends upon starkos' backup.sh - with my script residing next to his in my Drupal sites/default folder. This keeps things simple for the $1 parameter, which is passed through to starkos' script (as well as used for building the file name for the tar archives) and drush, which likes to be run from within the Drupal directory. It also assumes your site is in /var/www.

Running this script from cron required a few changes, mostly hardcoding paths to binaries and config files. Because the script(s) need to be run in the /sites/default directory, I required a lengthy cron command (replace mywebsite.com with your web site directory):