For backuping my wiki, I have a simple solution. Just put a script in the bin dir
(I actually put all my local script in a scripts/ dir not to mix them with TWiki
official ones) named wiki-backup.tgz being:

Interesting idea - took me a while to realise that there's a script with a name ending in .tgz... Might be cleaner to use the PATH_INFO to do this, i.e. use a script called backup, and a URL that goes /bin/scripts/backup/wiki-tar-180402.tgz. This also lets you use a suitable name based on today's date.

I use rsync very successfully to maintain a mirror of our intranet Twiki, which contains mission-critical support information (which wouldn't be much use if the Twiki server crashed). I was even wondering whether to trigger the rsync when a topic is saved.

The rsync method is generally useful, and can also be used also to allow a Distributed TWiki - this gives you the situation whereby you not only have backups (copied to physically different machines) but also gives you the ability for relatively simple failover.

One word of warning though - if you're using the --delete option and syncing often (highly recommended) then you can get into the situation whereby if the disk on the primary server becomes full you can lose edits, and lose the content on the rsync slaves. (Which means having a regime similar to Colas's is sensible.)

An alternative to both approaches is to implement ReadWriteOfflineWiki - that way you end up with large numbers of backups, all of which can become the primary system at any tme.

-- MS - 30 Jan 2004

After having a hosing service delete my entire TWiki installation early in 2002 (resulting in my losing months worth of work), I became very interested in figuring out a way to back it up my TWiki data and other key configuation files. However, since I was working with a dialup connection, it was also very important for me to optimize the size of my downloads. Also, because I'm bad about routine maintenance, I wanted the system to be as automoatic as possible. After looking around for options, I found a couple of small scripts that, together provided a very nice solution. They were:

BackerUpper - Creates compressed backup files with easy customization of which directories/files to include/exclude, frequency of backup, type of backup (full or incremental), etc.. I set it up to do incremental backups daily and full backups on the first of the month.

SendEmail - Sends me the compressed backup archives and then deletes them from the server.

I have a batch file that I invoke via cron that runs BackerUpper and then SendEmail. It also does all my other TWiki maintenance tasks like running mailnotify, deleting session files, etc., and sends me a nice email reporting on everything.

Perhaps someone with more unix knowledge could have fashioned a similar solution from scratch but I found this set-up relatively easy to set up and has been working like a charm for 2 years now. If anyone is interested in using this approach and needs some help, feel free to contact me and I'll give you the details of my setup. I've thought about packaging it as a TWikiAddOnProduct.

Benoit - I haven't published by backup solution yet on TWiki.org. It is available in "draft" form on my site at http://skyloom.com/Dev/HostedTWikiBackupAddOn. I welcome your trying it out and give me suggestions for polishing it up a bit before releasing it "officially" here on TWiki.org.