If you run a firewall, initially it can
be somewhat of a challenge to work out the rules that allow
Amanda access. The Amanda server contacts the clients on port 10080.
At this point, the
Amanda client forks the amandad process and seek random UDP ports on
the Amanda server. In turn, the server opens a couple of ports to
the client for the data and messages. If indexes are enabled, they
also need an additional port open on the server, TCP 10082.

The
random ports can be addressed in multiple ways. Beginning with Amanda
version 2.4.2, a compile option,
--with-portrange=xxx,yyy,
directs Amanda to use the given port ranges to connect
clients and servers.
The selected port range needs to be opened on both the
client and server ends of the firewalls. If you plan on running amrecover
on the client end, you should plan on opening TCP ports 10083 and
10082 on the server, in addition to UDP port 10080 on both the server
and client machines. All to these are in addition to the ports defined with
--with-portrange.

Amanda has the ability to work with most tape devices and libraries.
The tricks are defining the
proper tape type in the amanda.conf file and selecting and defining the
proper changer scripts (Listing 3). A single tape drive is by far the
least complicated method and simply by defining the type of drive and tape type
in the amanda.conf file you are on your way.
If a tape library or stacker is available,
the configuration is a multiple-step process consisting of defining the
tape changer program in amanda.conf (cgh-multi, chng-scsi and chg-zd-mtx
are three examples of configurations) and providing a changer file
directly within amanda.conf. With the Amanda server running on a Linux
platform, you need the sg (scsi -- generic) module compiled into
the kernel. You also need the mtx program if you are running any of the
mtx-based changer scripts. Amanda seeks a valid header on tapes,
and without it amcheck and/or amdump fails. Use the
amlabel tool to label the tapes, along with the regular expression defined
in the amanda.config file, for example:

/usr/sbin/amlabel DailySet DailySet1

Listing 3. The Defined Tape Changer in amanda.conf and
an Example CHANGER.conf File

# At most one changerfile entry must be defined;
#select the most
# appropriate one for your configuration.
#If you select man-changer,
# keep the first one; if you decide not to use a tape
# changer, you may
# comment them all out.
changerdev "/dev/sg1"
runtapes 1 # number of tapes to be used in a /
#single run of amdump
tpchanger "chg-zd-mtx" # the tape-changer /
#glue script
tapedev "/dev/nst1" # the no-rewind tape /
# device to be used
changerfile "/var/lib/amanda/CHANGER"
The CHANGER.conf file
changerdev=/dev/nst1
havereader=1
offline_before_unload=1
OFFLINE_BEFORE_UNLOAD=1
poll_drive_ready=10
Max_drive_wait=99
unloadpause=20
driveslot=0

When dealing with multiple computing groups or business departments,
you may determine that a separate backup scheme for each
group is required. This is
accomplished easily using Amanda with separate backup configurations. An
example of this would be two backup environments, one for each of two
projects being
funded independently and therefore requiring that you account for the time
and resources spent on each. Environment one
is labeled DailySet-BigFunds and environment two is labeled
DailySet-LilFunds. For the DailySet-BigFunds project, you have a large
tape library that houses ten tapes. DailySet-LilFunds provides you with
a single DLT IV drive whose tapes must be changed daily. To keep these
two projects separate but housed on the same backup
server, you would set up the separate directories in /etc/amanda
as DailySet-BigFunds[number of runs] and DailySet-LilFunds[number of
runs]. Each of the mentioned directories would contain an amanda.conf file
and a disklist of the included filesystems to be backed up. This pattern
allows mutually exclusive backups to be performed on separate tape
devices. If and when configuration
changes must be made, you need to make them only to the relevant files. This
is handy in an educational environment where you often have multiple
groups with varied hardware and backup media.

Game On!

Here we are with the software installed, the tape and library
devices working and the tapes all labeled, loaded and ready to go. So,
who do we call to begin our virtually hands-off backups? None other than
crontab, which helps make our backups as hands-off as
possible (Listing 4). It is recommended that your crontab contain two
entries for each Amanda run that will take place. The first entry should be
an amcheck to check the tape drive, tape header and each Amanda
client to ensure that the backup operation is carried out as planned. A
good idea is to have this mailed back to a system account that is checked
often enough so that any problems can be corrected prior to run time. The
second entry per job in the crontab should be the amdump command itself,
which allows Amanda to begin the backups without user intervention.

Although I prefer the command line for burning iso images, when it comes to making data backups, I have an easier time with k3b. I recently tried backing up my mail directory, but I have so many emails (maildir, heavy mailing lists), that the k3b app locked up, and I wasn't able to use it. It's probably about 5 GB in size, including sub-directories. I was attempting to span the mail directory across multiple cdrws for backup, but since it crashed...
Also, from what I understand, since it is about 5 GB, I'm assuming I have to make a 5 GB size iso image, split to fit on the 700 MB disks. Can this be done on-the-fly?

In other words, is there a way to backup a directory, say 5 GB in size, so that the directory is compressed to (example, approximate) about 2 GB using tar or other compression utility, then break that 2 GB tarred file into 3 files, 700 MB, 700 MB, 600 MB, and burn those files to 3 CDs, without needing 2 GB hard disk space to temporarily store the files?

2 GB may not sound like much with today's hard drive sizes, but if the above is possible, I'm also planning on backing up entire partitions, using multiple CD disks (I'm sticking with CDR/RW until the larger DVD drives, 50+GB, come out).

Manually creating tar files, then breaking them up to 700 MB each, then organizing the disks, and making sure there is enough room on partitions for all the data, starts turning into such a logistical hassle that it never gets done. I just want to type out a command, write the label for the cd, stick the cd in the drive, hit enter, have the cd eject, put the next cd in, hit enter again, and continue the process until done.

Is this a possibility? Anyone care to share commands used to tar and burn in the same command using pipes? Or should the commands be separated because of the risk of making coasters?

Mondo Rescue + Mindi - http://www.microwerks.net/~hugo/ will compress the source(s) and create and burn multiple iso's for CDs spanning the source. Besides the link to Hugo's (the author) site, see "Bootable Restoration CDs with Mondo", Linux Journal, October 2003. Though, it is meant as a bare metal recovery tool.

I've used mondoarchive for other distributions, RH7.3, Rh8.0, RH9.0, but I haven't been able to make it work with enterprise es. I've looked all over the net, but I cannot seem to find information about this, binaries, howto...nothing. Almost, like they were swallowed by the earth.

cdbkup is an excellent tool for this. It will, on the fly, tar the data and split the tar files up onto individual CD's. It has similar tools to concatenate the files back together and untar them to restore. It only needs 650MB of space in /tmp during the backup procedure (as it works with 1 iso at a time)

I use Mondo Rescue, as you can set it to backup direct to the CDRW (it'll prompt you to change CDs) or if like me you want it done overnight, you can tell it to backup to ISO images of the size you specify, then burn the ISOs manually.

In other words, is there a way to backup a directory, say 5 GB in size, so that the directory is compressed to (example, approximate) about 2 GB using tar or other compression utility, then break that 2 GB tarred file into 3 files, 700 MB, 700 MB, 600 MB, and burn those files to 3 CDs, without needing 2 GB hard disk space to temporarily store the files?

2 GB may not sound like much with today's hard drive sizes, but if the above is possible, I'm also planning on backing up entire partitions, using multiple CD disks (I'm sticking with CDR/RW until the larger DVD drives, 50+GB, come out).

I have been wondering about a similar situation, and
am beginning to lean towards a DVD solution. The reason is
that some DVD formats (DVD+RW and DVD-RAM) can be made
to act almost like a hard disk. If this works as advertised,
writing a compresed 4G archive without intermediate ISO images is
easy.

While the only backup tool I've used to date is Amanda, I'm prepared to abandon it for another tool with features more to my liking - still open-source, of course. Others must have felt the same way, given the number of different backup programs out there. In my search, I've come across these tools with no special support for backups across a network:

I have used cobian backup for some time on the windows OS with great success. Supports compressed backups, FTP, network, encrypted and password locked archives, full/incremental/scheduled backups, more.
It has been freeware for a long time and now it is open source too.http://sourceforge.net/projects/cobianbackup

In response to the above, the closest free/open-source alternative to Amanda that I've found until now is Bacula. While most of the other systems mostly seem to focus on small backups to harddrive or CD/DVD, Bacula is a real client/server backup system (like Amanda) that caters for users who backup a network of multiple clients/servers to tape (ie it has built-in tape/volume handling, scheduling, ...) or other storage systems. Given the problems Amanda had with backing up Win32 machines (when the Samba team changed a single line of output, my whole backup system fell apart) and handling the spanning of backups to multiple tapes, I've migrated to Bacula (that has 'native' backup agents for Win32 and supports tape spanning like everybody else) and never looked back again.

As Linux continues to play an ever increasing role in corporate data centers and institutions, ensuring the integrity and protection of these systems must be a priority. With 60% of the world's websites and an increasing share of organization's mission-critical workloads running on Linux, failing to stop malware and other advanced threats on Linux can increasingly impact an organization's reputation and bottom line.

Most companies incorporate backup procedures for critical data, which can be restored quickly if a loss occurs. However, fewer companies are prepared for catastrophic system failures, in which they lose all data, the entire operating system, applications, settings, patches and more, reducing their system(s) to “bare metal.” After all, before data can be restored to a system, there must be a system to restore it to.

In this one hour webinar, learn how to enhance your existing backup strategies for better disaster recovery preparedness using Storix System Backup Administrator (SBAdmin), a highly flexible bare-metal recovery solution for UNIX and Linux systems.