How To:Backup Databases Through Dumps (the old way)

From The Open Source Backup Wiki (Amanda, MySQL Backup, BackupPC)

Jump to: navigation, search

You can back up a small database of any sort quite easily by simply dumping it to a static file before each Amanda run.

Databases generally change their data files constantly, and GNU Tar cannot make a consistent backup of such data files. However, it can back up a static file full of SQL or some other database-specific dump.

Note that when using this method a large file will be backed up every backup run.

Better Options

If you are using Postgres and have anything but the most trivial database, you should use continuous WAL archiving as described on How To:Use Amanda to Back Up PostgreSQL. This gets you real incremental dumps and easier recoveries.


Here is an example script for backing up the data of a PostgreSQL instance:

# Prevent access to the data in the backup
umask 037
# Remove the backup from the day before yesterday to prevent disk space from filling up
/bin/rm /backups/PostgreSQL/postgresql-8.3-`date --date="2 days ago" +%F`.sql.gz 2>/dev/null
# Dump the contents of all databases
/usr/lib/postgresql/8.3/bin/pg_dumpall --port=5434 | /bin/gzip - > /backups/PostgreSQL/postgresql-8.3-`date +%F`.sql.gz

Add a crontab entry to run this dump before amdump; either leave enough time between the two crontab entries, or run one after the other with &&. Add the database's data directory to the exclude list to prevent backing up data twice. For example: /var/lib/postgresql/8.3/main

Personal tools