How To:Backup Databases Through Dumps (the old way): Difference between revisions

From wiki.zmanda.com
Jump to navigation Jump to search
(reworked into HowTo)
 
Line 1: Line 1:
There are multiple ways to performs PostgreSQL backups using Amanda. Which one you can and should use depends on your situation.
You can back up a small database of any sort quite easily by simply dumping it to a static file before each Amanda run.


= Small DB, plenty of disk space  =
Databases generally change their data files constantly, and GNU Tar cannot make a consistent backup of such data files.  However, it can back up a static file full of SQL or some other database-specific dump.
If you don't have a lot of data (less than a couple of gigabytes) in your PostgreSQL databases, you can simply dump the contents of the databases to a file before the Amanda backup is started. The dump file will be backed up using Amanda.  


Note that when using this method a large file will be backed up every backup run.  
Note that when using this method a large file will be backed up every backup run.  


= Better Options =
If you are using Postgres and have anything but the most trivial database, you should use continuous WAL archiving as described on [[How To:Use Amanda to Back Up PostgreSQL]].  This gets you real incremental dumps and easier recoveries.
= Dumps =
Here is an example script for backing up the data of a PostgreSQL instance:
Here is an example script for backing up the data of a PostgreSQL instance:
<pre>#!/bin/sh
<pre>#!/bin/sh
Line 15: Line 18:
/usr/lib/postgresql/8.3/bin/pg_dumpall --port=5434 | /bin/gzip - > /backups/PostgreSQL/postgresql-8.3-`date +%F`.sql.gz</pre>
/usr/lib/postgresql/8.3/bin/pg_dumpall --port=5434 | /bin/gzip - > /backups/PostgreSQL/postgresql-8.3-`date +%F`.sql.gz</pre>


Add the PostgreSQL data directory to the exclude list to prevent backing up data twice. For example: /var/lib/postgresql/8.3/main
Add a crontab entry to run this dump before amdump; either leave enough time between the two crontab entries, or run one after the other with <tt>&&</tt>.  Add the database's data directory to the exclude list to prevent backing up data twice. For example: <tt>/var/lib/postgresql/8.3/main</tt>
 
= Incremental Backups =
If you want a more incremental approach you can use continuous WAL archiving as described on [[How To:Use Amanda to Back Up PostgreSQL]].

Latest revision as of 16:13, 6 January 2011

You can back up a small database of any sort quite easily by simply dumping it to a static file before each Amanda run.

Databases generally change their data files constantly, and GNU Tar cannot make a consistent backup of such data files. However, it can back up a static file full of SQL or some other database-specific dump.

Note that when using this method a large file will be backed up every backup run.

Better Options

If you are using Postgres and have anything but the most trivial database, you should use continuous WAL archiving as described on How To:Use Amanda to Back Up PostgreSQL. This gets you real incremental dumps and easier recoveries.

Dumps

Here is an example script for backing up the data of a PostgreSQL instance:

#!/bin/sh
# Prevent access to the data in the backup
umask 037
# Remove the backup from the day before yesterday to prevent disk space from filling up
/bin/rm /backups/PostgreSQL/postgresql-8.3-`date --date="2 days ago" +%F`.sql.gz 2>/dev/null
# Dump the contents of all databases
/usr/lib/postgresql/8.3/bin/pg_dumpall --port=5434 | /bin/gzip - > /backups/PostgreSQL/postgresql-8.3-`date +%F`.sql.gz

Add a crontab entry to run this dump before amdump; either leave enough time between the two crontab entries, or run one after the other with &&. Add the database's data directory to the exclude list to prevent backing up data twice. For example: /var/lib/postgresql/8.3/main