One of things I constantly stress is backup.  When writing web applications we are often focused on the results we want from the application that we can easily forget the underlying infrastructure including the database.  Your beautiful Ruby-on-Rails app is only as good as long as the underlying database structure is alive.  Keep it alive with good database backup.

Today a friend of mine lost a lot of data due to a rouge maintenance script. He had good backup procedures in place, but they weren’t running when he needed them most. Make sure you have good backup in place and then check on it from time to time to make sure it is still running.

These details primarily explain backup with MySQL, but you can do the same thing with Postgres & SQL Server.

First, make sure you are running binary query logs. As long as you aren’t storing hundreds of thousands of records daily, keeping at least a week worth of queries shouldn’t require too much disk space. Add this feature of MySQL, but adding the following line to your my.cnf file in the [mysqld] section (many installs have it turned on by default).

log-bin = /var/log/mysql/bin.log

This tracks all queries made to the database. The advantage of doing this is if your database crashes or if data is lost for some other reason between backups you can use these logs to regenerate the data.

Now run a daily backup.  This can be run from cron on Linux or Scheduled Tasks in Windows.  This should call the mysqldump command with a timestamp and store it either compressed or uncompressed on your server.

Add a backup user account with privileges to read your web applications databases.

CREATE USER 'wp_backup'@'localhost' IDENTIFIED BY '****************';
GRANT SHOW DATABASES ON * . * TO 'backup-user'@'localhost' IDENTIFIED BY '****************'
GRANT SELECT, LOCK TABLES, SHOW VIEW ON `webapp\_wp` . * TO 'wp_backup'@'localhost';

Setup a configuration file with privileges to only be read by backup user account (my-backup.cnf). This file will store all connection settings.

[client]
host     = localhost
user     = backup-user
password = ****************

Now add that to your daily backup script, which might look something like this:

#!/bin/bash
mysqldump --defaults-file=my-backup.cnf webapp_wp | gzip > webapp_wp.`date +%F`.sql.gz
# if only database on server, flush binary logs to make sure they are in sync with backup
# mysqladmin -u root -pmypwd flush-logs

Make sure you save it in /etc/cron.daily as something like webapp-dbbackup and make it executable by root.

Then if you need to recover a database, you would create the database as you normally would and then load the database back in from the dumped data.

zcat webapp_wp.2009-02-25.sql.gz | mysql --defaults-file=my-backup.cnf webapp_wp

You can then recover any other data from the binary logs that you setup in the first step in this post using Russell Dyer’s excellent article on restoring point-in-time data.

If you don’t have direct access to your database do manual backups in an interval you are comfortable with using phpMyAdmin or your hosting control panel. Also run them before making any major updates to your application or blog.