Thursday, June 9th, 2005, 5:00 am
One-Click Backups
There are fast and automated ways of backing up *NIX servers. Backups should include databases as well as home directories, of which there can be many.
Compressed Archives
If backups can be downloaded via links (e.g. under cPanel), a browser can have its default action for .gz
files set to “save in home directory”. A little script can then timestamp all the archives and put them ‘in storage’:
# Once back-ups have been put in home directory, move all
# to a time-stamped directory in the back-up area
cd /home/user/
mkdir `date +%Y-%m-%d`
mv backup* /home/user/backup/Home_Directories
# move home directories
mv *.gz `date +%Y-%m-%d`
# move databases to the new directory
cp -rf `date +%Y-%m-%d` /home/user/backup/Databases/Recent
# merge if required
rm -r `date +%Y-%m-%d`
# clean up, NOTE: mv cannot merge
FTP
If your webspace host offers FTP access alone, there is no simple solution. Command-line ftp
might be of use though.
SCP
Copy files from the server to your local machine. This step must be invoked from the server’s side. The form of the command resembles the following:
scp -r ~/public_html your_user_name@machine.domain.suffix:/home/user/backup
mySQL Dumps
Set up cron jobs on the server, e.g.:
0 23 * * 1,4 mysqldump --user [DB_username] --password=[DB_password] [DB_name]> ~/tmp/mydatabase1.dump
0 23 * * 2,5 mysqldump --user [DB_username] --password=[DB_password] [DB_name]> ~/tmp/mydatabase2.dump
0 23 * * 0,3,6 mysqldump --user [DB_username] --password=[DB_password] [DB_name]> ~/tmp/mydatabase3.dump
The code above keeps a stack of 3 databases, which can be used to rollback data shall something catastrophic happen. Databases can then be snatched along with home directories. The above code can be extended so that dumps are compressed and file destinations are remote.