Tuesday, March 8th, 2005, 3:38 am
Cron-based Backup
Regardless of your operating system, good practice is to put recently added/changed files in a container (let us call it transfer). Files should be accumulated in transfer until the next backup cycle. Once a backup of that container is obtained, files can finally be moved appropriately to their destined directory. The notes that follow deal with a method of automatically backing up transfer using a server or another hard-drive. It also places emphasis on back-up of Web sites — a task which can become closely-related.
Note: instructions are Linux/Mac-specific, but can be adapted to Windows
Until recently, I used to get gunzipped tar archives of all my sites from CPanel. I did this every morning. Some weeks ago I automated part of the process by getting more crucial (and frequently-changing) pages and storing them remotely. It all works as follows:
Set up a set of batch scripts (let us call them dummy1..2
) which include the following commands:
cd /home/roy/Main/Transfer_Archives/Sites/Roy/
wget -r -l1 -t1 -N -np -erobots=off http://schestowitz.com/
If you are not sure what the latter is doing, type in man wget
and read the documentation. Then, set up cron jobs which include the following tasks:
Get local copies of important Web pages (see above)
38 23 * * * /home/roy/Main/Transfer_Archives/Sites/dummy1
38 23 * * * /home/roy/Main/Transfer_Archives/Sites/dummy2
...
Compress all the pages
50 23 * * * tar czvf /home/roy/Main/Transfer_Archives/www-`date +%Y-%m-%d`.tar.gz /home/roy/Main/Transfer_Archives/Sites
Make a copy of compressed pages
58 23 * * * cp -rf /home/roy/Main/Transfer_Archives/ /home/server2/transfer/roy
The last line puts a copy on the SAN, just to be 100% covered. Such scripts allow you to sleep while your sites/files are being backed up.