Introduction About Site Map

XML
RSS 2 Feed RSS 2 Feed
Navigation

Main Page | Blog Index

Sunday, July 23rd, 2006, 9:11 am

Reliable Backup Mechanism

Data Recovery - presentation

T ODAY I would like to explain, in a relatively shallow level of depth, my most basic backup routines. I will concentrate on a somewhat simplified perspective and that which involves my current backup approach for local files, as opposed to the Web. The method is largely automated, owing to cron jobs (scheduler-driven). More details and method were described in older blog items. For example, have a cursory look at:

At present, I continue to seek and stick to a robust backup mechanism that is rather immune to human error, as well as hardware failures. I take the ‘stacked backup’ approach (keep several cumulative/progressive backups) and I always remain paranoid, as to be on the ‘safe side’ of things. I fear (and maybe even loathe) situations where I might lose data as this costs a lot of time and can even lead to considerable emotional pain, especially in the case of irreversible loss. As a result, I have scripted all my backup routines. I can just set it all up and thereafter forget about it, so frequency of backups can be increased without extra cost (time). I would like to share a few commands that I use in this blog post, for whatever it’s worth. Here are bits referenced from the crontab file, as well as some corresponding and related scripts.

First of all, here is a command which takes all the settings files (beginning with a dot) and puts them on the external storage media, datestamped. It’s possible to go further and compress (e.g. using gzip), but it makes the entire process much slower.

tar -cf /media/SEA_DISK/Home/Home-Settings/home-settings`date +%Y-%m-%d`.tar ~/.[0-z]*

Here is a simple way of preparing a datestamp-named directory.

mkdir /media/SEA_DISK/Home/`date +%Y-%m-%d`

I then take all files to be backed up, slicing them into volumes of 1 gigabyte (the filesystem will not accept files that exceed 4 gigabytes in size).

tar -cf - /home/roy/Main/BU|split -b 1000m - /media/SEA_DISK/Home/`date +%Y-%m-%d`/Baine-`date +%Y-%m-%d`.tar

Lastly, important files that change frequently are copied without any compression.

cp -r /home/roy/Desktop/ /home/roy/.kde/share/apps/kpilot /home/roy/Main/MyMemos
/home/roy/Main/kpilot-syslog.html /media/SEA_DISK/Home/Misc_local #local

I prefer to send copies of these files off-site as well, just for the sake of redundancy.

konsole -e rsync -r /home/roy/Desktop /home/roy/.kde/share/apps/kpilot
/home/roy/Main/MyMemos /home/roy/Main/kpilot-syslog.html
/home/roy/public_html roy@baine.smb.man.ac.uk:/windows/BU/Sites/SCG #and remote

In the above, Konsole is just a convenient graphic-textual wrapper for these operations that spew out status or flag errors, shall they ever emerge (a rarity).

I use tape archives to retain nightly stacks. Every night I use rsync to replicate my main hard-drive and to avoid the existence of deprecated files, I create a fresh copy twice a week, using rm -rf followed by scp (could be rsync as well, in principle) and a storage unit whose total capacity is 0.3 terabyte keeps stacks of the files before each rm -rf operation. Here are some bits of code which are hopefully self-explanatory.

konsole -e rsync -r roy@baine.smb.man.ac.uk:/home/roy/* /home/roy/Main/BU/ &

For a fresh copy of a remote home directory, begin by erasing the existing files.

rm -rf /home/roy/Main/BU/*

rm -rf /home/roy/Main/BU/.[0-z]*

Then, copy all files using a simple remote copy command.

konsole -e scp -r roy@baine.smb.man.ac.uk:/home/roy/* /home/roy/Main/BU/ &

The stacked backups that are dated get deleted manually; and quite selectively so! One should permit reversal to older states of the filestore by leaving sensible time gaps between retained backups. This prevents backups from being ‘contaminated’ too quickly. Important files are often replicate on file/Webspaces, so the most I can lose if often less than one day’s worth, due to hard-drive failures that are physical. The files are kept on 3 separate archives in 2 different sites in Manchester (home and the University; used to be three sites before I left one of my jobs). All in all, I hope this inspired someone. If not, at least it would serve as a page I can reference friends to in case they seek something similar.

More tips on *nix-oriented backup can be found in a recent article.

Comments are closed.

Back to top

Retrieval statistics: 21 queries taking a total of 0.137 seconds • Please report low bandwidth using the feedback form
Original styles created by Ian Main (all acknowledgements) • PHP scripts and styles later modified by Roy Schestowitz • Help yourself to a GPL'd copy
|— Proudly powered by W o r d P r e s s — based on a heavily-hacked version 1.2.1 (Mingus) installation —|