Home Messages Index
[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index

Re: More tales of a system twonk

Hash: SHA1

On Sat, 11 Mar 2006 10:08:41 +0000,
 Roy Schestowitz <newsgroups@xxxxxxxxxxxxxxx> wrote:
> __/ [ Jim Richardson ] on Saturday 11 March 2006 09:23 \__
>> On Sat, 11 Mar 2006 06:37:44 +0000,
>>  Roy Schestowitz <newsgroups@xxxxxxxxxxxxxxx> wrote:
>>> __/ [ Larry Qualig ] on Saturday 11 March 2006 04:05 \__
>>>> Roy Schestowitz wrote:
>>>>> __/ [ The Ghost In The Machine ] on Friday 10 March 2006 23:00 \__
>>>> < large snip >
>>>>> Backing up, where appropriate, is an automatic process, so the owner can
>>>>> go play some favourite sports, while the other, less fortunate people
>>>>> stare at a GUI, inserting and ejecting CD's, or paying large sums of
>>>>> money to professionals with Knoppix.
>>>>> Speaking of the need to back up, this machine that I currently use has
>>>>> been up and running uninterruptedly for two and a half years. From the
>>>>> point of view of the O/s alone, it is less prone to breakage than its
>>>>> counterparts, which I used in the past.
>>>> I've been very happy with my backup plan as of late. My Win2000 server
>>>> (recently upgraded to 2003) has had an Adaptec Sata RAID-1 config for a
>>>> few years now. This works fine for 'drive failure' cases but doesn't
>>>> help protect against data loss. If something goes terribly wrong both
>>>> mirrored drives will get borked.
>>>> Now that I have "Ubu" on-line with ample disk space to spare my problem
>>>> is solved. The Win2003 server has a job scheduled to automatically
>>>> backup itself onto Ubu. This happens every Monday morning at 4:30 AM.
>>>> (The house is usually pretty quiet around then.) The day before at 4:30
>>>> AM Ubu makes a backup copy of itself and drops it off on Bubba (the
>>>> Windows server).
>>>> So between the RAID-1 arrays and having each server use the other one
>>>> as backup I should be pretty well covered. Which of course means that
>>>> since I'm prepared for a data disaster one won't happen but that's okay
>>>> with me.
>>> For what it's worth, here is my super-complicated backup 'package'.
>>> On the receiving end:
>>> mkdir /home/roy/BU/Filestore/<machine_name>/Backup-`date +%Y-%m-%d`
>>> On the sending end:
>>> nice scp -r ~/.[0-z]* ~/*
> roy@<destination_address>:/home/roy/BU/Filestore/<machine_name>/Backup-`date
>>> +%Y-%m-%d`
>>> This keeps a _stack_ of backups, just in case of data loss (e.g. user
>>> erases a file by accident only to realise it months later).
>> Allow me to introduce you to a tool called rdiff-backup. Does exactly
>> that, over rsync, so you don't dedicate a lot more storage than the raw
>> files take.
> I am aware of rsync, but I have plenty of bandwidth and storage capacity to
> spare. It leads to simplicity at the expense of efficiency.
> Thanks for the advice, Jim. I'll consider improving. *smile*

it's a lot more than just rsync, it does a diff backup of a dir,
(recursively) and puts the latest versions of the files in the backup
dir, and diffs them, to previous versions, said diffs are then placed in
a dir called rdiff-backup in the backup dir, and you can easily access
them from the cli, or other tools. 

It's really a python wrapper around rsync. Quite useful :)  

Version: GnuPG v1.4.2.1 (GNU/Linux)


Jim Richardson     http://www.eskimo.com/~warlock
	Don't get mad, get Linux

[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index