Results 1 to 17 of 17
-
08-14-2005, 07:26 AM #1Disabled
- Join Date
- Aug 2004
- Location
- Zurich, Switzerland
- Posts
- 774
Tutorial: simple backup rotation scripts
I received several requests to post my backup scripts. Before I start, I think some background info is needed (i.e. what their purpose is): I've written them as part of the "Enterprise Class Backup" system we offer for webhosting accounts. Following rotation is used:
1) Full account backups:
a) on-server (dedicated hard drive used only for storing backups)
- daily backups, 7 days rotation
- weekly backups, 4 weeks rotation
- the standard daily/weekly/monthly cPanel backups are also available on this drive (1 day/1 week/1 month rotation)
b) off-server (another server in the same datacenter)
- weekly backups, 12 weeks rotation
c) off-site (stored on DVDs in a Swiss bank vault)
- monthly backups, 24 months rotation
2) MySQL database backups:
a) on-server (dedicated hard drive used only for storing backups)
- hourly backups, 24 hours rotation
b) off-server (another server in the same datacenter)
- daily backups, 7 days rotation
c) off-site
- weekly backups, stored on the monthly DVDs
Step by step guide
First of all, I'll describe the environment that's needed to run these scripts 1:1. This is a cPanel server running GNU/Linux, and while cPanel is not neccessary (you could use any other backups instead of the daily cPanel account backups, just modify the scripts), I can't say on what non-GNU/Linux OSes it would work without changes. Since we're talking about simple shell commands, they might work on other *Nix OSes like *BSD or Solaris, but I haven't tested it. You're welcome to post your findings.
a) on-server backups
You'll need a certain directory structure so the scripts work without modifications. The backup hard drive is mounted as /backup and has the following directories:
/backup/cpbackup : here are the cPanel backups. The three subdirs will be created automatically by cPanel/WHM.
/backup/rotation: this is where my scripts write their output. You need to create following subdirs: hourly, daily, weekly, monthly. You don't need to create any directories under these subdirs, the scripts will take care of that.
Let's start with the hourly rotation for MySQL backups. These aren't compressed because their primary purpose is if a client suddenly calls you in tears saying they just destroyed/corrupted/whatever their big and important DB 5 minutes ago, you can just copy back their files from the backup location to the /var/lib/mysql subdir and they'll be happy. This is my script in /etc/cron.hourly (file permissions 7xx):
Code:#!/bin/sh rm -rf /backup/rotation/hourly/$(date +"%H") mkdir /backup/rotation/hourly/$(date +"%H") cp -R /var/lib/mysql/* /backup/rotation/hourly/$(date +"%H")
The daily rotation runs a couple of hours after the cPanel nightly backup and copies the latest account backups into the directory that corresponds to the current day and keeps them for 7 days before overwriting them:
Code:#!/bin/sh rm -rf /backup/rotation/daily/$(date +"%u") mkdir /backup/rotation/daily/$(date +"%u") cp -R /backup/cpbackup/daily/* /backup/rotation/daily/$(date +"%u")
The weekly rotation script does the same. Note that this one doesn't have an automatic delete mechanism (well a 52-week one), because I prefer to monitor available disc space and as long as there are sufficient reserves I keep the files, longer than advertised (after all, advertising and SLA is in my eyes the minimum, more is always OK). The script is located in /etc/cron.weekly and needs to be chmoded to 7xx.
Code:#!/bin/sh rm -rf /backup/rotation/weekly/$(date +"%V") mkdir /backup/rotation/weekly/$(date +"%V") cp -R /backup/cpbackup/daily/* /backup/rotation/weekly/$(date +"%V")
The monthly script adds some new twists for still another layer of security: there have been reports again and again that cPanel/WHM corrupted a certain account's backup and all backup files were useless. In this case, the following (mind you, storage space wasting) addition makes sure that in this worst case you still have the majority of data for the customer. While it needs you to setup the account from scratch and copy lots of things into it and still do some of the steps manually (like setting up the MySQL DBs and users in cPanel), at least all website data, MySQL DBs, e-mails stored on the server, webstat outputs etc is still available. The script goes into /etc/cron.monthly and has a 7xx permission.
Code:#!/bin/sh rm -rf /backup/rotation/monthly/$(date +"%m") mkdir /backup/rotation/monthly/$(date +"%m") cp -R /backup/cpbackup/daily /backup/rotation/monthly/$(date +"%m") cp -R /var/lib/mysql /backup/rotation/monthly/$(date +"%m") cp -R /home /backup/rotation/monthly/$(date +"%m")
b) off-server backups
These are stored on an external backup server to which I have only FTP access (thus no fancy stuff). The first is the MySQL daily backup, located in /etc/cron.daily (7xx as always):
Code:#!/bin/sh cd /backup/rotation tar -zcf mysql_$(date +"%F").tar.gz /var/lib/mysql/* ftp -in <<EOF open 127.0.0.1 user yourusername yourpassword bin hash prompt put mysql_$(date +"%F").tar.gz bye rm -f mysql_$(date +"%F").tar.gz
Next is the weekly FTP backup. This is a rather crude job, taking lots of CPU horsepower if you've got lots of accounts. It creates a giant .tar.gz of everything in the daily cPanel backup dir and pushes this on to the FTP backup server. Since by now we're speaking about disaster recovery backups (which you should not need in a normal case), I prefer having them in one big file instead of many small ones (if you prefer hundreds of small files every week in your FTP space, you can always use the mput command):
Code:#!/bin/sh cd /backup/rotation tar -zcf cpbackup_$(date +"%F").tar.gz /backup/cpbackup/daily/* ftp -in <<EOF open 127.0.0.1 user yourusername yourpassword bin hash prompt put cpbackup_$(date +"%F").tar.gz bye rm -f cpbackup_$(date +"%F").tar.gz
There is no monthly FTP job because in its place is the monthly off-site backup job.
c) off-site backups
The off-site part is done manually, I first copy the contents of /backup/cpbackup/daily to a directory with FTP access, then use an FTP client to download the files to a local machine and shovel them on a DVD. For the weekly MySQL part, I prefer to use the most recent state, so I create a .tar.gz directly into an FTP-accessible directory and download it.
Notes
* You shouldn't even try to set up this system on an overloaded or low-end server.
* You cannot sell even remotely all the space on your primary hard drive (unless your backup drive is several times larger).
* I am aware that doing full backups every time is the most space-wasting kind of backup, but I prefer it over more sophisticated methods because with a differential or incremental backup system once you lose the starting point, the rest is so much dead bytes. With full backups, any single one stands on its own.
* Offering such a sophisticated solution on a per client basis (like a paid addon) needs only small modifications: all you need is to specify the account backup files and the subdirs in /var/lib/mysql that are to be part of the process. You could e.g. solve it by creating a dir where you first copy over the needed files from the original places, and then run the backup scripts on that directory.
* Any questions/feedback is welcome.
-
08-14-2005, 07:30 AM #2Aspiring Evangelist
- Join Date
- Jan 2005
- Posts
- 363
Very good.
Thanks for sharing.Portugal Networks
Shared and Reseller cPanel Accounts. NEW! Windows 2003 with Plesk, ASP.NET 2.0 and MSSQL 2005.
-
08-14-2005, 07:41 AM #3Web Hosting Master
- Join Date
- May 2003
- Location
- Florida
- Posts
- 902
RambOrc,
Great instructions. Thank you for providing this.
-
08-14-2005, 10:13 AM #4Disabled
- Join Date
- Apr 2005
- Location
- Cochin
- Posts
- 2,452
hey good work man
-
08-18-2005, 10:04 PM #5Web Hosting Master
- Join Date
- Sep 2004
- Location
- Fairborn, ohio
- Posts
- 923
Would anyone care to walk a newbie through this? Do I just stick those lines of code in a text file, and rename it cron.daily and whatnot? Yeah, I'm an idiot
• Imeanwebhosting.com - Shared cpanel hosting, 99.9% uptime.
• 10 min average ticket responses, softaculous, rvsitebuilder, and more!
• Reliable, affordable shared hosting. I Mean Web Hosting!
-
08-18-2005, 10:26 PM #6Web Hosting Master
- Join Date
- May 2003
- Location
- Florida
- Posts
- 902
Originally posted by Bohica
Would anyone care to walk a newbie through this? Do I just stick those lines of code in a text file, and rename it cron.daily and whatnot? Yeah, I'm an idiot
1 - go to the /etc/cron.hourly directory
-- cd /etc/cron.hourly
2- create a file called hourlySql.sh and put this code in it:
-- use the editor of choice (like vi)
#!/bin/sh
rm -rf /backup/rotation/hourly/$(date +"%H")
mkdir /backup/rotation/hourly/$(date +"%H")
cp -R /var/lib/mysql/* /backup/rotation/hourly/$(date +"%H")
-- chmod 700 hourlySql.sh
4 - create a directory for the backup files. This script is coded to use:
/backup/rotation/hourly
You can create this by using:
mkdir /backup/rotation/hourly
Now do the same for the other scripts.
Hope I didn't miss a big step. Good luck.
-
08-19-2005, 12:58 PM #7Web Hosting Master
- Join Date
- Sep 2004
- Location
- Fairborn, ohio
- Posts
- 923
Can I name the files anything I want as long as they're in the right folder? Like would thisismymysqlcron.sh work? (dumb example, but you get what I'm asking).
• Imeanwebhosting.com - Shared cpanel hosting, 99.9% uptime.
• 10 min average ticket responses, softaculous, rvsitebuilder, and more!
• Reliable, affordable shared hosting. I Mean Web Hosting!
-
08-19-2005, 01:08 PM #8Disabled
- Join Date
- Aug 2004
- Location
- Zurich, Switzerland
- Posts
- 774
We aren't talking about something dumb like Winblows, so you can name the files anything you want to and don't even need to give them a file extension (mines don't have any, I usually don't use file extensions on GNU/Linux unless it's an Apache handler).
Actually, you don't even have to put them into /etc/cron.daily and the like, you can actually put them anywhere you want. The comfortable thing about the predefined crondirs on RedHat is only that you don't need to specify every command/script in the crontab manually, they'll automatically run at predefined times.
-
08-19-2005, 01:35 PM #9Web Hosting Master
- Join Date
- Sep 2004
- Location
- Fairborn, ohio
- Posts
- 923
That's pretty sweet! One more "newb" question.
The one that tars everything in daily and throws it in my ftp backup, will it overwrite the old file every time? I mean, I would like it to only be 1 rotation because my ftp backup is only 50GB. Just curious, thanks!• Imeanwebhosting.com - Shared cpanel hosting, 99.9% uptime.
• 10 min average ticket responses, softaculous, rvsitebuilder, and more!
• Reliable, affordable shared hosting. I Mean Web Hosting!
-
08-19-2005, 01:47 PM #10Disabled
- Join Date
- Aug 2004
- Location
- Zurich, Switzerland
- Posts
- 774
Since the filename includes the date, of course it won't, but if you want it to overwrite the previous file every day, that's easy, just give the file a static name, leaving away the whole dynamic "date" part, i.e. instead of mysql_$(date +"%F").tar.gz you can just name it something like mysql_daily.tar.gz. Make sure you change all 3 instances in the script.
-
08-19-2005, 02:20 PM #11Web Hosting Master
- Join Date
- Sep 2004
- Location
- Fairborn, ohio
- Posts
- 923
Excellent. Sorry for being a newb.
Thanks RambOrc!• Imeanwebhosting.com - Shared cpanel hosting, 99.9% uptime.
• 10 min average ticket responses, softaculous, rvsitebuilder, and more!
• Reliable, affordable shared hosting. I Mean Web Hosting!
-
01-23-2006, 02:45 PM #12Web Hosting Master
- Join Date
- Nov 2004
- Posts
- 675
So this scripts automaticly would make a full backup in cpanel?
Just imagine, if the world was like an online community. Where people help each other, just because they can.
-
01-24-2006, 06:32 AM #13Disabled
- Join Date
- Aug 2004
- Location
- Zurich, Switzerland
- Posts
- 774
Nope, what it does is copies the cPanel-created full backup files to other locations so that they aren't overwritten next time backup runs.
BTW I've modified the last two scripts in the meantime so they now delete the local backup files as they are supposed to:
Code:#!/bin/sh cd /backup/rotation rm -f mysql_*.tar.gz cd /var/lib tar -zcf /backup/rotation/mysql_$(date +"%F").tar.gz mysql cd /backup/rotation ftp -in <<EOF open 127.0.0.1 user yourusername yourpassword bin hash prompt put mysql_$(date +"%F").tar.gz bye
Code:#!/bin/sh cd /backup/rotation rm -f cpbackup_*.tar.gz cd /backup/cpbackup tar -zcf /backup/rotation/cpbackup_$(date +"%F").tar.gz daily cd /backup/rotation ftp -in <<EOF open 127.0.0.1 user yourusername yourpassword bin hash prompt put cpbackup_$(date +"%F").tar.gz bye
-
01-12-2007, 09:26 AM #14New Member
- Join Date
- Mar 2006
- Posts
- 2
A minor modification
These are great, simple backup scripts. Thanks for sharing.
I've made a simple modification in my implmentation however, by setting a variable to the date part instead of calling date each time.
e.g.
Code:#!/bin/shrm -rf /backup/rotation/monthly/$(date +"%m") mkdir /backup/rotation/monthly/$(date +"%m") cp -R /backup/cpbackup/daily /backup/rotation/monthly/$(date +"%m") cp -R /var/lib/mysql /backup/rotation/monthly/$(date +"%m") cp -R /home /backup/rotation/monthly/$(date +"%m")
Code:#!/bin/shmonth=`date +"%m"` rm -rf /backup/rotation/monthly/$month mkdir /backup/rotation/monthly/$month cp -R /backup/cpbackup/daily /backup/rotation/monthly/$month cp -R /var/lib/mysql /backup/rotation/monthly/$month cp -R /home /backup/rotation/monthly/$month
-
01-12-2007, 09:48 AM #15Newbie
- Join Date
- Jan 2007
- Posts
- 11
Great Work man, thanks a lot
-
01-31-2007, 01:58 PM #16Web Hosting Master
- Join Date
- Aug 2005
- Location
- Canada
- Posts
- 862
Originally Posted by edkay
And I'd go even further. I'm lazy.
Code:#!/bin/shm=/backup/rotation/monthly/`date +"%m"` rm -rf $m mkdir $m cp -R /backup/cpbackup/daily $m cp -R /var/lib/mysql $m cp -R /home $m
-
04-13-2009, 12:06 PM #17New Member
- Join Date
- Apr 2009
- Posts
- 1
Because this topic pops up on top of google while looking for backup rotation scripts, i decided it is worth posting in it regardless of its age, because some beginning users of linux will get screwed when they implement this exactly. Sorry for the bump though.
I just have to warn for doing mysql backups like that. Just copying the /var/lib/mysql folder is NOT the right way to do it. When the database is active, you have a very big chance to get corrupted files.
Also, when a client damages its database, you cannot just throw back the files of that client, because mysql won't let you. You would have to load them onto another mysql server, export the database you need and import that on the other server. So this way is not easy, not secure because there will be a big chance the db will be corrupt and just wrong in my eyes.
Here (sourceforge.net/projects/automysqlbackup/) you can find a mysql backup script that works like it should. It will dump the databases to files you can use to import back into your database by hand or phpmyadmin at once and most important, they won't be corrupt.
Also the point of doing a full backup all the time without incremental updates is not really the way to do it if you have lots and lots of data to be backed up. The point that "once you lose the starting point, the rest is so much dead bytes." is not true in my eyes.
With incremental backups, you will have every file as much times as the file has changed. So you will have a lot only once, others multiple times with differences.
And who says you cannot backup your backup from time to time? It will still take lots less space like that.
For a nice guide how to do incremental backups, you can look here (sanitarium.net/golug/rsync_backups.html)
For a complete backup rotation solution based on rsync, you might want to look at (dirvish.org/).
(bbcode and linking not used because of not having five posts here.)Last edited by mathijs; 04-13-2009 at 12:15 PM.