Results 1 to 11 of 11
  1. #1
    Join Date
    Apr 2005
    Location
    Tinterweb
    Posts
    556

    Question cPanel back-up script

    I have a shared account which I would like to back-up on a cron job.
    I know several people have tried to create these type of scripts but they dont seem to do the following:

    1) Full cPanel back-up (files/mysql etc)
    2) FTP the back-up to a remote server
    3) Delete the back-up on the local server
    4) Delete the old back-up on the remote server

    Does anyone know of a script that will do the above or several scripts I could use?
    Webhostgear have one but that only moves the back-up file between servers, it doesn't backup or delete.

    Thanks in advance.
    C program run. C program crash. C programmer quit.

  2. #2
    Join Date
    Oct 2004
    Location
    Kerala, India
    Posts
    4,771
    You can find such a script here. I dont remember the exact link of that thread. A search will get you one
    David | www.cliffsupport.com
    Affordable Server Management Solutions sales AT cliffsupport DOT com
    CliffWebManager | Access WHM from iPhone and Android

  3. #3
    Join Date
    Feb 2005
    Location
    Australia
    Posts
    5,849
    There are so many different backup scripts because they're easy enough to write and different people have different requirements (and different levels of access).

    For what you want to do, 1 2 and 3 are all covered by CPanel's own full backup - just view source on the CPanel form you submit to create a full backup and use curl or similar to submit the same data automatically. But I wouldn't want this running from the account itself because it would mean storing the CPanel account password in plain text in the script. You might consider doing it from a secure remote machine though.

    4 would be a one-line cron job on the remote machine.
    Chris

    "Some problems are so complex that you have to be highly intelligent and well informed just to be undecided about them." - Laurence J. Peter

  4. #4
    Join Date
    Apr 2005
    Location
    Tinterweb
    Posts
    556
    Ok, I'm getting somewhere with this, just need someone to confirm the following:
    The cPanel backup is left on the local server in the main root directory. I dont want these to build up, so I will need a cron which finds .tar.gz files and are 2 days old then deletes them. Will the following cron work:

    find /home/accountname -mtime +2 -type f -name '*tar.gz' -exec rm {} \;
    Thanks
    C program run. C program crash. C programmer quit.

  5. #5
    Join Date
    Feb 2005
    Location
    Australia
    Posts
    5,849
    If you use the CPanel full backup to remote server option the file that's left in the user directory is a dummy - only 10 or 20 bytes. If you still want to delete these I'd suggest a more specific find command or you might find your clients get unhappy when your script deletes for example /home/username/public_html/our_wedding_pics.tar.gz

    Code:
    find /home/username -maxdepth 1 -type f -name 'backup*username.tar.gz' -exec rm -f {} \;
    Chris

    "Some problems are so complex that you have to be highly intelligent and well informed just to be undecided about them." - Laurence J. Peter

  6. #6
    Join Date
    Apr 2005
    Location
    Tinterweb
    Posts
    556
    Quote Originally Posted by foobic View Post
    If you use the CPanel full backup to remote server option the file that's left in the user directory is a dummy - only 10 or 20 bytes. If you still want to delete these I'd suggest a more specific find command or you might find your clients get unhappy when your script deletes for example /home/username/public_html/our_wedding_pics.tar.gz

    Code:
    find /home/username -maxdepth 1 -type f -name 'backup*username.tar.gz' -exec rm -f {} \;
    Excellent just what I needed, I wondered about it deleting .tar.gz's in other directory's.
    The reason why I wanted the above code is because it leaves the backup file in the root directory (810mb), it isn't a dummy file. I dont know why its doing that, still waiting for the confirmation email from the script.
    C program run. C program crash. C programmer quit.

  7. #7
    Join Date
    Apr 2005
    Location
    Tinterweb
    Posts
    556
    Errors:
    Net::FTP>>> Net::FTP(2.77)
    Net::FTP>>> Exporter(5.58)
    Net::FTP>>> Net::Cmd(2.29)
    Net::FTP>>> IO:ocket::INET(1.29)
    Net::FTP>>> IO:ocket(1.29)
    Net::FTP>>> IO::Handle(1.25)
    Net::FTP=GLOB(0x9e81600)<<< 220 Gene6 FTP Server v3.9.0 (Build 2) ready...
    Net::FTP=GLOB(0x9e81600)>>> USER userid
    Net::FTP=GLOB(0x9e81600)<<< 331 Password required for userid.
    Net::FTP=GLOB(0x9e81600)>>> PASS ....
    Net::FTP=GLOB(0x9e81600)<<< 230 User userid logged in.
    Net::FTP=GLOB(0x9e81600)>>> TYPE I
    Net::FTP=GLOB(0x9e81600)<<< 200 Type set to I.
    Net::FTP=GLOB(0x9e81600)>>> CWD backups
    Net::FTP=GLOB(0x9e81600)<<< 250 CWD command successful. "/backups" is current directory.
    Net::FTP=GLOB(0x9e81600)>>> ALLO 850161427
    Net::FTP=GLOB(0x9e81600)<<< 200 ALLO Ok : 77267644416 bytes available.
    Net::FTP=GLOB(0x9e81600)>>> PASV
    Net::FTP=GLOB(0x9e81600)<<< 227 Entering Passive Mode (83,245,63,74,27,221)
    Net::FTP=GLOB(0x9e81600)>>> STOR backup-9.2.2007_14-24-06_accountname.tar.gz
    Net::FTP=GLOB(0x9e81600)<<< 150 Data connection accepted from ipaddress:56778; transfer starting for /backups/backup-9.2.2007_14-24-06_accountname.tar.gz
    Net::FTP=GLOB(0x9e81600): Timeout at /usr/lib/perl5/5.8.8/Net/FTP/dataconn.pm line 74
    Net::FTP=GLOB(0x9e81600)>>> QUIT
    Net::FTP: Unexpected EOF on command channel at /usr/local/cpanel/bin/ftpput line 35
    Tried almost everything but dont seem to be able to rectify the above errors. I think this is why I have the full 830mb backup file left on the local server.
    C program run. C program crash. C programmer quit.

  8. #8
    Join Date
    Oct 2004
    Location
    Kerala, India
    Posts
    4,771
    Make sure higher ports (range 35000 to 60000) are opened on the server. If you are using apf, most probably they might be blocked.
    David | www.cliffsupport.com
    Affordable Server Management Solutions sales AT cliffsupport DOT com
    CliffWebManager | Access WHM from iPhone and Android

  9. #9
    Join Date
    Feb 2005
    Location
    Australia
    Posts
    5,849
    Looks like it's making the connection so it can't be blocked completely. What's the server at the other end? Can you upload a similar sized file manually? scp method might work better if you have that option.
    Chris

    "Some problems are so complex that you have to be highly intelligent and well informed just to be undecided about them." - Laurence J. Peter

  10. #10
    Join Date
    Oct 2004
    Location
    Kerala, India
    Posts
    4,771
    Quote Originally Posted by foobic View Post
    Looks like it's making the connection so it can't be blocked completely.
    I have come across a similar issue and enabling the higher ports sorted it . It makes connection, but no data is transferred.
    David | www.cliffsupport.com
    Affordable Server Management Solutions sales AT cliffsupport DOT com
    CliffWebManager | Access WHM from iPhone and Android

  11. #11
    Join Date
    Feb 2005
    Location
    Australia
    Posts
    5,849
    Wouldn't that be in active mode, where the ftp server makes a data connection back to a high numbered local port? This is using passive mode. Could still be the firewall though - check the outbound filtering rules.
    Chris

    "Some problems are so complex that you have to be highly intelligent and well informed just to be undecided about them." - Laurence J. Peter

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •