Results 1 to 13 of 13
  1. #1

    Question Advice on moving 98 GB of data?

    How would you recommend moving 98 GB of data across data centers from a remote location?

    tar+wget, rsync, scp???

    Any pitfalls to watch out for?

    Thanks!

  2. #2
    Recommendation, do it in blocks.
    YourCheapHost.com - Low cost multi domain hosting solutions. [Legal adult content friendly]
    Reliable web site hosting is our motto. We have Alertra stats to back that up.
    Proven provider of high quality shared and reseller accounts since 2002.

  3. #3
    Thats rather cumbersome to do since directory structure is like:

    /Primary Folder
    /Primary Folder/0123456789ABCDEF (16 random digit directory names, a few thousand of them actually!)
    /Primary Folder/0123456789ABCDEF/0123456789ABCDEF.file (16 random digit file names, again a few thousand of them though extensions are the same!)

  4. #4
    Join Date
    Feb 2004
    Posts
    322
    i would spilit a file to smaller , about 4,5G each and sftp . . I'v move a file 18G before and i spilit it to 2 file which 9 G each.

  5. #5
    Join Date
    Oct 2002
    Posts
    702
    scp has proven to be reliable for me. I suggest setting up a screen session for the scp and then detach it. This way the scp session won't mess up if you lose your connection to the server.
    ServerMatingProject.com
    The World's first server mating experiment
    We give new meaning to I/O intensive and hot swap

  6. #6

    Re:

    Just adding few more ideas to think about.....

    Is this data something no-one can see? Although SCP provides security and TCP for better recovery procedures, if your network connection between the two servers is GREAT and you do NOT worry about security, tar+gzip the file and FTP it via UDP to offload some of that extra packet overhead on each MTU.

    I like the idea of spliting up the files, just in case your transfer dies in the very end. These "random" directories you have, can you do some "df" searches to find out which take up the most space and then find a combination on how to tar only the ones starting with 1, or 2 or 3 ...etc?

    You probably want to check your peak/off-peak hours and use those times to transfer your files...

    Try a dry run with 1gig file... play a little ..

    cheers

  7. #7

    Re: Re:

    Originally posted by makesecure
    ...These "random" directories you have, can you do some "df" searches to find out which take up the most space and then find a combination on how to tar only the ones starting with 1, or 2 or 3 ...etc?...
    Yes, and that would bring me down to 36 BIG files to transfer.

    Correction: All directories and filenames are 32 characters long and not 16 digits as noted above!

  8. #8
    Join Date
    Apr 2003
    Posts
    959
    buy the harddrive from the host.
    I guess it will be easier~

    Now I have to type longer because WHT won't let me write another message in another 90 seconds!

  9. #9
    Join Date
    Nov 2002
    Location
    Hot, hot Michigan...
    Posts
    3,506
    Wow, so they're *all* in /primary folder/?

    I'd give rsync a try, run it in the background and check on it occasionally. If there's a ton of files though, the first step in it (building the file list) is going to take ages.

    Failing that - buy the hard drive.
    Ion Web Services/TronicTech
    http://www.ion-web.com or Unsupported webhosting?!?
    Shared hosting, Reseller accounts, Dedicated Servers, and More
    Proudly hosting since 2002

  10. #10
    Yes, they are all in /primary folder/

    If only I could just buy the HDD, unfortunately some companies wont easily sell you the HDD and/or without HUGE markups! (And no guessing the provider will not get you any rewards! )

  11. #11
    Join Date
    Aug 2002
    Location
    Denmark
    Posts
    432
    tar and gzip the file and the ftp it. It has worked fine for me. Alternatively you can split the tar file in several gzip files.
    Checkout www.crunzh.com for nice freeware programs. Including a program for monitoring your webserver.
    Any opinions in this post, unless otherwise noted, are my own personal opinions.

  12. #12
    Join Date
    Jul 2003
    Posts
    139
    Rsync it over ssh. Use archival settings and compression. Easy, clean, and secure.

  13. #13
    Tar it then

    you may split if you donot want to use scp other way

    only is scp

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •