Results 1 to 9 of 9
  1. #1
    Join Date
    Mar 2008
    Posts
    372

    un-tar big file - big head ache - any way to overcome ???

    un-tar big file - big head ache - any way to overcome ???
    I was un-tarring a file of 6.622gb, the un-tar size should be around 14.4gb I think, most of them are image files .jpg's

    and its taking hell a lot of time and not to mention the breaks for un-specified time to process once again,

    started decompress around 40-50mins earlier and still the process is on,,.....

    using centos5 , ssh - root access

    is any there any way I can overcome this problem in future, I can't afford to spend so much time on friends vps just to decompress files ???

  2. #2
    Join Date
    Mar 2009
    Posts
    568
    Quote Originally Posted by koolnhot View Post
    un-tar big file - big head ache - any way to overcome ???
    I was un-tarring a file of 6.622gb, the un-tar size should be around 14.4gb I think, most of them are image files .jpg's

    and its taking hell a lot of time and not to mention the breaks for un-specified time to process once again,

    started decompress around 40-50mins earlier and still the process is on,,.....

    using centos5 , ssh - root access

    is any there any way I can overcome this problem in future, I can't afford to spend so much time on friends vps just to decompress files ???
    Faster hardware would be my only guess, but since it's a VPS, your friend can't do much other than move to a different provider.

    VPS systems are shared by nature, so you will be fighting for the disk resources with potentially dozen or more other customers on the same machine. These systems are typically built with standard 7.2k RPM hard drives which don't handle random access well. This works pretty well for your typical hosting environment where disk activity is usually pretty low. If the VPS used 10k or 15k SAS hard drives then there would be a lot less possibility of this sort of contention, but those are few and far between. 7.2k RPM drives could handle this type of load as well, but not along with lots of other users -- you would have to be on a dedicated server with your own spindles.

    --Chris
    The Object Zone - Your Windows Server Specialists for more than twenty years - http://www.object-zone.net/
    Services: Contract Server Management, Desktop Support Services, IT/VoIP Consulting, Cloud Migration, and Custom ASP.net and Mobile Application Development

  3. #3
    It's just too bad I'm afraid. Untarring files takes a long time, you just to get faster hardware, or do it overnight when you don't care.

    To make sure it doesn't use up too many resources, you can use "nice" and "ionice" before the tar command to lower the priority. Type "man ionice" or "man nice" in Linux (or similar) to find out more.

  4. #4
    Join Date
    Mar 2008
    Posts
    372
    oh!!! so this will have to continue till he moves on to a dedicated server
    damn so this is going to kill me, when ever I try to decompress big files on his vps
    since he is on un-managed plan its going to add more pain to ad to my voes

  5. #5
    Join Date
    Mar 2008
    Posts
    372
    thanks for the info stephen, will consider this when I put the files for un-tar during bed in future, decided not to un-tar during my working hrs for these big files

  6. #6
    You can also use the "nohup" command, so you can type the command to untar, then logout and it will keep going.

    That combined with doing it overnight and using ionice and/or nice to lower the process priority, should sort out all the problems :-).

  7. #7
    Join Date
    Jun 2009
    Location
    Baile Átha Cliath
    Posts
    186
    Quote Originally Posted by koolnhot View Post
    un-tar big file - big head ache - any way to overcome ???
    I was un-tarring a file of 6.622gb, the un-tar size should be around 14.4gb I think, most of them are image files .jpg's
    If its mostly JPEGs i'd be surprised if a 7GB tar file expands to 14GB of files, as JPEG are themselves already compressed.

    If you're regularly moving tarfiles full of JPEGs, don't bother compressing them (ie 'tar -cf file.tar *.jpeg' rather than 'tar -czf file.tar.gz *.jpeg'). At least then unpacking them is limited by filesystem read/write speed rather than CPU power.

  8. #8
    Join Date
    Jan 2006
    Location
    Ontario, Canada
    Posts
    324
    Make sure you are not using the verbose 'v' flag in your untar operation as that will slow it down a bit too, using tar -zxf bigfile.tar.gz should work a bit faster.

  9. #9
    Join Date
    Mar 2008
    Posts
    372
    thanks for the suggestions and nexbyte u r rite, I used -zxvf I think, so I will have to remove tht too if I do it next time

    remy - all files aren't jpg's but 90% of them, as its some image related site, so there are other files too such as php scripts

Similar Threads

  1. Overcome your impatience triggers @ iWeb.com
    By fbc in forum Dedicated Hosting Offers
    Replies: 6
    Last Post: 04-28-2008, 08:40 AM
  2. Brain Ache
    By stuart clark in forum Dedicated Server
    Replies: 1
    Last Post: 07-19-2005, 04:09 PM
  3. Doubleyou Ache Tee
    By ub3r in forum Web Hosting Lounge
    Replies: 6
    Last Post: 11-05-2004, 01:15 AM
  4. Head to head comparison: ThePlanet vs Rackshack?
    By Rich Z in forum Dedicated Server
    Replies: 24
    Last Post: 05-10-2003, 09:32 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •