Without cPanel is there any way to split a tar archieve of a large directory 10K+ Files, so that they can be tared? My current shared host kills the process when I hit 1.5GB on the archieve.. Is there any other way to tar it, or use 2 servers to transfer the files?
You can use tar and split together to make several smaller tars "in flight":
tar cvzf - * | split -b 1000k filename
First part tars your files and is piped (output) on the fly to the split command.
-b specifies the split to happen at a certain number of bytes...
1000k specifies the number of bytes...
filename is the tar filename and will be appended with aa, ab, ac , etc.
Use "cat to put them back together again. See this for cat help:
tar -cvlf - YOURDIRECTORY/ | split -a 2 -b 500m - tar/big_backup.tar
The reason I needed to this that way is because im on shared hosting with limited resources, that code splits it into muliple files allowing me to tar them all.. They just wont let me make a tar of 1.5GB
This code split it into 5 tar files.
I still havent tried putting them back together again xD.