Results 1 to 9 of 9
-
07-21-2005, 01:46 AM #1Web Hosting Master
- Join Date
- Jul 2005
- Posts
- 598
How to FTP a directory with files inside it?
Hi,
Anybody know how to FTP (mput, mget, get, put) a directory with files inside it?As far as i know, the command line FTP could only transfer plain/regular files.1
-
07-21-2005, 03:46 AM #2Hosting provider
- Join Date
- May 2002
- Location
- Moscow
- Posts
- 1,602
you are correct. but, you can use wget for same action. something like "wget -r ftp://username:password@ftp.host.com"
and you will retrieve all files which you have on ftp host. also you may add after url some path (folders). this will mean that you need only recursive download files from indicated folders.TK Rustelekom LLC Dedicated server since 2002, RIPE NCC member, LIR1
-
07-21-2005, 03:59 AM #3Web Hosting Master
- Join Date
- Jul 2005
- Posts
- 598
Originally posted by worldhosting
you are correct. but, you can use wget for same action. something like "wget -r ftp://username:password@ftp.host.com"
and you will retrieve all files which you have on ftp host. also you may add after url some path (folders). this will mean that you need only recursive download files from indicated folders.
Any other way to transfer the file like FTP client whereby you just click on the main directory and all the files/subdirectory inside that main directory will be transfer over.
I hope my words are clear.
worldhosting, i think your method is more suitable for web files. That would be my opinion.0
-
07-21-2005, 04:30 AM #4Web Hosting Master
- Join Date
- Oct 2003
- Posts
- 570
http://www.ncftp.com/ncftp/doc/ncftpget.html (-R) or use wget, as described.
Last edited by aldee; 07-21-2005 at 04:34 AM.
0
-
07-21-2005, 06:31 AM #5Web Hosting Guru
- Join Date
- Feb 2005
- Posts
- 335
mget -R <dirname>
0
-
07-21-2005, 07:38 AM #6Web Hosting Guru
- Join Date
- Feb 2005
- Posts
- 335
It occurs to me I should qualify that, you can indeed recursivly mget/put with -R, but you would have to enter the directory to do it (if I recall correctly). so as you sasked in your second post you could simply create a /path/to/folder on your maching, ftp into the /usr/bin dir on the remote system, then mget * to grab all the files to that dir.
ncftp is a much better option if you dont mind installing it, and I generally compress my between system transfers just because (according to the file type) it can save you a bit of time.0
-
07-21-2005, 11:53 AM #7Aspiring Evangelist
- Join Date
- Dec 2004
- Posts
- 350
-- from FTP man page--
mget and mput are not meant to transfer entire directory subtrees of files. That can be done by transferring a tar(1) archive of the subtree (in binary mode).0
-
07-21-2005, 01:27 PM #8Junior Guru Wannabe
- Join Date
- Mar 2005
- Posts
- 93
As nadtz suggested, I think the cleanest and most efficient way would be to compress all your files into one file before moving. This will work great no matter what kind of files, web or system.
In the directory you want to transfer all files and folders type:
tar cvf name.tar *
(where name is the name of the file you want to name it).
Then Gzip the file.
gzip name.tar
Now you will have one compressed file called name.tar.gz
Transfer that file over to where you want it, and extract it using this command.
tar -xzvf name.tar.gz
All files and folders will extract exactly as they were compressed.1
-
07-22-2005, 06:09 AM #9Web Hosting Master
- Join Date
- Apr 2004
- Location
- San Jose
- Posts
- 902
If there's no overriding reason to use FTP, try
rsync -avz /src remote:/dest
man rsync
if this is confusing.0