Helpful Information
 
 
Category: FTP Help
Recursive mget with command line ftp

hi y'all

just wondering.. Let's say I ftp to a machine and cd to dir1.
Now let's say dir1 contains directories dir2, dir3, dir4 , which all contain files of the same type (eg binaries... yeah, ok, so I'm downloading mp3s off a friend's machine). Anyway, my questions is this: is there a way to use mget, so that it will bring down the whole tree? Otherwise, I have to create all the directories locally, then cd into each one bfore ftp'ing up, cd'ing again, mget'ing and then repeating the whole process for each directory...

I just typed too much, but to summarise:
Can ftp.mget * be made to copy across subdirectories and their contents?

or is there an alternative command line tool?

thanks
Chris

it depends on your client. mine supports this:

ftp> recursive mget *

type "help" to see if "recursive" is supported by your client. i think it should work with any server then :)

recursive not supported. :(
I am told that wget can be used, but I can't get it working in this case, I think because I have to log into the ftp machine as me!

later
Christo

Hi CHR15T0!

from "wget --help" i canīt tell if it supports ftp logins. but i donīt think they left out this important feature.

try to set "--http-user=... --http-passwd=..." and see if it takes the same login for ftp too.

if this does not help, rsync should be able to do the same job. but i donīt have it installed, so you need to see yourself.

greetings,

Manuel

Hi,

I know, that it's quite late for anwsering and I think, you found the anwser long time ago, but when you search at google for "recursive + ftp + linux" this side is one of the first you get to.

So, in order to use ftp + login with wget you must just type:


wget -r ftp://user: pass@domain...
the space between user: and pass is incorrect, but the forum-software makes ...ftp://user:pass@domain...else.

hope, that will help someone

lol - I can't believe I even asked that question - and yeah, you're kinda late, dude :D


christo

Late? Who cares. Right Answer is what matters and is the difference between having to stare at the screen for hours with FTP and using this one precious command. Result is I'm heading down the pub while my whole collection of websites gets moved from one crappy host to their new home by wget magic. Good Work Fella!

:tntworth:

This is real simple. I found you on google via "mget recursive".

For those of you who have passwords that have characters that might be interpreted by the shell in them who find this off google search in the future:


wget -r 'ftp://user:pass@domain'

Wrap the whole thing in single quotes.


;)

For those of you who have passwords that have characters that might be interpreted by the shell in them who find this off google search in the future:


wget -r 'ftp://user:pass@domain'

Wrap the whole thing in single quotes.


;)

This thread helped me out a lot - I didn't realize wget could use the FTP protocol.

For those saying that the answer was late - it's important to remember that these threads become a searchable knowledgebase for the entire of the internet. I don't really post on devshed ever, but I was trying to download an entire folder path recursively in FTP, and this forum popped up when I googled it.

So for those who get mad at users asking silly questions, or at people who wake up old threads with useful information, or for people who ask the same thing as what was discussed in another, older (probably difficult to find) thread, remember - your contributions on a popular forum aren't just for the benefit of the other fellow forum-goers, but for the internet as a whole!

:cheers:

Thanks,
It was really helpful. One more thing if the folder you want to get is deep inside (for example at public_html/files/) you can simply use it like this:
wget -r 'ftp://user:pass@ftp.sitename.com/public_html/files/'

Perry.










privacy (GDPR)