Helpful Information
 
 
Category: FTP Help
net::ftp 2000 image files

Hi all -

How would perl handle a job putting about 2000+ files using the net::ftp module? (I don't have any number on the actual file sizes).

I want to run it as a scheduled job every so often. I have experience using net::ftp, but not with so many files.

Are there any traps to avoid when implementing this kind of program?

Thanks

this will take quite some time since ftp takes several handshakes till transmission starts... regarding ping times, this can become:
1 ping takes 50ms
estimate 5 handshake-items, this makes quarter a second per transfer PLUS your actual data.
1/4*2000 = 500 seconds only for handshaking.

i have no experience using this module but log your sessions, then you can find out about the number of handshakes...

do they all go to the same server or different ones?

anyway,
send one after the other. donīt use threads or multiple processes with that many files.

some servers kill your script if it runs for too long though..



I want to run it as a scheduled job every so often


check at the beginning if the last job has finished or donīt start. you will overload your server soon.

... any other suggestions?

All the files are going to the same server.

I was thinking about using multiple processes, because of the server possibly killing a single long executing process. Though,I don't know how I could use multiple processes creatively or effectively for this type of application.

Why did you suggest not using multiple process?

Very helpful feedback.

if you want to use several processes, dont spawn one for each transfer!! this is what i meant.
if you are not on an ultra-sparc 10k, more than a few 10 processes at the same time will kill your CPU and Network power.

if you want to use several processes, dont spawn one for each transfer!! this is what i meant.
That would be way crazy. LOL. But it would probably be fun to see how the CPU reacts ...

Anyway, how would somebody use multiple processes to tackle this? I was thinking about splitting the file transfers into 2, 3, or 4 optional processes. But I'm still in the learning stages with multi-process programming. And I'm learning it on windows, which doesn't help since I'm more comfortably with *nix.

Windows technology involved :(

it is not that much different. i am no perl pro, but fork() should be available in perl...
be careful with parent/child processes. you need to "disown" somewhere so the forkīed process wonīt be killed if the parent process is...

What kind of boxes are to transferring from/to? You might be better served with rsync, which can work over SSH.

Rsync is a very smart client- it will only upload files that have changed, and even then only the PARTS of the files that have changed. It's very, very fast as a consequence.

If you're transferring to a linux/*nix box running SSH, you can most definitely use rsync and cron to do this.

transferring from windows 2000 to windows 2000.

fork() is available in Perl. It doesn't, unfortunately, work quite the same way for both windows and *nix.

NET::FTP might not be the right solution.

Tou guys are talking about stuff way to much ;p. Unless it takes 300+ seconds to transfer one image (in which case you've got bigger problems then the ones discussed here) a typical ftp server wont time you out. Just split up the images and run multiple copied of the script at once (since fork( ) doesn't work in windows), or use threads to have one script do multiple things at once (you might be able to do an interesting thing with setting up a 'queue' for threads to read from -- just read all the files you need to upload into an array (the filenames of course, not the file content itself) and have each thread grab/upload that file, then grab the next one).

Also, if you're worried about getting timed out of the server -- Net::FTP will let you know if you can't upload a file for some reason:


unless($ftp->put('myfile.jpg')) {
# something went wrong, it will tell you in $!
# perhaps re-establish your connection with the ftp server
}

I greatly appreciate the feedback.

I like JonLed's suggestion. I read about and tested fork() on windows 2000 with perl > 5.6 and it works, although, not like it's documented in Lincoln Stein's "Network Programming". That's another threaded discussion.

So, trying the queque thing and forking the job sounds cool.

You haven't begun to live until you've known the true joy of rsync via SSH for transferring files. . . Look into it next time you need to do something like this on *nix.

But, given that you're on win32, it would be a slightly larger pain the butt to accomplish, though it would be possible.










privacy (GDPR)