I am in the process of doing a server migration and need to FTP 230,000 images.
FileZilla doesn't like that. It takes about 35 minutes for the queue to even load up, and then I'm lucky if it begins transferring. Also, when you save the queue, if you come back and import it it takes just as long to load up.
Another thing.. If the program crashes, it just resets the queue as if nothing was transferred. It would be nice if FileZilla removed items from the saved queue when they were completed so that the queue could be up to date when reloaded... like CuteFTP.
I know it's an obscene amount of file, but does anyone know a better way to transfer this?
Cheers,
Jeremy
Transferring a huge amount of files
Moderator: Project members
Re: Transferring a huge amount of files
Well, failing a tape drive and a hire car, I'd suggest ssl'ing into the recipient device and running wget from there. It seems pretty bombproof.illanti wrote:I am in the process of doing a server migration and need to FTP 230,000 images.....I know it's an obscene amount of file, but does anyone know a better way to transfer this?
--limit-rate=amount --tries=number and --waitretry=seconds might be useful settings.