FileZilla Forums

Welcome to the official discussion forums for FileZilla
Donate to project
It is currently 2015-02-28 13:54

All times are UTC




Post new topic  Reply to topic  [ 4 posts ] 
Author Message
PostPosted: 2011-05-30 15:23 
Offline
500 Command not understood

Joined: 2011-05-30 14:14
Posts: 5
First name: Charlie
Last name: Barrett
I just updated my Client to 3.5 a few days ago...

Last time I used FileZilla a few months ago to upload some large (40GB) backup files from my laptop to desktop, it took a while but they eventually transferred despite several disconnections (occasional disconnections are unfortunately normal for the internal WiFi board for my old Dell D600 laptop, so a successful Windows copy to the desktop's shared folder is impossible (I wasn't plugging into my FIOS router's ports because all four ports are taken).

The errors I'm getting after everything reconnects is "550 can't access file".

After a lot of head-scratching. I finally realized that when the network drops out, the FileZilla server keeps the files open on the old connections (at least until they time out), which should be fine if it reused the same connections - I use static IP numbers, so I'm not sure why it is creating new connections.

Every time the WiFi reconnects and transfer tries to resume on the new connections for each file being transferred simultaneously, if the server's old connections haven't timed out yet, the old connections still have the file(s) locked, preventing the new connections from writing to the file.

Even though I have the retry delay set to 5 minutes, and the error limit set to the max (99), the file access errors occur rapid fire, and thus very quickly exceeds the 99-error limit - within 10 seconds after the transfer tries to resume, the client has received its 99 errors, gives up, and moves the queued files to the failure queue.

On the server side, I can see the old abandoned connections displaying the green bar, which resembles a progress bar but isn't visible while transer is underway. The new connections pop up a similar yellow bar for a fraction of a second, the disk access error occurs, and the sequence repeats rapid-fire, until the client gets its 99 errors and gives up.

If I kick the users for all the old connections that have the green progress bar displayed, or if they time out, then the new connections can gain access to the file, the transfer restarts normally, and it will transfer several hundred megabytes or a GB more until the network drops out again - So then I have to kick all the old connections from the server and re-queue the files again.

I guess this is a very lengthy path to asking the question, "Can the server or client be set to use the same connections on resume, or else force the old connections to close before it tries to resume transferring the same file from the same IP number?

I can only guess that something changed since a few months ago, but I really can't say if it's the new client's fault, or something I changed in the settings since then - a few weeks ago, I was messing with timeouts, number of threads, and stuff like that while transferring a few smaller files just to see if anything affected the speed. I think I changed all that back to where I had it originally, though

Thanks
Charlie


Last edited by TXCharlie on 2011-05-30 23:12, edited 1 time in total.

Top
   
PostPosted: 2011-05-30 15:56 
Offline
Contributor
User avatar

Joined: 2006-05-01 03:28
Posts: 21098
Location: Germany
As long as the old port of the transfer is still open (not timed out), it cannot be accessed by new connections.

_________________
### BEGIN SIGNATURE BLOCK ###
FTP connection problems? Do yourself a favor and read Network Configuration.
All FileZilla products fully support IPv6. http://worldipv6launch.org
All support requests per PM will be ignored!
### END SIGNATURE BLOCK ###


Top
   
PostPosted: 2011-05-30 23:44 
Offline
500 Command not understood

Joined: 2011-05-30 14:14
Posts: 5
First name: Charlie
Last name: Barrett
Thanks!

Is there any way to make the client wait between retries, when the server responds with a file access error message? That would allow the old connections to time out. Currently, instead of waiting the 5 minutes I have the retry delay set to, the client is retrying several times per second, quickly exceeding the 99-error limit.

Maybe if I set the connection timeout to almost nothing, it will close the old connections before the wireless network has time to reconnect. I'll try that and see if there's a "sweet spot" for the timeout setting. I'm not sure why I didn't think of that before :roll: :wink:


Top
   
PostPosted: 2011-05-31 06:06 
Offline
Site Admin
User avatar

Joined: 2004-02-23 20:49
Posts: 24688
First name: Tim
Last name: Kosse
The timeouts would have to be decreased on the server to be effective.


Top
   
Display posts from previous:  Sort by  
Post new topic  Reply to topic  [ 4 posts ] 

All times are UTC


Who is online

Users browsing this forum: DanielGarneau, Yahoo [Bot] and 8 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Powered by phpBB® Forum Software © phpBB Limited