FileZilla Forums

Welcome to the official discussion forums for FileZilla
Donate to project
It is currently 2014-04-24 00:48

All times are UTC




Post new topic Reply to topic  [ 22 posts ]  Go to page 1, 2  Next
Author Message
PostPosted: 2010-07-19 20:22 
Offline
500 Command not understood

Joined: 2010-07-19 20:15
Posts: 5
Hey guys, I am using FileZilla 3.3.3 to FTP a huge number of files to a SAN on our local storage. I am getting the following error.

Filezilla Error
Can't create thread (error 8: not enough storage is available to process this command.)


My SAN has over 40TB left so its no where near out of space. The server running Filezilla has 16 CPUs that are not doing much and 40GB of RAM which less then 5GB is in use. I did some searching about this error and found a Microsoft page dealing with thread issues. If anyone has any suggestions let me know.

Thanks


Top
 Profile  
 
PostPosted: 2010-07-20 06:22 
Offline
Site Admin
User avatar

Joined: 2004-02-23 20:49
Posts: 22556
Quote:
Hey guys, I am using FileZilla 3.3.3 to FTP a huge number of files to a SAN on our local storage. I am getting the following error.


What is huge for you? For me it starts with a million files.

Quote:
Filezilla Error
Can't create thread (error 8: not enough storage is available to process this command.)

It's a local error on the client's machine.

My guess would be some broken DLL that is loaded that allocates large amounts of memory with each thread but doesn't free it.


Top
 Profile  
 
PostPosted: 2010-07-20 21:19 
Offline
500 Command not understood

Joined: 2010-07-19 20:15
Posts: 5
Hey thanks for the reply. To me a huge set of files is millions yes. Right now I have 2.6 million in the que and its still growing. So this DLL you mention is not part of Filezilla? When the error does occur nothing is added to the Failed Transfers list.


Top
 Profile  
 
PostPosted: 2010-07-20 22:09 
Offline
Site Admin
User avatar

Joined: 2004-02-23 20:49
Posts: 22556
Quote:
So this DLL you mention is not part of Filezilla? When the error does occur nothing is added to the Failed Transfers list.


Yes.

I would try a fresh Windows installation, that way there are no third-party DLLs interfering.


Top
 Profile  
 
PostPosted: 2010-07-20 22:48 
Offline
500 Command not understood

Joined: 2010-07-19 20:15
Posts: 5
Well its a new installation already. It has very little software installed. Its a Windows 2008 Server SP1 with McAfee, .NETFx 3.5 SP1, MS SQL Server 2008 Native Client, Firefox, some stuff for the SAN and the File System to connect to the SAN, and two other FTP programs I was evaluating (FlashFXP and FTP Synchronizer).

I have been watching the filezilla.exe in the Task Manager and the memory its using is steadily growing. I had to start over because when the queue got to about 3 million I started to see that error several times and then it crashed. Some of the files it was trying to download just got stuck and then it crashed. As I am writing this I only have about 350,000 files in the queue and the memory for filezilla.exe is about 360,000K and growing.

Right now its solid...building the queue and downloading...no problems, but I bet once the memory usage of the filezilla.exe gets to high...and the queue gets larger...I will start to see that error again and it will crash shortly after that. I assume that the memory should increase as its building the queue but it looks like that is a limit and I am not sure why.

thanks


Top
 Profile  
 
PostPosted: 2010-07-21 05:55 
Offline
226 Transfer OK
User avatar

Joined: 2006-05-01 03:28
Posts: 19663
Location: Germany
What memory consumption we talk about? A single 32bit process cannot allocate and use more than 2GiB (3GiB with /3G switch) memory. So that would be one case where a native 64bit build of FileZilla may help. Damn you, wxWidgets!

Otherwise the overwhelming percentage of FileZilla users will never reach such levels.

_________________
### BEGIN SIGNATURE BLOCK ###
FTP connection problems? Do yourself a favor and read Network Configuration.
All FileZilla products fully support IPv6. http://worldipv6launch.org
All support requests per PM will be ignored!
### END SIGNATURE BLOCK ###


Top
 Profile  
 
PostPosted: 2010-07-21 06:16 
Offline
Site Admin
User avatar

Joined: 2004-02-23 20:49
Posts: 22556
Time to somehow cut down on memory consumption without sacrificing too much performance.


Top
 Profile  
 
PostPosted: 2010-07-21 15:25 
Offline
500 Command not understood

Joined: 2010-07-19 20:15
Posts: 5
Ok I let it run over night one more time and the queue is at 2.5 million files and its still building and I am getting several of those memory errors. I have it set to down load 10 files at one time and it looks like 7 of them are stuck so its only downloading 3 max at one time now. The Task Manager says the filezilla.exe is up to 1,895,896 K memory (Private Working Set). So it looks like the queue is being saved only to memory?

One idea would be to write the queue to a file and not RAM...but then you could have the issue of writing to that file from multiple processes...and then when a file is downloaded you have to remove that file from the queue file. Could be a bit tricky. Another idea, which might be easier, would be to have a max number of files that can be in the in-memory queue at one time. For example say the max queue size is x files but you are trying to download 2x files (twice the max). Once the in-memory queue hits x files the process that builds the queue pauses and waits for the download process to download a file and remove it from the queue. Then the next file is added to the queue. So the files in the queue will bounce between x and x-1 for awhile till the total number of files left to download is less then x and the queue process is finished. This way you will never hit the memory limit again.

I just did some quick math based on the numbers of folders and files I was trying to download and I think its about 4 million files, so this is not something the average user will be doing. :-)

For now I am going to try and select a few of the sub-folders at one time to download, that might work better.

thanks for the help guys


Top
 Profile  
 
PostPosted: 2010-07-21 18:34 
Offline
Site Admin
User avatar

Joined: 2004-02-23 20:49
Posts: 22556
I think the directory listing cache is the culprit, right now it is not pruned.


Top
 Profile  
 
PostPosted: 2010-07-21 21:23 
Offline
Site Admin
User avatar

Joined: 2004-02-23 20:49
Posts: 22556
Step 1: Optimizing existing structures.

First structure I looked at was a single directory listing entry. 16 bytes to shave off of that one.


Top
 Profile  
 
PostPosted: 2010-07-25 19:06 
Offline
Site Admin
User avatar

Joined: 2004-02-23 20:49
Posts: 22556
Please try the upcoming 2010-07-26 build from http://filezilla-project.org/nightly.php, it should perform better.


Top
 Profile  
 
PostPosted: 2011-10-28 08:39 
Offline
500 Command not understood

Joined: 2011-10-28 08:22
Posts: 2
For what it is worth, I have exactly the same problem with the latest client build on 32-bit XP. I need to regularly transfer large numbers of files, but when I try to do this with FileZilla it's memory use slowly but continuously climbs until the system runs out of memory. This makes Filezilla unusable for my application, and as HDD sizes continue to get larger more and more users will start hitting this fundamental problem.

It fails with:
Can't create thread (error 8: not enough storage is avilable to process this command.)
An unhandled exception occured. Press "Abort" to terminate the program.


Top
 Profile  
 
PostPosted: 2011-10-28 19:29 
Offline
226 Transfer OK
User avatar

Joined: 2006-05-01 03:28
Posts: 19663
Location: Germany
Like I already wrote: A single 32bit process cannot address more than 2GiB max memory. The queue is handled by a database engine now (SQLite), so it should behave much better. There are (at least) two possibilities:
1. You're queueing up millions of files and thereby approaching the natural 2GiB limit. This is the case if the memory is released upon clearing the queue.

2. You have a resource leak in your system somewhere, e. g. by a buggy third-party DLL hooked into the FileZilla process. That one you could check by using a tool such as Process Explorer. A memory leak allocates memory over time it never frees again, so if you release the queue and the memory is not given back to the system...

I still have hope for a 64bit Windows build of FileZilla some day, where, in conjunction with a 64bit OS, you can overcome this limit.

_________________
### BEGIN SIGNATURE BLOCK ###
FTP connection problems? Do yourself a favor and read Network Configuration.
All FileZilla products fully support IPv6. http://worldipv6launch.org
All support requests per PM will be ignored!
### END SIGNATURE BLOCK ###


Top
 Profile  
 
PostPosted: 2011-10-28 23:45 
Offline
500 Command not understood

Joined: 2011-10-28 08:22
Posts: 2
Thank you for your reply, but we do need to carefully avoid blame-shifting and focus on actually fixing FileZilla's bugs. Blame-shifting is deeply unhelpful and patronises the user.
e.g.
Blame the user: "The user has queued too many files and has broken the software. How dare the user do this, it is clearly their fault if they do this and the software crashes."
Blame the user's PC: "There must be some other mystery software on your system that makes FileZilla crash. It's the user's fault not FileZilla's, and it serves them right"

Actually, all software should cope gracefully and not crash regardless of what the user asks of it.
I asked FileZilla to transfer 1.2M files by asking it to upload one directory name, but there is no good reason for the amount of system RAM memory used by FileZilla to be in proportion to the number of files in the entire directory tree within that directory.

To address the suggested sources of difficulty, I have started a new transfer running and monitored it with windows task manager:
Monitoring FileZilla's process over time, I can see it consuming more and more system memory as it continues to allocate (and not release) memory as it runs. As I write this it as reached 900K, which is very greedy for an application. I don't have any bogeyman 3rd party DLLs hooked into FileZilla that we can blame for this and it is a straightforward WinXP-Pro OS install.

There is no good reason when a directory upload request is performed for FileZilla to store the name of every file to be transferred, and to maintain that list in active memory.
It should be enough to just store the name of the directory that was selected for uploading, and a temporary reference to the file and sub-directory name in the hierarchy of where the itterator has currently got to.
Typically the user doesn't need the name of every file transferred to be logged, and the software doesn't really need that full list either.

If it really needs to store the full filename log, it should use a temporary file, not try and store it all in system RAM.
The HDD uses NTFS, so even if the temporary file size rose above 4GB it should not be an issue. There is no excuse for using 32-bit variables in the FileZilla code. Even 32 -bit code can use 64-bit variables to read and write to very large files on the HDD.

Regards,
Nicholas Lee


Top
 Profile  
 
PostPosted: 2011-10-29 03:54 
Offline
226 Transfer OK
User avatar

Joined: 2006-05-01 03:28
Posts: 19663
Location: Germany
Quote:
Blame the user: "The user has queued too many files and has broken the software. How dare the user do this, it is clearly their fault if they do this and the software crashes."
Nobody has.

Quote:
Blame the user's PC: "There must be some other mystery software on your system that makes FileZilla crash. It's the user's fault not FileZilla's, and it serves them right"
It is a possibility, many users install lots of crap into their system, with DLLs hooking into other processes. If then the DLL causes a problem, it looks like FileZilla did it. That's not blame shifting, it happened already.

The ''you'' was not just directed at you, more like common advice. And I already said: These are possibilities, and there may be more reasons.

Quote:
To address the suggested sources of difficulty, I have started a new transfer running and monitored it with windows task manager:
Monitoring FileZilla's process over time, I can see it consuming more and more system memory as it continues to allocate (and not release) memory as it runs. As I write this it as reached 900K, which is very greedy for an application.
You probably mean 900M, not K. Windows task manager isn't good to monitor, as it does only display the main process, but doesn't show childs and threads. So any third-party DLL etc. is not displayed. That's why I mentioned Process Explorer. Use this and you see all the wazoo.

Quote:
There is no excuse for using 32-bit variables in the FileZilla code. Even 32 -bit code can use 64-bit variables to read and write to very large files on the HDD.
FileZilla does already use 64bit variables. The limit is main memory, not HDD or variables. The developer must have had a reason to implement it like it is now, I can't say anything regarding the issue, only that it is the very fastest method. The proper solution would be to use a 64bit build of FileZilla (which is not possible right now for Windows users, *NIX OS users already have it), which is able to address plenty of memory (and Pagefile in case it runs out of physical one).

_________________
### BEGIN SIGNATURE BLOCK ###
FTP connection problems? Do yourself a favor and read Network Configuration.
All FileZilla products fully support IPv6. http://worldipv6launch.org
All support requests per PM will be ignored!
### END SIGNATURE BLOCK ###


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 22 posts ]  Go to page 1, 2  Next

All times are UTC


Who is online

Users browsing this forum: No registered users and 16 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Dedicated server provided by Artmotion.
Forum sponsored by Everyware.ch.
Powered by phpBB® Forum Software © phpBB Group