Page 1 of 1

Uploading makes hard drive go crazy

Posted: 2020-05-18 18:00
by fortissimo
Hi, this has been a problem I've noticed with FileZilla for years, and I don't think it's a bug, just a design problem... but extremely annoying. Not sure why it took me this long to report it.

When uploading a file, even with my slow 2 MB/s upload speed, FileZilla accesses the hard drive like 4 times per second (loudly and obnoxiously, I might add). It's time that FileZilla stops putting hard drives through hell just to upload a single 10GB file.

I could understand this behavior back in the days when we only had 4MB of RAM. But with 16GB, 8GB or even 4GB of RAM, I can't imagine any good reason why Filezilla apparently still has to buffer files in 250K or 500K chunks or whatever it's doing. It should be buffering 10 or 20MB of data at a time.

Even better, people should be able to specify in the settings how much data to buffer at a time.

Re: Uploading makes hard drive go crazy

Posted: 2020-05-19 08:21
by botg
FileZilla already buffers several MB of data and keeps the buffers topped off for optimal performance.

If you are this annoyed about noise, considering getting an SSD, they are absolutely silent.

Re: Uploading makes hard drive go crazy

Posted: 2020-05-19 17:45
by fortissimo
Buffering "several" megabytes should, at the very least, imply that it's buffering 2 or more MB at a time, which would imply that it should only have to access the hard drive once per second when my upload speed is literally 2.2MB per second.

At the moment it's accessing at the rate of 1.7X per second (17 times in 10 seconds). By my calculations, FileZilla is buffering every 1.29MB. When I pause the transfer, my hard drive is totally silent, so this is all FileZilla.

Either FileZilla does not buffer several MB at a time, or FileZilla has a bug that makes it access the hard drive every 1-2MB anyway despite being told to buffer "several" megabytes at a time.

Would it be possible to add an option to have FileZilla buffer chunks of as many MB as the user specifies (perhaps up to 100MB)? Hard drives read large amounts of data way more efficiently than small amounts of data. I'd like to be able to buffer more than just "several" megabytes of data.

Re: Uploading makes hard drive go crazy

Posted: 2020-05-19 22:44
by botg
No, any such functionality would add needless complexity just for dealing with legacy technology in a situation where a far superior solution already exists in the form of SSDs.

Re: Uploading makes hard drive go crazy

Posted: 2020-05-20 00:14
by fortissimo
I have an SSD, but I (and most people I know) don't use SSDs to store a lot of their data in cases when they have a ton of data. In my case, I have about 12TB. My 500GB SSD is devoted to Windows, applications, and games. Congratulations if you have enough money to be able to afford to buy SSDs for all of your data needs, but either you have very little data (1 SSD does everything you need) or you have a really big data budget with multiple SSDs giving you terabytes of data.

There is nothing wrong with updating legacy software so it works in a way that makes sense with today's hardware. I just made another suggestion in another new thread that would improve FileZilla, but you could easily dismiss it with the same reasoning there: FTP is legacy technology, so I guess that's that.

Even if all of my uploading and downloading were done from an SSD, retrieving data from the SSD 1.7 times per second instead of once per 7 seconds is still a vice, even if it's just a small one. There is no benefit to insisting that FileZilla continue buffering data as if we were dealing with slow DSL or even dialup connections from 15-20 years ago, which is exactly what FileZilla does right now. FTP technology doesn't have to be "legacy" technology in the negative sense of the term - only in the positive sense that it's been around a long time. If the software that utilizes FTP technology is updated in creative and novel ways, then the technology becomes updated even if it's using an old protocol.