A Segmented Downloads workaround...

Come here to discuss FileZilla and FTP in general

Moderator: Project members

Message
Author
topbanana
450 Internal Error
Posts: 36
Joined: 2012-11-09 02:29

A Segmented Downloads workaround...

#1 Post by topbanana » 2015-03-16 19:51

OK, Filezilla IS the best FTP client out there, it works very well, a near perfect user-interface, etc... But it doesn't do segmented transfers of big files (due mostly to a lack of empathy, no other detailed reasons given)... BUT it does do Simultaneous Transfers, up to 10 different files at once... So lets make the most of that feature.

We can use the WinSCP client, not to transfer the files, it's a bit clunky at that, but to run remote commands on files in a familiar 'File Explorer' user interface... Select the file(s)/directory and click a button and with a custom remote command it'll split the file into a familiar multi-part rar files. These chunks can then be transferred simultaneously, maxing out your internet connection, especially if it's a bit crap or you're many hops away from your FTP server.

Get WinSCP: http://winscp.net/eng/index.php - "WinSCP is an open source free SFTP client, FTP client, WebDAV client and SCP client for Windows."
Add a Custom Command:

Code: Select all

rar a -r -m0 -v20m "!.rar" "!"
'Remote Command', 'Apply to Directories'... Add the Custom Commands toolbar.
This command will use Rar (install it on your server) to spit the file(s)/directory into 20mb numbered rar files using no compression, therefore very quickly. If you multi-select files, they'll each have their own multi-part rar file, nice.

We will still use Filezilla for transferring the files, using Simultaneous Transfers ;-) as it's by far the nicest/best working FTP client out there (even despite this missing, much needed functionality).

So now the every-day steps are:
1) Start WinSCP
2) Log in to the server
3) Browse to the file(s)
4) Select them
5) Click on the custom command's toolbar button ... it then gets to work processing (seconds or a minute perhaps)...
6) Start Filesilla
7) Log in to the server
8 ) Browse to the file(s) and the multi-part rars
9) Select the rars
10) Download them to the local machine ... watching the simultaneous transfers chomping thru the data!
11) Unrar the local rar files when completed
12) Delete the rar files from the remote server using WinSCP as it does it quicker.
Finished.

It's a shame that's not just the 5 steps we dream of...

Any other workarounds that are working well???

User avatar
boco
Contributor
Posts: 25289
Joined: 2006-05-01 03:28
Location: Germany

Re: A Segmented Downloads workaround...

#2 Post by boco » 2015-03-17 07:40

There's a little detail to add. For custom commands to be sent the server must support the Secure Shell (SSH/SSH2/SCP). Many servers don't.
### BEGIN SIGNATURE BLOCK ###
No support requests per PM! You will NOT get any reply!!!
FTP connection problems? Do yourself a favor and read Network Configuration.
FileZilla connection test: https://filezilla-project.org/conntest.php
### END SIGNATURE BLOCK ###

topbanana
450 Internal Error
Posts: 36
Joined: 2012-11-09 02:29

Re: A Segmented Downloads workaround...

#3 Post by topbanana » 2015-03-18 07:44

Ah yes, good point.

User avatar
botg
Site Admin
Posts: 33182
Joined: 2004-02-23 20:49
First name: Tim
Last name: Kosse
Contact:

Re: A Segmented Downloads workaround...

#4 Post by botg » 2015-03-18 08:27

Segmented downloads are an error-prone workaround at best.

FileZilla can easily do fast transfers even when the latency is high. Look at the server, something is not right there, otherwise transfers would be fast.

robena
503 Bad sequence of commands
Posts: 19
Joined: 2007-03-20 19:53

Re: A Segmented Downloads workaround...

#5 Post by robena » 2015-08-18 14:48

botg wrote:Segmented downloads are an error-prone workaround at best.

FileZilla can easily do fast transfers even when the latency is high. Look at the server, something is not right there, otherwise transfers would be fast.
That's just no true.

Every server benefits from segmented downloads. Even Filezilla server itself!

I installed Filezilla server on my PC, connected to a very fast gigabit 920/240 speed Internet provider. Both Filezilla and SmartFTP clients are on another PC connected to a 340/340 fiber provider.

SmartFTP runs 10 times faster than Filezilla.

I also use regularly Internet Download manager. It also runs 6 to 10 times faster in segmented mode than the native Firefox downloads.

In segmented mode, I peek at 850 Mbs from some filesharing servers. It NEVER goes that fast natively, peaking at 80 Mbs at best.

So, ok, you don't want to add this feature to Filezilla. It's your product (great aside that), it's your choice, it's free, you are perfectly justified to do whatever YOU want with it, it's yours, period.

But, please, don't insult your users with this nonsense. Segmented transfers are useful and have become an absolute necessity in today's fast fiber optics network access.

User avatar
botg
Site Admin
Posts: 33182
Joined: 2004-02-23 20:49
First name: Tim
Last name: Kosse
Contact:

Re: A Segmented Downloads workaround...

#6 Post by botg » 2015-08-18 15:23

You might need to tune your advertised receive window. How large is your latency to the server during transfer?

topbanana
450 Internal Error
Posts: 36
Joined: 2012-11-09 02:29

Re: A Segmented Downloads workaround...

#7 Post by topbanana » 2015-08-20 09:09

botg wrote:You might need to tune your advertised receive window. How large is your latency to the server during transfer?
You guys keep on about changing the settings of our clients or the server, but for many/most of us the problems lie between us and our servers and we have zero control over ISPs, networks, hosting companies.

If we do Simultaneous Transfers it goes much, much faster. FACT.

If we had Segmented Downloads, it would go much, much faster. FACT.



e.g. I'm currently on an island in Indonesia, using a 3G modem or a mobile phone's 3G hotspot. Late at night i can almost saturate my connection if i Simultaneous Transfers, using FZ or IDM or whatever... easy. If i transfer a single file, i could get 1/5 - 1/4 of that throughput. Every time. 100% repeatable.
Then, a few weeks ago, something changed between me and my server, perhaps Indonesia got some new fibres, perhaps the cellphone company improved the backhaul from the cell site on my island, perhaps my hosting company in France stopped throttling their internal traffic... I have absolutely no idea.. So now, amazingly, a single file transfer in FileZilla can sometimes (and only sometimes) transfer at about 80% of my link speed! WOW! I hadn't changed a thing on my client or server. But, if i start adding Simultaneous Transfers, 2, 3, or 4, then i can saturate my connection and get things done much, much quicker.

Wherever i live, be it in a massive city with great connectivity, or on a tiny island in Indonesia, i always get the same slow single transfers and the fast simultaneous transfers.

You could make a lot of people very happy and save them a lot of time by implementing Segmented Downloads.

User avatar
botg
Site Admin
Posts: 33182
Joined: 2004-02-23 20:49
First name: Tim
Last name: Kosse
Contact:

Re: A Segmented Downloads workaround...

#8 Post by botg » 2015-08-20 09:46

You didn't answer the relevant question: What's your your latency to the server during a transfer?

topbanana
450 Internal Error
Posts: 36
Joined: 2012-11-09 02:29

Re: A Segmented Downloads workaround...

#9 Post by topbanana » 2015-08-20 16:44

botg wrote:You didn't answer the relevant question: What's your your latency to the server during a transfer?
I thought you were asking this of Robena.

You didn't answer the relevant question: What are the full technical details of the reasoning behind not implementing Segmented Downloads when so many other clients have and the evidence of faster transfer speeds are so overwhelming and easy to reproduce.

User avatar
botg
Site Admin
Posts: 33182
Joined: 2004-02-23 20:49
First name: Tim
Last name: Kosse
Contact:

Re: A Segmented Downloads workaround...

#10 Post by botg » 2015-08-20 17:00

The main reasons:
- Segmented transfers add significant overhead
- With proper configuration, segmented transfers are actually slower than single transfers, especially when using rotating disks as the storage device.
- If the bottleneck is due to total available bandwidth, using segmented transfers actually steals bandwidth from those users not using segmented transfers. Total bandwidth across all users however does not go up. Ultimately it would compel everyone to use segmented transfers to steal bandwidth from each other until congestive collapse occurs where every bit of bandwidth is used up by the overhead
- At least for uploads, many servers use temporary files to upload or truncate the file when starting a transfer, which is inherently incompatible with segmented transfers yet not detectable client-side due to the following issue
- Last but not least, FTP does not have any means of verifying file integrity, using segmented transfers will lead to corruption. The only way to verify a file is to actually transfer it into the other direction and to compare it with the original. Nobody wants to waste that amount of bandwidth

Nytron
500 Command not understood
Posts: 3
Joined: 2016-01-05 10:52

Re: A Segmented Downloads workaround...

#11 Post by Nytron » 2016-01-05 11:10

botg wrote:The main reasons:
- Segmented transfers add significant overhead
- With proper configuration, segmented transfers are actually slower than single transfers, especially when using rotating disks as the storage device.
- If the bottleneck is due to total available bandwidth, using segmented transfers actually steals bandwidth from those users not using segmented transfers. Total bandwidth across all users however does not go up. Ultimately it would compel everyone to use segmented transfers to steal bandwidth from each other until congestive collapse occurs where every bit of bandwidth is used up by the overhead
- At least for uploads, many servers use temporary files to upload or truncate the file when starting a transfer, which is inherently incompatible with segmented transfers yet not detectable client-side due to the following issue
- Last but not least, FTP does not have any means of verifying file integrity, using segmented transfers will lead to corruption. The only way to verify a file is to actually transfer it into the other direction and to compare it with the original. Nobody wants to waste that amount of bandwidth
If you have the client and server geographically close to each other with above average internet speeds on each end, then Segmented Downloading may not be necessary. FileZilla devs are ignoring the case where a user might download files from a server from another country/continent. You know... the entire point of the internet? To not add Segmented Downloads as a feature to FileZilla is a giant oversight.

Depending on you're location, the server's location, the ISP(s) involved and how they choose to route traffic, sometimes it's the case where a server cannot provide full speed on a single file download. No matter what you do, you cannot change this. By enabling Segmented Downloading, you can increase speeds tremendously. When you can increase your download speed from 300kB/s to 1.8mB/s, do the shortcomings/excuses that you listed even matter? Other clients have managed to pull off Segmented Downloading with no noticeable drawbacks whatsoever. It comes down to how many hops and the geography and ISPs involved. So it is not an issue with the user configuration. Your attempts to belittle Segmented Downloading are very poor form. FileZilla is the best FTP client I've used, but now I have to look elsewhere.

User avatar
botg
Site Admin
Posts: 33182
Joined: 2004-02-23 20:49
First name: Tim
Last name: Kosse
Contact:

Re: A Segmented Downloads workaround...

#12 Post by botg » 2016-01-05 14:05

In its default configuration, FileZilla can completely saturate a 100Mbit/s link at 300ms latency. In 300ms you can easily contact a server on the exact opposite side of the earth and obtain a reply.

Let's look at the reasons for slowness, why segmented downloads are detrimental and how to fix the issue proper:
- Badly configured servers. Just fix the server configuration. Segmented downloads are no excuse for a lazy server administrator
- Congestion on the server itself. Segmented downloads steal capacity from other users of the server. Proper solution is to upgrade the server
- Congestion on the network. Segmented downloads steal capacity from other users. Proper solution is for the network operators to lay bigger pipes
- ISP-level throttling. Using segmented downloads are a stopgap measure at best, they only force the provider to eventually change the throttling algorithms and then you're back at square one. Solution is to switch to a provider that does not throttle.
- Even higher latency or massive packet loss. Generally a symptom of congestion, solution as above.

Nytron
500 Command not understood
Posts: 3
Joined: 2016-01-05 10:52

Re: A Segmented Downloads workaround...

#13 Post by Nytron » 2016-01-05 14:35

botg wrote:In its default configuration, FileZilla can completely saturate a 100Mbit/s link at 300ms latency. In 300ms you can easily contact a server on the exact opposite side of the earth and obtain a reply.

Let's look at the reasons for slowness, why segmented downloads are detrimental and how to fix the issue proper:
- Badly configured servers. Just fix the server configuration. Segmented downloads are no excuse for a lazy server administrator
- Congestion on the server itself. Segmented downloads steal capacity from other users of the server. Proper solution is to upgrade the server
- Congestion on the network. Segmented downloads steal capacity from other users. Proper solution is for the network operators to lay bigger pipes
- ISP-level throttling. Using segmented downloads are a stopgap measure at best, they only force the provider to eventually change the throttling algorithms and then you're back at square one. Solution is to switch to a provider that does not throttle.
- Even higher latency or massive packet loss. Generally a symptom of congestion, solution as above.
It is not as simple as latency and speed. In certain cases, the user has no control over the server or where the packets are being routed. Some ISPs routing is not ideal, i.e. they route traffic through the worst quality routes that cost the least for them. In some areas, people are far away from CO. I had a situation a few years ago where I was connecting to one of the largest server providers in Europe from only DSL provider in area in Central USA. My download speed on a single thread was consistently capped at 350kB/s, but then turn on Segmented Downloading it goes to 1.8mB/s steady. For some people, not most, Segmented Downloading will increase your speed from a trickle to full speed.

FileZilla is a client, and Segmented Downloading should be an option that is not on by default. A big portion of the world has monopolized ISP, i.e. any given area only has one or two choices for internet. So your statement where you said "Solution is to switch to a provider that does not throttle." is pretty much a slap in the face at best.

eriksp
500 Command not understood
Posts: 2
Joined: 2016-01-12 13:47
First name: Erik

Re: A Segmented Downloads workaround...

#14 Post by eriksp » 2016-01-12 14:12

botg wrote:Segmented downloads are an error-prone workaround at best.

FileZilla can easily do fast transfers even when the latency is high. Look at the server, something is not right there, otherwise transfers would be fast.
In my particular case, this is not the issue.

I'm currently stationed in China, behind the "Great Firewall". This firewall doesn't much like that people wants to communicate with the outside world, and typically rate-limits everything in order to check for "unharmonious" content. Encrypted connections are particularly slow. UDP-based VPNs are extremely unstable, TCP-based are subject to the same rate limiting. In order to maximize my measly 12 Mbit, I need around 30 parallel connections. My servers, in European data centers, are all 100Mbit or Gbit, and I can easily reach the full 100 Mbit/around 700 Mbit of the Gbit when transferring between them.

There is nothing I can do about this, there's no change of ISP or server tuning that will increase the individual connection speed. Without segmented transfers, I don't stand a chance of getting my data down here before it's stale.

Now, I could use a lot of workarounds: pack up the files in smaller chunks and use parallel transfers, move to a sane country, develop a socks proxy using parallel TCP connections, probably a few other solutions if I think more on it. Most of these, in essence, do exactly what segmented transfers would do, just in a more complicated fashion.

I fully understand how much of a kludge segmented transfers are, and I completely agree that in most cases they aren't really needed. But in some cases, like mine, they're invaluable. I really don't see why such a feature is refused based on "no need for it" - there *is* a need, there are clients filling that need, but none of them are of the same quality as FileZilla.

Now I'll go back to my buggy lftp, as CuteFTP is an ugly mess of crashes and brokenness. I'll return to using FileZilla as soon as I get stationed in a functioning country again :)

User avatar
botg
Site Admin
Posts: 33182
Joined: 2004-02-23 20:49
First name: Tim
Last name: Kosse
Contact:

Re: A Segmented Downloads workaround...

#15 Post by botg » 2016-01-12 14:30

You are dealing with a political issue. Technological workarounds are no solution to political issues.

Post Reply