File Fragmentation

Come here to discuss FileZilla and FTP in general

Moderator: Project members

Post Reply
Message
Author
GateKeeper2000
500 Command not understood
Posts: 2
Joined: 2009-02-18 18:37

File Fragmentation

#1 Post by GateKeeper2000 » 2009-02-18 18:40

Hi,

I have been using FileZilla Client and Server for a while to move fairly large files on my home network but I have been finding that the resulting downloads are heavily fragmented (350Mb file ends up in 350 chunks for example!) and require significant defragmentation time afterwards. Is there a setting I am missing or is this an issue?

GK

User avatar
botg
Site Admin
Posts: 35509
Joined: 2004-02-23 20:49
First name: Tim
Last name: Kosse

Re: File Fragmentation

#2 Post by botg » 2009-02-18 23:31

Defragment your disks and make sure you've got sufficient continuous blocks of free disk space.

Don't worry much about it, with the advent of cheap SSD drives, disk fragmentation is a relic of the past.

User avatar
boco
Contributor
Posts: 26913
Joined: 2006-05-01 03:28
Location: Germany

Re: File Fragmentation

#3 Post by boco » 2009-02-19 23:05

OT: Trouble-free SSDs are not ready yet. Right now they suffer from internal fragmentation.

Do you transfer more than one files simultaneously? Depending on your FS, they could end up scattered because the OS tries to write to the next free block.
No support requests over PM! You will NOT get any reply!!!
FTP connection problems? Please read Network Configuration.
FileZilla connection test: https://filezilla-project.org/conntest.php
FileZilla Pro support: https://customerforum.fileZilla-project.org

User avatar
botg
Site Admin
Posts: 35509
Joined: 2004-02-23 20:49
First name: Tim
Last name: Kosse

Re: File Fragmentation

#4 Post by botg » 2009-02-19 23:25

What is "internal fragmentation"?

User avatar
boco
Contributor
Posts: 26913
Joined: 2006-05-01 03:28
Location: Germany

Re: File Fragmentation

#5 Post by boco » 2009-02-19 23:47

There is an article about the problem.
http://www.pcper.com/article.php?aid=669
It does apply at least to the intel ones, but may be relevant for the others, too. The used technology is very similar.
No support requests over PM! You will NOT get any reply!!!
FTP connection problems? Please read Network Configuration.
FileZilla connection test: https://filezilla-project.org/conntest.php
FileZilla Pro support: https://customerforum.fileZilla-project.org

GateKeeper2000
500 Command not understood
Posts: 2
Joined: 2009-02-18 18:37

Re: File Fragmentation

#6 Post by GateKeeper2000 » 2009-02-20 01:07

This happened as I was consolidating some data onto a drive for archiving, I had filled the drive to ~ 70% at which point I ran defrag (winXP built it - not the best but does ok) and got 0% fragmented and pretty contiguous free space. I then copied another 20GiB (200GB disc) of large files (300-500MiB) and thought I would just check the fragmentation, to my horror it was really high, one file of 350MB was in 312 segments! One at a time transfer - its slower to do it any other way esp. over a 1Gbps link.

I am not expecting 0% fragmentation but was really annoying to spend 4mins transfering data and then another 20 defraging the mess!

Its going to take a long while for the price of SSD to come down to the point where I can replace the 3+ TB of HDDs that are in use plus it isnt good to have crazy fragmentation whatever the hardware, the file system will add overhead for each non-contiguous segment due to the increased File Allocation Table size/complexity.

GK

User avatar
boco
Contributor
Posts: 26913
Joined: 2006-05-01 03:28
Location: Germany

Re: File Fragmentation

#7 Post by boco » 2009-02-20 04:08

The more a drive gets full, the higher the fragmentation. I guess the filesystem is NTFS, in this case there are unmovable files scattered around and the large files cannot be stored contiguously anymore.

For defragmentation, http://kessels.com/jkdefrag.
No support requests over PM! You will NOT get any reply!!!
FTP connection problems? Please read Network Configuration.
FileZilla connection test: https://filezilla-project.org/conntest.php
FileZilla Pro support: https://customerforum.fileZilla-project.org

magistar
500 Command not understood
Posts: 2
Joined: 2017-07-22 14:34
First name: S
Last name: d K

Re: File Fragmentation

#8 Post by magistar » 2019-06-14 12:16

With 1.7 GB of continuous free space on an 8 TB hard drive I am still seeing filezilla ftp server incomming files of 1 GB with 5000 fragments each. Is there a write buffer I can modify somewhere similar to what torrent clients allow? The server has 4GB of free ram so I could get at least 1 GB fragments. Os is Win10 with NTFS.

User avatar
boco
Contributor
Posts: 26913
Joined: 2006-05-01 03:28
Location: Germany

Re: File Fragmentation

#9 Post by boco » 2019-06-14 15:05

With only 1.7GB left on 8TB disk, expect heavy fragmentation. Unless you meant 1.7TB.

FileZilla Server does not support pre-allocation like FileZilla Client does, already. Maybe it's planned for the coming re-write, the developer would have to answer that.

Other than that, the files are not placed by FileZilla Server, they are placed by the filesystem driver (ntfs.sys). How and why this one works that way, we cannot know, as it is a blackbox and only Microsoft knows its secrets.

Windows 10 will regularly defrag spinning disks. It will also defrag SSDs, to a lesser extent and for a different reason (filesystem limitation of max. file extents stored in metadata).
No support requests over PM! You will NOT get any reply!!!
FTP connection problems? Please read Network Configuration.
FileZilla connection test: https://filezilla-project.org/conntest.php
FileZilla Pro support: https://customerforum.fileZilla-project.org

Post Reply