Filezilla Pro S3 Data Transfer Costs

Need help with FileZilla Client? Something does not work as expected? In this forum you may find an answer.

Moderator: Project members

Post Reply
Message
Author
lnadolski
500 Command not understood
Posts: 2
Joined: 2018-12-03 19:26
First name: Lee
Last name: Nadolski

Filezilla Pro S3 Data Transfer Costs

#1 Post by lnadolski » 2018-12-03 21:02

I am trying to estimate the monthly cost to access data stored in AWS S3 buckets when copying files(objects) from S3 bucket to my local machine using FileZilla Pro.

Based on logging information provided by FileZilla Pro, I believe there to be an HTTP request for each file(object) and some charge based on amount of data for what AWS calls "Data Transfer OUT From Amazon S3 To Internet" (https://aws.amazon.com/s3/pricing/#Data ... er_pricing_).

Besides the costs for the HTTP requests and data transfer pricing, are there any additional costs that I should be considering when trying to estimate monthly costs to copy data from S3 bucket to my local machine(s) ?
Last edited by lnadolski on 2018-12-04 00:01, edited 1 time in total.

User avatar
botg
Site Admin
Posts: 31608
Joined: 2004-02-23 20:49
First name: Tim
Last name: Kosse
Contact:

Re: Filezilla Pro S3 Data Transfer Costs

#2 Post by botg » 2018-12-03 22:03

Data storage pricing.

lnadolski
500 Command not understood
Posts: 2
Joined: 2018-12-03 19:26
First name: Lee
Last name: Nadolski

Re: Filezilla Pro S3 Data Transfer Costs

#3 Post by lnadolski » 2018-12-04 00:00

Thanks for the reply @botg

I was looking specifically for costs associated with the copying of data from AWS S3 to local machine(s). Apologies if my question(s) were unclear. I have edited original post.

Cheers

User avatar
botg
Site Admin
Posts: 31608
Joined: 2004-02-23 20:49
First name: Tim
Last name: Kosse
Contact:

Re: Filezilla Pro S3 Data Transfer Costs

#4 Post by botg » 2018-12-04 08:52

Assuming a stable connection:

For downloads it's requests to enumerate objects per common shared slash-delimited prefix and one GET per object enumerated.

Uploads are more complicated: Directories are also enumerated.
- Files less than 5 MB are uploaded in a single request
- Files larger than 5 MB are uploaded in multiple parts in order to allow resume on unstable connections. Part size is dynamic, with each part about 30 seconds in size, but no less than 5MB. Apart from one request per part, there's also enumerating existing multi-part uploads, initiating new multi-part uploads and finishing a multi-part upload.



Each enumeration can be multiple requests if the server decides to truncate the answer, if I remember correctly Amazon's S3 doesn't return more than 10000 items in reply.

Post Reply