[NBLUG/talk] FTPing large files
sjs at sonic.net
Thu Jan 7 22:31:20 PST 2010
> On Thu, Jan 7, 2010 at 5:52 PM, sean machin <smachin1000 at gmail.com> wrote:
>> I'm writing a python script for my Centos server which (among other
>> things), tries to FTP a large (7GB) archive image to another server
>> across the WAN.
>> My script calls the curl program to do the upload. Curl always seems to
>> fail after a few 100MB however.
> Does FTP even work for files larger that large? I seem to remember
> running into an issue at work where we could not upload files larger
> than a few gigabytes (can't remember the exact cut-off) via FTP due to
> some sort of 32-bit limitation. We were using PyCurl on the client (a
> Python interface to the Curl libraries) and vsftpd on the server, both
> running RHEL 5.3.
> I don't remember exactly how the transfer would fail, though.
From the FTP entry on wikipedia there used to be a 4Gb (1998) but now
file size is probably unlimited.
"The original protocol has a file size limit of 32 MB, although this was
extended when RFC 2347 <http://tools.ietf.org/html/rfc2347> introduced
option negotiation, which was used in RFC 2348
<http://tools.ietf.org/html/rfc2348> to introduce block-size negotiation
in 1998 (allowing a maximum of 4 GB and potentially higher). If the
server and client support block number wraparound, file size is
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the talk