[NBLUG/talk] FTPing large files

Lincoln Peters anfrind at gmail.com
Thu Jan 7 20:54:39 PST 2010


On Thu, Jan 7, 2010 at 5:52 PM, sean machin <smachin1000 at gmail.com> wrote:
> I'm writing a python script for my Centos server which (among other
> things), tries to FTP a large (7GB) archive image to another server
> across the WAN.
> My script calls the curl program to do the upload.  Curl always seems to
> fail after a few 100MB however.

Does FTP even work for files larger that large?  I seem to remember
running into an issue at work where we could not upload files larger
than a few gigabytes (can't remember the exact cut-off) via FTP due to
some sort of 32-bit limitation.  We were using PyCurl on the client (a
Python interface to the Curl libraries) and vsftpd on the server, both
running RHEL 5.3.

I don't remember exactly how the transfer would fail, though.

>
> Any ideas on how best to transfer this large file?  I do not have SSH
> access BTW so can't use scp.

That is unfortunate, since in our experience SFTP didn't seem to have
any such problems, even when using the same PyCurl library.


-- 
Lincoln Peters
<anfrind at gmail.com>



More information about the talk mailing list