[NBLUG/talk] FTPing large files
Christopher Wagner
waggie at waggie.net
Thu Jan 7 20:41:52 PST 2010
I think the right focus is why it's failing. That's very odd, as I've
used FTP to transfer multi-gigabyte files without difficulty.
Do you have an error message that FTP is generating? Which FTP program
are you using? Is the destination server a generic web host? They might
be quashing the connection purposefully.
- Chris
jezra wrote:
> On Thu, 07 Jan 2010 17:52:49 -0800
> sean machin <smachin1000 at gmail.com> wrote:
>
>
>> Hi All,
>>
>> I'm writing a python script for my Centos server which (among other
>> things), tries to FTP a large (7GB) archive image to another server
>> across the WAN.
>> My script calls the curl program to do the upload. Curl always seems
>> to fail after a few 100MB however.
>>
>> Any ideas on how best to transfer this large file? I do not have SSH
>> access BTW so can't use scp.
>>
>> Thanks :)
>> Sean
>>
>> _______________________________________________
>> talk mailing list
>> talk at nblug.org
>> http://nblug.org/cgi-bin/mailman/listinfo/talk
>>
>
> Personally, I would use wget -c ftp://file/to/download for a
> download that is likely to be interupted. The "-c" flag will allow wget
> to pick up where it left off, so running wget -c ftp://file/to/download
> will only get the needed bytes.
>
> _______________________________________________
> talk mailing list
> talk at nblug.org
> http://nblug.org/cgi-bin/mailman/listinfo/talk
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://nblug.org/pipermail/talk/attachments/20100107/28a2fa12/attachment.htm
More information about the talk
mailing list