[NBLUG/talk] ncftpget: recursive downloads

Dave Sisley dsisley at sonic.net
Thu May 20 22:50:27 PDT 2010


Thanks, Mark.  wget wins the prize.  It's downloading now.

I just don't get why ncftpget was so selective with regards to the -R 
(recursive) flag.  I like ncftp a lot, especially for scripting, but it 
let me down here.

...and to Sean:  I can't use scp.  The original server is a Network 
Solutions site, and I don't have shell access, just ftp.  For the 
record, I'm moving the site in question to a new space at Sonic.net, 
with shell access (The way it's supposed to be!).

-dave.

Mark Street wrote:
> Hi Dave,
>
> Take a look at wget, it works well for this sort of stuff.
>
> On 5/20/2010 5:24 PM, Dave Sisley wrote:
>> Hey, all:
>>
>> I'm trying to move a website from one server to another, and went 
>> googling for a way to move the files & directories with FTP
>>
>> The straightforward method would seem to be:
>>
>> ncftpget -R -v -u "username" remote-server.com /local-directory 
>> remote-directory
>>
>> ...only it doesn't quite work.  As written, the command only 
>> downloaded files, but not directories from the server.
>>
>> If I change add '/*' to the remote directory argument, it downloads 
>> some, but not all, of the subdirectories.
>>
>> Anybody out there know why this doesn't work?  Data transfer limits?  
>> Timeouts?  Goblins?
>>
>> I will be grateful for any clues.
>>
>> -dave.
>>
>
>
> _______________________________________________
> talk mailing list
> talk at nblug.org
> http://nblug.org/cgi-bin/mailman/listinfo/talk
>


-- 
Dave Sisley
dsisley at sonic.net
roth-sisley.net




More information about the talk mailing list