how to use the wget command to copy whole website to local

Mark C. Allman mcallman at allmanpc.com
Wed Sep 26 02:24:51 UTC 2012


On Tue, 2012-09-25 at 19:06 -0700, Dave Stevens wrote:
> Quoting yujian <yujian4newsgroup at gmail.com>:
> 
> > 于 2012/9/26 9:45, Dave Stevens 写道:
> >> Quoting yujian <yujian4newsgroup at gmail.com>:
> >>
> >>> I want to copy a website to my local disk. I use the command wget -r
> >>> www.example.com, but I find that only html copyed.
> >>
> >> what else were you expecting to be copied? And have you read the  
> >> man page? Or maybe an on-line tutorial?
> >>
> >> Dave
> >>
> >>> -- 
> >>> users mailing list
> >>> users at lists.fedoraproject.org
> >>> To unsubscribe or change subscription options:
> >>> https://admin.fedoraproject.org/mailman/listinfo/users
> >>> Guidelines: http://fedoraproject.org/wiki/Mailing_list_guidelines
> >>> Have a question? Ask away: http://ask.fedoraproject.org
> >>>
> >>
> >>
> >> All the files in the website,  such as pdf, doc and exe file.  I  
> >> saw the man page, so that I use wget -r to try to download it.
> 
> maybe it would help if you tell us the command line you used.
> 
> D

Looks to me like the the command the OP used was:
    wget -r <url>

I understand the original question but I've never tried to download a
complete site.  The "-m" and "-p" switches look interesting.  I'd first
take a few minutes to work through the "man" page.  Looks like lots of
good documentation there.

-- 
Mark C. Allman, PMP, CSM
Founder, See How You Ski
Allman Professional Consulting, Inc., www.allmanpc.com
617-947-4263, Twitter:  @allmanpc




More information about the users mailing list