1

OK, so I've got wget 1.12 on Windows 7, and I can do basic downloads of it.

The site I'm trying to download: http://www.minsterfm.co.uk

and all images on it are stored externally at http://cml.sad.ukrd.com/image/

How can I download the site, and the external images and possibly allow all files to keep their original extension, without converting .php files to .htm

I would appreciate any help, since I'm new to wget.

Hennes
  • 65,804
  • 7
  • 115
  • 169

2 Answers2

3

The manual told us:

Actually, to download a single page and all its requisites (even if they exist on separate websites), and make sure the lot displays properly locally, this author likes to use a few options in addition to ‘-p’:

wget -E -H -k -K -p http://the.site.com

You'll have to combine that with some Recursive Download options. You'd rather use --wait=xx, --limit-rate=xxK and -U agent-string to not be blacklisted by the server…

Renaud
  • 484
0

I've used BlackWidow for downloadinging sites recursively on Windows.

It has the following features, but is not free:

  • Scripting Engine
  • User Friendly
  • NetSpy (Network Spy)
  • SnapShot (Web page snap shot)
  • Windows Explorer like site view
  • Powerful scan filters
  • Expendable parser
  • Wildcards & Regular Expressions
Sirex
  • 11,214