Possible Duplicate:
How can I download an entire website
I frequently encounter webpages that offer manual pages or other info accessible only via a table of contents consisting of links to individual chapters or paragraphs. Often the individual leaf pages then consist of a few lines only, so traversing the entire tree is extremely cumbersome.
What I am seeking is a tool that would allow me to pull and combine all pages referenced by the links of a starting page into a single concatenated html document, such that one could e.g., save that page and/or linearly scroll through all child pages without having to click and go back 1000 times. This would also allow to print the entire collection to have a manual or search through it in one go, etc.
Does anyone know a good tool to achieve that? Ideally such a tool would offer some exclusion criteria (like ignore all "back" links or the link to help or home pages that is found on each page, etc.).