I would like to save the current state of a publicly available website as (non legal) evidence in a fairly authentic way.
Is there a website/application for this? Like they store the MD5 hash of the given site and I download the site they see.
I would like to save the current state of a publicly available website as (non legal) evidence in a fairly authentic way.
Is there a website/application for this? Like they store the MD5 hash of the given site and I download the site they see.
There is the Wayback Machine, but they only save contents that the respective robots.txt file allows.
If you need this, you could use something like curl or wget (on Linux, I don't know if either is available natively on Windows, but they should be in CygWin, one at least should be available on MacOS) to get a full copy of (the parts of) the website that interest you, and store it away (perhaps add a note with URL and exact details how you got it, zip it up and sign with GnuPG). Might then write it to a CD for safekeeping at low cost.
Screenshots are pretty decent for non-legal purposes. They may be harder to get the full content of a website. But if you mostly need a particular page, snapping the URL and the pertinent content is usually sufficient, especially if there's a time/date visible in the image.
There are also multiple apps, utilities, and browser plugins that allow you to capture an entire webpage, which can be helpful if, like most pages, the content extends below a single window's height. https://www.google.com/search?q=screenshot+entire+webpage