[HacktionLab] Static archive of site - the best command

Chris Clemson Chris.Clemson at GoGreenIT.net
Tue Jul 18 07:52:49 UTC 2023


wget get will do it, but if you want an easier GUI way, have you had a 
look at this:

http://www.httrack.com/

I've used it before and it's pretty good.

I'm sure you will be able to achieve the same thing with wget though, 
but I've not really tried to use it that way.


On 17/07/2023 23:23, Mike Harris wrote:
> Hi all, but especially Mick,
>
> Last year Mick gave a talk on recovering the old Schnews website and producing a static version of it by a certain clever use of curl or wget.
>
> What’s the best command to get a complete functional static version of the entirety of a website for all linked to content?
>
> I ask because I need to grab a site for someone that’s about to ‘go dark’ and no one can get the details to login and get to the file system side of things.
>
> Cheers,
>
> Mike.
>
> Mike Harris
>
> XtreamLab
> W: https://XtreamLab.net
> T: +44 7811 671 893
> _______________________________________________
> HacktionLab mailing list
> HacktionLab at lists.aktivix.org
> https://lists.aktivix.org/mailman/listinfo/hacktionlab

-- 
Chris Clemson - IT Contractor
Go Green IT Ltd
M: +447770916901
W: http://www.gogreenit.net




More information about the HacktionLab mailing list