[HacktionLab] Static archive of site - the best command

m3shrom m3shrom at riseup.net
Tue Jul 18 07:57:40 UTC 2023


This has some good content

https://www.stevenmaude.co.uk/posts/archiving-a-wordpress-site-with-wget-and-hosting-for-free

It's focused on wordpress but potentially relevant for other content.

Sample command I used for a wp network.

wget --page-requisites --convert-links --adjust-extension --mirror 
--span-hosts --domains=mcrblogs.co.uk,www.mcrblogs.co.uk,edlab.org.uk 
<about:blank>mcrblogs.co.uk/afrocats

nice one
mick

On 17/07/2023 23:23, Mike Harris wrote:
> Hi all, but especially Mick,
>
> Last year Mick gave a talk on recovering the old Schnews website and producing a static version of it by a certain clever use of curl or wget.
>
> What’s the best command to get a complete functional static version of the entirety of a website for all linked to content?
>
> I ask because I need to grab a site for someone that’s about to ‘go dark’ and no one can get the details to login and get to the file system side of things.
>
> Cheers,
>
> Mike.
>
> Mike Harris
>
> XtreamLab
> W:https://XtreamLab.net
> T: +44 7811 671 893
> _______________________________________________
> HacktionLab mailing list
> HacktionLab at lists.aktivix.org
> https://lists.aktivix.org/mailman/listinfo/hacktionlab
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.aktivix.org/pipermail/hacktionlab/attachments/20230718/193a15db/attachment.html>


More information about the HacktionLab mailing list