I don’t have a convenient zip file or anything like that for you cause I don’t really feel like writing something to make that. But it’s not too hard to archive my website! Or any simple website like this one, for that matter. Go grab a copy of wget
, either on a normal Linux machine, a VM, Windows Subsystem for Linux, or like, iSH or termux if all you’ve got is a phone.
Then run this command:
wget --mirror --convert-links --adjust-extension --page-requisites --no-parent 'https://artemis.sh'
Or, if you’re only interested in archiving just a specific post or page, run it but put the link to the post instead of the link to my home page.
This command will download the web page and anything else needed to display it like the CSS and images and stuff like that, all onto your local hard drive! It’ll also download any other pages the page links to, recursively, except it won’t follow links to other websites and it won’t follow links to different directories than your page starts in. That means if you start on my home page it’ll download my entire website, going through all the pages and following all the links to each other. But if you download just one post, it’ll only grab that post and all the images and stuff it uses.
It’ll also fix up all the links so you can browse the whole site locally without even using a web server. If you download the whole site you can just open up index.html
in your browser. How cool is that?