Static website converter?
Does anyone know a good tool that can convert a dynamic website into a static website?
The Thimbleweed Park blog was built using PHP and a MongoDB database and it's quite complex. The website won't ever change and I like to turn it into a static site so I can host the files somewhere else and can shut down the server.
Ideally it would be a tool that you just point at the url and it scrapes the site, producing a bunch of .html files.
Does such a thing exist?
wget --recursive --no-parent <url>
Only issue might be that you really just get the front-end, no logins etc. but I've found it useful. Just takes five minutes to try so I think it's worth a try... Enjoy!
I suspect more than one of them will convert your front-end into a warc without issue. The trick is then rendering that warc as a non-awful output. I'd say this is very much a plan B if wget doesn't adjust paths for you.
I think it should do the trick.
wget --no-verbose \
-e robots=off \
--domains blog.thimbleweedpark.com \
Stay safe and have a nice-not-so-grumpy day.
Then you've already got all your site's posts in a database somewhere.
You should be able to write a basic PHP script to pull those entries in the database, and dump them to a bunch of HTML files.
brew install httrack
httrack "https://blog.thimbleweedpark.com/" -O "/blog.thimbleweedpark.com/" -v
(-O means the output directory -- so you can change the second part. -v mean verbose). If you need anything on another subdomain then:
httrack "https://blog.thimbleweedpark.com/" -O "/blog.thimbleweedpark.com/" "+*.thimbleweedpark.com//*" -v