This small script allows you to download the export.json of wikis for each domain in a text file provided.
Consider a file domains containing
glossary.asia.wiki.org
fed.wiki.org
hello.ward.bay.wiki.org
about.fed.wiki
pods.wiki.org
which is then passed to the script like
./backup-wiki < domains
If you invoke this command, a subfolder sites is created, where the backups are being placed.
Consider the same domains file, but the command
./backup-wiki-pages < domains
which will create a subfolder pages within each domain is represented separately.
MIT, see attached LICENSE file