- Do you dump your wiki(s) to XML?
- If you don't, is it because of some missing feature?
- If you do, how do you use it?
- What would you like to see that's missing?
- Is the possibility to host wiki dumps at the Internet Archive useful?
I believe the xml wikidump is way too incomplete as a backup, too many things essential for restoring a wiki are missing (interwiki table, user information, openID, etc.). Having a wikidump does not allow to restore a wiki to functionality.
I would not expect the dumps to be used in place of a backup; that's what db table dumps and snapshots are for. But I would be interested in knowing whether XML dumps are generated for other purposes (analysis, allowing users to downlowad the content in bulk, bot processing of content, etc).
it's interesting possibility to host dunmps on Internet archive. I also like the idea to use things like this Webcite bot that is walking through all the external links and archive the pages.
So does anyone use XML dumps for anything and what for? Do you back up your wikis and how?
Backup are of course done. Weekly a full DB backup, daily a differential backup. There are many soft doing that, we coded our own (GPL).
Full DB dump and files dump. XML dumps are for the wikis in which we don't have direct access to database (in other words, not OUR wikis)
Same as Katkov Yury, although we have a commons for media files so only one file repo.
From private email: «an easy database export that would also include a backup of the stores images would be nice-- ideally one that can be run from the command line to pull a full database dump (with an option not to include deleted pages)».
Extend support for URL customization options (in the HTML UI or via LocalSettings.php, not with complex configurations in multiple places).