Manual:Parameters to Special:Export


 * See also Manual:DumpBackup.php

Wiki pages can be exported in a special XML format to upload into another MediaWiki. See Help:Export for more details.

Available parameters
Below is the list of available parameters for Special:Export as of version 1.16. Not all of these are available through the Special:Export UI.

URL parameter requests do not work
The,   and   parameters only work for POST requests. GET requests through a URL are ignored.

When you use the URL as in a browser, you are submitting via GET. In the ruby script, you are using POST.

As an example, the following parameter request does not work, it returns all revisions of a page despite the parameter limit=5.

https://en.wikipedia.org/w/index.php?title=Special:Export&pages=XXXX&offset=1&limit=5&action=submit

Retrieving earliest 5 revisions
A POST request is generated by cURL when passing. The following retrieves the earliest 5 revisions from the English Wikipedia main page and its talk page:

curl -d "" 'https://en.wikipedia.org/w/index.php?title=Special:Export&pages=Main_Page%0ATalk:Main_Page&offset=1&limit=5&action=submit'

And here are the next 5 revisions of the main page only:

curl -d "" 'https://en.wikipedia.org/w/index.php?title=Special:Export&pages=Main_Page&offset=2002-01-27T20:25:56Z&limit=5&action=submit'

Here the timestamp from the last revision of the previous query is copied into the offset field of the URL. Because the offset field is non-inclusive, that 5th revision is not displayed again, and instead we get revisions 6-10.

POST request to download
A more explicit example, especially if you also want to save the darn thing, would be curl -d "&pages=Main_Page&offset=1&action=submit" https://en.wikipedia.org/w/index.php?title=Special:Export -o "somefilename.xml" The URL root needs to follow the MediaWiki parameters... also, note the fact that you need to add the curl parameters at the end for saving the file as something. Otherwise it will just scroll on your screen and nothing will be saved. Currently, it appears that Wikipedia servers are under maintenance, hence the above method is showing error and not providing the xml.

If you instead have the list of titles in a file, say, you must pass the list as a parameter to curl and encode the linefeeds correctly (for some reason,   and   do not work): curl -d "&action=submit&pages=$(cat title-list | hexdump -v -e '/1 "%02x"' | sed 's/\(..\)/%\1/g' )" https://en.wikipedia.org/w/index.php?title=Special:Export -o "somefilename.xml"

If you want to save bandwidth, append the following arguments as well: --compressed -H 'Accept-Encoding: gzip,deflate'

Stopping the export of your Mediawiki
If $wgExportAllowHistory is set to false in LocalSettings.php, only the current version can be exported, not the full history.

By default with GET requests, only the current (last) version of each page is returned.

If the $wgExportAllowHistory parameter is true in LocalSettings.php, and the "Include only the current revision, not the full history" is unchecked, then all versions of each page are returned.

To disable export completely, you need to set a callback-function in your LocalSettings.php: function removeExportSpecial(&$aSpecialPages) { 	unset($aSpecialPages['Export']); return true; } $wgHooks['SpecialPage_initList'][] = 'removeExportSpecial';

Keep in mind that that exporting is still possible, if you have the API enabled.