Manual:Parameters to Special:Export

Wiki pages can be exported in a special XML format to upload import into another MediaWiki installation (if this function is enabled on the destination wiki, and the user is a sysop there) or use ithttp://www.mediawiki.org/w/index.php?title=Manual:Parameters_to_Special:Export&action=edit elsewise for instance for analysing the content. See also Syndication feeds for exporting other information but pages and Help:Import on importing pages. See Help:Export for more details.

Available parameters
for Special:Export.


 * pages

A list of page titles, separated by linefeed (%0A) characters.


 * action

Unused; set to "submit" in the export form


 * dir

Should be set to "desc" to retrieve revisions in reverse chronological order.

The default, with this parameter omitted, is to retrieve revisions in ascending order of timestamp (newest to oldest).


 * offset

The timestamp at which to start, which is non-inclusive. The timestamp may be in several formats, including the 14-character format usually used by MediaWiki, and an ISO 8601 format like the one output in the XML dumps.


 * limit

The maximum number of revisions to return. If you request more than a site-specific maximum (1000 on Wikipedia at present), it will be reduced to this number.

This limit is cumulative across all the pages specified in the pages parameter. For example, if you request a limit of 100, for two pages with 70 revisions each, you will get 70 from one and 30 from the other.

These were added later. addcat returns all members of the category catname added to it.
 * addcat catname

For example, the following is for all pages in en:Category:Books:

http://en.wikipedia.org/w/index.php?title=Special:Export&addcat&catname=Books&pages=XXXX


 * templates

Includes any transcluded templates on any pages listed for export.

curonly or history

Include only the current revision or the full history. (default is curonly)


 * wpDownload

Save as file.

URL parameter requests do not work
The dir, offset and limit parameter only work for POST requests. GET requests through a URL are ignored.

When you use the URL as in a browser, you are submitting via GET. In the ruby script, you are using POST.

As an example, the following parameter request does not work, it returns all revisions of a page despite the parameter limit=5.

http://en.wikipedia.org/w/index.php?title=Special:Export&pages=XXXX&offset=1&limit=5&action=submit&history

Retrieving earliest 5 revisions
A POST request is generated by cURL when passing. The following retrieves the earliest 5 revisions from the English Wikipedia main page:

curl -d "" 'http://en.wikipedia.org/w/index.php?title=Special:Export&pages=Main_Page&offset=1&limit=5&action=submit'

And here is the next 5 revisions:

curl -d "" 'http://en.wikipedia.org/w/index.php?title=Special:Export&pages=Main_Page&offset=2002-01-27T20:25:56Z&limit=5&action=submit'

Here the timestamp from the last revision of the previous query is copied into the offset field of the URL. Because the offset field is non-inclusive, that 5th revision is not displayed again, instead we get revisions 6-10.

Stopping the export of your mediawiki
If $wgExportAllowHistory is set to false in LocalSettings.php, only the current version can be exported, not the full history.

By default, only the current (last) version of each page is returned.

If the $wgExportAllowHistory parameter is true in LocalSettings.php, and the "Include only the current revision, not the full history" is unchecked, then all versions of each page are returned.