Help talk:Export

declaration of element contributor in the DTD is not complete
The declaration of element contributor does not include the case where there is no id nor ip, and instead contributor has "deleted" as an attribute. I came across one instance where neither id nor ip were present.

Getting full list of pages
The SQL select page_title from wiki_page where page_namespace=0 was from 2007 and does not work now.
 * select page_title from page where page_namespace=0 -- returns blobs.
 * However, this works for me -- select CONVERT(page_title using utf8) from `page` where page_namespace=0

--GhostInTheMachine (talk) 11:55, 24 December 2015 (UTC)

My road to update a MediaWiki 1.23 with PostgreSQL 8.4.9 to MediaWiki 1.30 with MariaDB 10.1
Ordinary update failed (could be that the OS was old). Using PostgreSQL for MediaWiki was a misstake, so I want to use MariaDB. This is how I exported ALL pages (with complete history) from the old wiki and import them into the new wiki.

/MikaelLindmark (talk) 15:47, 5 January 2018 (UTC)
 * 1) Read all the steps below before you start!
 * 2) Installed MediaWiki 1.30 and MariaDB 10.1. Did the basic setup.
 * 3) Add/change some parameters needed for my export on the "from" wiki (LocalSettings.php)
 * 4) Add/change some parameters needed for my import on the "target" wiki (php.ini)
 * 5) Find the name of every page in every namespace in the wiki. The SQL part (after the -c) should work on "any" SQL. The sed part change "|" into ":". The split makes the output into files with maximum 100 wiki pages each.
 * 6) For each file from the command above, copy the text into Special:Export and export it. Then rename it with the same number as the input file (so you keep track).
 * 7) Import the file with Special:Import.
 * 8) Repeat until all files are exported and imported.
 * 1) Find the name of every page in every namespace in the wiki. The SQL part (after the -c) should work on "any" SQL. The sed part change "|" into ":". The split makes the output into files with maximum 100 wiki pages each.
 * 2) For each file from the command above, copy the text into Special:Export and export it. Then rename it with the same number as the input file (so you keep track).
 * 3) Import the file with Special:Import.
 * 4) Repeat until all files are exported and imported.
 * 1) For each file from the command above, copy the text into Special:Export and export it. Then rename it with the same number as the input file (so you keep track).
 * 2) Import the file with Special:Import.
 * 3) Repeat until all files are exported and imported.

Export table made with a template to join data
Good morning,

I created a table with a lot of columns thanks to a template. I used this template to join data from different pages, thus it contains many different. I only found how to export each when I actually want to export the entire table. Does anyone have a solution for me? Thank you in advance. AnaisBce (talk) 07:49, 31 August 2018 (UTC)

Say how to export entire wiki
Document how to get every single page of a wiki in one step using Special:Export. (All pages in every namespace.) Jidanni (talk) 04:43, 5 November 2019 (UTC)
 * Oh. Exporting all the files of a wiki.Jidanni (talk) 04:46, 5 November 2019 (UTC)

Maximum number of titles
Mention the maximum limit of the number of titles the user can paste into the box. 100? 1000? No limit? If so mention it also. Jidanni (talk) 11:44, 29 November 2019 (UTC)

Feature suggestion: Ability to export edit histories without content
When implemented, this feature should be accessible through both the Special:Export interface and a HTTP GET URL parameter. 79.241.202.95 18:44, 28 February 2021 (UTC)


 * I was just looking if anyone else had already suggested this. But I have something to add:

I suggest exports to contain edit tags, like this for example:

Marcus Alanius (talk) 19:53, 30 March 2021 (UTC)