Manual talk:Importing XML dumps

link tables?
(regarding mwdumper import) I want to avoid the expensive rebuildall.php script. Looking at enwiki/20080724/, I'm wondering - should we import ALL of the SQL dump files, or are there any that should be skipped? --JaGa 00:50, 23 August 2008 (UTC)
 * OK, I went through maintenance/tables.sql, and compared what an importDump.php populates and what mwdumper populates (only page, revision, and text tables), so I'm thinking this is the list of SQL dumps I'll want after mwdumper finishes:


 * category
 * categorylinks
 * externallinks
 * imagelinks
 * pagelinks
 * redirect
 * templatelinks


 * Thoughts? --JaGa 07:04, 24 August 2008 (UTC)

When I try to import using this command: C:\Program Files\xampp\htdocs\mediawiki-1.13.2\maintenance>"C:\Program Files\xampp\php\php.exe" importDump.php C:\Users\Matthew\Downloads\enwiki-20080524-pages-articles.xml.bz2

It fails with this error: XML import parse failure at line 1, col 1 (byte 0; "BZh91AY&SYö┌║O☺Ä"): Empty document

What do you think is wrong?

table prefix
I have a set of wikis with a different table prefix for each of them. How to I tell importDump.php which wiki to use?


 * Set $wgDBprefix in AdminSettings.php —Emufarmers(T 11:10, 25 February 2009 (UTC)

Importing multiple dumps into same database?
If we try to import multiple dumps into the same database, what happens?

Will it work this way?

For example, if there are are two articles with the same title in both databases, what will happen?

Is it possible to import both of them into the same database and distinguish titles with prefixes?

Merging with an existing wiki
How do I merge the dumps with another wiki I've created without overwriting existing pages/articles?

.bz2 files decompressed automatically by importDump.php?
It seems inly .gz files, not .bz2, are decompressed on the fly. --Apoc2400 22:40, 18 June 2009 (UTC)


 * Filed as bug 19289. —Emufarmers(T 05:15, 19 June 2009 (UTC)

Add

to the importFromFile function

Having trouble with importing XML dumps into database
I have been trying to upload one of the latest version of the dumps, pages-articles.xml.bz2 from enwiki/20090604/. I dont want the front end and other things that comes with wikimedia installations, so i thought i would just create the database and upload the dump. I tried using mwdumper, but it breaks with the following error. 18328I also tried using mwimport, that also failed due to the same problem. any one have any suggestions to import the dump successfully to the database ?

Thanks Srini

Error Importing XML Files
A colleague has exported Wikipedia help contents and when attempting to import ran into an error. One of the errors had to do with Template:Seealso. The XML that is produced has a tag which causes the import.php module to error out. If I remove the line from the XML the imports just fine. We are using 1.14.0. Any thoughts?


 * I am using 1.15., and I get the following errors:


 * Warning: xml_parse [function.xml-parse]: Unable to call handler in_ in /home/content/*/h/s/*hscentral/html/w/includes/Import.php on line 437




 * Warning: xml_parse [function.xml-parse]: Unable to call handler out_ in /home/content/*/h/s/*hscentral/html/w/includes/Import.php on line 437


 * By analyzing what entries kill the script, I found that it is protected redirects- these errors come when a page has both and the lines. Manually removing the restrictions line makes it work. I get these errors both from importdump.php and in my browser window on special:import when there is a protected redirect in the file. 76.244.158.243 02:55, 30 September 2009 (UTC)

simple download updated import.php from here: http://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3/includes/Import.php?view=co and replace original file in /includes directory. work fine!


 * xml2sql has the same problem:

xml2sql-0.5/xml2sql -mv commonswiki-latest-pages-articles.xml unexpected element xml2sql-0.5/xml2sql: parsing aborted at line 10785 pos 16. 212.55.212.99 12:22, 13 February 2010 (UTC)

Error message
The error message I get is "Import failed: Loss of session data. Please try again." Ikip 02:50, 27 December 2009 (UTC)

Fix: I got this error while trying to upload a 10 MB file. After cutting it down into 3.5 MB pieces, each individual file received "The file is bigger than the allowed upload size." error messages. 1.8 MB files worked though. --bhandy 19:24, 16 March 2011 (UTC)
 * THANK YOU! This was driving me mad! LOL But your fix worked. ;) Zasurus 13:00, 5 September 2011 (UTC)

Another Fix: Put the following into your .htaccess file (adjust these figures according to the size of your dump file):

php_value upload_max_filesize 20M php_value post_max_size 20M php_value max_execution_time 200 php_value max_input_time 200

Does NOT allow importing of modified data on my installation
If I export a dump of the current version using dumpBackup.php --current, then make changes to that dumped file, then attempt to import the changed file back into the system using importDump.php, NONE of the changes come through, even after running rebuildall.php.

Running MW 1.15.1, SemanticMediaWiki 1.4.3.

Am I doing something wrong, or is there a serious bug that I need to report? --Fungiblename 14:09, 13 April 2010 (UTC)

And for the necro-bump.... yes, I was doing something wrong.
For anyone else who has run into this problem, you need to delete revision IDs from your XML page dumps if you want to re-import the XML after modifying it. Sorry for not posting this earlier, but this issue was addressed almost instantly as invalid in response to an admittedly invalid bug report that I filed on Bugzilla in 2010: This is exactly how it's supposed to work to keep you from overwriting revisions via XML imports. --Fungiblename 07:57, 21 September 2011 (UTC)

Error message: PHP Warning: Parameter 3 to parseForum
Two errors: PHP Deprecated: Comments starting with '#' are deprecated in /etc/php5/cli/conf.d/imagick.ini on line 1 in Unknown on line 0

PHP Warning: Parameter 3 to parseForum expected to be a reference, value given in /home/t/public_html/deadwiki.com/public/includes/parser/Parser.php on line 3243

100 (30.59 pages/sec 118.68 revs/sec)

Adamtheclown 05:11, 30 November 2010 (UTC)

XML that does NOT come from a wiki dump
Can this feature be used on an xml file that was not created as, or by, a wiki dump? I am looking for a way to import a lot of text documents at once, that can be wikified later. Your advice, wisdom, insight, etc, greatly appreciated.
 * NO - XML is a structure not a format, so the mediawiki xml-reader only accepts xml-dumps for mediawiki or simalary formated xml.

Altered Display Titles
Hi, I am using MediaWiki 1.18.2 and I have been experimenting with the XML Import process.

I have notice that the title actually displayed on each page that has been altered by the Import process appears as Special:Import until it is edited and saved again. I assume that this is supposed to indicate that the page has been edited by the Import process, but it can be very confusing to less knowledgeable users and it also means the page RDF links produced with the SemanticMedia wiki process are incorrectly rendered.

I have noticed a similar process of the display name being cosmetically changes after other forms of mass edits, such as the MassEditRegex extension, so I assume this probably more of a core Mediawiki process, but I have not been able to find any information about this issue.

I would love to be able to turn this feature off, or perhaps at least be able to hide it for certain groups of users, any help would be greatly appreciated.

Thanks Jpadfield (talk) 11:41, 5 April 2012 (UTC)

No Page error
When I try to import a template XML file from Wikipedia, I receive an error message that says "No page available to import." Any ideas why it won't find the XML file and what work arounds? 12.232.253.2 15:56, 26 April 2012 (UTC)


 * First thing to check is that the XML file actually has any nodes within. --98.210.170.91 18:49, 5 May 2012 (UTC)