Extension talk:Data Transfer

Adding prefix to title?
I'm looking to import a CSV file with at least 1,000 entries in it, but I want them to go into a separate namespace and provide transclusion only. Is there any way to achieve this?


 * What do you mean by transclusion? Yaron Koren (talk) 14:35, 18 October 2019 (UTC)


 * Like this: Transclusion. I don't want the actual page searchable in the wiki, but I want the data available for other pages to use. So i'm creating a custom namespace for "Asset" where all the assets data will get dumped into, and stored into cargo tables and then the actual article pages will run a query to find their relevant assets data.


 * Oh, I thought the transclusion thing was related to the data transfer part. You just need to add the namespace to every title in the CSV file. There are various ways to do that - one is by editing the data in a spreadsheet, then saving it back to CSV. Within a spreadsheet, you can create a separate column with just the namespace (and colon), then merge that column and the title column into one. Yaron Koren (talk) 16:47, 18 October 2019 (UTC)


 * That's what I was afraid of, was hoping I could apply something at time of import. But I think I can get awk to do what I need it to do. Thanks!

error with MW 1.34
/Special:ImportCSV Error from line 54 of ...extensions/DataTransfer/specials/DT_ImportCSV.php: Cannot access private property ImportStreamSource::$mHandle

Backtrace:


 * 1) 0 ...extensions/DataTransfer/specials/DT_ImportCSV.php(29): DTImportCSV->importFromUploadAndModifyPages
 * 2) 1 ...includes/specialpage/SpecialPage.php(575): DTImportCSV->execute
 * 3) 2 ...includes/specialpage/SpecialPageFactory.php(611): SpecialPage->run
 * 4) 3 ...includes/MediaWiki.php(296): MediaWiki\Special\SpecialPageFactory->executePath
 * 5) 4 ...includes/MediaWiki.php(900): MediaWiki->performRequest
 * 6) 5 ...includes/MediaWiki.php(527): MediaWiki->main
 * 7) 6 ...index.php(44): MediaWiki->run
 * 8) 7 {main}


 * Sorry about that - I just checked in what I think is a fix for this. Please let me know if there's still a problem! Yaron Koren (talk) 22:48, 7 January 2020 (UTC)


 * Thanks. I am now able to import the data, but there's an unrelated problem having to do with LF handling.Acnetj (talk) 00:59, 8 January 2020 (UTC)


 * What is the appropriate version to use for 1.34.x? REL1_34 results in v1.0.1 (1fc1c61) 04:42, 20 September 2019 - Revansx (talk) 23:41, 5 May 2020 (UTC)


 * You should never use the REL version of any of my extensions. You should either use the most recent version, or just the latest code. Yaron Koren (talk) 23:44, 5 May 2020 (UTC)


 * roger that .. now the trick is to remember to always check to see if an extension is one of yours. thx - Revansx (talk) 00:24, 6 May 2020 (UTC)


 * It's not just my extensions; it's any extension that has the "master" compatibility policy. Yaron Koren (talk) 02:18, 6 May 2020 (UTC)


 * Gotcha. I just learned something new. cool. - Revansx (talk) 02:59, 6 May 2020 (UTC)

LF handling
With the older version. a CSV field can include a new line (LF) and it is not parsed as a separate entry (it is to include free text for the wiki which has multiple lines. The current master (fixing the error above) however handles LF as if it is CRLF.

I changed this on DT_ImportCSV.php and the handling is correct: line 133: 		$table = str_getcsv( $csvString, "\n" );

line 133: 		$table = str_getcsv( $csvString, "\r\n" ); But it ignored extra LF on the field. Acnetj (talk) 01:31, 8 January 2020 (UTC)
 * This fix works for me in this particular instance (because I use control-enter for extra line in LibreOffice). I don't know for other instances. I think it should somehow respect the double quotation marks like for the commas.

Acnetj (talk) 01:52, 8 January 2020 (UTC)


 * Sorry about that! What a strange bug in PHP. I just checked in code that I think works better. Yaron Koren (talk) 19:35, 8 January 2020 (UTC)


 * I am getting

Unable to construct a valid title from "". with the latest update. Acnetj (talk) 02:13, 10 January 2020 (UTC)


 * What encoding is the file in, do you know? Yaron Koren (talk) 04:12, 10 January 2020 (UTC)
 * UTF-8. Just a simple file with one line of data plus header. Acnetj (talk) 19:18, 10 January 2020 (UTC)
 * I can't reproduce that problem. Was this exact file working before? Yaron Koren (talk) 20:21, 10 January 2020 (UTC)
 * Sorry I just found it was a problem on my end. Just bad data. Things are working as it should for now.Acnetj (talk) 21:40, 10 January 2020 (UTC)
 * Great, that's a relief! Yaron Koren (talk) 21:43, 10 January 2020 (UTC)

Problem with overwriting fields of the template
MW 1.31, using UTF-16 LE with signature CSV Import

I try to update some content for a specific template.

For example, existing content for page "Acilius":

Or it could look like this:

To update I use this file content (using option "Overwrite only fields contained in the file", other templates also existed in page):

Title,WoodhouseENELnames[Text] "Acilius","Ἀκύλιος, ὁ."

Result:

(I deliberately added space in the middle of each entity so that it will not get parsed here) So, for some strange reason the thumbnail, link and old text are maintained and some text is corrupted and/or turned into html entities. When I select "Overwrite existing content" or "Append to existing content" no such problems exist, but I cannot do that as other templates exist in those pages.

I suspect that the "|link=" bit is parsed as an extra field, when in fact it isn't.

I even tried removing first this bit from the content:

And then trying the import. And then I got this error:

Error: the column 0 header, 'ÿþTitle', must be either 'Title', 'Free Text' or of the form 'template_name[field_name]'

Edit: I have managed to partly resolve the import into field corruption by commenting out some regexes in: DataTransfer\includes\DT_PageStructure.php

// $page_contents = preg_replace( '//', '&# 123;&# 123;$1&# 125;&# 125;', $page_contents ); // escape out transclusions, and calls like "DEFAULTSORT" // $page_contents = preg_replace( '//', '&# 123;&# 123;$1&# 125;&# 125;', $page_contents );

But still, I cannot find how to stop breaking the line with the image link from

To

Or when the existing text contains instances like compare They get turned into

compare or even mixed with other words.

For example this text:

leave in the lurch: P. and V. λείπω, λείπειν

Looked like this when imported:

leave in the lurch: προλείπειν, ἀμείβειν (Plat. but rare P.), V. ἐξαμείβειν, ἐκλιμπάνειν.

Also, when importing text that has headers (many equals signs, as for example a 3rd level header), line breaks are lost and the header joins the text on top and text gets mangled. For example, when importing something like: of what kind? P. and V. ποῖος; indirect: P. and V. οἷος, ὁποῖος. ===adjective=== P. and V. πρᾶος, ἤπιος The result is:

of what kind? P. and V. ποῖος; indirect: P. and V. οἷος, ὁποῖος.===adjective===

So I am guessing that there is some pre-processing going on before the import which is affected by what the currently existing text looks like, which is quite strange.


 * Im having similar problems. MW 1.33.1 / Data Transfer 1.1.1. Using ImportCSV with the option Overwrite only fields contained in the file, the output has brackets like { in my existing content replaced by &# 123; etc. Commenting out the relevant lines in DataTransfer\includes\DT_PageStructure.php introduces new lines and hence white spaces. Am I using the feature incorrectly? Is there any workaround? Any comments? --Fehlerx (talk) 17:29, 30 May 2020 (UTC)


 * In the file DT_WikiTemplate.php line 33

Comment out those two lines, and it should take care of the unwanted line breaks. I had the same issue, and it was REALLY annoying. 05:42, 5 October 2020 (UTC)


 * I believe this problem has now finally been fixed in the Data Transfer code. Sorry for the very long delay. Yaron Koren (talk) 17:22, 18 February 2021 (UTC)

Internal error on Special:ImportSpreadsheet in MW 1.34.x
[XrIAjzTNAOBUFE21T0p@MQAAAAo] /test/Special:ImportSpreadsheet PHPExcel_Reader_Exception from line 73 of /opt/htdocs/mediawiki/vendor/phpoffice/phpexcel/Classes/PHPExcel/Reader/Excel2007.php: Could not open for reading! File does not exist.
 * MediaWiki	1.34.1 (b1f6480) 18:15, 30 April 20
 * PHP	7.2.30 (apache2handler)
 * Data Transfer	1.1.1 (1fc1c61) 04:42, 20 September 2019
 * phpoffice/phpexcel	dev-master

Backtrace:

- Revansx (talk) 00:22, 6 May 2020 (UTC)
 * 1) 0 /opt/htdocs/mediawiki/vendor/phpoffice/phpexcel/Classes/PHPExcel/IOFactory.php(281): PHPExcel_Reader_Excel2007->canRead(NULL)
 * 2) 1 /opt/htdocs/mediawiki/vendor/phpoffice/phpexcel/Classes/PHPExcel/IOFactory.php(191): PHPExcel_IOFactory::createReaderForFile(NULL)
 * 3) 2 /opt/htdocs/mediawiki/extensions/DataTransfer/specials/DT_ImportSpreadsheet.php(42): PHPExcel_IOFactory::load(NULL)
 * 4) 3 /opt/htdocs/mediawiki/extensions/DataTransfer/specials/DT_ImportCSV.php(60): DTImportSpreadsheet->importFromFile(ImportStreamSource, NULL, array)
 * 5) 4 /opt/htdocs/mediawiki/extensions/DataTransfer/specials/DT_ImportCSV.php(29): DTImportCSV->importFromUploadAndModifyPages
 * 6) 5 /opt/htdocs/mediawiki/includes/specialpage/SpecialPage.php(575): DTImportCSV->execute(NULL)
 * 7) 6 /opt/htdocs/mediawiki/includes/specialpage/SpecialPageFactory.php(611): SpecialPage->run(NULL)
 * 8) 7 /opt/htdocs/mediawiki/includes/MediaWiki.php(296): MediaWiki\Special\SpecialPageFactory->executePath(Title, RequestContext)
 * 9) 8 /opt/htdocs/mediawiki/includes/MediaWiki.php(900): MediaWiki->performRequest
 * 10) 9 /opt/htdocs/mediawiki/includes/MediaWiki.php(527): MediaWiki->main
 * 11) 10 /opt/htdocs/mediawiki/index.php(44): MediaWiki->run
 * 12) 11 {main}

Here's the pertinent lines from the debug data: Unstubbing $wgLang on call of $wgLang::_unstub from ParserOptions->__construct

[error] [XrIKvf9myczl4IVZkwmjCgAAAAs] /test/Special:ImportSpreadsheet ErrorException from line 39 of /opt/htdocs/mediawiki/extensions/DataTransfer/specials/DT_ImportSpreadsheet.php: PHP Warning: stream_get_meta_data expects parameter 1 to be resource, object given

[exception] [XrIKvf9myczl4IVZkwmjCgAAAAs] /test/Special:ImportSpreadsheet PHPExcel_Reader_Exception from line 73 of /opt/htdocs/mediawiki/vendor/phpoffice/phpexcel/Classes/PHPExcel/Reader/Excel2007.php: Could not open for reading! File does not exist. - Revansx (talk) 01:01, 6 May 2020 (UTC)

Any plans to switch to phpspreadsheet? (versus phpexcel)

 * MediaWiki	1.34.1 (b1f6480) 18:15, 30 April 20
 * PHP	7.2.30 (apache2handler)
 * Data Transfer	1.1.1 (1fc1c61) 04:42, 20 September 2019
 * phpoffice/phpexcel	dev-master

I upgraded phpexcel to phpspreadsheet and Data Transfer said no and that phpexcel is required, however all the documentation for phpexcel says that it is obsolete. Is there any talk of getting Data Transfer to use phpspreadsheet soon? - Revansx (talk) 00:22, 6 May 2020 (UTC)


 * The code has been updated 5 months ago to include phpspreadsheet. It checks if one of them( phpspreadsheet or phpexcel ) have been installed. Sen-Sai (talk) 12:50, 12 November 2020 (UTC)

Problems with importing code with {{
I wanted to import a bunch of articles that already had the template code in them. I do not want to have to separate them all out into separate items, because I want the template results to be in a list. But when I run "Import XML", I have 6000 pages and it says that I only have 11 pages. So obviously something is wrong. I wish there was a way to simply say to do a direct import without messing with any template stuff.


 * I finally figured out to properly setup an XML file, so it imports properly.

Error Import XML ?
MW 1.35.3, PHP 7.4.21,SMW 3.2.3 ,Data Transfer 1.2

I can import the CSV file but I hit the below error message when I import XML file.

[952b888cb21a4ba6e34ca73a] /demo/index.php/Special:%E5%AF%BC%E5%85%A5XML Error from line 66 of /home/sjkcyuhu/public_html/tbpedia.org/demo/extensions/DataTransfer/includes/specials/DT_ImportXML.php: Cannot access private property DTXMLParser::$mPages

Backtrace:


 * 1) 0 /home/sjkcyuhu/public_html/tbpedia.org/demo/extensions/DataTransfer/includes/specials/DT_ImportXML.php(35): DTImportXML->modifyPages(ImportStreamSource, string, string)
 * 2) 1 /home/sjkcyuhu/public_html/tbpedia.org/demo/includes/specialpage/SpecialPage.php(600): DTImportXML->execute(NULL)
 * 3) 2 /home/sjkcyuhu/public_html/tbpedia.org/demo/includes/specialpage/SpecialPageFactory.php(635): SpecialPage->run(NULL)
 * 4) 3 /home/sjkcyuhu/public_html/tbpedia.org/demo/includes/MediaWiki.php(307): MediaWiki\SpecialPage\SpecialPageFactory->executePath(Title, RequestContext)
 * 5) 4 /home/sjkcyuhu/public_html/tbpedia.org/demo/includes/MediaWiki.php(940): MediaWiki->performRequest
 * 6) 5 /home/sjkcyuhu/public_html/tbpedia.org/demo/includes/MediaWiki.php(543): MediaWiki->main
 * 7) 6 /home/sjkcyuhu/public_html/tbpedia.org/demo/index.php(53): MediaWiki->run
 * 8) 7 /home/sjkcyuhu/public_html/tbpedia.org/demo/index.php(46): wfIndexMain
 * 9) 8 {main}


 * Sorry about that - this was fixed about a week ago. Yaron Koren (talk) 17:32, 22 September 2021 (UTC)

ImportCSV too slow
I'm importing a CSV with 650 rows. It only imported about 50 after 10 hours. 1. What can I do to accelerate this import process?

I have noticed the process goes background and keep creating. 2. How can I kill the import process?


 * You can speed things up by calling the script runJobs.php, within MediaWiki's /maintenance process. Conversely, if you want to stop the import, go into MediaWiki's "job" database table and delete the rows in that table. Yaron Koren (talk) 17:30, 22 September 2021 (UTC)