Extension talk:External Data

Fetch data from external RDF-file
As I get it, this is not supported by this extension as of now (Apr. 2020). Would it make sense to add it, or is there some other way to do that?

Scenario
We have a list of Open Hardware projects, each hosted in its own git repository. Each such repository has a meta-data.ttl (RDF/Turtle) file in it. In our MediaWiki page, we would like to have a table showing this meta-data, on the wiki page for the respective project.

fetchURL error after upgrade
Hello... I recently upgraded mediawiki to 1.33 and can no longer use #get_web_data. I get errors about a string being passed to Http::get instead of an array. Even using the sample code/urls from the ExternalData wiki get the same error.

[XUNJvirhcjlnb-i73WIgxAAAA@g] /wiki/index.php?title=X&action=submit TypeError from line 98 of /myurl.com/wiki/includes/http/Http.php: Argument 2 passed to Http::get must be of the type array, string given, called in /myurl/wiki/extensions/ExternalData/includes/ED_Utils.php on line 873

Any chance someone has seen this before?


 * Are you using the very latest External Data code? This bug may have just been fixed a few days ago. Yaron Koren (talk) 02:25, 2 August 2019 (UTC)

Yeah. I just updated now to be sure, and am still seeing the same error. I did do a manual update from an older version of MW. Made the necessary updates to the syntax in my LocalSettings file. Perhaps I have missed something?


 * Are you sure you're using the latest ED code? ED_Utils.php doesn't call Http::get any longer - it now calls HttpWithHeaders::get. Yaron Koren (talk) 15:09, 2 August 2019 (UTC)


 * According to Yaron Koren's comments you need to switch that extension to the master branch. This solves the problem in my case: "MW 1.33/REL1_33" with "External Data master" Spas.Z.Spasov (talk) 21:24, 21 November 2019 (UTC).

Populating Wiki Page from CSV spreasdsheet
Hi there! SO, I run a website with a wiki. … I'm interested in having wiki pages contain an entry populated from a spreadsheet I will be hosting on my own website (Encyclopedia_Greyhawkania_DATA ONLY.csv).

As an example, I'd like it to look similar to this example page where I've manually hand-written entries from this index-spreadsheet (without calling it from the spreadsheet. It's the bit at the bottom under "Encyclopedia Greyhawkania".

I'd like to call data from the spreadsheet using the as part of the call parameter. Ideally what it'd do is use the to search ColumnA, for example, then return any row (with cells A through D) that has the  in it, and have it in a table or a list of some kind.

Is there any chance I can get some help with how to call the data?


 * Something like this might work, if I understand the question correctly:


 * Yaron Koren (talk) 17:00, 24 July 2019 (UTC)
 * Thank you so much Yaron Koren!!!
 * I used a variation of that, replacing the "! A" with just the direct title of the header (e.g. "Topic"), instead of using the parameter.
 * I got it to generate the table, but, with no rows below the headers.  I wasn't able to make it work with using  . I even tried using a different parameter (e.g. "Sulm"). It simply wasn't populating the table, but, the Extension troubleshooting section says that might've been because of the "$wgTimeOut" thingie ... so I updated that in my LocalSettings.php, and sure enough, even with up to 300 seconds, it's giving a "500 error".
 * Here's what I'm using, on the off chance that you might be able to help refine it more.


 * I don't know if you're able to help troubleshooting it, but, here's the address for the page I'm using as an example.
 * -IcarusATB (talk) 16:20, 25 July 2019 (UTC)


 * That's strange - I just tried out that exact wikitext on one of my wikis, and it worked out fine, displaying five well-formatted rows (plus the header row). Is it still not working for you? Yaron Koren (talk) 18:44, 26 July 2019 (UTC)
 * First, why would the behaviour of the code change on my end?  I ahven't changed the code further, therefore, it's still getting the same result.   That's really not "strange", at all.  What it indicates, to me, is that if you're writing, testing, and checking it on your server and it works fine, there's likely something on your server that isn't on mine.  Probably an extension, addon, plugin, whatever, that I don't konw to install, or something else, other than the code, that's making it behave differently.
 * IcarusATB (talk) 01:49, 27 July 2019 (UTC)


 * I thought maybe the CSV data was temporarily down or something for you, but working when I tried it. I still think it's strange. Yes, we have different setups, but I can't think of any reason why we'd be seeing these different results, assuming you're using the latest External Data code. Yaron Koren (talk) 18:04, 29 July 2019 (UTC)

Get template data from another wiki
Hello, Can I use this extension to do a semantic query from another site?

If I have a a page "Foo" on SiteA that contains a template "Foo" as: and I want to create a page on SiteB called "Foo Status" that contains a template "Foo" as:

is this possible?


 * Yes - if the source wiki stores its data via either SMW or Cargo, both of those extensions provide an API of sorts to let you get query results in either CSV or JSON formats, either of which is parseable by External Data. Yaron Koren (talk) 18:46, 17 April 2019 (UTC)


 * Thanks, Yaron. Any chance you could point me to an SMW example of this? :-)


 * I can't think of one. Yaron Koren (talk) 02:18, 18 April 2019 (UTC)

prerequisites for getting LDAP data
Hi.

I'm trying to use the #get_ldap_data: function of "External Data" to get LDAP attributes about my users and I'm getting the following error: Fatal error: Call to undefined function ldap_connect in /opt/htdocs/mediawiki/extensions/ExternalData/ED_Utils.php on line 136

A quick "grep -R ldap_connect" in the "extensions/External Data" folder shows only a use-call to ldap_connect and nothing anywhere actually defining it.

A quick search online of "ldap_connect" seems to indicate that it is defined in the PHP module "PHP-LDAP"

A quick inspection of my phpinfo page shows that the php-ldap module is not loaded.

Before I install the php-ldap module on my system, can someone confirm that php-ldap is indeed a pre-requisite for the #get_ldap_data: function of "External Data" to work. Alex Mashin (talk) 18:03, 1 August 2020 (UTC)
 * I confirm it.

[Solved] Poor man's Sync from WikiA to WikiB
Hi.

Can I use this extension as a real-time clone of a page in one wiki to another?

For example, Mediawiki Site A has a page called "Foo" with arbitrary text.

Can I create a page called "Foo" on another wiki that contains something like:

and functionally get a copy of the page Foo on Site A?


 * Well, I'm pretty sure you can get the right wikitext - after all, a single piece of text is valid CSV. Whether it'll display correctly is a different story - template calls won't work, for instance, unless you have a local copy of those templates. It's probably easier to just do an iframe, using one of the iframe-supporting extensions, with "action=render" on the source URL to leave out the skin. Yaron Koren (talk) 02:25, 18 April 2019 (UTC)


 * The ideal way is also the poor man's way.. set up a PyWikiBot (it's free) and use the "https://www.mediawiki.org/wiki/Manual:Pywikibot/transferbot.py transferbot.py] script. - Revansx (talk) 17:10, 13 May 2020 (UTC)


 * That would actually move over the contents, which is different. Having two different copies of the content, both of which can be edited, is not necessarily ideal. Yaron Koren (talk) 17:31, 13 May 2020 (UTC)


 * Well, if a cron job was used to run the PWB script such that WikiB was always "synch'ed" with what WikiA had.. then it would meet the need of this topic as it was originally expressed when I wrote it. - Revansx (talk) 23:45, 13 May 2020 (UTC)

Nested SQL functions throw a rdbms error



 * Per an email with Yoren, "There's no special handling of replace or any other commands."
 * This would lead me to think the below syntax should work... but it does not.


 * Does anyone have any suggestions or corrections to the code I'm trying to get working? Much appreciated if you do.

Unable to perform nested SQL REPLACE functions
I am attempting to query data from a local database, but need to replace 3 different characters with alternatives that won't trigger mediawiki parsing of the result. No matter the format I have tried, I receive the error pictured to the right when trying to display the page.

This snippet works, but only replaces one of the three characters necessary: I need the below SQL code to work, but it creates a parse error whenever run inside a mediawiki page: or with html codes in place of characters: and in any combination of the character/html code I could think of. I even tried variations using the mediawiki replace command like this: and

Same error, different SQL commands
In a very similar usage scenario, I would like to capitalize the first letter of the string returned, and make sure the rest of the letters are lowercase. Unfortunately, I must also replace all '_' with a space, as in my above issue. Each of these examples works separately, but once they are nested together, I get the same error as in my image above: Once nested, in any combination, the same error as above is shown:

Eventually, this is the SQL I would like run, but if it's too much for the extension I will work on finding a different solution: or

Thanks for any help provided!

Get nested JSON data
If an API returns a non-flat JSON structure, it seems like it's not possible to access the deeper data. Or is it? Sophivorus (talk) 03:27, 28 June 2019 (UTC)

Add http headers to get_web_data
We need to get data from API with authentication header, produced dynamically. I have read this answer, which not helping - this is not soap API.

To solve it, I inserted hook mechanism into the get_web_data logic, so I can alter data before request. But now I have other problem - Http class have no way to add headers to the request. I needed to extends it just for this need. (And to redeclare also its get and post methods, because the origin uses Http::request instead of self::request)

So the full solution patch is here in gerrit - allowing hook the call ($url and $options passing by ref) and inserting additional headers by $options['headers']. I think that the HttpWithHeaders class is overkill - this feature would be better in the core Http class - but that's for start.

Getting file data from UNC path?
Hello, this is working great for with local files, but I would to use a json file stored on a network drive. I have tried using a mapped drive (which I did not expect to work anyway) but I had hoped i could access via \\domain\sharedfolder\file.json defined in LocalSettings. I tried with both forward and backward slashes, but always get Error: No file found. I hope I am just doing something syntactically wrong. I appreciate the help!


 * I don't know - the key is that the server can access that path, not just your computer. Could that be the issue? Yaron Koren (talk) 19:22, 6 August 2019 (UTC)

Yes, it is accessible from the server. Is there a particular syntax to use for UNC paths? Thank you!


 * I don't know - External Data is using the PHP file_get_contents function, so based on what it says here, the path you entered should work. Yaron Koren (talk) 17:28, 7 August 2019 (UTC)

Stale cache
If manages to fetch data, and then, when the cache expires, the data source is gone, will the function return the stale data from the cache?

What about "Stored Procedures" in SQL systems.
Hello. There is SQL application on my network I'm trying to get data from using "Extension:External Data" and the SQL app owner has told me that they can make a "Stored Procedure" available to me rather than grant me "read" access to the SQL database itself.. Can someone shed some light on how this could be done using this extension? What should I ask for? Are SQL "Stored Procedures" the SQL implementation of SOAP? Can someone please offer some insight on how to do this if it's possible at all. Thanks!


 * You can do it - you just need to create a "mini-API", i.e. a script in some language like PHP or anything else, that calls that stored procedure, possibly passing in to it some values from the query string, and then output the results on the screen in CSV or JSON. External Data can then access that API via #get_web_data, and make use of the results. Yaron Koren (talk) 13:23, 6 September 2019 (UTC)


 * Thanks, Yaron. I'll pursue this approach.

Use to populate a template in a for loop?
First off, amazing work on this plugin. I really like it! However, would it be at all possible to use the #for_external_table or similar to populate a template? I.e something like:

That would totally make my day. I run an RPG system where it's handy to have information in a database for the various apps that access it (rather than scraping from the wiki) and this would certainly save on queries per page load for those pages that list skills or objects.


 * I'm glad you like it! Yes, you can do that using #display_external_table. Yaron Koren (talk) 00:34, 22 September 2019 (UTC)

Argh how did I miss that XD Thank you!

TypeError
Just to let you know: TypeError from line 98 of /w/includes/http/Http.php: Argument 2 passed to Http::get must be of the type array, string given, called in /w/extensions/ExternalData/includes/ED_Utils.php on line 873
 * MediaWiki: 1.33.0
 * PHP: 7.3.10
 * External Data: 1.9.1

Backtrace:

Jaider msg 14:48, 5 October 2019 (UTC)
 * 1) 0 /w/extensions/ExternalData/includes/ED_Utils.php(873): Http::get(string, string, array)
 * 2) 1 /w/extensions/ExternalData/includes/ED_Utils.php(976): EDUtils::fetchURL(string, string, integer)
 * 3) 2 /w/extensions/ExternalData/includes/ED_ParserFunctions.php(133): EDUtils::getDataFromURL(string, string, array, string, integer, integer)
 * 4) 3 /w/includes/parser/Parser.php(3528): EDParserFunctions::doGetWebData(Parser, string, string, string)
 * 5) 4 /w/includes/parser/Parser.php(3235): Parser->callParserFunction(PPTemplateFrame_DOM, string, array)
 * 6) 5 /w/includes/parser/Preprocessor_DOM.php(1285): Parser->braceSubstitution(array, PPTemplateFrame_DOM)
 * 7) 6 /w/includes/parser/Parser.php(3409): PPFrame_DOM->expand(DOMElement)
 * 8) 7 /w/includes/parser/Preprocessor_DOM.php(1285): Parser->braceSubstitution(array, PPFrame_DOM)
 * 9) 8 /w/includes/parser/Parser.php(3049): PPFrame_DOM->expand(DOMElement, integer)
 * 10) 9 /w/includes/parser/Parser.php(1359): Parser->replaceVariables(string)
 * 11) 10 /w/includes/parser/Parser.php(491): Parser->internalParse(string)
 * 12) 11 /w/extensions/PageForms/specials/PF_RunQuery.php(98): Parser->parse(string, Title, ParserOptions, boolean, boolean)
 * 13) 12 /w/extensions/PageForms/specials/PF_RunQuery.php(26): PFRunQuery->printPage(string, boolean)
 * 14) 13 /w/includes/specialpage/SpecialPage.php(569): PFRunQuery->execute(string)
 * 15) 14 /w/includes/specialpage/SpecialPageFactory.php(558): SpecialPage->run(string)
 * 16) 15 /w/includes/MediaWiki.php(288): MediaWiki\Special\SpecialPageFactory->executePath(Title, RequestContext)
 * 17) 16 /w/includes/MediaWiki.php(865): MediaWiki->performRequest
 * 18) 17 /w/includes/MediaWiki.php(515): MediaWiki->main
 * 19) 18 /w/index.php(42): MediaWiki->run
 * 20) 19 {main}


 * Sorry, External Data is well overdue for a new version. This problem may have been fixed already - could you try running the latest External Data code to see if the problem is still there? Yaron Koren (talk) 17:16, 6 October 2019 (UTC)


 * Yes, I have just checked out master now and I confirm it is fixed. Thanks. Jaider msg 17:35, 6 October 2019 (UTC)


 * Great. And I'll try to release a new version soon. Yaron Koren (talk) 01:21, 7 October 2019 (UTC)

MW Version 1.33.1 Upgrade
Hi,

I too have just attempted to upgrade from 1.27.5 to 1.33.1 of Mediawiki. I rely heavily on ExternalData (Using development Master after 1.33.1, have tried REL_1.33 too) to populate my pages and would like to get my site back up and running.

As an example, when I use a page that has a MYSQL (Version 5.6.22) running on Apache (Version 2.4) and PHP (Version 7.2.21) I get overlapping trace backs on the page. Below is the listing from the Apache log file.

I am not sure how to provide the relevant information to debug.

Any suggestions?

Thanks,

Gregg

more MW.log [218636] PHP Warning: A non-numeric value encountered in /.../mediawiki/1.33.1/includes/libs/rdbms/database/Database.php on line 304, referer: .../mediawiki/1.33.1/index.php?title=Special:UserLogin&returnto=Special%3ARunQuery%2FSystem+Table&returntoquery=pfRunQueryFormName%3DSystem%2BTable%26System_Table%255BApproach%255D%3DStandard%26System_Table%255BPlayer%255D%3DAnyone%26System_Table%255BPartner%255D%3DAnyone%26wpRunQuery%3DDisplay%2BSystems%26pf_free_text%3D [239131] PHP Stack trace:, referer: .../mediawiki/1.33.1/index.php?title=Special:UserLogin&returnto=Special%3ARunQuery%2FSystem+Table&returntoquery=pfRunQueryFormName%3DSystem%2BTable%26System_Table%255BApproach%255D%3DStandard%26System_Table%255BPlayer%255D%3DAnyone%26System_Table%255BPartner%255D%3DAnyone%26wpRunQuery%3DDisplay%2BSystems%26pf_free_text%3D [239167] PHP  1. {main} /.../mediawiki/1.33.1/index.php:0, referer: .../mediawiki/1.33.1/index.php?title=Special:UserLogin&returnto=Special%3ARunQuery%2FSystem+Table&returntoquery=pfRunQueryFormName%3DSystem%2BTable%26System_Table%255BApproach%255D%3DStandard%26System_Table%255BPlayer%255D%3DAnyone%26System_Table%255BPartner%255D%3DAnyone%26wpRunQuery%3DDisplay%2BSystems%26pf_free_text%3D [239186] PHP  2. MediaWiki->run /.../mediawiki/1.33.1/index.php:42, referer: .../mediawiki/1.33.1/index.php?title=Special:UserLogin&returnto=Special%3ARunQuery%2FSystem+Table&returntoquery=pfRunQueryFormName%3DSystem%2BTable%26System_Table%255BApproach%255D%3DStandard%26System_Table%255BPlayer%255D%3DAnyone%26System_Table%255BPartner%255D%3DAnyone%26wpRunQuery%3DDisplay%2BSystems%26pf_free_text%3D

[removed the rest]


 * I think that's just a single error message ("A non-numeric value encountered"). What code do you have on on line 304 of /includes/libs/rdbms/database/Database.php? Yaron Koren (talk) 14:27, 14 October 2019 (UTC)

Hi Yaron, Here is what I see from the standard MW 1.33.1 installation: 304:     if ( $this->flags & self::DBO_DEFAULT )

Here is a bit longer code snippet:

/**        * @note exceptions for missing libraries/drivers should be thrown in initConnection * @param array $params Parameters passed from Database::factory */       protected function __construct( array $params ) { foreach ( [ 'host', 'user', 'password', 'dbname', 'schema', 'tablePrefix' ] as $name ) { $this->connectionParams[$name] = $params[$name]; }

$this->cliMode = $params['cliMode']; // Agent name is added to SQL queries in a comment, so make sure it can't break out $this->agent = str_replace( '/', '-', $params['agent'] );

$this->flags = $params['flags']; 304:         if ( $this->flags & self::DBO_DEFAULT ) { if ( $this->cliMode ) { $this->flags &= ~self::DBO_TRX; } else { $this->flags |= self::DBO_TRX; }               }                // Disregard deprecated DBO_IGNORE flag (T189999) $this->flags &= ~self::DBO_IGNORE;

$this->sessionVars = $params['variables'];

$this->srvCache = $params['srvCache'] ?? new HashBagOStuff;

$this->profiler = is_callable( $params['profiler'] ) ? $params['profiler'] : null; $this->trxProfiler = $params['trxProfiler']; $this->connLogger = $params['connLogger']; $this->queryLogger = $params['queryLogger']; $this->errorLogger = $params['errorLogger']; $this->deprecationLogger = $params['deprecationLogger'];

if ( isset( $params['nonNativeInsertSelectBatchSize'] ) ) { $this->nonNativeInsertSelectBatchSize = $params['nonNativeInsertSelectBatchSize']; }

// Set initial dummy domain until open sets the final DB/prefix $this->currentDomain = new DatabaseDomain(                       $params['dbname'] !=  ? $params['dbname'] : null,                        $params['schema'] !=  ? $params['schema'] : null,                        $params['tablePrefix']                ); }

Hope that helps and let me know if you need anything else.

Thanks again

Gregg


 * Okay, thanks. Are you modifying the value of $edgDBFlags in your LocalSettings.php file? Yaron Koren (talk) 16:47, 15 October 2019 (UTC)

-- Aha. Yes I am

$edgDBFlags['bdb'] = "DBO_DEFAULT";

Should I remove this statement?

Gregg

--

Hi Yaron,

Removing this statement seems to fix this problem!

I have different problems now, but at first glance, I am back in business for the most part.

Thanks,

Gregg


 * Great! That particular statement looks unnecessary (DBO_DEFAULT is already the default), though if you do want to have it, I think you should remove the quotes around DBO_DEFAULT. Yaron Koren (talk) 19:31, 15 October 2019 (UTC)

TAB delimited CSV files
I have a TAB delimited text data file on my server that I have configured the "External Data" extension to be able to read.

I have proven that the  argument works with   by replacing all my tab literals with the   char such that   actually works well.

My question is: how can I express a TAB as a delimiter? as that is the way the data file is being generated.

Thank you! /Rich


 * That's a good question, and I'm surprised that this never came up before, given that TSV (tab-separated values) is a somewhat popular data format. There was no way to handle this, as far as I know - I just checked in a way to do this, so now, if you have "delimiter=\t", it will handle tabs, because the "\t" will be interpreted as an actual tab. Yaron Koren (talk) 03:04, 29 October 2019 (UTC)
 * Thanks, Yaron! You rock. --- Rich
 * Unfortunately, after manually making the changes identified in your commit [1] to my v1.8.3 version of ED, it was not successful :/ .. The affected parts of "ED_ParserFunctions.php" were identical to your commit. Any idea why this isn't working for me?
 * line 103 as  works better for me. just fyi

[1] https://github.com/wikimedia/mediawiki-extensions-ExternalData/commit/33eed80a2100c4922387da91033e799c144f4618


 * Sorry about that - that always seems to happen when I don't test my changes! I checked in what I think is a fix for this. Yaron Koren (talk) 18:35, 29 October 2019 (UTC)


 * Hi Yaron, apart from the  function, could you also make the change within the   function? Thanks! --Platinops (talk) 18:02, 3 December 2019 (UTC)


 * Good idea - I think I forgot about #get_file_data... I just checked in that change. Yaron Koren (talk) 14:39, 4 December 2019 (UTC)

#get_web_data question
What am I doing wrong?

My wiki page isn't passing data to my php page via POST process.

Here's the code in the page itself:

And here's the code in "program_title_count.php" that is supposed to get the POSTed data:  $whichOffice = $_POST["office"]; if(empty($whichOffice)) { 		$whichOffice = "03808"; } 	$query1 = "SELECT title,COUNT(*) AS titleCount FROM list_personnel INNER JOIN affiliation_program ON list_personnel.id_list_personnel = affiliation_program.id_list_personnel WHERE program_number = '$whichOffice' GROUP BY title"; 

It runs the query, but uses the default value for $whichOffice (03808) rather than "03809".

What am I missing?

Sorry, I'm not an IT professional; please forgive me if I'm missing something obvious.


 * What versions of External Data and MediaWiki are you running? Yaron Koren (talk) 18:48, 15 November 2019 (UTC)

Lua modules
Hi. Is there a way to use this extension through a Scribunto module? I would like to get the raw file and use my own functions to process it. Tinker Bell (talk) 23:47, 1 December 2019 (UTC)


 * Not entirely sure what you mean here, but this details how to call parser functions from modules: Extension:Scribunto/Lua reference manual DSquirrelGM &#120035;&#120031;&#120018; 00:12, 2 December 2019 (UTC)
 * No, DSquirrelGM, I just want to get a JSON file from a webserver, and process it with a function I wrote in Lua. And using callParserFunction won't work because it only generates a strip marker that can't be used by Scribunto. Tinker Bell (talk) 06:25, 6 December 2019 (UTC)

Alex Mashin (talk) 03:12, 31 July 2020 (UTC)
 * Now there is a way.

Can I use this extension to upload CSV/xml files and have mediawiki turn them to wiki tables?
Hi, unfortunately the example you provide in the extension is not working. Will this extension allow me to convert an uploaded CSV file in my mediawiki to a wiki table that can be used in a page? Thanks   MavropaliasG (talk) 15:11, 2 December 2019 (UTC)


 * Hello, MavropaliasG, your code works as it is expected. I've tested it on my private wiki, here is the result: https://i.stack.imgur.com/EOhqG.png IMO, the problem in your case is the version of Extension:ExternalData, try to use the   branch, instead of  for the extension. This solves the very similar problem with my MW 1.33. Regards. Spas.Z.Spasov (talk) 15:38, 2 December 2019 (UTC)

Hi thanks for the reply. I was talking about the example given on the extension page in that box (under download). It links to a mediawiki which returns an error. Anyway I wanted to know if this extension allow me to convert an uploaded CSV file in my mediawiki to a wiki table that can be used in a page? Thank you  MavropaliasG (talk) 15:53, 2 December 2019 (UTC)


 * Yes. The error you're seeing on discoursedb.org is due to a temporary bug in another extension. Yaron Koren (talk) 17:13, 2 December 2019 (UTC)

php warning on rebuildall.php
Hi, I'm seeing a php warning upon deploying here. The error looks like this:

My versions are:


 * MW 1.34.0
 * Running on debian 10.2
 * php version 7.3.14
 * ED from git, and the 1_34 branch checked out

The workflow leading up to this is essentially, install MW, composer install some extensions (SMW etc.), then in order:


 * update.php --quick
 * importDump.php --no-updates (about 6000 XML dumped pages)
 * rebuildall.php (the warning occurs here 3 times)

This warning doesn't appear in the 1.31.6 LTS.

Any thoughts?

--JosefAssad (talk) 09:21, 19 February 2020 (UTC)


 * I don't know why it's happening for one MediaWiki version and not another, but I'm guessing you can ignore that warning. Are you seeing any actual problems? Yaron Koren (talk) 15:51, 19 February 2020 (UTC)

Nope, no obvious errors, but to be honest I haven't explicitly tested ED yet. :) I have been assuming it's php being chatty php; will update here if I see something obvious. --JosefAssad (talk) 08:15, 24 February 2020 (UTC)

rebuildData.php doesn't seem to work from cron for get_web_data [solved]
We're using ED (1.9) more an more to query data and store in articles and have been using the rebuildData script to keep things fresh. Some server environment variables were adjusted yesterday which has cause some really weird behavior when it comes to the cron jobs. I suspect it's some kind of permissions piece that has gone bad, but I can't figure why an article save would work fine and the cron (running as the same user apache does, www-data) would not. #get_file_data updates work totally fine. Is there a nuance in the extension or environment I should be looking for?
 * When a user views an article that uses #get_web_data, everything fetches and renders as expected.
 * When a user saves an article, the #store_external_table function works as expected, storing SMW subobjects.
 * If I run rebuildData with sudo, the objects are stored. (without sudo, they do NOT get fetched/stored)
 * When the cron runs, the subobjects disappear (as though the #get_web_data failed)

Thanks!

Lbillett (talk) 17:01, 4 March 2020 (UTC)
 * Yah so this was all my fault. The changes we made seemed to cause some obsolete environment variables to get picked up from etc/bash.bashrc and etc/wgetrc. While I don't know why some were set and others not depending on how it was being run (cron vs page save) seems clear this was the issue. All perfect now. - Lbillett (talk) 19:40, 19 March 2020 (UTC)


 * That's great to hear! Yaron Koren (talk) 20:35, 19 March 2020 (UTC)

function fetchURL: options for HTTP request differ if caching is used
Hi Yaron,

I am using External Data to fetch some JSON data from an asset management software. This worked great until I started using caching by adding $edgCacheTable = 'ed_url_cache'; to my settings. After that I only saw Error: No contents found at URL https://... on the respective pages. Looking at the code in  includes/EDUtils.php I saw that you are using different options for the HTTP request when caching is used (line 914 of that file in the master branch) than when caching is not used (lines 880 ff in the master branch). After changing the options for the caching case to the same ones as for the non-caching case I was able to get rid of that error.

Is there a reason you a using different options? Am I missing something?

Regards, HermannSchwärzler (talk)


 * Sorry about that. No good reason. I just checked in what I think is a fix for this - hopefully the new code works better for both you and the person below. Yaron Koren (talk) 01:26, 6 March 2020 (UTC)


 * Thank you, Yaron! I looked at your commit and it's exactly the change that I would have made for fixing this problem. :-) I have to find some time to test this in my setup, maybe the Wuestenarchitekten are faster in testing it...

Class 'LoggerFactory' not found
I'm trying to update our wiki to this:

I get the following error:

I then found this and upgrade ExternalData to  but now get:

The mentioned URL does return a list as expected.

My Template that is using ExternalData has the following:

Any hints on what I should do? Thx! --Wuestenarchitekten (talk) 19:49, 5 March 2020 (UTC)


 * It sounds like you're seeing the same issue as the person above. Do you have $edgCacheTable set to some value? Yaron Koren (talk) 19:51, 5 March 2020 (UTC)
 * Ah - I didn't even see that - yes, same setting  - Do I need to create that table again?--Wuestenarchitekten (talk) 20:00, 5 March 2020 (UTC)


 * No. The code needs to be fixed; until that happens, you should probably just comment out that setting. Yaron Koren (talk) 20:04, 5 March 2020 (UTC)

get_db_data with special character get corrupted
Hello, i try to update my wiki and change the connection for external_data from mssql to mysql (mssql is not supportet in this new wiki-version). i have a Problem with the get_db_data function. One Column has a SpecialCharacter in the Name: ä. As you can see in the Errormessage, this will be convertet to \xC3\xA4. If i remove this column, every works fine. So, the sql statement itself get corcupted. The return data itself also contains Special Characters like ä,ö,ü and these comming correct in the results. (i have shorten the result message and remove some internal information):

My Installation/Configuration:
 * MediaWiki: 1.34.0
 * PHP: 7.4.5 (apache2handler)
 * MySQL: 8.0.19
 * Elasticsearch: 6.5.4

--TomyLee (talk) 11:23, 28 April 2020 (UTC)

delimiter=T (TAB) not working after upgrade to 2.0 on MW 1.34.1
Using ED 1.9 on MW 1.31 I had wiki page that was reading a locally hosted tab-separated variable text file with the following code:

LocalSettings.php $edgFilePath['XYZ'] = "/opt/htdocs/XYZ.txt"; with the wiki page as:
 * 1) File XYZ.txt is a "tab-separated-variable" file with n rows of m tab-separated columns

I upgraded my site to MW 1.34.1 and ED 2.0 and now #get_file_data: produces the following error: Notice: Undefined variable: regex in "/opt/htdocs/mediawiki/extensions/ExternalData/includes/ED_ParserFunctions.php" on line 229 and my  #external_value:  statements generate the error: Notice: Undefined offset: 0 in /opt/htdocs/mediawiki/extensions/ExternalData/includes/ED_ParserFunctions.php on line 442 Any thoughts on what might have gone wrong? -- Revansx (talk) 00:02, 2 May 2020 (UTC)


 * I don't know, but there's a more recent version of External Data, 2.0.1 - you should try upgrading to that. Yaron Koren (talk) 23:10, 3 May 2020 (UTC)
 * not working on ED 2.0.1 either - same issue. Have you tested 2.0.1 with a teb delimited file? (i.e. |delimiter=\t )? - Revansx (talk) 23:04, 12 May 2020 (UTC)

UPDATE - This fixed it. diff --git a/includes/ED_ParserFunctions.php b/includes/ED_ParserFunctions.php index 0cc9886..3b5ee12 100644 --- a/includes/ED_ParserFunctions.php +++ b/includes/ED_ParserFunctions.php @@ -190,6 +190,12 @@ class EDParserFunctions { } else { $format = ''; } + +               $regex = $format === 'text' && array_key_exists( 'regex', $args ) +                       ? html_entity_decode( $args['regex'] ) +                       : null; + +               if ( $format == 'xml' ) { if ( array_key_exists( 'use xpath', $args ) ) { // Somewhat of a hack - store the fact that

Urlencoding encodes space to + instead of %20
the very useful function encodes spaces in "+", instead it should encode it in "%20"


 * Does this lead to problems? Yaron Koren (talk) 13:41, 12 May 2020 (UTC)

Undefined variable regex in ED_ParserFunctions.php after upgrade to mw 1.34.1 and ED 2.0.1
I was successfully using External Data on my MW 1.31 wiki to read a local file on my wiki server.

After I upgraded to MW 1.34.1 and ED 2.0.1 my  call no longer works and I see this error: Notice: Undefined variable: regex in /opt/htdocs/mediawiki/extensions/ExternalData/includes/ED_ParserFunctions.php on line 233 exact same wiki page code .. exact same tab-delimited file.. switched to a true csv and it works ... tab-delimited causes the error. - Revansx (talk) 22:05, 12 May 2020 (UTC)

UPDATE - This fixed it. diff --git a/includes/ED_ParserFunctions.php b/includes/ED_ParserFunctions.php index 0cc9886..3b5ee12 100644 --- a/includes/ED_ParserFunctions.php +++ b/includes/ED_ParserFunctions.php @@ -190,6 +190,12 @@ class EDParserFunctions { } else { $format = ''; } + +               $regex = $format === 'text' && array_key_exists( 'regex', $args ) +                       ? html_entity_decode( $args['regex'] ) +                       : null; + +               if ( $format == 'xml' ) { if ( array_key_exists( 'use xpath', $args ) ) { // Somewhat of a hack - store the fact that


 * Sure, that would fix the "undefined variable" problem - but did that also fix the problem of reading from the tab-delimited file? That would be surprising. Yaron Koren (talk) 23:28, 12 May 2020 (UTC)
 * So far so good. I think it was the same problem.. The regex wasn't defined in the "doGetFileData" function at all. I don't know why but true CSV was working and Tab-separated was not.. i added the patch listed above and for whatever reason, all the errors went away and the parser function was able to ready the values from the TSV file. *shrugs* - Revansx (talk) 00:36, 13 May 2020 (UTC)


 * Okay, great. I just checked in a change to initialize $regex, so hopefully everything works now. Yaron Koren (talk) 13:58, 13 May 2020 (UTC)
 * Thanks. I'll test it today and assuming everything works well i'll list this topic as "solved" - Revansx (talk) 17:05, 13 May 2020 (UTC)

HTTP Request Options
I would like to create a new configuration variable for the extension to hold default HTTP options to use in EDUtils::fetchURL. I need it to handle redirects properly; I am also considering using proxies, like i2p. Would filing an issue and uploading a patch be OK, or the author won't have it? Alex Mashin (talk) 09:54, 4 June 2020 (UTC)

Alex Mashin (talk) 06:57, 5 June 2020 (UTC)
 * Hi - sure, you can upload a patch, or just describe it, and we can discuss it either way. But what you're saying sounds reasonable. Yaron Koren (talk) 13:17, 4 June 2020 (UTC)
 * See https://phabricator.wikimedia.org/T254551.

HTML mode
Please note this feature request. Alex Mashin (talk) 14:33, 9 June 2020 (UTC)

Stale cache
Please review this patch. Alex Mashin (talk) 09:31, 26 June 2020 (UTC)

Encoding detection
Please review a new patch to improve encoding detection in loaded texts.

Alex Mashin (talk) 07:21, 4 July 2020 (UTC)

SOAP
Did  ever work? If it did, was it cached, as the page claims? Alex Mashin (talk) 22:02, 10 July 2020 (UTC) Alex Mashin (talk) 17:48, 13 July 2020 (UTC)
 * Well, it does now, at least for XML.
 * I don't know how well it worked - I never tried it myself. Yaron Koren (talk) 18:02, 13 July 2020 (UTC)

Patchset for Scribunto
Please review this patchset. Alex Mashin (talk) 17:48, 13 July 2020 (UTC)

Graceful handling of missing data
If I can't be sure the external URL will be present or active, is there a graceful way for it to fail without the red message 'Error: No contents found at URL', so I can just silently not display the data? Vicarage (talk) 09:58, 27 July 2020 (UTC)

I can hack it by suppressing .error with CSS and using {{#ifeq:{{#external_value:title}} to check if any data was imported, but the method is crude and nasty. Vicarage (talk) 17:12, 27 July 2020 (UTC)


 * Have you considered caching the data, so that it will still be displayed if the URL temporarily goes offline? Or is this more than a temporary problem? Yaron Koren (talk) 17:17, 27 July 2020 (UTC)


 * its mapping 5000 pages to 1000 directories, both in flux, so its a matter of speculatively hoping data might be there rather than hard-coding 1000 links. Vicarage (talk) 22:22, 27 July 2020 (UTC)

Alex Mashin (talk) 03:32, 31 July 2020 (UTC)
 * Bug T259326.

Exception with get_db_data - "Prefix must be a string"
I use get_db_data with MySQL and MariaDB and now have to specify a prefix even an empty string.

To fix this I added entries like this to each database definition in LocalSettings.php: Alex Mashin (talk) 02:28, 11 August 2020 (UTC)
 * Could you update the extension and try again without setting ?

Patch for review
Please review this patch. Alex Mashin (talk) 04:45, 12 August 2020 (UTC)

Can not get #for_external_table to work on MW 1.34
MW 1.34, External Data 2.0.1 (9a1fdcb)

First I Verified that url: https://discoursedb.org/AfricaCSV.txt has correct data

Then I created page called "Test" in my wiki with wikitext:

with page "Template:Country info row" as: Results are:
 * Only Table Heading renders. No Data Rows in the table
 * No errors on the page
 * No errors in the debug

Can someone help me debug this?

Thanks!

/Rich