Extension talk:External Data

Invalid JSON on #get_web_data
I plan on doing some upgrades to mediawiki and possibly php as a shot in the dark, but before doing that I figured I would ask here.

Here is the exact json of the table I'm trying to display. The data itself is junk just to play with.

[{"IPAddress":"192.168.000.001","Type":"Server","Device":null,"Location":null,"Comment":null,"HostnameDNS":null,"OSFirmware":null,"LastVerified":"1\/7\/2021","PlantNetwork":"999Office"},{"IPAddress":"192.168.000.002","Type":"Server","Device":"Virtual Server","Location":"ERA - Server Room","Comment":"File and Print Server","HostnameDNS":"fileserver","OSFirmware":"Server 2016","LastVerified":"1\/7\/2021","PlantNetwork":"999Office"},{"IPAddress":"192.168.000.003","Type":"Server","Device":"Virtual Server","Location":"ERA - Server Room","Comment":"Video Server","HostnameDNS":"videoserver","OSFirmware":"Linux","LastVerified":"1\/7\/2021","PlantNetwork":"999Office"},{"IPAddress":"192.168.000.004","Type":"Server","Device":"Physical Server","Location":"ERA - Server Room","Comment":"Web Server","HostnameDNS":"webserver","OSFirmware":"IBM OS\/400 7.2","LastVerified":"1\/7\/2021","PlantNetwork":"999Office"},{"IPAddress":"192.168.000.005","Type":"Server","Device":"Physical Server","Location":"ERA - Server Room","Comment":"Mainframe","HostnameDNS":"mainframe","OSFirmware":"IBM OS\/400 7.2","LastVerified":"1\/7\/2021","PlantNetwork":"999Office"}]

I'm getting this from a really generic json_encode( $data ); function in select_ipam.php, which queries a Microsoft SQL server database. I have already verified that my header information is set correctly as another user pointed out.

header("Content-Type: application/json; charset=utf-8"); echo( json_encode( $data ) );

I also tried using the "use jsonpath" approach but it just gives me the same thing.


 * I'm not sure what the exact problem you're seeing is, but I would recommend switching to the latest version of External Data - the version you're running is rather old. Yaron Koren (talk) 00:46, 8 January 2021 (UTC)


 * I was able to get it working with some different syntax not using a template and a version update. Now if I can just figure out how to get it to update more frequently. When I click edit and save the page, without making any other changes it updates the table, but otherwise it doesn't want to grab new data from the php post.  Are there any plans to integrate an "ajax mode" that would allow the table data to be refreshed say every 10 seconds without a full page reload?


 * Okay, that's good. No... that sounds like it would require a whole different approach - JavaScript-based instead of PHP-based. Yaron Koren (talk) 20:18, 8 January 2021 (UTC)

get_web_data error after updating mediawiki
updating to mediawiki 1.35.1 and using the latest external data  results in my template: erroring with:

Could not get URL http://localhost/api.php?action=query&list=categorymembers&cmtitle=Category:CHOPs&format=xml&cmlimit=500 after 3 tries.

Clicking the url though returns the expected XML..

--Wuestenarchitekten (talk) 21:47, 13 January 2021 (UTC)


 * Silly question, but did this work before? Yaron Koren (talk) 14:00, 14 January 2021 (UTC)

#get_db_data with 'join on' statement returning blank table
Apologies in advance, I am somewhat new to SQL but I have some experience. I am attempting to use #get_db_data to pull data from a local server DB to display sports statistics. One of the tables has a field (Position) that I need for the wikitable, so I was hoping to join these two tables using the 'PlayerID' field. I can't do this server-side because it would add considerable time to processing and, again, I don't have enough experience.

Using the aliases throughout and executing the query returns no syntax issues or DB errors, but #for_external_table returns an empty table. I get the appropriate number of rows in the table, so I assume it's returning nulls or default values. Without the ability to debug there's no way to track down where in the process it's losing the values.

Here's the code I wrote:

Documentation is limited for the 'join on' statement for this extension, and there aren't even any reported issues from the past (I've checked the archives). Should I create a template and use #display_external_table instead? Any help would be appreciated.

2603:8001:4948:DF00:7D69:EB88:902:704 09:50, 26 January 2021 (UTC)gplehner


 * I don't know. My guess is that the problem is in the "data" parameter, somehow - that External Data is failing to find the external variables listed. I would try taking out those table aliases, i.e. keeping the original table names... there's a chance that that will fix the problem. Yaron Koren (talk) 15:11, 26 January 2021 (UTC)


 * Thanks, I had time to come back to this today. Removing the table aliases from the "data" parameters returned Error 1052. I incrementally removed each alias until I narrowed the issue to two parameters, team and playerID. Both tables have a field with the name 'Team' and 'PlayerID'. It doesn't seem to matter which one I request in the "data" parameter, both print out the default/'else' values I set. The 'data' parameter certainly is understanding the alias names.


 * I should also mention that I switched to using '#display_external_table' in an attempt to pass data parameters that way but it didn't work either. Thanks for your response & suggestions. 45.51.157.21 08:21, 28 January 2021 (UTC) -gplehner


 * If you have database server access you should create a VIEW on the database and then do the select on that view.
 * In the view you can join whatever table you want without touching the tables.
 * See: https://www.w3schools.com/sql/sql_view.asp
 * In your case you create a view called PlayerSeasonStats. --Felipe (talk) 13:14, 28 January 2021 (UTC)


 * I meant removing table aliases from the entire query, by the way, not just from "data". Yaron Koren (talk) 16:26, 28 January 2021 (UTC)

Alex Mashin (talk) 06:30, 3 August 2021 (UTC)
 * I'm seeing the same issue in the current version. "data" doesn't seem to be able to access fields when they are prefixed with their table alias. Seems this behavior has introduced with REL1_31
 * Can you reproduce undesired behaviour on this database?

function to check whether any line of external data matches a string
My use case for External data is reading a single csv file, and then displaying different tables for lines that contain video/pdf/image etc, indicated by a type column. While I can use {{#ifeq:{{#external_value:type}} to check the first line of data, the header, has been read, I can't decide to only display the video table if videos are available. How hard would it be to have a function that checked all entries in a particular column against a search string, returning true if any matched? At the moment I have lots of tables with headers and no content. Vicarage (talk) 16:48, 7 March 2021 (UTC)


 * If I understand this question correctly, you may be able to accomplish this with the Variables extension - instead of displaying the table directly, set it in a variable, and then only display that variable (and its header) if it's non-empty. Yaron Koren (talk) 02:12, 8 March 2021 (UTC)

Thanks, I'll look at that. My initial workaround was to create a counts.csv file, with no header, and check against {{#ifexpr:{{#external_value:imagecount}} > 0, where imagecount=column 4 of the file. Vicarage (talk) 11:57, 8 March 2021 (UTC)

retrieving a template reference from external_data.
I'd like to be able to retrieve a json object that contains one key ('data') and a string that contains mediawiki formatted text. Is there a way to parse that retrieved string, so that it displays on the page as formatted data, rather than plain text?

Example: I'd like to retrieve  from an external source, and have it render the template

173.16.242.134 04:23, 8 March 2021 (UTC) Alex Mashin (talk) 14:54, 24 July 2021 (UTC)
 * Try using Lua bindings and.

display_external_table should do nothing if its template function returns nothing
In http://wiki.johnbray.org.uk/Problem1 I read a 2 column file, and want to put items from the second column in a table ONLY when the first column has value "good". The template function has a switch so nothing is returned, but display_external_table adds a blank line to the last table box, which spoils the format. Using delimiter= just makes the table wider.

As this is a key use case for display_external_table, it would seem to make sense to just do nothing if there was nothing to display. Vicarage (talk) 09:52, 10 March 2021 (UTC)

Fatal exception of type "InvalidArgumentException" When trying to access mysql database.
Recently updated to MediaWiki 1.35 from 1.31, and updated the extension to the latest version for 1.35. Previously the get_db_data queries were working fine, but now they are throwing an Invalid Argument Exception.

When I run a debug log I get the following output, related to one of the exceptions.

[exception] [YGRqrMgm-gbpiKaSuVS3ewABaww] /index.php?title=Template:CementKiln&action=submit InvalidArgumentException from line 68 of /home1/wikiwast/public_html/includes/libs/rdbms/database/domain/DatabaseDomain.php: Prefix must be a string. I have re-loaded the extension several times, and tried a couple of different databases.
 * 1) 0 /home1/wikiwast/public_html/includes/libs/rdbms/database/Database.php(291): Wikimedia\Rdbms\DatabaseDomain->__construct(string, NULL, NULL)
 * 2) 1 /home1/wikiwast/public_html/includes/libs/rdbms/database/DatabaseMysqlBase.php(111): Wikimedia\Rdbms\Database->__construct(array)
 * 3) 2 /home1/wikiwast/public_html/includes/libs/rdbms/database/Database.php(433): Wikimedia\Rdbms\DatabaseMysqlBase->__construct(array)
 * 4) 3 /home1/wikiwast/public_html/extensions/ExternalData/includes/EDUtils.php(238): Wikimedia\Rdbms\Database::factory(string, array)
 * 5) 4 /home1/wikiwast/public_html/extensions/ExternalData/includes/EDParserFunctions.php(377): EDUtils::getDBData(string, string, array, string, array, NULL, array)
 * 6) 5 /home1/wikiwast/public_html/includes/parser/Parser.php(3340): EDParserFunctions::doGetDBData(Parser, string, string, string, string)
 * 7) 6 /home1/wikiwast/public_html/includes/parser/Parser.php(3047): Parser->callParserFunction(PPFrame_Hash, string, array)
 * 8) 7 /home1/wikiwast/public_html/includes/parser/PPFrame_Hash.php(253): Parser->braceSubstitution(array, PPFrame_Hash)
 * 9) 8 /home1/wikiwast/public_html/includes/parser/Parser.php(2887): PPFrame_Hash->expand(PPNode_Hash_Tree, integer)
 * 10) 9 /home1/wikiwast/public_html/includes/parser/Parser.php(1556): Parser->replaceVariables(string)
 * 11) 10 /home1/wikiwast/public_html/includes/parser/Parser.php(651): Parser->internalParse(string)
 * 12) 11 /home1/wikiwast/public_html/includes/content/WikitextContent.php(374): Parser->parse(string, Title, ParserOptions, boolean, boolean, NULL)
 * 13) 12 /home1/wikiwast/public_html/includes/content/AbstractContent.php(590): WikitextContent->fillParserOutput(Title, NULL, ParserOptions, boolean, ParserOutput)
 * 14) 13 /home1/wikiwast/public_html/includes/EditPage.php(4282): AbstractContent->getParserOutput(Title, NULL, ParserOptions)
 * 15) 14 /home1/wikiwast/public_html/includes/EditPage.php(4187): EditPage->doPreviewParse(WikitextContent)
 * 16) 15 /home1/wikiwast/public_html/includes/EditPage.php(2965): EditPage->getPreviewText
 * 17) 16 /home1/wikiwast/public_html/includes/EditPage.php(701): EditPage->showEditForm
 * 18) 17 /home1/wikiwast/public_html/includes/actions/EditAction.php(71): EditPage->edit
 * 19) 18 /home1/wikiwast/public_html/includes/actions/SubmitAction.php(38): EditAction->show
 * 20) 19 /home1/wikiwast/public_html/includes/MediaWiki.php(527): SubmitAction->show
 * 21) 20 /home1/wikiwast/public_html/includes/MediaWiki.php(313): MediaWiki->performAction(Article, Title)
 * 22) 21 /home1/wikiwast/public_html/includes/MediaWiki.php(940): MediaWiki->performRequest
 * 23) 22 /home1/wikiwast/public_html/includes/MediaWiki.php(543): MediaWiki->main
 * 24) 23 /home1/wikiwast/public_html/index.php(53): MediaWiki->run
 * 25) 24 /home1/wikiwast/public_html/index.php(46): wfIndexMain
 * 26) 25 {main}

the LocalConfig.php entries are..

$edgDBServer['engy']="localhost"; $edgDBServerType['engy']="mysql"; $edgDBName['engy']="wikiwast_energy"; $edgDBUser['engy']="wikiwast_energy"; $edgDBPass['engy']=" "; The query I used was, being called from a template. Where the value of EPR is passed to it through the template call. All the data values are correct and contained within the mysql database.

I am at a loss as to what the problem is, I hope that someone might see this and be able to help.


 * For External Data - and really any extension whose compatibility policy is "Master maintains backwards compatibility", which includes all of my extensions - you should not use a REL_ branch, but instead just use the latest code - either the truly latest code, or the latest released version. Using either one may fix this specific problem. Yaron Koren (talk) 13:12, 31 March 2021 (UTC)
 * Thanks a Million Yaron that has fixed the issue. Will remember that in future.

Usage in DE:WP
Hy! This extension seems not to be installed on any Wikipedia, even on Mediawiki not! Tried a usecase on DE:WP, without success. Any possibility to get it enabled on a local Wikipedia? Regards, Uwe Martens (talk) 20:44, 25 April 2021 (UTC)


 * That's outside of my domain. But it would be great if this extension could be installed on a Wikimedia site! Yaron Koren (talk) 12:26, 26 April 2021 (UTC)


 * Thanks for your response! I'll discuss it with a responsible admin. I hoped one of them is active here, but anyway, best regards, Uwe Martens (talk) 13:07, 26 April 2021 (UTC)

No data with XML namespace
Hi Yaron, I bumped into something weird. It seems that External Data is not always able to fetch data when the XML file contains a URL for xmlns. I guess the reason could be, in some cases, that the URL is not accessible (you'd be surprised how often that happens) or in others, that the URL redirects to a new one. Lots of sites still use xmlns="http://www.w3.org/1999/xhtml", with that URL redirecting to the more secure https://www.w3.org/1999/xhtml/. I've created a snippet that you can try out for yourself:

   Jo Nesbø 

Cavila 11:13, 19 August 2021 (UTC)


 * Use  and add it to the XPath query:
 * res: Alex Mashin (talk) 11:43, 19 August 2021 (UTC)
 * Alternative:
 * res: Alex Mashin (talk) 11:52, 19 August 2021 (UTC)
 * res: Alex Mashin (talk) 11:52, 19 August 2021 (UTC)
 * res: Alex Mashin (talk) 11:52, 19 August 2021 (UTC)


 * Thanks, that seems to work! And thanks also for the patch. Cavila 17:39, 19 August 2021 (UTC)

Extension Page
So, the great dismemberment of this page is afoot. But I don't think that merely moving its sections to subpages is enough. The structure of the documentation needs refactoring, whether it is stored in one page or split between several. Alexander Mashintalk 14:05, 30 September 2021 (UTC)

Alexander Mashintalk 16:35, 30 September 2021 (UTC)
 * What kind of things specifically do you recommend? Yaron Koren (talk) 16:17, 30 September 2021 (UTC)
 * I think, a more logical structure of the manual, regarding text formats, would be: fetching - preparsing - parsing - mapping - displaying data. At this point, everything about parsing text is told within the section about, although it also applies to and . The  is also unbalanced. The Lua part should be better integrated. Anyway, refactoring the manual will not be an easy task, due to the complexity and "non-linearity" of the subject, but I think it is necessary.


 * Oh, I forgot that #get_program_data also uses those same formats. Yes, it does seem to make sense to move all the information about the different formats into a new page, including related information like using XPath and so on. I don't know if that's what you refer to as preparsing or parsing, or both (I'm guessing both). Similarly, it probably makes sense to move the information about caching data into a new page - right now it's in the documentation for #get_web_data, for no strong reason. And it probably makes sense to create a page for mapping as well, i.e. the handling of the "data" parameter - I thought at first that this was too simple to justify a separate page, but I see now that quite a few features have been added to it. What do you mean by the #get_db_data part being unbalanced? Was it the incorrect header sizes? (I just fixed that.) I disagree with integrating Lua better - I think, at least for now, that the use of Lua is rare enough that it's fine to require anyone who wants to use Lua for External Data to learn about the parser functions first, even if they're not planning to use the parser functions. Yaron Koren (talk) 17:58, 30 September 2021 (UTC)


 * Well, I've now made a few changes to the documentation, based on this discussion; let me know what you think. Yaron Koren (talk) 21:49, 1 October 2021 (UTC)
 * The structure makes more sense now. Alexander Mashintalk 17:53, 2 October 2021 (UTC)

Call database function
It is possible or will ever be implemented a way to call a database function? I'm actually working on PostgreSQL. Alexander Mashintralk 12:11, 18 December 2021 (UTC) Alexander Mashintalk 15:11, 19 December 2021 (UTC)
 * The MediaWiki RDBMS library quotes all table names; the parentheses get inside double quotes and are treated by the database server as a part of the identifiers; so, currently, no.
 * If you update the extension now, you can define a prepared statement for your connection to the PostgreSQL database, as described here. It does not get quoted.

One missing XML element means no data is retrieved at all
I'm using something along these lines:

If the XML contains elements xxx, yyy and zzz, all works fine.

If the XML is missing anything, for example xxx, it'll say:
 * For the variables:  etc.
 * For the variables:  etc.

Is there any way to ensure that yyy and zzz are set even when xxx is missing from the XML source? gets rid of the first error message but has no effect on whether yyy and zzz are set. Thanks.