Extension talk:External Data

From MediaWiki.org
Jump to navigation Jump to search

fetchURL error after upgrade[edit]

Hello... I recently upgraded mediawiki to 1.33 and can no longer use #get_web_data. I get errors about a string being passed to Http::get() instead of an array. Even using the sample code/urls from the ExternalData wiki get the same error.

[XUNJvirhcjlnb-i73WIgxAAAA@g] /wiki/index.php?title=X&action=submit TypeError from line 98 of /myurl.com/wiki/includes/http/Http.php: Argument 2 passed to Http::get() must be of the type array, string given, called in /myurl/wiki/extensions/ExternalData/includes/ED_Utils.php on line 873

MediaWiki 1.33.0
PHP 7.1.14 (cgi-fcgi)
MySQL 5.6.41-84.1
ICU 4.2.1

Any chance someone has seen this before?

Are you using the very latest External Data code? This bug may have just been fixed a few days ago. Yaron Koren (talk) 02:25, 2 August 2019 (UTC)

Yeah. I just updated now to be sure, and am still seeing the same error. I did do a manual update from an older version of MW. Made the necessary updates to the syntax in my LocalSettings file. Perhaps I have missed something?

Are you sure you're using the latest ED code? ED_Utils.php doesn't call Http::get() any longer - it now calls HttpWithHeaders::get(). Yaron Koren (talk) 15:09, 2 August 2019 (UTC)

Populating Wiki Page from CSV spreasdsheet[edit]

Hi there! SO, I run a website with a wiki. … I'm interested in having wiki pages contain an entry populated from a spreadsheet I will be hosting on my own website (Encyclopedia_Greyhawkania_DATA ONLY.csv).

As an example, I'd like it to look similar to this example page where I've manually hand-written entries from this index-spreadsheet (without calling it from the spreadsheet. It's the bit at the bottom under "Encyclopedia Greyhawkania".

I'd like to call data from the spreadsheet using the {{PAGENAMEE}} as part of the call parameter. Ideally what it'd do is use the {{PAGENAMEE}} to search ColumnA, for example, then return any row (with cells A through D) that has the {{PAGENAMEE}} in it, and have it in a table or a list of some kind.

Is there any chance I can get some help with how to call the data?

Something like this might work, if I understand the question correctly:
{{#get_web_data:url=URL goes here
 |format=csv with header
 {| class="wikitable"
 ! A
 ! B
 ! C
 ! D {{#for_external_table:<nowiki/>
 {{!}} {{{A}}}
 {{!}} {{{B}}}
 {{!}} {{{C}}}
 {{!}} {{{D}}}
Yaron Koren (talk) 17:00, 24 July 2019 (UTC)
Thank you so much Yaron Koren!!!
I used a variation of that, replacing the "! A" with just the direct title of the header (e.g. "Topic"), instead of using the parameter.
I got it to generate the table, but, with no rows below the headers. I wasn't able to make it work with using {{PAGENAMEE}}. I even tried using a different parameter (e.g. "Sulm"). It simply wasn't populating the table, but, the Extension troubleshooting section says that might've been because of the "$wgTimeOut" thingie ... so I updated that in my LocalSettings.php, and sure enough, even with up to 300 seconds, it's giving a "500 error".
Here's what I'm using, on the off chance that you might be able to help refine it more.
   |format=CSV with header
   {| class="wikitable"
   ! Topic
   ! Type
   ! Product
   ! Page/Card/Image {{#for_external_table:<nowiki/>
   {{!}} {{{A}}}
   {{!}} {{{B}}}
   {{!}} {{{C}}}
   {{!}} {{{D}}}
I don't know if you're able to help troubleshooting it, but, here's the address for the page I'm using as an example.
-IcarusATB (talk) 16:20, 25 July 2019 (UTC)
That's strange - I just tried out that exact wikitext on one of my wikis, and it worked out fine, displaying five well-formatted rows (plus the header row). Is it still not working for you? Yaron Koren (talk) 18:44, 26 July 2019 (UTC)
First, why would the behaviour of the code change on my end? I ahven't changed the code further, therefore, it's still getting the same result. That's really not "strange", at all. What it indicates, to me, is that if you're writing, testing, and checking it on your server and it works fine, there's likely something on your server that isn't on mine. Probably an extension, addon, plugin, whatever, that I don't konw to install, or something else, other than the code, that's making it behave differently.
IcarusATB (talk) 01:49, 27 July 2019 (UTC)
I thought maybe the CSV data was temporarily down or something for you, but working when I tried it. I still think it's strange. Yes, we have different setups, but I can't think of any reason why we'd be seeing these different results, assuming you're using the latest External Data code. Yaron Koren (talk) 18:04, 29 July 2019 (UTC)

Get template data from another wiki[edit]

Hello, Can I use this extension to do a semantic query from another site?

If I have a a page "Foo" on SiteA that contains a template "Foo" as:


and I want to create a page on SiteB called "Foo Status" that contains a template "Foo" as:

{{Foo|Bar= {{#get_web_data:url=SiteA |format={???}}} |}}

is this possible?

Yes - if the source wiki stores its data via either SMW or Cargo, both of those extensions provide an API of sorts to let you get query results in either CSV or JSON formats, either of which is parseable by External Data. Yaron Koren (talk) 18:46, 17 April 2019 (UTC)
Thanks, Yaron. Any chance you could point me to an SMW example of this? :-)
I can't think of one. Yaron Koren (talk) 02:18, 18 April 2019 (UTC)

prerequisites for getting LDAP data[edit]


I'm trying to use the #get_ldap_data: function of "External Data" to get LDAP attributes about my users and I'm getting the following error:

Fatal error: Call to undefined function ldap_connect() in /opt/htdocs/mediawiki/extensions/ExternalData/ED_Utils.php on line 136

A quick "grep -R ldap_connect" in the "extensions/External Data" folder shows only a use-call to ldap_connect and nothing anywhere actually defining it.

A quick search online of "ldap_connect" seems to indicate that it is defined in the PHP module "PHP-LDAP"

A quick inspection of my phpinfo() page shows that the php-ldap module is not loaded.

Before I install the php-ldap module on my system, can someone confirm that php-ldap is indeed a pre-requisite for the #get_ldap_data: function of "External Data" to work.

Poor man's Sync from WikiA to WikiB[edit]


Can I use this extension as a real-time clone of a page in one wiki to another?

For example, Mediawiki Site A has a page called "Foo" with arbitrary text.

Can I create a page called "Foo" on another wiki that contains something like:


and functionally get a copy of the page Foo on Site A?

Well, I'm pretty you can get the right wikitext - after all, a single piece of text is valid CSV. Whether it'll display correctly is a different story - template calls won't work, for instance, unless you have a local copy of those templates. It's probably easier to just do an iframe, using one of the iframe-supporting extensions, with "action=render" on the source URL to leave out the skin. Yaron Koren (talk) 02:25, 18 April 2019 (UTC)

Nested SQL functions throw a rdbms error[edit]

ExternalData (rdbms error with nested sql commands).png
Per an email with Yoren, "There's no special handling of replace() or any other commands."
This would lead me to think the below syntax should work... but it does not.
Does anyone have any suggestions or corrections to the code I'm trying to get working? Much appreciated if you do.

Unable to perform nested SQL REPLACE functions[edit]

I am attempting to query data from a local database, but need to replace 3 different characters with alternatives that won't trigger mediawiki parsing of the result. No matter the format I have tried, I receive the error pictured to the right when trying to display the page.

This snippet works, but only replaces one of the three characters necessary:

|data=zoneNAME=replace(zone_settings.name,'_',' ')

I need the below SQL code to work, but it creates a parse error whenever run inside a mediawiki page:

|data=zoneNAME=replace(replace(replace(zone_settings.name,'_',' '), '[','('), ']',')')

or with html codes in place of characters:

|data=zoneNAME=replace(replace(replace(zone_settings.name,'_',' '), &#91;,&#40;), &#93;,&#41;)

and in any combination of the character/html code I could think of. I even tried variations using the mediawiki replace command like this:

|data=zoneNAME={{#replace:{{#replace:(replace(zone_settings.name,'_',' ')|&#91;|&#40;}}|&#93;|&#41;}}


|data=zoneNAME={{#replace:{{#replace:(replace(zone_settings.name,'_',' ')|<nowiki>[</nowiki>|<nowiki>(</nowiki>}}|<nowiki>]</nowiki>|<nowiki>)</nowiki>}}

Same error, different SQL commands[edit]

In a very similar usage scenario, I would like to capitalize the first letter of the string returned, and make sure the rest of the letters are lowercase. Unfortunately, I must also replace all '_' with a space, as in my above issue. Each of these examples works separately, but once they are nested together, I get the same error as in my image above:

|data=itemNAME=REPLACE(item_basic.name,'_',' ')

Once nested, in any combination, the same error as above is shown:

|data=itemNAME=REPLACE(UCASE(item_basic.name),'_',' ')

Eventually, this is the SQL I would like run, but if it's too much for the extension I will work on finding a different solution:

CONCAT(UCASE(LEFT(item_basic.name, 1)), LCASE(SUBSTRING(item_basic.name, 2)))



Thanks for any help provided!

Get nested JSON data[edit]

If an API returns a non-flat JSON structure, it seems like it's not possible to access the deeper data. Or is it? Sophivorus (talk) 03:27, 28 June 2019 (UTC)

Add http headers to get_web_data[edit]

We need to get data from API with authentication header, produced dynamically. I have read this answer, which not helping - this is not soap API.

To solve it, I inserted hook mechanism into the get_web_data logic, so I can alter data before request. But now I have other problem - Http class have no way to add headers to the request. I needed to extends it just for this need. (And to redeclare also its get() and post() methods, because the origin uses Http::request instead of self::request)

So the full solution patch is here in gerrit - allowing hook the call ($url and $options passing by ref) and inserting additional headers by $options['headers']. I think that the HttpWithHeaders class is overkill - this feature would be better in the core Http class - but that's for start.

Getting file data from UNC path?[edit]

Hello, this is working great for with local files, but I would to use a json file stored on a network drive. I have tried using a mapped drive (which I did not expect to work anyway) but I had hoped i could access via \\domain\sharedfolder\file.json defined in LocalSettings. I tried with both forward and backward slashes, but always get Error: No file found. I hope I am just doing something syntactically wrong. I appreciate the help!

I don't know - the key is that the server can access that path, not just your computer. Could that be the issue? Yaron Koren (talk) 19:22, 6 August 2019 (UTC)

Yes, it is accessible from the server. Is there a particular syntax to use for UNC paths? Thank you!

I don't know - External Data is using the PHP file_get_contents() function, so based on what it says here, the path you entered should work. Yaron Koren (talk) 17:28, 7 August 2019 (UTC)

Stale cache[edit]

If {{#get_web_data:}} manages to fetch data, and then, when the cache expires, the data source is gone, will the function return the stale data from the cache?

What about "Stored Procedures" in SQL systems.[edit]

Hello. There is SQL application on my network I'm trying to get data from using "Extension:External Data" and the SQL app owner has told me that they can make a "Stored Procedure" available to me rather than grant me "read" access to the SQL database itself.. Can someone shed some light on how this could be done using this extension? What should I ask for? Are SQL "Stored Procedures" the SQL implementation of SOAP? Can someone please offer some insight on how to do this if it's possible at all. Thanks!

You can do it - you just need to create a "mini-API", i.e. a script in some language like PHP or anything else, that calls that stored procedure, possibly passing in to it some values from the query string, and then output the results on the screen in CSV or JSON. External Data can then access that API via #get_web_data, and make use of the results. Yaron Koren (talk) 13:23, 6 September 2019 (UTC)
Thanks, Yaron. I'll pursue this approach.