Extension talk:External Data

Populating Wiki Page from CSV spreasdsheet
Hi there! SO, I run a website with a wiki. … I'm interested in having wiki pages contain an entry populated from a spreadsheet I will be hosting on my own website (Encyclopedia_Greyhawkania_DATA ONLY.csv).

As an example, I'd like it to look similar to this example page where I've manually hand-written entries from this index-spreadsheet (without calling it from the spreadsheet. It's the bit at the bottom under "Encyclopedia Greyhawkania".

I'd like to call data from the spreadsheet using the as part of the call parameter. Ideally what it'd do is use the to search ColumnA, for example, then return any row (with cells A through D) that has the  in it, and have it in a table or a list of some kind.

Is there any chance I can get some help with how to call the data?


 * Something like this might work, if I understand the question correctly:


 * Yaron Koren (talk) 17:00, 24 July 2019 (UTC)
 * Thank you so much Yaron Koren!!!
 * I used a variation of that, replacing the "! A" with just the direct title of the header (e.g. "Topic"), instead of using the parameter.
 * I got it to generate the table, but, with no rows below the headers.  I wasn't able to make it work with using  . I even tried using a different parameter (e.g. "Sulm"). It simply wasn't populating the table, but, the Extension troubleshooting section says that might've been because of the "$wgTimeOut" thingie ... so I updated that in my LocalSettings.php, and sure enough, even with up to 300 seconds, it's giving a "500 error".
 * Here's what I'm using, on the off chance that you might be able to help refine it more.


 * I don't know if you're able to help troubleshooting it, but, here's the address for the page I'm using as an example.
 * -IcarusATB (talk) 16:20, 25 July 2019 (UTC)


 * That's strange - I just tried out that exact wikitext on one of my wikis, and it worked out fine, displaying five well-formatted rows (plus the header row). Is it still not working for you? Yaron Koren (talk) 18:44, 26 July 2019 (UTC)
 * First, why would the behaviour of the code change on my end?  I ahven't changed the code further, therefore, it's still getting the same result.   That's really not "strange", at all.  What it indicates, to me, is that if you're writing, testing, and checking it on your server and it works fine, there's likely something on your server that isn't on mine.  Probably an extension, addon, plugin, whatever, that I don't konw to install, or something else, other than the code, that's making it behave differently.
 * IcarusATB (talk) 01:49, 27 July 2019 (UTC)


 * I thought maybe the CSV data was temporarily down or something for you, but working when I tried it. I still think it's strange. Yes, we have different setups, but I can't think of any reason why we'd be seeing these different results, assuming you're using the latest External Data code. Yaron Koren (talk) 18:04, 29 July 2019 (UTC)

Get template data from another wiki
Hello, Can I use this extension to do a semantic query from another site?

If I have a a page "Foo" on SiteA that contains a template "Foo" as: and I want to create a page on SiteB called "Foo Status" that contains a template "Foo" as:

is this possible?


 * Yes - if the source wiki stores its data via either SMW or Cargo, both of those extensions provide an API of sorts to let you get query results in either CSV or JSON formats, either of which is parseable by External Data. Yaron Koren (talk) 18:46, 17 April 2019 (UTC)


 * Thanks, Yaron. Any chance you could point me to an SMW example of this? :-)


 * I can't think of one. Yaron Koren (talk) 02:18, 18 April 2019 (UTC)

prerequisites for getting LDAP data
Hi.

I'm trying to use the #get_ldap_data: function of "External Data" to get LDAP attributes about my users and I'm getting the following error: Fatal error: Call to undefined function ldap_connect in /opt/htdocs/mediawiki/extensions/ExternalData/ED_Utils.php on line 136

A quick "grep -R ldap_connect" in the "extensions/External Data" folder shows only a use-call to ldap_connect and nothing anywhere actually defining it.

A quick search online of "ldap_connect" seems to indicate that it is defined in the PHP module "PHP-LDAP"

A quick inspection of my phpinfo page shows that the php-ldap module is not loaded.

Before I install the php-ldap module on my system, can someone confirm that php-ldap is indeed a pre-requisite for the #get_ldap_data: function of "External Data" to work.

Poor man's Sync from WikiA to WikiB
Hi.

Can I use this extension as a real-time clone of a page in one wiki to another?

For example, Mediawiki Site A has a page called "Foo" with arbitrary text.

Can I create a page called "Foo" on another wiki that contains something like:

and functionally get a copy of the page Foo on Site A?


 * Well, I'm pretty you can get the right wikitext - after all, a single piece of text is valid CSV. Whether it'll display correctly is a different story - template calls won't work, for instance, unless you have a local copy of those templates. It's probably easier to just do an iframe, using one of the iframe-supporting extensions, with "action=render" on the source URL to leave out the skin. Yaron Koren (talk) 02:25, 18 April 2019 (UTC)

Nested SQL functions throw a rdbms error



 * Per an email with Yoren, "There's no special handling of replace or any other commands."
 * This would lead me to think the below syntax should work... but it does not.


 * Does anyone have any suggestions or corrections to the code I'm trying to get working? Much appreciated if you do.

Unable to perform nested SQL REPLACE functions
I am attempting to query data from a local database, but need to replace 3 different characters with alternatives that won't trigger mediawiki parsing of the result. No matter the format I have tried, I receive the error pictured to the right when trying to display the page.

This snippet works, but only replaces one of the three characters necessary: I need the below SQL code to work, but it creates a parse error whenever run inside a mediawiki page: or with html codes in place of characters: and in any combination of the character/html code I could think of. I even tried variations using the mediawiki replace command like this: and

Same error, different SQL commands
In a very similar usage scenario, I would like to capitalize the first letter of the string returned, and make sure the rest of the letters are lowercase. Unfortunately, I must also replace all '_' with a space, as in my above issue. Each of these examples works separately, but once they are nested together, I get the same error as in my image above: Once nested, in any combination, the same error as above is shown:

Eventually, this is the SQL I would like run, but if it's too much for the extension I will work on finding a different solution: or

Thanks for any help provided!

Get nested JSON data
If an API returns a non-flat JSON structure, it seems like it's not possible to access the deeper data. Or is it? Sophivorus (talk) 03:27, 28 June 2019 (UTC)

Add http headers to get_web_data
We need to get data from API with authentication header, produced dynamically. I have read this answer, which not helping - this is not soap API.

To solve it, I inserted hook mechanism into the get_web_data logic, so I can alter data before request. But now I have other problem - Http class have no way to add headers to the request. I needed to extends it just for this need. (And to redeclare also its get and post methods, because the origin uses Http::request instead of self::request)

So the full solution patch is here in gerrit - allowing hook the call ($url and $options passing by ref) and inserting additional headers by $options['headers']. I think that the HttpWithHeaders class is overkill - this feature would be better in the core Http class - but that's for start.