Extension talk:External Data/Archive 2017 to 2018

#display_external_table:... lower case parameter names
Hi, I was in the process of updating the documentation for External Data but I got stuck with a question. It seems that  can only handle lower case parameter names that are defined in. Is this by design or is this a "bug"?

When you alter any of the letters in parameter to upper case like paraMeter  works just fine but   stops working. Example:

--Jongfeli (talk) 13:22, 30 January 2014 (UTC)


 * That definitely sounds like a bug, yes. Yaron Koren (talk) 16:49, 30 January 2014 (UTC)


 * Well Yaron, I was a little to fast with this one. We where not running the latest version of External Data and it was already fixed "ages" ago, thanks. Sorry for the "trouble" :) --Felipe (talk) 07:24, 31 January 2014 (UTC)


 * Great, that's a relief! Yaron Koren (talk) 13:02, 31 January 2014 (UTC)

Basic auth in conjunction with External Data?
First of all: great extension! So many possibilities! :-) As of now, I’m using External_data to access Semantic data on another wiki. Took me some time, but I think I tackled all of my problems - at least most of them. Next step for me is accessing xml and json data from some of our university’s web services. Problem is, I need basic auth to access these files. Is this even possible? If so, could anyone give me a clue as to where to start looking?

-- Tobi (talk) 10:43, 19 February 2014 (UTC)


 * Sorry for the very long delay on this. No, it's not possible - see below for one idea on how it could be implemented. Yaron Koren (talk) 12:27, 14 March 2014 (UTC)

ldap tabular result
Extension:External_Data says: It currently only handles a single row of results, and so is most appropriate for querying directories for data about individual users. For our use case we'd need tabular results. What needs to be done to modify the extensions behavior appropriately? --Seppl2013 (talk) 10:31, 22 February 2014 (UTC)


 * Sorry for the long delay. Unfortunately, I don't know much about the LDAP-accessing portion - all that code was written by someone else. Hopefully it's not hard to do. Yaron Koren (talk) 12:28, 14 March 2014 (UTC)

Using #get_web_data to retrieve data from file on the same wiki where login is required to read
If I understand things correctly, then if you have a wiki where users must log in in order to read, then the External Data won't be able to read the files that have been uploaded. I've seen the discussion here where you mention the only ways around this are to either create a web page to log into the wiki or use #get_db_data.

Is there some way I could patch the extension to get around this? One possibility that comes to mind is to create a separate account for the extension and then use these credentials for it to log in. This seems like a bit of a hack since the extension is part of the wiki but doesn't have permission to actually access it. Is there some other approach I should be looking into?

--ChrisDavis (talk) 12:56, 12 March 2014 (UTC)


 * It would be great if External Data could support something like that. It would most likely require using an account to log in, yes, as hack-ish as that sounds. I'm imagining something like the setup for #get_db_data, where the username and password, and possibly a login URL as well, are stored in LocalSettings.php for each site/wiki being accessed. I don't know hard or easy that would be to implement, especially given the different kinds of authentication setups different kinds of sites have. Yaron Koren (talk) 12:27, 14 March 2014 (UTC)

Use as RSS reader
I am thinking of using this as an RSS reader since existing RSS readers have various issues.

What's is the method to limit the number of posts with this extension? Acnetj (talk) 16:54, 23 March 2014 (UTC)


 * That's an interesting usage; if you can get it working, that's great. Unfortunately, there's no current way to limit results, although this would be easy to accomplish, by just adding a "limit=" parameter to #get_web_data, in the same way that #get_db_data has one now. Is that the only obstacle you know of to allowing External Data to serve as an RSS reader? Yaron Koren (talk) 17:49, 23 March 2014 (UTC)


 * I can get it working except that I cannot limit the data. The other thing is that the date and time from the RSS cannot be reformatted through existing functions.203.185.14.234 06:05, 24 March 2014 (UTC)


 * Right now I am switching to js based RSS reader through Widget and that gives me a better result.Acnetj (talk) 16:06, 24 March 2014 (UTC)


 * Alright, cool. Adding a "limit=" parameter sounds like a good idea in any case, though. Yaron Koren (talk) 16:26, 24 March 2014 (UTC)

Oracle database
I'm having problems setting up/verifying my connection to an external Oracle database. The database is local, Oracle XE 11g. I'm using BitNami MediaWiki 1.22.5. (And I'm a newbie at this)

This is what I have in LocalSettings.php: $edgDBServer['XE'] = "jdbc:oracle:thin:@//:1521/XE"; $edgDBServerType['XE'] = "oracle"; $edgDBName['XE'] = "XE";

This is my query:

I'm sure that the $edgDBServer value is wrong, but I have tried all kinds of combinations, with and without jdbc:oracle:thin, with and without /XE, and can't figure out what this should be. I either get an error about compiling php --with-oci8 option or "(Cannot contact the database server)". I can log into the database using PL/SQL Developer and SQL*Plus using the same user and password.

How do I figure out what the structure of this value should be?

Thanks! Parrotheadcali (talk) 15:46, 17 April 2014 (UTC)

#get_web_data - CSV Filters
So this might be a naive question but I have a CSV file with the following structure and data:

Here is what I have on the Wiki page:

So my question:

Is there a way to use a greater than; less than, etc filter instead of a filter that shows an equal comparison?

Thank you. Clifford.kasper (talk) 21:09, 8 May 2014 (UTC)


 * Unfortunately, no - the filtering capability of #get_web_data is quite limited. You'd have to do that kind of filtering on the server side (i.e., some kind of dynamic API to generate the CSV), or via SMW, by storing the CSV data as internal semantic data and then querying on that. Yaron Koren (talk) 21:26, 8 May 2014 (UTC)


 * Thanks for the information Yaron. --Clifford.kasper (talk) 13:50, 23 May 2014 (UTC)

Just copying every line of the input file to the page?
This is a great extension, but I've found some problems to make it work properly. I have a mediawiki which I'm feeding with the results of my PhD. I'd like to include the files with the raw results (csv), and then having them shown in a chart in the the page. So, I need just to copy every line in the input file into the page. To do this, I'm doing:





///////////example lines contained in the csv: ,80%,60%,40%,20% 0,1,3,5,8 ///////////

Which is not working. However, if I input directly the lines, it works (so, looks to me like a problem with externalData :S). In the page of the extension is stated that copying every row is supported directly by for_external_table, but nowhere is explained how to do it. Could somebody shed some light on this point?

Thank you.


 * I'm not familiar with &lt;pLines&gt;, but I'm guessing that it's a tag defined by a MediaWiki extension. The problem with such tag functions is that they're called before their contents are parsed - so what you're passing in to it is literally the string "#get_web_data...". Thankfully, there's an easy solution: you can use MediaWiki's built-in parser function #tag to call tag functions in a way so that their contents will be parsed correctly. Yaron Koren (talk) 12:35, 15 May 2014 (UTC)

There is a section on the pChart4mw extension page that shows how to use the charting functionality with parser functions. I think this will help you:


 * 1) Use the External Data call outside of the charting
 * 2) Use the Parser Function syntax

That should work for you. --Clifford.kasper (talk) 14:03, 23 May 2014 (UTC)

Issue with internal CSV lists since upgrade [RESOLVED]
My host just upgraded to PHP 5.4.4 and MySQL 5.5.37 and something odd happened after that. I'm using to get values internally from a CSV list on a wiki page (between includeonly tags). Those values are called through or  to produce a tabular output. This was working fine before the upgrade, but soon after this all values in the first column of each table went missing. It's possible to cheat one's way out of it by adding a dummy column in front, so in this case by adding a comma at the beginning of every row of the comma-separated list. Clearly something goes wrong with the way that list is read or parsed. Cavila (MW 1.22, MySQL 5.5.37-0, Php 5.4.4-14 squeeze, SMW 1.9.2, SF 2.7) 08:55, 20 May 2014 (UTC)


 * No idea - could you include here the calls you're making, and/or the data set? Yaron Koren (talk) 15:14, 20 May 2014 (UTC)

Yaron has looked into this and submitted a fix to ED_Utils.php (git link). Thanks for this! Cavila (MW 1.22, MySQL 5.5.37-0, Php 5.4.4-14 squeeze, SMW 1.9.2, SF 2.7) 10:38, 24 May 2014 (UTC)

{{#iferror: broken with this extension
@Yaron_Koren The error message is not wrapped in something with class="error" so I can't feed it through {{#iferror: and get useful results (demonstration).

Also, https://git.wikimedia.org/metrics/mediawiki%2Fextensions%2FExternal_Data which is linked from the main page does not work, and https://github.com/wikimedia/mediawiki-extensions-ExternalData is a mirror which I can't add issues to, so it's slightly tricky to find a better place to report this.


 * Oh, I never thought about that #iferror thing - that's a good point. The places to report bugs are here, or on Bugzilla, or on the SMW mailing list. Yaron Koren (talk) 01:08, 16 July 2014 (UTC)