Extension:External Data

Description
The External Data extension allows MediaWiki pages to retrieve, filter, and format data from one or more sources. In addition to external URLs, data can also come from a regular wiki page, an uploaded file, an LDAP directory, or a relational database.

The extension defines six parser functions:
 * #get_external_data retrieves CSV, GFF, JSON or XML data from a URL and assigns it to variables that can be accessed on the page.
 * #get_db_data retrieves data from a relational database
 * #get_ldap_data retrieves data from an LDAP server
 * #external_value displays the value of any such variable.
 * #for_external_table cycles through all the values retrieved for a set of variables, displaying the same "container" text for each one.
 * #store_external_table cycles through a table of values, mimicking a call to the Semantic Internal Objects extension's #set_internal function for each row (Semantic Internal Objects must be installed for this function to work).

Download
You can download the External Data code in either one of these two compressed files:


 * external_data_1.1.tar.gz
 * external_data_1.1.zip

You can also download the code directly via SVN from the MediaWiki source code repository, at http://svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/ExternalData/. From a command line, you can call the following:

svn checkout http://svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/ExternalData/

You can also view the code online here.

Installation
To install this extension, create an 'ExternalData' directory (either by extracting a compressed file or downloading via SVN), and place this directory within the main MediaWiki 'extensions' directory. Then, in the file 'LocalSettings.php', add the following line:

Authors
External Data was written by Yaron Koren, reachable at yaron57@gmail.com; Michael Dale; and David Macdonald. Important code contributions were also made by Siebrand Mazeland and Ryan Lane.

Retrieving data
Data can be retrieved from three different sources: from a URL containing structured data (including a URL on the wiki itself), from a database, and from an LDAP server.

#get_external_data - CSV, GFF, JSON, XML
To get data from an external URL, call the following:

An explanation of the fields: The other parameters are divided into two types:
 * URL is the full URL of the CSV, GFF, JSON or XML file. (CSV, JSON and XML are standard data formats; GFF, or the Generic Feature Format, is a format for genomic data.)
 * format specifies the format of the file: it should be one of either 'CSV', 'CSV with header', 'GFF', 'JSON' or 'XML'. The difference between 'CSV' and 'CSV with header' is that 'CSV' is simply a set of lines with values; while in 'CSV with header', the first line is a "header", holding a comma-separated list of the name of each column.
 * parameters containing a '==' are filters - they do additional filtering on the set of rows being returned. It is not necessary to use any filters; most APIs, it is expected, will provide their own filtering ability through the URL's query string.
 * parameters containing a '=' are mappings - they let you connect local variable names to external variable names. External variable names are the names of the values in the file (in the case of a header-less CSV file, the names are simply the indexes of the values (1, 2, 3, etc.)), and local variable names are the names that are later passed in to #external_value.

More than one #get_external_data call can be used in a page. If this happens, though, make sure that every local variable name is unique.

For data from XML sources, the variable names are determined by both tag and attribute names. For example, given the following XML text:
 *  red

the variable type would have the value Apple, and the variable color would have the value red.

Similarly, the following XML text would be interpreted as a table of values defining two variables named type and color:
 *  red
 *  brown
 *  brown

Currently, a CSV file must be literally a CSV file, i.e., delimited by commas. A call for a headerless CSV file might look " ", while a call to CSV with a header row might look like "  ".

You can also set caching to be done on the data retrieved, and string replacement to hide API keys; see the "Usage" section, below, for how to do both of those.

Getting data from a non-API text file
If the data you wish to access is on a MediaWiki page or in an uploaded file, you can use the above methods to retrieve the data assuming the page or file only contains data in one of the supported formats:
 * for data on a wiki page, use "&action=raw" as part of the URL;
 * for data in an uploaded file, use the full path.

If the MediaWiki page with the data is on the same wiki, it is best to use the fullurl: parser function, e.g.

Similarly, for uploaded files, you can use the filepath: function, e.g.

For wiki pages that have additional information, the External Data extension provides a way to create an API of your own, at least for CSV data. To get this working, first place the data you want accessed in its own wiki page, in CSV format, with the headers as the top row of data (see here for an example). Then, the special page 'GetData' will provide an "instant API" for accessing either certain rows of that data, or the entire table. By adding "field-name=value" to the URL, you can limit the set of rows returned.

A URL for the 'GetData' page can then be used in a call to #get_external_data, just as any other data URL would be; the data will be returned as a CSV file with a header row, so the format argument of #get_external_data should be set to 'CSV with header'. See here for an example of such data being retrieved and displayed using #get_external_data and #for_external_table. In this way, you can use any table-based data within your wiki without the need for custom programming.

Data caching
You can configure External Data to cache the data contained in the URLs that it accesses, both to speed up retrieval of values and to reduce the load on the system whose data is being accessed. To do this, you can run the SQL contained in the extension file 'ExternalData.sql' in your database, which will create the table 'ed_url_cache', then add the following to your LocalSettings.php file, after the inclusion of External Data:

You should also add a line like the following, to set the expiration time of the cache, in seconds; this example line will cache the data for a week:

String replacement in URLs
One or more of the URLs you use may contain a string that you would prefer to keep secret, like an API key. If that's the case, you can use the array $edgStringReplacements to specify a dummy string you can use in its place. For instance, let's say you want to access the URL "http://worlddata.com/api?country=Guatemala&key=123abcd", but you don't want anyone to know your API key. You can add the following to your LocalSettings.php file, after the inclusion of ExternalData:

Then, in your call to #get_external_data, you can replace the real URL with: "http://worlddata.com/api?country=Guatemala&key=WORLDDATA_KEY".

Whitelist for URLs
You can create a "whitelist" for URLs accessed by #get_external_data: in other words, a list of domains, that only URLs from those domains can be accessed. If you are using string replacements in order to hide secret keys, it is highly recommended that you create such a whitelist, in order to prevent users from finding out those keys by including them in a URL within a domain that they control.

To create a whitelist with one domain, add the following to LocalSettings.php:

To create a whitelist with multiple domains, add something like the following instead:

#get_db_data - retrieve data from RDBMS
The parser function #get_db_data allows retrieval of data from external databases. This function executes a simple SELECT statement and assigns the results to local variables that can then be used with the #external_value or #for_external_table functions.

A note about security: - If you are going to use #get_db_data you should think hard about the security implications. Configuring a database in LocalSettings.php will allow anyone with edit access to your wiki to run SQL against that database. You should use a database user that has the minimum permissions for what you are trying to achieve. It is quite possible that complex SQL constructions could be passed to this function to cause it to do things vastly different to what it was designed for. You should know what you are doing before enabling this function.

Configuration
You need to configure each database server in LocalSettings.php. Add the following stanza for each server:

$edgDBServer['serverID'] = "server.com"; $edgDBServerType['serverID'] = "dbType"; $edgDBName['serverID'] = "mydb"; $edgDBUser['serverID'] = "myuser"; $edgDBPass['serverID'] = "mypassword";

Where:


 * serverID is a label for this database which is used when calling #get_db_data
 * server.com is the hostname on which the database lives
 * dbType is the type of database, e.g. mysql, postgres, mssql, oracle (oracle support works only on MediaWiki 1.16 or greater)
 * mydb, myuser and mypassword are details for accessing the database.

Usage
To get data from an external database, call the following:

An explanation of the fields:


 * server - the serverID label configured in LocalSettings.php
 * from - an SQL from clause
 * where - an SQL where clause
 * data - mapping of database column names to local variables (syntax: localVariable=databaseColumn - i.e. "strName" is the name of the database column in the example below).

#get_ldap_data - retrieve data from LDAP directory
The parser function #get_ldap_data allows retrieval of data from external LDAP directories. This function executes LDAP queries and assigns the results to local variables that can then be used with the #external_value function. It currently only handles a single row of results, and so is most appropriate for querying directories for data about individual users.

A note about security: - If you are going to use #get_ldap_data you should think hard about the security implications. Configuring an LDAP server in LocalSettings.php will allow anyone with edit access to your wiki to run queries against that server. You should use a domain user that has the minimum permissions for what you are trying to achieve. Wiki users could run queries to extract all sorts of information about your domain. You should know what you are doing before enabling this function.

Configuration
You need to configure each LDAP server in LocalSettings.php. Add the following stanza for each server:

$edgLDAPServer['domain'] = "myldapserver.com"; $edgLDAPBaseDN['domain'] = "[basedn]"; $edgLDAPUser['domain'] = "myDomainUser"; $edgLDAPPass['domain'] = "myDomainPassword";

Where:


 * domain is a label to be used when calling #get_ldap_data
 * myDomainuser and myDomainPassword are credentials used to bind to the LDAP server
 * [basedn] is the base DN used for the search.

Example:

$edgLDAPServer['foobar'] = "foobar.com"; $edgLDAPBaseDN['foobar'] = "OU=Users,dc=foobar,dc=com"; $edgLDAPUser['foobar'] = "u12345"; $edgLDAPPass['foobar'] = "mypassword";

Usage
To query the LDAP server add this call to a wiki page:

Where:


 * domain is the label used in LocalSettings.php
 * filter is the ldap filter used for the search
 * data is the mappings of LDAP attributes to local variables

An example that retrieves a user from with Win2003/AD, using a userid passed to a template:

Note that #get_ldap_data will only retrieve one result.

Displaying data
Once you have retrieved the data onto the page, from any source, there are two ways to display it on the page: #external_value and #for_external_table.

Displaying individual values
If this call retrieved a single value for each variable specified, you can call the following:

As an example, this page contains the following text:

{{#get_external_data:http://semanticweb.org/wiki/Special:Ask/-5B-5BGermany-5D-5D/-3FBorders/-3FPopulation/-3FArea/format%3Dcsv/sep%3D, |csv with header|bordered countries=borders|population=population|area=area}

.
 * Germany borders the following countries and bodies of water:
 * Germany has population.

The page gets data from a URL at semanticweb.org, generated by the Semantic MediaWiki extension. That URL contains the following text:

Germany,"North Sea,Denmark,Baltic Sea,Poland,Czech Republic,Austria,Switzerland,France,Luxembourg,Belgium,Netherlands","82,411,000",3.5705e+11 m²

The page then uses #external_value to display the 'bordered countries' and 'population' values; although it uses the #arraymap function, defined by the Semantic Forms extension, to apply some transformations to the 'bordered countries' value (you can ignore this detail if you want).

Displaying a table of values
The data returned by #get_external_data or #get_db_data (#get_ldap_data doesn't support this feature) can also be a "table" of data (many values per field), instead of just a single "row" (one value per field). In this case, you can call the function #for_external_table to display it. For example, this URL at semanticweb.org contains information similar to that above, but for all the countries in Africa instead of just one country. Calling #get_external_data with this URL, with the same format as above, will set the local variables to contain arrays of data, rather than single values. You can then call #for_external_table, which has the following format:

...where "expression" is a string that contains one or more variable names, surrounded by triple brackets. This string is then displayed for each retrieved "row" of data.

For an example, this page contains a call to #get_external_data for the semanticweb.org URL mentioned above, followed by this call:

The call to #for_external_table holds a single row of a table, in wiki-text; it's surrounded by wiki-text to create the top and bottom of the table. The presence of " | " is a standard MediaWiki trick to display pipes from within parser functions; to get it to work, you just have to create a page called "Template:!" that contains a single pipe. There are much easier calls to #for_external_table that can be made, if you just want to display a line of text per data "row", but an HTML table is the standard approach.

There's one other interesting feature of #for_external_table, which is that it lets you URL-encode specific values, by calling them with instead of just. For instance, if you wanted to show links to Google searches on a set of terms retrieved, you could call:

Using External Data with Semantic MediaWiki
A common approach is to use External Data in conjunction with the Semantic MediaWiki extension: the data is retrieved using External Data, then stored using Semantic MediaWiki tags, so it can then be queried, aggregated, mapped etc. on the wiki. If you take this approach, you should note a common problem, which is that the data stored by SMW does not get automatically updated when the data coming from the external source changes. The best solution for this, assuming you expect the data to change over time, is to create a cron job to call the SMW maintenance script "SMW_refreshData.php" at regular intervals, such as once a day; that way, the data is never more than a day old.

If you're using a version of SMW older than 1.4.3, then in order to be able to run SMW_refreshData.php you will need to add a small patch to Semantic MediaWiki. In the file "/includes/storage/SMW_SQLStore2.php", under line 1278 ("foreach ($titles as $title) {"), you should add the following two lines:

If you want to store an entire table of data in a single page, instead of just individual values, you should use the #store_external_table function. This function works as a hybrid of the #for_external_table function and the #set_internal function from the Semantic Internal Objects extension (that extension must be installed for this function to work). A call to it looks a lot like a call to #set_internal (see the Semantic Internal Objects page for documentation), but it loops over each row, and uses variables, in the same way as #for_external_table. You can see a demonstration of this function on the page fruits semantic data; the call to #store_external_table on that page looks like:

Common problems

 * If the call to #get_external_data or #for_external_tables isn't returning any data, and the page being accessed is large, it could be because the call to retrieve is getting timed out. You should set the $wgHTTPTimeout flag in your LocalSettings.php file (which represents a number of seconds) to some number greater than 3, its default value. You could call, for instance:




 * If the data being accessed has changed, but the wiki page accessing it still shows the old data, it is because that page is being cached by MediaWiki. There are several solutions to this: if you are an administrator, you can hit the "refresh" tab above the page, which will purge the cache. You can also easily disabling caching for the entire wiki; see here for how. Finally, if you wait long enough (typically no more than 24 hours), the page will get refreshed on its own and display the new data.


 * If your wiki uses HTTPS, you may be unable to connect to external URLs that use only HTTP, or do not have a proper security certificate. To get around this problem, you can add the following to LocalSettings.php:



Version
External Data is currently at version 1.1. See the entire version history.

Bugs and feature requests
You should use the Semantic MediaWiki mailing list, semediawiki-user, for any questions, suggestions or bug reports about External Data. If possible, please add "[ED]" at the beginning of the subject line, to clarify the subject matter.

(Although the External Data extension in general is independent of Semantic MediaWiki, the fact that it uses Semantic Internal Objects for one of its functions, along with the fact that the two extensions are often used together, means that External Data can be considered part of the Semantic MediaWiki "family" for communication purposes.)

You can also send specific code patches to Yaron Koren, at yaron57@gmail.com.

Translating
Translation of External Data is done through translatewiki.net. The translation for this extension can be found here. To add language values or change existing ones, you should create an account on translatewiki.net, then request permission from the administrators to translate a certain language or languages on this page (this is a very simple process). Once you have permission for a given language, you can log in and add or edit whatever messages you want to in that language.

Links

 * Presentation at Wikimania 2009