Extension:LinkedWiki

The LinkedWiki extension lets you reuse Linked Data in your wiki. You can get data from Wikidata or another source directly with a SPARQL query. This extension also provides Lua functions for building your module so that you can write your data in your RDF database.

Quick start
After the installation of this extension:
 * 1) Open the special page: SPARQL editor
 * 2) Select a SPARQL service: an new SPARQL endpoint or a configuration of your SPARQL services
 * 3) Insert a SPARQL query (examples of SPARQL queries)
 * 4) Select a visualization: HTML table or a Sgvizler2 visualisation
 * 5) For a Sgvizler2 visualisation, you can click on the button "See the doc" to find its available options.
 * 6) Check the result
 * 7) To finish, open the tab "See Wiki code" and copy the generated wiki code in a page of your wiki

See details : #sparql reuses your data in your wiki

Maps




Only 3 parameters are necessary to print a map in your wiki:
 * 1) a SPARQL query
 * 2) a SPARQL service (by default Wikidata)
 * 3) a visualization (charts or table, pivot, etc.)

For the leaflet.visualization.Map visualization with OpenStreetMap, you add several options.

You can also use the google.visualization.Map visualization and see the log to debug your query or the visualization.

You can replace the config parameter by a SPARQL endpoint but if it does not work, we have to create a specific configuration for this SPARQL service.

HTML table


By default, the parser #sparql uses a HTML table and it can be customize with wiki templates.

This visualization supports the service SPARQL with credentials which have to describe in the localsettings.

Datatable
or use a DataTable where you can customize the tag HTML and the the style of each column.

Build SPARQL queries
The LinkedWiki extension gives two SPARQL editors. Flint Editor works with SPARQL endpoint 1.1 or 1.0 but sometimes, it doesn't work for example with Wikidata.

We develop a new SPARQL editor where you can select in one click an endpoint already defined in your configuration and read (and write, if you want) via SPARQL directly in this editor.

See details: Special pages to test your queries and to build your visualizations for your wiki

Visualize SPARQL results
The extension gives parser functions #SPARQL in order to reuse your data or the Linked Open Data in your wiki.

You can use a new SPARQL endpoint or reuse a SPARQL service already defined in the configuration of your wiki.

See details :
 * #sparql reuses your data in your wiki
 * (deprecated) #wsparql

Write data in the pages
The tag  allows to write directly in RDF/Turtle (1.0 or 1.1) on any page of wiki. All pages with this tag are in the category "RDF page".

If the option "check RDF Page" is enable, the wiki checks the RDF before saving the page. If there are an error, the wiki shows the line where there is a problem in the turtle code.

Example of page with the tag : a RDF documentation.

Share data
IRIs (or URIs) of pages with the tag  are Cool IRIs. So a machine see only the RDF content and a human can see the RDF content with a description in natural language in the same page.

If your wiki is private, it is possible to open your private wiki only for your RDF database (see the installation).

Write data of main pages in a data namespace
The LinkedWiki extension creates namespaces Data and UserData. Users naviguate in these namespaces via a data tab on all main/users pages.

Only users in group Data can change these namespaces. A user or a bot can use these namespaces to write a RDF/Turtle content in relation with a main page.

Push private data in a open knowledge base
For example in a private wiki, you can enable the push tab in order to push easily in a public wiki a private page with its subpages, data, files, modules, etc.

Configuration of SPARQL services
Often, the configuration of SPARQL services is not trivial and very different in function of RDF databases. In the configuration of the extension, you can declare correctly public SPARQL services and also your private SPARQL services.

The extension supports the SPARQL services with credentials where the users of your Wiki can reuse your private data without seeing your credentials.

See details : Configuration of the LinkedWiki extension



Module : Lua class to read/write your data
Generally for users, a wiki page is like an object where they want to be able to write/read a property. Unfortunately, RDF schemas can be complex and the contributor are rarely experts in RDF or in SPARQL.

The LinkedWiki extension can simplify the work of contributors and developers without imposing definitive RDF schemas on your data.

With the Lua class of LinkedWiki extension, you can build your own module (for example an infobox) where you are able to write or read a property of your RDF database via a SPARQL service.

If you want to change your RDF schemas, you change simply your modules and refresh your database and all pages of your wiki via the special page "Refresh database".

See details : Use LinkedWiki in your module

Write contraints and generate a SHACL report
The tag  supports the attribute   to precise how checking your data.

For the moment, LinkedWiki support only SHACL. If RDFUnit is installed, the special page "RDF test cases" generate the SHACL report of your database with the rules wrote in the wiki. This special page shows the last report calculated and can recalculate it (many minutes).

To enable the contraints to verify, you need to insert this attribut  in the tag. Example :

Download instructions
You can download the last version with this link.

Installation of LinkedWiki
To install this extension :
 * 1) copy the extension in the folder   of your wiki
 * 2) in the folder, execute   and   (or  ). If you don't have install composer or yarn, see in this page : "How install composer and yarn ?".
 * 3) add the following line to :

You can now use the Special:SPARQLEditor of your wiki to build a query with its chart/table and copy this chart/table in any pages of your wiki. On the service LinkedWiki.com, you can find examples of queries with their wiki text.

By default, a query without endpoint or configuration is resolved by Wikidata.

Configuration of local SPARQL services
Wikidata is the endpoint by default of your wiki (read only).

If you add a new SPARQL service and change the default endpoint of your wiki, you need to add parameters in your localsettings.

Example for a Virtuoso SPARQL service, we add the configuration "http://database-test/data" :

If you want to replace Wikidata by this new SPARQL service, we need to add also this line:

If you want to use this SPARQL service to save all RDF data of wiki, we need to add this line:


 * See details : Configuration
 * Examples of other endpoints : List of configurations

Make an infobox with the lua functions
If you want to make an infobox with the lua functions of LinkedWiki, you need to install the Extension:Scribunto and the Extension:Capiunto. Next, you need to read the quick start with Lua.

Add a data tab for main pages and user pages
LinkedWiki allows the tag  to write with RDF/Turtle directly in a page (their ontology or their SHACL rules, for example) but people prefer to separate natural language from RDF/Turtle in their wiki.

In your localsettings, the following line inserts a new Data tab on all main pages and user pages. After, you can give the rights to users to see or not this tab and to change or not the pages in the Data namespace.

Add a push tab
When you have finished working in private (ie, in a private wiki), you may want to push your pages (with their modules, templates, files and data pages) in another (public) wiki. This installation inserts has discrete push tab on your pages. On the Special: BotPasswords page of the target wiki, you need only to create the login and password for the locasettings of your source wiki.

You can add push targets (wikis you can push content to) by adding elements to the  array. The array keys should be the names of your wikis and the values should point to the wiki root, without tailing slash. You can find your wiki root by clicking the history tab on a page, finding the '/index.php' in it, and taking everything that's left of that. You create the logins and passwords via the Special: BotPasswords of targets.

Example:

Details:

Compatibility: Mediawiki 1.33.1+

Check the RDF/Turtle syntax before saving
In the extension, you enable the feature "check RDF Page". In your localsettings, you can enable this feature with this line :

This feature uses RAPPER to parse the syntax Turtle (1.0 and 1.1).

This tool is installed in same time that Raptor RDF of Redland. You have to install redland or raptor2. For example in CentOs, the commands are :

Generate a SHACL report
You need to install RDFUnit. This tool is again experimental but it's stable.

The extension waits RDFUnit in the folder /RDFUnit of your server (or link this folder). The special page "RDF Unit" shows the command line to test the installation and shows the SHACL report or the errors of RDFunit.

Here an example to install RDFUnit v0.8.21 (last release) in a CentOS server:

Refresh your RDF database with RDF pages of private wiki
If your wiki is private, the special page "Refresh RDF database" doesn't work without Extension:NetworkAuth. The RDF database with a SPARQL service need to read the RDF page of private wiki without credentials.

If your database is installed with the wiki on the same server, the configuration for Extension:NetworkAuth will be:

Errors about CURL
If, after the installation, you have errors about CURL, probably you need to install the lib php7X-curl in your server. Example with PHP 7.3 with ubuntu, debian or fedora: or

Questions ?

 * FAQ and problems

How install composer and yarn ?
For debian or fedora: or

How to propose a new feature ?

 * You can propose a new task : Propose a new feature

How to declare a problem ?

 * You can declare a problem here : Declare a problem

=See also=


 * SPARQL examples: LinkedWiki.com, University of Paris-Saclay and Wikidata
 * SPARQL in Wikipedia
 * Tutorial SPARQL in french (Wikiversity)