From MediaWiki.org
Jump to navigation Jump to search

This extension is still under active development. This document contains some additional information that may be of interest to developers contributing to this project.

Working with Wikibase[edit]

WikibaseMediaInfo sits on tops of Wikibase. If you're working with WikibaseMediaInfo you'll need to keep an eye on what's happening in Wikibase, because changes in Wikibase can affect you. For example Wikibase js config vars can come and go, and if they disappear they might catch you out.

Also watch out for conceptual differences between the two. As an example - in MediaInfo every File page has a corresponding MediaInfo item. Sometimes that item doesn't exist in the database, in which case it'll be a virtual item consisting only of an id. As far as Wikibase is concerned that item doesn't exist. This has tripped us up in the case of Wikibase's entityLoaded hook - it doesn't fire if there is no Wikibase entity in the db, and we need it to fire for a virtual item as well as a concrete one, so we can't use it.

Wikibase code is heavily abstracted and it can take some work to understand it and how to use it from WikibaseMediaInfo. Instantiating objects in particular can be a bit tricky - factories are wrapped inside callbacks that are in turn wrapped inside dispatching factories (factories that delegate object instantiation to other factories depending on their inputs). There's a WikibaseRepo service locator which you can access statically using WikibaseRepo::getDefaultInstance(), and to get utility classes like serializers or lookups you can usually find some kind of get*() method on the service locator that will give you what you need.

Here's an example - on the File page we want to be able to make the MediaInfo item associated with the page available to javascript. We do that by writing a serialized version of the item to a js config var inside the onBeforePageDisplay() hook in WikibaseMediaInfoHooks.php. Here's a simplified version of the code:

// Note: the OutputPage object $out is passed into the hook by default
// Step 1: getting the entity from storage
$pageId = $out->getTitle()->getArticleID();
$entityId = 'M' . $pageId;
$entityLookup = WikibaseRepo::getDefaultInstance()->getEntityLookup(); // service locator
$entity = $entityLookup->getEntity( $entityId );

// Step 2: serializing the entity
$serializer = WikibaseRepo::getDefaultInstance()->getAllTypesEntitySerializer( $entityId ); // service locator
$serializedEntity = ( $entity ? $serializer->serialize( $entity ) : [] );

// Step 3: writing to js config var
$out->addJsConfigVars( [ 'wbEntity' => $serializedEntity ] );

The factories that the service locator ultimately uses are defined in WikibaseMediaInfo.entitytypes.php - for example the serializer used above is defined like this:

return [
	MediaInfo::ENTITY_TYPE => [
		'serializer-factory-callback' => function( SerializerFactory $serializerFactory ) {
			return new MediaInfoSerializer(

Wikibase code munges all the serializer factories together into a dispatching factory, and then when you call getAllTypesEntitySerializer() from the service locator with a MediaInfo entity id it uses the callback defined above to return a MediaInfoSerializer.


To set up federation you need to have a local wikidata and a local commons on separate urls.

Vagrant does this automatically on http://dev.wiki.local.wmftest.net:8080 and http://wikidata.wiki.local.wmftest.net:8080

Once you have both setup it's a good idea to populate your local wikidata with at least some of the data from live wikidata. To do this you can install https://github.com/filbertkm/WikibaseImport then

  • import all properties mwscript extension/WikibaseImport/importEntities.php --wiki wikidatawiki --all-properties
  • import some items mwscript extension/WikibaseImport/importEntities.php --wiki wikidatawiki --range Q1:Q999

And finally set up local wikidata as a remote repository for local commons. This is how I've done it in LocalSettings.php

if ( $wgDBname === 'wiki' ) {
	wfLoadExtension( 'WikibaseMediaInfo' );
	$wgEnableWikibaseRepo = true;
	$wgEnableWikibaseClient = true;

	unset( $wgWBRepoSettings['entityNamespaces']['item'] );
	unset( $wgWBRepoSettings['entityNamespaces']['property'] );
	$wgWBRepoSettings['useEntitySourceBasedFederation'] = true;
	$wgWBClientSettings['useEntitySourceBasedFederation'] = true;
	$wgWBClientSettings['sharedCacheKeyPrefix'] = 'local-dev';
        $wgWBClientSettings['repositories'] += [
                'wiki' => [
                        'repoDatabase' => 'wiki',
                        'entityNamespaces' => [ 'mediainfo' => '6/mediainfo' ],
                        'baseUri' => 'http://dev.wiki.local.wmftest.net:8080/wiki/Special:EntityData/',
                        'prefixMapping' => [],
	$wgWBRepoSettings['sharedCacheKeyPrefix'] = 'local-dev';

	$wgWBRepoSettings['localClientDatabases'] = [];
	$wgWBRepoSettings['entityNamespaces'] = [ 'mediainfo' => NS_FILE ];
	$wgWBRepoSettings['foreignRepositories']['d'] = [
		'repoDatabase' => 'wikidatawiki',
		'baseUri' => 'http://wikidata.wiki.local.wmftest.net:8080',
		'supportedEntityTypes' => [ 'item', 'property' ],
		'prefixMapping' => [],
		'entityNamespaces' => [ 'item' => 0, 'property' => WB_NS_PROPERTY ]
	$wgMediaInfoExternalEntitySearchBaseUri = 'http://wikidata.wiki.local.wmftest.net:8080/w/api.php';

Testing Strategy[edit]

WikibaseMediaInfo is a complicated extension, with complicated dependencies (i.e. Wikibase). Automated testing can play an important role in helping to manage this complexity.

To do this, we are using three different types of tests, which can be likened to levels in a "testing pyramid"[1]. The three levels are: JS and PHP unit tests (the "base" of the pyramid), PHPUnit API/integration tests (the middle layer), and end-to-end tests in Selenium (the top of the pyramid).

Javascript unit tests (headless Node/QUnit)[edit]

WikibaseMediaInfo introduces lots of new JS code, much of which is concerned with introducing new UI elements that enable users to view and edit structured data in various places (File pages, UploadWizard, Search, etc.). Wherever possible, we want to try and test these new JS components in isolation, using a headless Node.js testing framework instead of the traditional Special:JavascriptTest approach. There is a good discussion around the advantages and reasoning behind this approach at this RFC on Phabricator.


Node.js v10 is required to run these tests. QUnit is used as the testing framework. The JSDOM and Sinon libraries are also used extensively.

Writing Tests[edit]

For JS code in a Mediawiki extension to be testable this way, we need to be able to load it in an isolated context using Node's require statement. This means that the relevant part of the codebase needs to be re-written using ResourceLoader's new PackageFiles feature. Then the individual JS files used in this module must define a module.exports property (these files no longer need to be wrapped in self-executing functions). In addition to making code more testable, refactoring in this way lets us write JavaScript in a way that is more in line with the current practices of the wider JS community. This refactoring is currently in-progress (some modules in our extension.json use PackageFiles, while others still define an array of scripts).

Tests live in: tests/node-qunit and are organized into subfolders. Here is an example with a few simple tests for the LicenseDialogWidget, a basic UI component.

Having good coverage at the JS component level will help to catch regressions and make it easier to refactor code. Things to test for at this level include basic interactions (toggling a component in or out of edit state, for example), ensuring that appropriate API requests are sent when an action is taken, etc.

Running Tests[edit]

To run Node QUnit tests, open a terminal and run npm run test:unit. They are also included in the larger npm test script (which means they will run in CI).

PHP tests[edit]

PHPUnit tests are located in tests/phpunit. They must by run using MediaWiki core’s phpunit.php like this sudo -u www-data php phpunit.php --wiki wiki (in the vagrant dev environment phpunit.php is in /vagrant/mediawiki/tests/phpunit/).

Normal unit tests are in tests/phpunit/mediawiki/. Integration tests are in tests/phpunit/integration/.

End-to-end tests (Selenium)[edit]

End-to-end tests represent the highest level of the "testing pyramid". Tests at this level should focus on the "happy path" for a user. They can also be used to ensure that basic functionality (like logging in and editing a page) is never hampered by a regression.

Currently it is not feasible to run extension-specific Selenium tests for WikibaseMediaInfo in the regular CI process. Instead, tests can be run against Beta Commons on a regular schedule. These tests need to live in their own location ("specs_betacommons" instead of "specs") so that they are not picked up by the Selenium script run by Core (which does happen in the CI pipeline).

There is currently an in-progress patch that adds this functionality to add Selenium tests to this extension here. This document will be updated with more information about how to write and run these tests once that patch is merged.