Manual:MediaWiki architecture

This page contains the actual text of the MediaWiki architecture document project. This content will be merged into the appropriate mediawiki.org pages once the specific formatting required for inclusion in the AOSA book is not needed any more.

Introduction
From the start, MediaWiki was developed to be Wikipedia's software. Over the years, efforts have been made to facilitate reuse by third party users, but Wikipedia's influence and bias have shaped MediaWiki's architecture during its ten years of existence.

Wikipedia is one of the top ten websites in the world, currently getting about 400 million unique visitors a month. It gets over 100.000 hits per second. Wikipedia isn't commercially supported by ads; it is entirely supported by a non-profit organization, the Wikimedia Foundation, which relies on donations as its primary model of funding. This means that not only must MediaWiki run a top-ten website, but it must do so on a shoestring budget. In order to meet these demands, MediaWiki has a heavy bias towards performance, caching and optimization. Expensive features that can't be enabled on Wikipedia are either reverted or disabled through a configuration variable; there is an endless balance between performance and features.

The influence of Wikipedia on MediaWiki's architecture isn't limited to performance. Contrary to generic CMSes, MediaWiki was originally written for a very specific purpose: supporting a community for the creation and curation of freely-reusable knowledge on an open platform. This means, for example, that MediaWiki doesn't include regular features found in corporafte CMSes (like a publication workflow or ACLs), but on the other hand offers a variety of tools to handle spam and vandalism.

There's been a constant interplay between the development of MediaWiki, and the needs and actions of a constantly evolving community of Wikipedia participants. The architecture of MediaWiki has been driven many times by initiatives started or requested by the community, like the creation of Wikimedia Commons, or the Flagged Revisions feature. Major architectural changes, like MediaWiki 1.12's preprocessor, were implemented because the way MediaWiki was used by Wikipedians made it necessary.

MediaWiki has also gained a solid external user base by being open-source software from the beginning. Third-party reusers know that as long as such a high-profile website as Wikipedia uses MediaWiki, the software will be mantained and improved. MediaWiki used to be really focused on Wikimedia sites, but in recent years, efforts have been made to make it more generic and better accommodate the needs of these third-party users. For example, MediaWiki now ships with an excellent web-based installer, making the installation process much less painful than when everything had to be done via the command line, and the software contained hardcoded paths for Wikipedia.

Still, MediaWiki is and remains Wikipedia's software, and this shows throughout its history and architecture.

Phase I: UseModWiki
Wikipedia was launched in January 2001. At the time, it was mostly an experiment, to try and boost the production of content for Nupedia, a free-content, but peer-reviewed, encyclopedia created by Jimmy Wales. Because it was an experiment, Wikipedia was originally powered by UseModWiki, an existing GPL wiki engine written in Perl, using CamelCase and storing all pages in individual text files.

It soon appeared that CamelCase wasn't really appropriate to name encyclopedia articles. In late January 2001, UseModWiki developer and Wikipedia participant Clifford Adams added a new feature to UseModWiki: free links, e.g. the ability to link to pages with a special syntax (double square brackets), instead of automatic CamelCase linking. A few weeks later, Wikipedia enabled the new version of UseModWiki supporting free links, and enabled them.

While this initial phase isn't about MediaWiki per se, it provides some context and shows that, even before MediaWiki was created, Wikipedia started to shape the features of the software that powered it. UseModWiki also influenced some of MediaWiki's features, for example its markup language.

Phase II: the PHP script
In 2001, Wikipedia was not yet a top 10 website; it was an obscure project sitting in a dark corner of the Interwebs, unknown to most search engines, and hosted on a single server. Still, performance was already an issue, notably because UseModWiki stored its content in a flat file database. At the time, Wikipedians were worried about being "inundated with traffic" following articles in the New York Times, Slashdot or Wired.

In Summer 2001, Wikipedia participant Magnus Manske (then a university student) started to work on a dedicated wiki engine for Wikipedia in his free time. The goal was to improve Wikipedia's performance using a database-driven software, but also to be able to develop Wikipedia-specific features that couldn't be provided by a "generic" wiki engine. Written in PHP and MySQL-backed, the new engine was simply called the "PHP script", "PHP wiki", "Wikipedia software" or "phase II".

The "PHP script" was made available in August 2001, shared on SourceForge in September, and tested until late 2001. As Wikipedia suffered from recurring performance issues because of increasing traffic, the English language Wikipedia eventually switched from UseModWiki to the PHP script in January 2002. Other language versions also created in 2001 were slowly upgraded as well, although some of them would stay powered by UseModWiki until 2004. An automated program, called "User:Conversion script", converted the last version of the existing articles to the phase II format; previous revisions of the articles' history pre-January 2002 were partly restored by developer Brion Vibber in September 2002.

As a PHP software using a MySQL database, the PHP script was the first iteration of what would later become MediaWiki. Not only that, but it also introduced many critical features still in use today, like namespaces to organize content (including talk pages), skins, and special pages (including maintenance reports, contributions list and user watchlist).

Phase III: MediaWiki
Despite the improvements from the PHP script and database back-end, the combination of increasing traffic, expensive features and limited hardware continued to cause performance issues. In 2002, developer Lee Daniel Crocker rewrote the code again, calling the new software "Phase III". Because the site was experiencing frequent difficulties, Lee thought there "wasn't much time to sit down and properly architect and develop a solution", so he "just reorganized the existing architecture for better performance and hacked all the code".

The Phase III software kept the same basic interface, and was designed to look and behave as closely to the Phase II software as possible. A few new features were also added, like a new file upload system, side-by-side diffs of content changes, and interwiki links.

It was deployed to the English Wikipedia in July 2002, along with a hardware move to a new (but still single) server. Other features were added over 2002, like new maintenance special pages, or the "edit on double click" option. Performance issues quickly reappeared, though: view count and site stats, which were causing two database writes on every page view, were for example disabled in November 2002, then re-enabled. The site would occasionally be switched to read-only mode to maintain the service for readers, and expensive maintenance pages would be disabled during high-access times because of table locking problems.

In early 2003, developers discussed whether they should properly re-engineer and re-architect the software from scratch, before the fire-fighting became unmanageable, or whether they should rather continue to tweak and improve the existing code base. The latter solution was chosen, mostly because most developers were sufficiently happy with the code base, and confident enough that further iterative improvements would be enough to keep up with the growth of the site.

In June 2003, a second server was added, serving as a database server separate from the web server. The database server also served as the web server for non-English Wikipedia sites. Load-balancing between the two servers would be set up later that year. A new page caching system was also enabled, that used the file system to cache rendered, ready-to-output pages for anonymous users.

June 2003 is also when Jimmy Wales created the Wikimedia Foundation, a nonprofit to support Wikipedia, and manage its infrastructure and day-to-day operations. The "Wikipedia software" was officially named "MediaWiki" in July, as a word play on the Wikimedia Foundation's name. What was thought at the time to be a clever pun would confuse generations of users and developers.

New features were added in July, like the automatically-generated table of contents, and the ability to edit page sections, both still in use today. The first release under the name "MediaWiki" happened in August 2003, concluding the long genesis of a software whose overall structure would remain fairly stable from there on.

PHP
PHP was chosen as the framework for Wikipedia's "Phase II" software in 2001; MediaWiki has grown organically since then, and is still evolving. Most MediaWiki developers are volunteers contributing in their free time, and they were very few in the early years. Some software design decisions or omissions may seem wrong in retrospect, but it's hard to criticize the founders for not implementing some abstraction which is now found to be critical, when the initial code base was so small, and the time taken to develop it so short.

For example, MediaWiki uses unprefixed class names, which can cause conflicts when PHP core and PECL developers add new classes: MediaWiki's  had to be renamed to   to ensure compatibility with PHP 5.3. Consistently using a prefix for all classes (e.g. " ") would have made it easier to embed MediaWiki inside another application or library.

Relying on PHP was probably not the best choice for performance, as it has not benefited from improvements that some other dynamic languages have seen in recent years. Using Java would have been much better for performance, and scaling up the execution of back-end maintenance tasks would have been simpler. On the other hand, PHP is very popular, which facilitates the recruitment of new developers.

Even if MediaWiki still contains "ugly" legacy code, major improvements have been made over the years, and new architectural elements have been introduced to MediaWiki throughout its history; they include the,  , and   classes, the   class and the   class hierarchy, ResourceLoader, and the   hierarchy. MediaWiki started without any of these things, despite the fact that all of them support features that have been around since the beginning. Many developers are driven primarily by feature development, and architecture is often left behind, only to catch up later, as the cost of working within an inadequate architecture becomes apparent.

Security
Because MediaWiki is the platform for high-profile sites such as Wikipedia, security is paramount. A strong security-minded development culture has been established among MediaWiki developers; to make it easier to write secure code, they are provided with wrappers around HTML output and database queries to handle escaping.

User input sanitization is done with the  class, which analyzes data passed in the URL or via a POSTed form. It removes "magic quotes" slashes, strips illegal input characters and normalizes Unicode sequences. Cross-site request forgery (CSRF) is avoided by using tokens, and cross-site scripting (XSS) by validating inputs and escaping outputs, usually with PHP's  function. MediaWiki also provides (and uses) an XHTML sanitizer with the  class, and database functions that prevent SQL injection.

Configuration
MediaWiki offers hundreds of configuration settings, stored in global PHP variables. Their default value is set in, and the system administrator can override them by editing.

Global configuration variables offered better performance than other configuration methods in older versions of PHP, but has left MediaWiki with other problems. Globals cause serious security implications with PHP's  function (not needed any more by MediaWiki since version 1.2). This system also limits potential abstractions for configuration, and makes optimization of the start-up process more difficult. Moreover, the configuration namespace is shared with variables used for registration and object context, leading to potential conflicts. From a user perspective, global configuration variables have also made MediaWiki seem difficult to configure and maintain, and hurt third-party reuse of MediaWiki's code (since most other projects cannot share the same MediaWiki global variable names).

Database
MediaWiki has been using a relational database back-end since the Phase II software. The default (and best supported) DBMS for MediaWiki is MySQL, which is the one used by all Wikimedia sites, but other DBMSes (such as PostgreSQL, Oracle, and SQLite) have community-supported implementations. The DBMS can be chosen by the system administrator during installation, and MediaWiki provides both a database abstraction and a query abstraction layer, that simplify database access for developers.

The database went through dozens of schema changes over the years. The current layout contains dozens of tables, many of which related to the wiki's content (e.g.,  ,  , and  ). Other tables include data about users, media files , caching and internal tools (  for ResourceLoader,   for the job queue), among others.

Because of the size of Wikimedia sites, their databases, and their readership, database handling in MediaWiki is extremely optimized for performance. As of 1.4.5 (2005), MediaWiki can store page text externally on a separate database server, and old page revisions can be compressed to reduce the size of the database. This change significantly boosted performance of some operations like rename and delete operations on pages with very long edit histories. Indices and summary tables are used extensively in MediaWiki, as SQL queries that scan huge numbers of rows can be very expensive, particularly in the context of Wikimedia. As a matter of fact, unindexed queries are usually discouraged in MediaWiki.

MediaWiki has built-in support for load balancing, added as early as 2004 in MediaWiki 1.2 (when Wikipedia got its second server — a big deal at the time). Load balancing is now a critical part of Wikimedia's infrastructure, which explains its influence on some algorithm decisions in the code. Also visible in the code is the influence of Wikimedia's multi-database environment, where writes are made to a unique master database, and then replicated to read-only slaves. Native management of the replication lag is one of the features offered by MediaWiki that few software developers usually have to worry about.


 * hurt performance: MySQL has had a few specific areas it's lagged in that have been problematic:
 * lack of full native UTF-8 (this is finally in in the latest versions, but you have to jump through some hoops and we have years of legacy databases)
 * no or limited online alter table makes even simple schema changes painful to deploy, slowing some development
 * data dump format is very hard to parallelize well; even with few changes to the database it takes forever to build one due to the compression.

Execution workflow of a web request
is the main entry point for MediaWiki, and handles most requests processed by the application servers (i.e. requests that were not served by the caching infrastructure; see below). The code executed from  performs security checks, loads configuration settings from , instantiates a   object , and creates a   object depending of the title and action parameters from the request.

can take a variety of action parameters in the URL request; the default action is, which shows the regular view of an article's content. For example, the request  displays the content of the article "Apple" on the English Wikipedia. Other frequent actions include  (to open an article for editing),   (to preview or save an article),   (to show an article's history) and   (to add an article to the user's watchlist). Administrative actions include  (to delete an article) and   (to prevent edits to an article).

is then called to handle most of the URL request. It checks for bad titles, read restrictions, local interwiki redirects, redirect loops, and determines whether the request is for a normal or a special page.

Normal page requests are handed over to, to create an   object for the page, and then to  , which handles "standard" actions. Once the action has been completed,  finalizes the request by committing DB transactions, outputting the HTML and launching deferred updates through the job queue. commits the deferred updates and closes the task gracefully.

If the page requested is a Special page (i.e. not a regular wiki page, but a special page of the software),  is called instead of  ; the corresponding PHP script is then called. Special pages can do all sorts of magical things, and all have a very specific purpose, usually independent of any specific article or its content. Special pages include various kinds of reports (recent changes, logs, uncategorized pages) and wiki administration tools (user blocks, user rights changes), among others. Their execution workflow depends on their function.

For debugging, many functions contain profiling code, which makes it possible to follow the execution workflow, if profiling is enabled.

Caching
MediaWiki itself is improved for performance because it plays a central role on Wikimedia sites, but it is also part of a larger operational ecosystem that has influenced its architecture. Wikimedia's caching infrastructure has imposed limitations in MediaWiki; developers worked around the issues, not by trying to shape Wikimedia's extensively optimized caching infrastructure around MediaWiki, but rather by making MediaWiki more flexible, so it could work within that infrastructure, without compromising on performance and caching needs.

On Wikimedia sites, most requests are handled by reverse caching proxies (Squids), and never even make it to the MediaWiki application servers. Squids contain static versions of entire rendered pages, served for simple reads to users who aren't logged in to the site. MediaWiki natively supports Squid and Varnish, and integrates with this caching layer by, for example, notifying them to purge a page from the cache when it has been changed.

For logged-in users, and other requests that can't be served by Squids, Squid forwards the requests to the web server (Apache), adding the "X-Forwarded-For" header containing the remote address; preserving the client's address is important since MediaWiki uses IPs for a variety of purposes, e.g. to block users and to credit edits made without a user account.

The second level of caching happens when MediaWiki renders and assembles the page from multiple objects, many of which can be cached to minimize future calls. Such objects include the page's interface (sidebar, menus, UI text) and the content proper, parsed from wikitext. The in-memory object cache has been available in MediaWiki since the early 1.1 version (2003), and is particularly important to avoid re-parsing long and complex pages. The default object/parser cache uses memcached.

Login session data can also be stored in memcached, which lets sessions work transparently on multiple front-end web servers in a load-balancing setup (Wikimedia heavily relies on load balancing, using LVS with PyBal).

Since version 1.16, MediaWiki uses a dedicated object cache for localized UI text; this was added after noticing that a large part of the objects cached in memcached consisted of UI messages localized into the user's language. The system is based on fast fetches of individual messages, minimizing memory overhead and start-up time in the typical case.

The last caching layer consists of the PHP opcode cache, commonly enabled to speed up PHP applications. Compilation can be a lengthy process; to avoid compiling PHP scripts into opcode every time they're invoked, a PHP accelerator can be used to store the compiled opcode and execute it directly without compilation. MediaWiki will "just work" with many accelerators such as APC, PHP accelerator and eAccelerator.

Because of its Wikimedia bias, MediaWiki is optimized for this complete, multi-layer, distributed caching infrastructure. Nonetheless, it also natively supports alternate setups for smaller sites. For example, it offers an optional simplistic file caching system that stores the output of fully rendered pages, like Squid does. Also, MediaWiki's abstract object caching layer lets it store the cached objects in a number of different places, including the file system, the database, or the opcode cache.

ResourceLoader
Like in many web applications, MediaWiki's interface has become more interactive and responsive over the years, mostly through the use of JavaScript. Usability efforts initiated in 2008, as well as advanced media handling (e.g. online editing of video files), called for dedicated front-end performance improvements.

ResourceLoader was developed to optimize the delivery of JavaScript and CSS assets. Started in 2009, it was completed in 2011 and has been a core feature of MediaWiki since version 1.17. ResourceLoader works by loading JS and CSS assets on demand, thus reducing loading and parsing time for unused features, for example in older browsers. It also minifies the code, groups resources to save requests, and can embed images as data URIs.

ResourceLoader is a particularly interesting case, as it's one of the few core components of MediaWiki that benefited from proper architecting prior to development. The main reason is that its developers not only wanted to "do it right", but also had the opportunity to do so, as the Wikimedia Foundation grew its engineering staff and more developers were available.

Context and rationale
Imagine a world in which every single human being can freely share in the sum of all knowledge. This is Wikimedia's vision. A central part of effectively contributing and disseminating free knowledge to all is to provide it in as many languages as possible. Wikipedia is available in more than 280 languages, and encyclopedia articles in English represent less than 20% of all articles.

Because Wikipedia and its sister sites exist in so many languages, developers and translators are committed to fully localizing MediaWiki. And because Wikimedia sites are based on collaboration, it is important not only to provide the content in the readers' native language, but also to provide a localized interface, and effective input and conversion tools, so participants can contribute content. Some languages are really rare on the Internet; Wikimedia provides them with a platform to create educational content, through MediaWiki.

For this reason, localization and internationalization (l10n & i18n) are a central component of MediaWiki. The i18n system is pervasive, and impacts many parts of the software; it's also one of the most flexible and feature-rich. Translator convenience is usually preferred to developer convenience, but this is believed to be an acceptable cost.

MediaWiki is currently localized in more than 350 languages, including non-latin and right-to-left (RTL) languages, with varying levels of completion. The interface and content can be in different languages, and have mixed directionality.

Content language
MediaWiki originally used per-language encoding, which led to a lot of issues, such as the impossibility to write native names in a language that required a different encoding and many HTML entities and character numbers in the pages' source. UTF-8 was adopted instead. Latin-1 support was dropped in 2005, along with the major database schema change in MediaWiki 1.5; content must now be encoded in UTF-8.

Special characters can be customized and inserted via MediaWiki's, or its JavaScript version. The WikiEditor extension for MediaWiki, developed as part of a usability effort, merges special characters with the edit toolbar. Another extension, called "Narayam", provides additional input methods and key mapping features for non-ASCII characters.

Recent and future improvements include better support for right-to-left text, bidirectional text (LTR and RTL text on the same page) and WebFonts.

Interface language
Localization of the user interface messages was implemented in many different ways in the early years of MediaWiki, especially in MediaWiki extensions. Efforts were made to standardize them; interface messages are now all stored in PHP arrays of key-values pairs. Each message is identified by a unique key, which is assigned different values across languages. This standard was established for legacy reasons, and also because other systems were deemed not to be flexible enough for MediaWiki. For example, gettext doesn't support plural forms for multiple variables.

MediaWiki messages can embed parameters provided by the software, which will often influence the grammar of the message. In order to support virtually any possible language, MediaWiki's localization system has been improved and complexified over time to accommodate their specific traits and exceptions, often considered oddities by English speakers.

For example, adjectives are invariable words in English, but languages like French require adjective agreement with nouns. If the user specified their gender in their preferences, the  files, where   is the ISO-639 code of the language (e.g.   for French); default messages are in English and stored in. MediaWiki extensions use a similar system, or host all localized messages in an  file. Along with translations, Message files also include language-dependent information such as date formats.

Contributing translations used to be done by submitting PHP patches for the  files. In December 2003, MediaWiki 1.1 introduced "database messages", a subset of wiki pages in the MediaWiki namespace containing interface messages. The content of the wiki page  is the message's text, and overrides its value in the PHP file. Localized versions of the message are at, e.g..

This feature has allowed power users to translate (and customize) interface messages locally on their wiki, but the process doesn't update i18n files shipping with MediaWiki. In 2006, MediaWiki developer Niklas Laxström created a special, heavily hacked MediaWiki website (now hosted at ) where translators can easily localize interface messages in all languages, simply by editing a wiki page. The  files are then updated in the MediaWiki code repository, where they can be automatically fetched by any wiki, and updated using the LocalisationUpdate extension. On Wikimedia sites, database messages are now only used for customization, and not for localization any more. MediaWiki extensions and related programs, such as bots and Toolserver tools, are localized at translatewiki.net, too.

In order to help translators understand the context and meaning of an interface message, it is considered a good practice in MediaWiki to provide documentation for every message. This documentation is stored is a special Message file, with the  language code, which doesn't correspond to a real language. The documentation for each message is then displayed in the translation interface on translatewiki.net. Another help tool is the  language code: there is no associated   file, and it isn't possible to select this fake language in the user's preferences. But when used with the  parameter to display a wiki page (e.g.  ), MediaWiki will display the message keys instead of their values in the user interface: this is very useful to identify which message to translate or change.

Registered users can set their own interface language in their preferences, in which case it overrides the site's default interface language. MediaWiki also supports fallback languages: if a message isn't available in the chosen language, it will be displayed in the closest possible language, and not necessarily in English. For example, the fallback language for Breton is French.

Users
Users are represented in the code using instances from the  class, which encapsulates all of the user-specific settings (user id, name, rights, password, email address, etc.). Client classes use accessors to access these fields; they do all the work of determining whether the user is logged in, and whether the requested option can be satisfied from cookies or whether a database query is needed. Most of the settings needed for rendering normal pages are set in the cookie to minimize use of the database.

MediaWiki provides a very granular permissions system, with basically a user permission for every possible action. For example, to perform the "Rollback" action (i.e. to "quickly rollback the edits of the last user who edited a particular page"), a user needs the  permission, included by default in MediaWiki's   user group. But it can also be added to other user groups, or have a dedicated user group only providing this permission (this is the case on the English Wikipedia, with the  or   and  ); the main content namespace has no prefix. Wikipedia users quickly adopted them, and they provided the community with different spaces to evolve. Namespaces have proven to be an important feature of MediaWiki, as they create the necessary preconditions for a wiki's community and set up meta-level discussions, community processes, portals, user profiles, etc.

The default configuration for MediaWiki's main content namespace is to be flat (no subpages), because it's how Wikipedia works, but it is trivial to enable them. They are enabled in other namespaces (e.g., where people can for instance work on draft articles) and display breadcrumbs.

Namespaces separate content by type; within a same namespace, pages can be organized by topic using categories, a pseudo-hierarchical organization scheme introduced in MediaWiki 1.3. There are countless examples of how Wikipedia and Wikimedia sites have influenced MediaWiki's features and architecture; namespaces and categories are two examples of rarer cases where, conversely, MediaWiki developer introduced unexpected features that have influenced how Wikipedia functions and how users work.

Content processing: MediaWiki markup language & Parser
The user-generated content stored by MediaWiki isn't in HTML, but in a markup language specific to MediaWiki, sometimes called "wikitext". It allows users to make formatting changes (e.g. bold, italic using quotes), add links (using square brackets), include templates, insert context-dependent content (like a date or signature), and make an incredible number of other magical things happen.

In order to display a page, this content needs to be parsed, assembled from all the external or dynamic pieces it calls, and converted to proper HTML. The parser is one of the most essential parts of MediaWiki, which also makes it difficult to change or improve. Because hundreds of millions of wiki pages worldwide depend on the parser to continue outputting HTML the way it always has, it has to remain extremely stable.

The markup language wasn't formally spec'd from the beginning; it started based on UseModWiki's markup, then morphed and evolved as needs have demanded. In the absence of a formal specification, the MediaWiki markup language has become a complex and idiosyncratic language, basically only compatible with MediaWiki's parser. The current parser's specification is jokingly referred to as "whatever the parser spits out from wikitext, plus a few hundred test cases".

There have been many attempts at alternative parsers, but none has succeeded so far. In 2004, an experimental tokenizer was written by developer Jens Frank to parse wikitext, and enabled on Wikipedia; it had to be disabled three days later, because of the poor performance of PHP array memory allocations. Since then, most of the parsing has been done with a huge pile of regular expressions, and a ton of helper functions. The wiki markup, and all the special cases the parser needs to support, have also become considerably more complex, making future attempts even more difficult.

A notable improvement was Tim Starling's preprocessor rewrite in MediaWiki 1.12, whose main motivation was to improve the parsing performance on pages with complex templates. The preprocessor converts wikitext to an XML DOM tree representing parts of the document (template invocations, parser functions, tag hooks, section headings, and a few other structures), but can skip "dead branches" in template expansion, such as unfollowed  cases and unused defaults for template arguments. The parser then iterates through the DOM structure and converts its content to HTML.

Recent work on a visual editor for MediaWiki has made it necessary to improve the parsing process (and make it faster), so work has resumed on the parser and intermediate layers between MediaWiki markup and final HTML (see Future, below).


 * metadata (categories, interwiki) in the body of text. This should really have been coded in a different table / interface.

Magic words and templates
MediaWiki offers "Magic words" that modify the general behavior of the page or include dynamic content into it. They consist of: behavior switches like  (to hide the automatic table of content) or   (to tell search engines not to index the page); variables like   or  ; and parser functions, i.e. magic words that can take parameters, like   (to output   in lowercase). Constructs like,   and  , used to localize the UI, are parser functions.

The most common way to include content from other pages in a MediaWiki page is to use templates. Templates were really intended to be used to include the same content on different pages, e.g. navigation panels or maintenance banners on Wikipedia articles; having the ability to create partial page layouts and reuse them in thousands of articles with central maintenance made a huge impact on sites like Wikipedia.

However, templates have also been used (and abused) by users for a completely different purpose. MediaWiki 1.3 made it possible for templates to take parameters that change their output; the ability to add a default parameter (introduced in MediaWiki 1.6) enabled the construction of a functional programming language implemented on top of PHP, which was ultimately one of the most costly features in terms of performance.

MediaWiki developer Tim Starling then developed additional parser functions (the ParserFunctions extension), as a stopgap measure against insane constructs created by Wikipedia users with templates. This set of functions included logical structures like  and , and other functions like   (to evaluate mathematical expressions) and   (for time formatting).

Soon enough, Wikipedia users started to create even more complex templates using the new functions, which considerably degraded the parsing performance on template-heavy pages. The new preprocessor introduced in MediaWiki 1.12 (a major architectural change) was implemented to partly remedy this issue. Recently, MediaWiki developers have discussed the possibility to use an actual, lower-level scripting language to improve performance.

Media files
Uploading is done through the  page; it is possible to configure the allowed file types through an extension whitelist. Once uploaded, files are stored in a folder on the file system, and thumbnails in a dedicated  directory.

Because of Wikipedia's educational mission, MediaWiki supports file types that may be uncommon in other web applications or CMSes, like SVG vector images, and multipage PDFs & DjVus. They are rendered as PNG files, and can be thumbnailed and displayed inline, similarly to more common image files like GIFs, JPGs and PNGs.

When a file is uploaded, it is assigned a  page containing information entered by the uploader; this can include copyright information (author, license) and items describing or classifying the content of the file (description, location, date, categories, etc.). While private wikis may not care much about this information, on media libraries like Wikimedia Commons they are critical to ensure the legality of sharing these files. It has been argued that most of these metadata should, in fact, be stored in a queryable structure like a database table. This would considerably facilitate search, but also attribution and reuse by third parties, for example through the API.

Most Wikimedia sites allow "local" uploads to each wiki, but freely-licensed media files are centralized on Wikimedia's free media library, Wikimedia Commons; any Wikimedia site can transparently display a file hosted on Commons as if it were hosted locally. This avoids having to upload a file to every wiki in order to use it there.

As a consequence, MediaWiki natively supports foreign media repositories, i.e. the ability to access media files hosted on another wiki through its API and the  system. Since version 1.16, any MediaWiki website can easily use files from Wikimedia Commons through the  feature. When using a foreign repository, thumbnails are stored locally in order to save bandwidth. However, it is not (yet) possible to upload to a foreign media repository from another wiki.

Levels
MediaWiki's architecture provides different ways to customize and extend the software. This can be done at different levels of access: External programs can also communicate with MediaWiki through its machine API, if it's enabled, basically making any feature and data accessible to the user.
 * System administrators can install extensions and skins, and configure the wiki's separate helper programs (e.g. for image thumbnailing and TeX rendering) and global settings (see Configuration above).
 * Wiki sysops (sometimes called "administrators" too) can edit site-wide gadgets, JavaScript and CSS settings.
 * Any registered user can customize their own experience and interface using their preferences (for existing settings, skins and gadgets) or make their own modifications (using their personal JS and CSS pages).

JavaScript and CSS
MediaWiki can read and apply site-wide or skin-wide JavaScript and CSS using custom wiki pages; these pages are in the  namespace, and thus can only be edited by sysops; for example, JavaScript modifications from   apply to all skins, CSS from   applies to all skins, but   only applies to users with the Vector skin.

Users can do the same kind of changes, which will only apply to their own interface, by editing subpages of their user page (e.g.  for JavaScript on all skins,   for CSS on all skins, or   for CSS modifications only applying to the Vector skin).

If the Gadgets extension is installed, sysops can also edit gadgets, i.e. snippets of JavaScript code providing features than can be turned on and off by users in their preferences. Upcoming developments on gadgets will make it possible to share gadgets across wikis, thus avoiding duplication.

This set of tools has had a huge impact and greatly increased the democratization of MediaWiki's software development. Individual users are empowered to add features for themselves; power users can share them with others, both informally and through globally-configurable sysop-controlled systems. This framework is ideal for small, self-contained modifications, and presents a lower barrier of entry than heavier code modifications done through hooks and extensions.

Extensions and skins
When JavaScript and CSS modifications are not enough, MediaWiki provides a system of hooks that let third-party developers run custom PHP code before, after, or instead of MediaWiki code for particular events. MediaWiki extensions use hooks to plug into the code.

Before hooks existed in MediaWiki, adding custom PHP code meant modifying the core code, which was neither easy nor recommended. The first hooks were proposed and added in 2004 by Evan Prodromou; many more have been added over the years when needed. Using hooks, it is even possible to extend MediaWiki's wiki markup with additional capabilities, using tag extensions.

The extension system isn't perfect: extension registration is based on code execution at startup, rather than cacheable data, which limits abstraction and optimization and hurts MediaWiki's performance. But overall, the extension architecture is now a fairly flexible infrastructure, that has helped make specialized code more modular, keeping the core software from expanding (too) much, and making it easier for third-party users to build custom functionality on top of MediaWiki.

Conversely, it's very difficult to write a new skin for MediaWiki without reinventing the wheel. . In MediaWiki, skins are PHP classes each extending the parent  class; they contain functions gathering the information needed to generate the HTML.

API
The other main entry point for MediaWiki, besides, is  , used to access its machine-readable query API (Application Programming Interface). Originally a read-only interface located at, it evolved into a full-fledged read and write API providing direct, high-level access to the data contained in the MediaWiki database.

Client programs can use the API to login, get data, and post changes. The API supports thin web-based JavaScript clients and end-user applications. Almost anything that can be done via the web interface can basically be done through the API. Client libraries implementing the MediaWiki API are available in many languages, including Python and .NET.

Future
What started as a Summer project done by a single volunteer PHP developer has grown into MediaWiki, a mature, stable wiki engine powering a top ten website with a ridiculously small operational infrastructure. This has been made possible by a constant optimization for performance, iterative architectural changes and a team of awesome developers.

The evolution of web technologies, and the growth of Wikipedia, call for persistent improvements and new features, some of which require major changes to MediaWiki's architecture. This is for example the case for the ongoing visual editor project, which has prompted renewed work on the parser and conversion between the MediaWiki markup language, the DOM and the final HTML.

MediaWiki is a tool used for very different purposes. Within Wikimedia projects, it is used to create and curate an encyclopedia (Wikipedia), to power a huge media library (Wikimedia Commons) or to transcribe scanned reference texts (Wikisource). In other contexts, MediaWiki is used as a corporate CMS, or as a data repository, sometimes combined with a semantic framework. These specialized uses, that weren't planned for, will probably continue to drive constant adjustments to the software's internal structure. As such, MediaWiki's architecture is very much alive, just like the immense community of users it supports.

Notes and references

 * Automatically-generated MediaWiki documentation:
 * Domas Mituzas, Wikipedia: site internals, configuration, code examples and management issues, MySQL Users conference, 2007. Full text available at