Manual:MediaWiki architecture

This page contains the actual text of the MediaWiki architecture document project. This content will be merged into the appropriate mediawiki.org pages once the specific formatting required for inclusion in the AOSA book is not needed any more.

We are tentatively using an "architecture through history" approach, where MediaWiki's history is told chronologically, and its different parts presented along a timeline, surrounded by the context (reasons, assumptions, constraints) in which they were introduced.

Introduction

 * context
 * Wikipedia and sister sites
 * optimization and performance because of requirements of high-profile sites like those operated by the WMF

Phase I

 * Wikipedia launched in January 2001; was using existing wiki software UseModWiki, written in Perl, stored all pages in individual text files
 * used CamelCase. See w:Wikipedia:UuU, w:Wikipedia:CamelCase and Wikipedia
 * February 19, 2001, Wikipedia enabled and recommended free links.

Phase II
PHP & MySQL
 * limitations (which? performance? others?) => mid 2001: Magnus Manske started to work on a dedicated wiki engine for Wikipedia
 * deployed to English Wikipedia in January 2002, gradually deployed to other language versions http://meta.wikimedia.org/w/index.php?title=Wikipedia_software_upgrade_status&oldid=48204
 * "the PHP script" / "phase II"

Phase III

 * performance /load issues led to another rewrite by Lee Daniel Crocker: "Phase III". Same PHP/MySQL architecture, same basic interface
 * deployed to Wikipedia in July 2002
 * WMF announced in June 2003. MediaWiki named in July. Confusion

The Coming of age

 * See also MediaWiki

2004
 * Categorization system

2005
 * Hooks!
 * Major database redesign decoupling text storage from revision tracking

2006
 * ParserFunctions extension

2007
 * Gadgets extension

2008
 * FlaggedRevisions extension

2009-2010
 * ResourceLoader
 * Usability initiative (Vector skins/WikiEditor extension etc.)

Milestones to integrate

 * creation, testing, and initial deployment of Magnus' "phase 2"
 * creation, testing, and initial deployment of Lee's "phase 3"
 * Refactorings & performance improvements in the early Brion & Tim years
 * internationalization & unicode
 * addressable logs
 * 1.5 schema refactor; Proposed Database Schema Changes/October 2004
 * compression & external storage
 * web-based installer (1.2)
 * regular releases
 * Early empowerment of end-users
 * user/site JS/CSS
 * extensions
 * 1.3: The MonoBook skin, categories, templates and extensions.
 * CentralAuth
 * Gadgets
 * API
 * 1.12 or so - preprocessor rewrite (Tim); improvements to template performance
 * 1.17 - resourceloader; beginning of strong JavaScript module APIs
 * The creation of api.php, and the addition of write actions (including edit) to it

Use

 * reusability of MediaWiki by other people, not just for our purposes; used to be difficult to install (command line installer, many references to Wikipedia, hardcoded paths); now easier with the web-based installer
 * installer. a quick note about the installer would be good. During the early days (ask Tim Starling), you add to run a shell script to install MW. Then it was made to just upload file and run the /config/ wizard.
 * Making it FLOSS since the very beginning was very important for its popularity. MediaWiki has become the 800-lb gorilla of wiki software, and it wouldn't have happened with a closed development model.
 * Reusing MediaWiki to build commons: not adapted to handling millions of media files.

Configuration

 * Globals for configuration -- partially this is because we started out in PHP4 world, but it has really hurt 3rd parties over time and made the software seem rather difficult to configure/maintain.


 * Configuration variables are placed in the global namespace.
 * This had serious security implications with register_globals.
 * It limits potential abstractions for configuration, and makes optimisation of the startup process more difficult.
 * The configuration namespace is shared with variables used for registration and object context, leading to potential conflicts.

Architecture overview

 * PHP, DB
 * what is the overall workflow of a user request?
 * index.php dispatches to MediaWiki class; SpecialPage class; page, revision, & user tables, Title & WikiPage classes

PHP

 * Unprefixed class names.
 * PHP core and PECL developers have the attitude that all class names that are English words inherently belong to them, and that if any user code is broken when new classes are introduced, it is the fault of the user code.
 * Prefixing e.g. with "MW" would have made it easier to embed MediaWiki inside another application or library.


 * We try to do things cleanly if there are benefits to it (e.g. separation of logic and output in the architecture) but at the same time we're not afraid to ignore standards or rules if that's better for us (e.g. not fully complying with stupid provisions of HTML4, denormalizing the DB schema where that brings performance benefits)


 * MediaWiki seems to have been started mostly by people who weren't really very expert in their field, and as a result a lot of ugly old code is lying around that lacks proper logic/view separation and has other nasty issues

MediaWiki grew organically and is still evolving. It's hard to criticise the founders for not implementing some abstraction which we now find to be critical, when the initial codebase was so small, and the time taken to develop it so short.

Structure and classes

 * main classes

We've seen major new architectural elements introduced to MediaWiki throughout its history, for example:


 * The Parser class
 * The SpecialPage class
 * The Database class
 * The Image class, then later the media class hierarchy and the filerepo class hierarchy
 * ResourceLoader
 * The upload class hierarchy
 * The Maintenance class
 * The action hierarchy

MediaWiki started without any of these things, despite the fact that all of them support features that have been around since the beginning. Many developers are driven primarily by feature development -- architecture is often left behind, only to catch up later as the cost of working within an inadequate architecture becomes apparent.


 * Internal API

Database

 * schema rewrite for performance (MediaWiki 1.5. ?); early enough. very well thought out. We've added on to it over time, but by and large it has remained the same to this day and continued to serve us reasonably well.


 * DBMS support & abstraction
 * Manual:Database layout
 * http://yellowstone.cs.ucla.edu/schema-evolution/index.php/Schema_Evolution_Benchmark


 * DB API, cache handling (Squid purge, memcached).
 * everytime the workflow request data, there is a cache to speed it up
 * the first one being web caches (Squid, Varnish), then memcached for data/parser, then database query cache (not sure it is used)
 * split queries accross multiples databases, partitions caches ...)

Performance

 * schema rewrite for performance (MediaWiki 1.5. ?);
 * we wrote our own database abstraction layer and load balancer. at that time, there were not much around so we HAD to write one


 * relying on PHP/MySQL: probably not the best choices for performance. But very popular and facilitates recruitment of new devs
 * hurt: PHP has not benefited from performance improvements that some other dynamic languages have seen in recent years (eg JavaScript VMs now have aggressive JITs etc, but Zend's PHP still doesn't ship an opcode cache, much less try to actually compile anything)
 * Obviously using Java would have been much better for performance, and scaling up the execution of backend maintenance tasks would have been simpler.


 * hurt: MySQL has had a few specific areas it's lagged in that have been problematic:
 * lack of full native UTF-8 (this is finally in in the latest versions, but you have to jump through some hoops and we have years of legacy databases)
 * no or limited online alter table makes even simple schema changes painful to deploy, slowing some development
 * data dump format is very hard to parallelize well; even with few changes to the database it takes forever to build one due to the compression.


 * Adding support for memcached (in memory cache) and APC (PHP opcode cache) had a HUGE impact in improving perfs.


 * endless compromise between performance and features. e.g. template sysem degraded performance. One possibility is to turn them into dedicated extensions


 * ResourceLoader
 * Using jQuery for javascript.


 * hurt performance: awful awful syntax making it harder to plug in better-performing parser etc bits


 * MediaWiki has to be webscale ;-) Unlike most PHP applications MediaWiki has been built for years now with performance as a major design goal since it absolutely must scale to WMF sites (primarily: enwiki)


 * Adding a generic caching layer did amazing things for performance. We can throw practically anything expensive into the cache and expect it to come out :)


 * As discussed above: the configuration and extension registration systems hurt MediaWiki's performance.


 * The template argument default parameter feature was ultimately the most costly feature in terms of performance. It enabled the construction of a functional programming language implemented on top of PHP, starting with.


 * When the limitations imposed on us by WMF's caching infrastructure caused problems in MediaWiki or friction with things we wanted to do in MediaWiki, we found ways around that. We didn't try to shape WMF's caching-heavy optimized-to-death architecture around MW, but we did almost the reverse: make MW more flexible so it can handle our crazy caching setup, without compromising on our performance and caching needs.


 * things we support that 'normal' people never think about:
 * DB replication/lag handling
 * things we support that 'normal' people never think about: reverse caching proxies


 * Factors that contribute to a 'performance culture':
 * We have a few very expert people on board who know a lot about performance optimization (DB performance is a big chunk of it, but the rest is important too)
 * MediaWiki must run on a top-ten web site. Things that would not scale to that size are fixed, reverted, or put behind a config var
 * MediaWiki must run on a top-ten web site operated on a shoestring budget, which has additional implications for performance and caching
 * Specific things that have improved performance:
 * Generic caching layer support that looks the same to the developer whether the backend is memcached (preferred), APC/ECache/whatever, a database, or even nothing (null cache)
 * Using disk-backed object cache for the parser cache greatly improved the pcache hit rate and produced some awesome hit rate graphs
 * PoolCounter prevents a Michael Jackson-esque cache stampede. It's sort of difficult to verify it actually works, but we've seen things that lead us to believe it does indeed work
 * Specific things that have harmed performance
 * The fact that wikitext slowly evolved into this almost Turing-complete programming language. Wikitext that exploits these features takes forever to parse


 * couple slides from talk at CCC in 2004

Security

 * Strongly focusing on security by providing wrappers around HTML output and DB queries that handle escaping for you, and making their use pretty much mandatory. This means that everyone is expected to write secure code, while at the same time writing secure code is made easy so everyone can do it. Thanks mostly to Tim Starling, we have institutionalized a security-minded development culture, and I think that contributes to the low number of security flaws found in MediaWiki.


 * WebRequest / Sanitizer
 * user input sanitization. We have webRequest to grab parameters given by a user and make them safe. to avoid code injection

Languages
(both for content and interface, and omnipresent in MediaWiki, hence dedicated section)


 * Using per language encoding led to a lot of issues. Eventually everything migrated to UTF-8 which makes things easier when you deal with hundreds of different languages. was eliminated in 1.5, along with the "big schema change".


 * i18n/l10n system


 * We are fully committed to internationalizing our software in any imaginable language. This i18n support is quite pervasive and impacts many parts of MediaWiki, but despite that we stuck with it anyway, and we now have a very feature-rich i18n system.


 * things we support that 'normal' people never think about:
 * internationalization in 350+ languages
 * the ability to have the interface and content in different languages
 * right-to-left languages
 * mixed directionality (i.e. interface language and content language have opposite directionality)

Authentication

 * A cleaner accounts system that spanned multiple sites from the beginning would have saved lots of trouble; CentralAuth is still a bit hacky to work with.

Permissions

 * 1) Lack of a unified, pervasive permissions concept.
 * 2) * This stymied the development of new user rights and permissions features, and led to various security issues.

Content structure

 * Page title: CamelCase, then free links. Allowed for greater flexibility in link text and page names and made it less confusing. Free links have since become the de-facto standard for internal links in most wiki software now
 * subpages
 * namespaces
 * The flat namespace for articles is too simple: for Wikipedia it encourages overly long pages (leads to performance problems as we have to parse and copy around huge chunks of text that will not usually get read all at once, and makes it harder to navigate to relevant, more digestable chunks of data). For other sites like wikibooks, wikisource, wikiversity, heck even mediawiki.org we could benefit a lot from more structured entities that consist of multiple subpages. It also means it's harder to separate draft or provisional pages from the published article space.

Wikitext & Parser

 * tokenizer to parse wikitext (JeLuF wrote it). Unfortunately lack of performances with PHP array memory allocations led to a revert after 3 days of having it running on live site. We are back to the huge pile of regexp since them.
 * Cleaner markup syntax near the beginning would simplify our lives a lot with template & editing stuff


 * The parser wasn't formally spec'd from the beginning--it just morphed and evolved as needs have demanded. This makes it difficult for alternative parsers to exist and has made changing the parser hard. The parser's spec is whatever the parser spits out, plus a few hundred test cases.


 * The parser has to remain very very stable. Hundreds of millions of wikipages worldwide depend on the parser to continue outputting HTML the way it always has. It makes changing the parser difficult.

Making wikitext such a complex and idiosyncratic language that parsing it with 'normal' parsers is very hard was definitely a bad move, and we're feeling the pain now

Editing

 * metadata (categories, interwiki) in the body of text. This should really have been coded in a different table / interface.


 * The visual editor project is way overdue. We're fixing it now, and that's good, but it's kind of ridiculous that, in 2011, the main interface of one of the largest sites on the web is still a  from the 90s

Templates

 * While we have plenty of things to whinge about in the syntax, management etc, the ability to create partial page layouts and reuse them in thousands of articles with central maintenance has been a big boon.

ParserFunctions

 * ParserFunctions never should've seen the light of day. Granted we were responding to the needs of the time, but if we did it over, we probably should've taken the time to solve the problem properly rather than putting PFuncs as a stopgap measure. This page has some interesting history on the subject (Also this and also older versions of this page)

Media

 * DB & filesystem storage layout for media files is very awkward, with a number of problems that hinder our ability to mirror, cache, and do major online maintenance.


 * File repository code could've been done slightly differently. Ideally wikis	 should be able to upload *to* foreign repos, rather than just read from. Also, most of the code assumes a local filesystem or NFS which isn't very flexible--other backends like the database, Swift, etc. shouldn't be so hard to add.


 * Replication support

Community

 * Open source from the start


 * MediaWiki's early couple years involved zero engineering budget and a lot of volunteer turnover -- the original authors of the phase 2 and phase 3 PHP codebases were Wikipedians with a technical bent as were most of our other early devs; folks like Brion started on fixes, internationalization, and features support based on their ability to see the source in CVS & the bug tracker on SourceForge, discuss with other devs on the wikis & mailing lists, and get software updates actually pushed to production.


 * Regular releases and installer. Making sure the software was easy to set up was a BIG factor in getting more volunteer developers involved, both directly (people already possibly interested wanting a small impedence to start working) and indirectly (easier for 3rd-party usage, leading to people sending fixes and customizations upstream).


 * Deploy-then-release

backwards compatibility

 * Some aspects such as hooks or configuration variables, remain very stable for a long time. When they change, they typically go through a slow deprecation process to allow users and extension authors to catch up.
 * However our internal apis change all the time which can be frustrating to extension authors (and even core devs!)
 * I think this will improve in the coming releases though. A lot of our "omg rewrite" situations in the past ~2 years have been to bring ancient code into the 21st century.


 * Inconsistent. Such is the result of having many volunteer developers with many different opinions on this fraught issue.


 * Most old methods / functions are kept in the code, nowaday they are marked as deprecated and removed after 2 or 3 releases. We still support the 10 years old skins.
 * The wikitext parser and render still supports hacks we really want to remove for performances reasons.

QA

 * Peer reviewing of every single patch since day 1.

Beyond MediaWiki

 * Several levels:


 * System administrator:


 * Wiki administrator:


 * Common.js, Common.css, and skin-specific counterparts
 * Gadgets (per wiki, but can be enabled/disabled by each user)
 * New developments on gadgets (central repo, UI improvements)


 * User:


 * User:Foo/vector.css, User:Foo/common.js

Side programs

 * interaction with other pieces of software (i.e. LaTeX, ImageMagick, etc.)

Skins & extensions

 * hooks
 * Extension architecture. fairly flexible infrastructure which has helped us to make specialized code more modular, keeping the core software from expanding (too) much and making it easier for 3rd-party reusers to build custom stuff on top.


 * The skin system has been terrible since the beginning. It's damn near impossible for 3rd parties to write their own custom skins without reinventing the wheel.


 * Extension registration based on code execution at startup rather than cacheable data.
 * Limits abstraction and optimisation

Gadgets and site/user JS/CSS

 * hugely impactful, this has greatly increased the democratization of MediaWiki's software development. Individual users are empowered to add features for themselves; power users can share these with others both informally and through globally-configurable admin-controlled systems.


 * Using jQuery for javascript.

Future

 * Current RfCs, etc.
 * New parser (specification), wiki dom, Visual editor