Quality Assurance

Quality Assurance at the Wikimedia Foundation is about answering two questions:


 * What should the software do? (Are we building the right thing?)
 * How should the software function? (Are we building the thing right?)

The biggest issue facing Wikipedia today is that the number of editors. So the answer to the first question "What should the software do?" is: the software should increase the number of editors for Wikipedia.

Quality Assurance and testing works with the software development projects to answer the second question, "How should the software function?". The software development effort at WMF is divided into several projects:


 * Core features is devoted to building significant features for Wikipedia dedicated to encouraging and supporting editors.
 * Flow is the most important project right now from the Core features team


 * Growth is devoted to projects that bring in new editors. This team does a lot of A/B testing.


 * Having to learn wikitext is a barrier to editing Wikipedia and sometimes even a burden to those who know it well, so we are building a VisualEditor for Wikipedia


 * Find out about Mobile software projects at the Mobile Gateway


 * Find out about Language software projects at the Language portal

Software testing
We have two main approaches to software testing: QA/Features testingExploratory Testing or "ET", and automated browser testing.

Exploratory Testing is "simultaneous learning, test design and test execution" or "test design and test execution at the same time". ET is a powerful approach that everyone should know.

Our automated browser tests use Cucumber to define test scenarios and implement the Page Object design pattern. Most of the browser test code is in the repositories of the extensions being tested, but we manage some tests independently.

Test environments
We have two main test environments.

One test environment is known as "beta labs" or "the beta cluster". Here we run the latest version of the master branch of the wiki software and all extensions. The code on beta labs is updated automatically every few minutes, and the databases about every hour or so. On the beta cluster we test the most recent software features that are assumed to be viable. We do host wild experiments or unsupported features on beta labs, but only the latest version of the master branch of features to be deployed to production.

The other test environment is "test2wiki" or "test2". This environment is a node on the production wiki cluster, a peer wiki to English Wikipedia, Commons, etc. This environment is the first target for a potentially deployable branch of all the code, and is updated weekly, one week ahead of all the production wikis.

Resources
The QA mail list is a great resource not only for testing Wikipedia software but also for general discussion of QA and testing practice

We do issue tracking in Bugzilla.

Our source code is in gerrit and is mirrored at github.

More information
Because our QA effort is spread across Wikimedia Engineering we are not always 100% engaged with every project. We have a guide on when to use QA services.

We also collaborate with Bug management, Continuous integration, Wikimedia Labs and the testing plans of other Wikimedia Engineering teams.

Status
See also the QA Strategy and Roadmap.

Features testing
QA on new features is assessed mainly through manual testing of the software build through the continuous integration process. Developers and community testers manually check function and features, either on their own or through organized testing activities.

If you are interested in manual/feature/functional testing, join the proposed MediaWiki Group Features testing.

For more details, see Features testing.

We practice three levels or approaches to Features testing.

Browser testing
We create and maintain automated browser-level tests with a focus on compatibility and regressions.

We are looking for contributors! Technical experience is NOT required.

If you are interested in automated browser tests, join the proposed MediaWiki Group Browser testing.

For more details, see Browser testing.