Quality Assurance/Strategy

Manual testing
This Group will become our central repository for information about manual/functional/features testing

Goals

 * 1) Improve Wikimedia software products:
 * 2) * User perceived quality.
 * 3) * Areas difficult to cover with automated testing.
 * 4) Grow the Wikimedia technical community.
 * 5) * Accessible entry point for Wikimedia users and editors. No technical skills required.
 * 6) * Good motivation for experienced and professional testers.

We still need a central "these are our QA priorities" page.

Volunteer profiles

 * Wikipedia/Wikimedia editors
 * Motivated users willing to try what's coming next.
 * Experienced / professional testers willing to contribute.
 * Companies developing products where Wikipedia/Wikimedia software needs to run well.
 * ... and of course other regular contributors at https://bugzilla.wikimedia.org/ willing to get involved in a more structured way.

Consolidating a testing team
We need to identify, empower and let lead to those experienced in testing and QA, and those experienced in Wikimedia software & community.

We must build a healthy meritocratic structure, with a dose of fun and incentives to those doing great progress, and helping out others to progress as well.

Based on this, a set of profiles:


 * Senior testers - can help and teach others to test.
 * Organizers - can increase the quantity and quality of QA activities.
 * Connectors - can bridge QA volunteers efforts with development teams.
 * Promoters - can help reach out to new volunteers.

Activities
In theory, almost all combinations apply across:


 * Testing vs. bug triaging.
 * Online vs. on-site.
 * DIY vs. team sprint.
 * Synchronous vs. asynchronous.

However, not all combinations are equally productive towards all contexts and goals. For instance, a face-to-face team sprint requires well-defined scope and goals, and heavy involvement from the development team. Discrete (individual) tasks can provide great results, as long as they don't block urgent deliveries and critical paths.

We need good documentation so that we can clone at least these three procedures efficiently:


 * Online testing sprint: how to organize, announce, perform, and evaluate it.
 * Proposed: right after deployment of new MediaWiki versions to non-Wikipedias.
 * Proposed: right after feature deployments.
 * Note: this requires availability of effective announcements and release notes.
 * For examples, see Mozilla Test Days, Fedora Test Days and our own Weekend Testing Americas held on 2012-05-5 and Article Feedback Testing on 2012-06-09.
 * See also session-based testing.
 * Individual testing: tasks that a person can perform and report on, anytime and anywhere.
 * Individual bug triaging: bug reports to look at, and instructions to improve their status.

Reaching out
We need to go beyond the sporadic isolated efforts and build a continuous, incremental flow of activites. The success of each activity must contribute to future successes.

We need to let people know about ongoing / DIY opportunities as well as events. We need to reach out to the current MediaWiki / Wikimedia communities as well as to external groups and potential new contributors.


 * Calendar: a central place where activities are announced. It should be possible to subscribe and receive notifications of new activities.
 * QA communities: reaching out and having processes in place to promote our activities.
 * Work with promoters to spread the news.
 * Contact companies testing Wikipedia in their products e.g. browser developers.
 * Organize on-site activities engaging local groups.

Follow-up activities
Testing events require a follow-up to


 * Evaluate and announce the outcome.
 * Triage and process the feedback received into the regular development flow.
 * Keep the contributors engaged.
 * Warm up for the next event.

For instance, it is a good idea to organize an online bug triaging sprint after a testing event.

We need a way to keep in touch with participants in events and get back to them.

Community incentives
To be defined. Some ideas:


 * Tester barnstar.
 * "I test Wikipedia" shirt.
 * Sponsored training e.g. AST courses.
 * Sponsored travel to Wikimania.

Test automation

 * Browser testing Group


 * How community works for browser test automation


 * Jenkins host with connections to Sauce Labs


 * QA/test backlog

Future
Emphasis on good unit tests

Measuring success
Activities
 * One Features testing activity per month.
 * One Browser testing activity per month.
 * One Bug management activity every other week.
 * Co-organized with different teams, focusing in different areas.
 * In sync with WMF and other community priorities.

Results
 * Tangible progress after each activity: bugs reported / triaged, scenarios written, good perception of end user perceived quality.
 * Improvements documentation and processes enabling scalability of QA volunteer effort.
 * Developer team values the exercise and is willing to repeat.

Participation
 * Pool of at least 25 regular non-WMF contributors in each QA category.
 * At least 10 non-WMF contributors in synchronous activities.
 * Mix of newcomers and repeaters in each activity.
 * Involvement of Wikimedia editors.
 * Involvement of contributors active in other OSS projects, new to Wikimedia.
 * Involvement of existing groups.

Organization
 * Topic and dates are agreed 2 weeks before the activity starts.
 * Goals are defined before an activity and evaluated right after.
 * Preparing a new activity takes less effort.
 * The stakeholders related with an activity are aware and engaged.
 * Participants can start learning / contributing right away.
 * Non-WMF contributors involved in the organization.

Evaluation checklist

 * For an example, see QA/Browser testing/Search features.

★★★☆☆


 * Summary:
 * Results:
 * Participation: WMF? Wikimedia? Other?
 * New:
 * Repeating:
 * Mood:
 * Documentation:
 * Promotion:
 * Developer team:
 * Organizers:
 * Lessons learned:

Individual activities

 * How easy and how long before doing a first contribution.
 * Positive / negative feedback, complaints, bugs.
 * Individual contributors showing up in community channels and team activities.
 * Statistics on individual contributors (to be defined).
 * QA contributor retention.