Language Testing Plan

Automatic Browser Tests
The Language Engineering team is collaborating with the QA team to build automatic browser tests for its extensions and features. The scenarios are written in cucumber and implemented in ruby. The following section lists the test scenarios for several feature, which can then be moved to gerrit and github.

Test Scenarios

 * 1) ULS Test Scenarios
 * 2) Translate Test Scenarios

Bug Triage
The Language Engineering team hosts open session for bug triages on the 4th Wednesday of every month, at 1700UTC.

Announcements are made over mailing lists and also marked in the Project:Calendar

The team also carries out regular bug triages whenever required.

Notes from these sessions can be found in the following section.

Notes from past sessions

 * 24 April, 2013 - Translate (TUX) bugs
 * 22 August, 2012 - i18n bugs in Mediawiki extensions
 * 9 May, 2012 - Translate feature bugs
 * 18 January, 2012 - Webfonts
 * 14 September, 2011 - Miscellaneous i18n bugs

Test days - Plan overview
THE FOLLOWING SECTION IS ONLY A DRAFT

This section outlines the plan for a pilot series of testing events of tools developed by Wikimedia’s Language Engineering team. The plan is based upon the current development release and the testing framework may change depending upon the focus of each release cycle and the feedback about the previous testing cycle.

Pre-Event Tasks

 * Mark completion stage within a Release Cycle (i.e. mapped to the Mingle-based sprints) for testing. The test events in the series are:
 * One big test day approximately half-way into the release (in translatewiki.net or in a deplyment in WMF Labs)
 * Another big test day before the final bugs-based sprint, on a dark-launched version of the feature
 * One more before the public launch (optional)


 * Set up a lab/Virtual environment with the released version of the component(s) to be tested
 * Prepare test scenarios for the components and build test cases for each
 * Prepare complete walkthrough instructions for the test-cases
 * A wiki page (or any other better option) with the test cases + instructions + link to record results to be set up
 * Results should have options to provide (a Google spreadsheet+form may be useful):
 * V and X results for pass and fail
 * Additional Notes
 * Screenshots
 * Bug number


 * Add an option to file a bug (through a template) for an issue (but don’t make it mandatory for the testers)
 * Known issues to be listed to avoid testers spending too much time on them
 * Set a 7-10 day window for the event to be open (Timezone and volunteer time is of concern)
 * A dry run of the test event is done by a team member (QA person)
 * Event is opened and emails sent to:
 * appropriate mailing lists 
 * social media
 * etc


 * Sent out communication should include:
 * Links to testing event instructions
 * Google hangout/IRC where discussion related to the testing event will be done
 * Ways to get information if no one is around to provide live help

Tasks during the Event

 * Test organizers (or people with enough information to help testers) must be present on Google Hangout/IRC
 * Someone with knowledge on how to fix the lab installation if it’s broken, to be around

Post-Event Tasks

 * Go through all the feedback received from the test results
 * File bugs/issues for inclusion into the development cycles
 * Prepare event reports

Notes:


 * Types of Testing
 * A development release cycle based test event can target ‘Regression’ and ‘New Features’ tests
 * A general test event can be used for smaller general updates (such as changes in Message or Language class, jqueryMsg, cldr, etc. as these happen quite often)


 * Frequency
 * A development release cycle can have at least 2 test events
 * The second one before final bugs-based sprint to catch any potential blockers
 * A general test event can be done less frequently


 * Test Cases
 * These are set based upon the type of test event
 * Previously written testing instructions can be reused for regression testing


 * Deployment requirements for the testing environment
 * Scripts to automatically deploy MediaWiki:
 * In the needed version (Git checksum)
 * With the right extensions
 * In multiple languages (all or a defined subset for testing)
 * With at least some content in those languages
 * With relevant interface customizations from the respective Wikipedia (or another project): gadgets. common.css, common.js etc.


 * Pony features
 * A test dashboard that will list:
 * Upcoming test days
 * Link to testing environment
 * Link to start test that leads to the test instructions, and recording the test results
 * Benefit: This can be used as the go-to place for anything related to Language testing and only one URL can be handed out when communicating about test events