Readers/Web/Instrumentation Overview

2 Months from Launch

 * Gather prerequisites
 * Plan in advance what to do with results and what actions to take.

1.5 Months from Launch
Begin writing instrumentation

1 Month from Launch
Complete A/B Test instrumentation

2 Weeks from Launch
Enable a dummy A/B test earlier to test the mechanism separately from the actual test in Beta Cluster and Test Wiki (production).

1 Week from Launch
Deploy on smaller language wikis and test

Launch Date
Deploy on English wikipedia

Two Weeks after Launch
Turn off A/B test

Prerequisites

 * Get research objective and schema from data analyst, both of which can be found in the Phabricator ticket.
 * Name of what is being tested: Identify the components that will be tested. In this example, we are A/B testing the Zebra skin, so the name is skin-vector-zebra-experiment. Keep this for later.
 * Variations: Identify the different versions of the component that you'll be testing. The variations in this example are vector-feature-zebra-design-enabled and vector-feature-zebra-design-disabled.
 * Make sure the answer is "yes" for the following question: "Is the focus or the aim of the test on change management and phasing out features to users?"

Files to modify
skin.json


 * 1) Modify the lines containing the A/B test configuration.
 * 2) In "VectorWebABTestEnrollment", assign "name": to the value decided on earlier (skin-vector-zebra-experiment).

{{terminal|title=Example:|text= "VectorWebABTestEnrollment": { "value": { "name": "skin-vector-zebra-experiment", "enabled": false, "buckets": { "unsampled": { "samplingRate": 0 },     "control": { "samplingRate": 0.5 },     "treatment": { "samplingRate": 0.5 }  } } }}

ServiceWiring.php If the experiment is not enabled or does not match the specified name, the requirement returns true. Otherwise, it calculates the user's variant based on their central ID and returns a boolean value indicating whether the user is in the "control" or "test" group. The requirement is registered under the name REQUIREMENT_ZEBRA_AB_TEST, which is set in Constants.php
 * 1) Register the A/B test requirement for the Vector skin feature manager.
 * 2) Create a MultiConfig object that contains two configurations: one with a single key (in this instance REQUIREMENT_ZEBRA_AB_TEST) set to true and the second one is the main configuration obtained through $services->getMainConfig.
 * 3) Next, register a new instance of ABRequirement with the FeatureManager. This will check whether the user is enrolled in an A/B test for the skin-vector-zebra-experiment experiment by checking the VectorWebABTestEnrollment configuration parameter in the $services->getMainConfig.

Other useful tools

 * 1) Hue - Use this to query events. (Remember: Data is limited to 90 days)
 * 2) DataHub Historically, the team has been using Google Spreadsheets for schema tracking, but we are currently transitioning to referencing and recording schemas here.

Writing the variations

 * 1) Configure testing parameters in LocalSettings.php, such as the percentage of traffic that will see each version.
 * 2) Define an array for A/B testing in the Vector skin of MediaWiki.

To allocate 50% of users to the "control" bucket and 50% to the "treatment" bucket, use the following format.

Launching test
Once you've set up your A/B test and determined your sample size, commit the patch containing the test as seen here.

Coordinate with the PM and CRS and make sure the messages announcing the test have been posted


 * at least a week before the estimated time of launch – to give heads-up about a change of user experience, and make it possible for the communities to look for possible bugs in the user-generated code
 * in the week of the estimated time of launch – when the quality of configuration has been confirmed

Phase 1
Limit configuration enabling to test.wikipedia.org and test2.wikipedia.org. Avoid launching the test on active content wikis without launching it on test wikis or closed content wikis first.

Analyst handoff
After the test has run for a sufficient amount of time, the analyst will check the results to determine which variation performed better.

Next steps

 * Once we have identified the winning variation and project manager approves, open a new patch to implement it. (Example forthcoming)
 * Ensure that the changes are properly documented and communicated to relevant stakeholders.