Growth/Personalized first day/Variant testing

Starting in December 2019, the Growth team started testing variants of features against each other -- instead of just building one version of a feature and seeing whether it does or does not increase new editor retention. Testing variants will help us learn and iterate faster. This page contains running lists of variants we want to test for our different features, which we'll draw from when planning tests going forward. The lists are long and exhaustive, but we expect to test under 10 of these per year.

Newcomer tasks variants
Below is the list of feature variants we would be interested in testing for newcomer tasks. Competed tests have summarized results in the table and full results in the section below.

General homepage variants
Below is the list of feature variants we would be interested in testing for the newcomer homepage.

Initiation Part 1 (A vs. B)

 * Summary
 * Variant A: suggested edits is initiated from the start module, and then the user receives mandatory onboarding screens.
 * Variant B: suggested edits is pre-initiated when the user arrives on the homepage, and there is no onboarding.
 * Hypothesis: having the visual suggested edits module on arrival will be attractive to users, and they will be more likely to interact with it and do more edits.
 * Result: though Variant B did cause more interaction, it did not cause more edits. We believe that the onboarding of Variant Ahelps the users understand what to do once they arrive on an article, leading them to be more likely to complete edits.
 * On desktop, Variant B yields double the interaction (clicking on anything in the suggested edits module), 60% more navigation (clicking on arrows to navigate to different suggested articles), and 30% more clicking on suggested articles.
 * On mobile, the variants perform the same for interaction and navigation, but Variant B leads to 15% less clicking on suggested articles.
 * On mobile, Variant A leads to 40% more newcomers editing than Variant B, but the edit rates are statistically the same on desktop.
 * Takeaways:
 * Making the suggested edits module more prominent on the homepage gets more users to interact with it.
 * Neither Variant A or B successfully made the module prominent on mobile.
 * Though the overlays were not present on Variant B, they may still play an important role in giving the user context for what they're supposed to be doing with the module. It's possible that this is reflected in the data in two ways:
 * Variant A yields 40% more newcomers editing on mobile than Variant B.
 * The increases between Variant A and B on desktop is lower with each successively "more engaged" type of engagement. In other words, simply interacting with the module is increased by 100%, but navigating between articles is only increased by 60%, and selecting an article is only increased by 30%.  This may show that users don't have sufficient context to understand that they are supposed to navigate between and select articles.
 * Next steps: we turned Variant A on for all newcomers while we build Variants C and D. This is because it is shown to yield more newcomers completing edits.
 * We'll use these learnings to design Variants C and D, with work happening here.

Initiation Part 2 (C vs. D)

 * Summary
 * Variant C
 * Desktop: suggested edits is pre-initiated and takes up half of screen. Onboarding is optional through a popup.
 * Mobile: suggested edits is task card shown directly on homepage. Onboarding is optional through a popup.
 * Variant D
 * Desktop: suggested edits module takes up half of screen, but contains mandatory onboarding materials inside before displaying suggestions.
 * Mobile: suggested edits is a call-to-action card that leads to mandatory onboarding.
 * Hypothesis: making the module much more prominent will lead to more interaction and edits in both variants, but we're not sure which variant will be the winner.
 * Result: both variants led to higher interactions and edits than Variants A and B. Variant D led to more edits on desktop and Variant C led to more edits on mobile.
 * On both desktop and mobile, Variant C led many more users to interact with the suggested edits module. This makes sense because in that variant, the module is active and visual upon arrival.
 * Those higher levels of initial interaction for Variant C cascaded down to higher levels of navigation and clicking on tasks on both platforms.
 * When it comes to edits being saved, the results are mixed:
 * On desktop, Variant D leads to 2.5% of account creators saving a suggested edits, whereas Variant C leads to 2.2%.
 * On mobile, it's the other way around, with Variant C leading to 2.0% of account creators saving a suggested edits, whereas Variant D leads to 1.7%.
 * Next steps: we'll deploy the winning variant on each platform to all newcomers via T272754. All desktop users will get Variant D, and all mobile users will get Variant C.  We'll revisit these learnings when we work on the next set of variants.