Growth/Personalized first day/Newcomer homepage/Measurement and experiment plan

The goal of the Homepage is to provide users, particularly those who are newly registered, with easy access to things like help resources, suggestions for work they can do, and information that makes salient the lively activity of the Wikipedia community and the impact of the work that is being done. This type of information can help lower the barrier to making constructive contributions to the encyclopedia, thereby increasing editor activation (the proportion of new users who go on to make edits). It can also potentially increase editor retention (the proportion of new users who return to make additional edits), which is the overarching goal of the Growth Team.

This page documents our plans for running a controlled experiment with the Homepage and what measurements we will use to learn how users are interacting with it.

Research questions
The research questions are the higher level questions we want answered. By “homepage treatment” we mean that a user both has the homepage on by default and is receiving the various ancillary features that drive them to the homepage, which may include banners, talk page messages, or links on Main Page. In the initial version, the only ancillary feature will be the fact that the username link in personal tools will go to the homepage.


 * 1) Does receiving the homepage treatment increase editor activation?
 * 2) Does receiving the homepage treatment increase editor retention?
 * 3) Does having access to the homepage treatment increase the average number of edits in the first two weeks after registration?
 * 4) Does having access to the homepage increase the proportion of constructive edits?
 * 5) * In this context, “constructive edits” means edits that are not reverted. As the Growth Team is focused on new user retention, we plan to measure this both overall as well as by user experience levels (e.g. tenure in days, number of edits, or combinations of these)
 * 6) Do users who have access to the homepage go to, and return to, the homepage? What patterns do we see in homepage visits?
 * 7) Which modules engage newcomers and to what extent are they engaged? Do modules that ask users to take action lead to users taking those actions?
 * 8) Are we able to effectively personalize the homepage?
 * 9) * First step: do users behave in correspondence with their responses to the Welcome Survey? For example, do users who say they are interested in being contacted to get help with edits use the mentor module to reach out to a mentor; and do users who say they are interested in creating an article use the modules that help them learn more about how to do that?
 * 10) * Second step: once we can personalize the homepage, do we see increased engagement for users whose homepages are personalized?
 * 11) When the homepage is customizable, to what extent to do users avail themselves of that capability?
 * 12) If/when the effort exists to survey newcomers about the homepage, what can we learn qualitatively?  Such a qualitative study could be done through Quick Surveys, or through a homepage module that simply asks about the rest of the homepage.

Controlled experiment
In order to understand the Homepage’s impact on editor activation and retention, we propose an experiment that will be running for at least six months. During that time, 50% of new registrations on target wikis (currently Czech, Korean, and Vietnamese) will have the homepage enabled by default, and 50% will have it disabled. We will most likely be running other experiments on the target wikis at the same time (e.g. variants of our Help Panel). If those experiments require stratified sampling, we will make sure to adjust our sampling strategies as necessary.

While the experiment is running, we expect to also be testing variants of the Homepage to understand how specific interface elements affect user behavior. We will remember that the stronger our hypothesis that an altered interface will affect activation or retention, the more a test on that interface confounds our longer term experiment on activation and retention.

Measurements
These measurements are driven by our research questions and focus on specific actions taken by users who have the ability to access the homepage and interact with its modules. The list below contains measurements for the four modules present on the Homepage at time of launch: Start, Help, Impact, and Mentor. When we develop new modules for the Homepage, we will also develop similar measurement plans for those.


 * 1) General
 * 2) What percent of users access the homepage?
 * 3) What percent of those users access the homepage multiple times?
 * 4) When users access the homepage multiple times, over what timeline does that happen?
 * 5) Are there differences between mobile and desktop users?
 * 6) How soon after creating their account does a user first visit the homepage?
 * 7) How do users get to the homepage?
 * 8) By clicking on their username in the top navigation?
 * 9) By clicking on the tab from their User or User talk page?
 * 10) Via another driver, such as a banner on wiki or link in an email?
 * 11) If the user clicked on a link to the Homepage from the top navigation or a tab, we want to capture the context they were in. Specifically, we want to know what namespace they were in when they clicked it, and whether they were reading or editing.
 * 12) How much time do users spend interacting with homepage modules per visit?
 * 13) During their first visit?
 * 14) During subsequent visits?
 * 15) What percent of users follow links from at least one module?
 * 16) Are users who have access to the homepage more or less likely to create a user page?
 * 17) Are users who have access to the homepage more or less likely to interact with other users through user talk pages?
 * 18) Are users who have access to the homepage more or less likely to interact with the community through article and project talk pages?
 * 19) Modules
 * 20) General
 * 21) How often is each module clicked on? What is the proportion of users who click on each one?
 * 22) How many modules do users tend to interact with (where “interact” means any click)?
 * 23) Which modules tend to be interacted with repeatedly?
 * 24) Which modules are interacted with most by those users who are activated and retained (acknowledging that the causality may be going in either direction)?
 * 25) How long do desktop users hover over each module? (analogous measurement for mobile depends on pending mobile design)
 * 26) Does the placement of the module on the page appear to influence interactions?
 * 27) What is the impact on retention from doing the call-to-action on each module?
 * 28) For example: what percentage of users who contact their mentor are retained?
 * 29) Specific modules
 * 30) Help module
 * 31) What percent of users click one or more of the help links?
 * 32) If so, which link(s) do they click?
 * 33) What percent of users use the search functionality?
 * 34) 1: Focus on the search box
 * 35) 2: Submit a search, number of characters in search
 * 36) 3: Click a result, number of results, ranking of clicked result
 * 37) How far do users go in the workflow of asking a question on the Help Desk?
 * 38) 1: Click to ask a question
 * 39) 2: Type a question
 * 40) 3: Submit
 * 41) 4: Click a link in the confirmation dialog
 * 42) What is the average length of questions asked?
 * 43) What percent of users who post a question appear to return to view the answer, by doing any of the following?
 * 44) Clicking the Echo notification saying they have been pinged in the response.
 * 45) Clicking on the link in the “Your recent questions” list.
 * 46) Responding to the answer on the Help Desk page.
 * 47) What percent of newcomers that had no email address add one when asking a question?
 * 48) What percent of newcomers confirm their email address (within 48 hours) after asking a question?
 * 49) What percent of newcomers ask a question without an email address?
 * 50) What percent of users who asked at least one question see one or more archived questions when they view the homepage?
 * 51) What percent of users who click on one of the links in the “Your recent questions” list clicked on a link to a question that was archived?
 * 52) Mentorship module
 * 53) What percent click on a link to learn more about the mentor?
 * 54) How far do users go in the workflow of asking their mentor a question?
 * 55) 1: Click to ask a question
 * 56) 2: Type a question
 * 57) 3: Submit
 * 58) 4: Click a link in the confirmation dialog
 * 59) What is the average length of questions asked?
 * 60) (Measured by hand) What percent of users who post a question receive an answer from their mentor?
 * 61) Do they get an answer from someone that is not their mentor?
 * 62) What percent of users who post a question appear to return to view the answer, by doing any of the following?
 * 63) Clicking the Echo notification saying they have been pinged in the response.
 * 64) Clicking on the link in the “Your recent questions” list.
 * 65) Responding to the answer on the talk page.
 * 66) (Measured by hand) What percent of newcomers who ask a question post a second time on their mentor’s talk page?
 * 67) How often does the first question become a conversation?
 * 68) How often is a second question asked?
 * 69) How often do conversations move beyond transactional wiki talk?
 * 70) What percent click on one of the links to view their question after they’ve asked it?
 * 71) What percent of newcomers that had no email address add one when asking a question?
 * 72) What percent of newcomers confirm their email address (within 48 hours) after asking a question?
 * 73) What percent of newcomers ask a question without an email address?
 * 74) Are newcomers more likely to ask questions to mentors who have a high edit count and short time since last edit compared to mentors with lower edit counts and/or longer time since last edit?
 * 75) What percent of users who asked at least one question see one or more archived questions when they view the homepage?
 * 76) What percent of users who click on one of the links in the “Your recent questions” list clicked on a link to a question that was archived?
 * 77) Impact module
 * 78) What percent click on a link when the module is in its “unactivated state” (when the user has no edits to articles)?
 * 79) What percent click to view an article?
 * 80) What percent click to view the pageviews analysis tool?
 * 81) What percent click to view all their contributions?
 * 82) How often do users return to open the pageviews analysis tool multiple times?
 * 83) Start module
 * 84) What percent of users that had no email address add an email address through this module?
 * 85) What percent of users confirm their email address through this module?
 * 86) What percent of users click the button for the tutorial?
 * 87) What percent of users click the button to start their user page, and what percent of them actually save a user page?
 * 88) What percent of users click the link to learn more about creating a user page?

Leading indicators and plans of action
The duration of the A/B test is at least six months because it is impossible to detect changes to new editor retention on mid-size wikis in less time than that (unless we drastically impact retention, but we see that as somewhat unlikely). While we wait for our results we want to be able to take action if we suspect that something is amiss. Below, we sketch out a set of scenarios based on the data described in the instrumentation strategy above. For each scenario, we propose a plan of action to take to remedy the situation.