Growth/Positive reinforcement/es

Esta página describe el trabajo en torno al concepto de "refuerzo positivo" como parte del conjunto de funcionalidades de Crecimiento. Esta página contiene los principales elementos, diseños, preguntas abiertas y decisiones.

La mayoría de las actualizaciones sobre el progreso se publicarán en la página general de Actualizaciones del equipo de Crecimiento, con algunas actualizaciones importantes o detalladas publicadas aquí.



Estado actual

 * 2021-03-01: página del proyecto creada
 * 2022-02-25: inicio del proyecto y discusiones con el equipo
 * 2022-03-01: página del proyecto expandida
 * 2022-05-11: discusión con la comunidad
 * 2022-08-12: user testing completo
 * 2022-11-24: diseños actuales y plan del experimento y métricas añadidas
 * 2022-12-01: nuevo módulo de impacto disponible en las wikis piloto
 * 2023-02-07: Inicio del trabajo en torno a los proyectos Subir de nivel y Elogios personalizados, además de una segunda ronda de discusión con la comunidad
 * 2023-02-14: publicado Análisis de hitos de las tareas para personas recién llegadas que ayudará a orientar el trabajo de Subir de nivel
 * 2023-03-22: Leveling up features released as an A/B test at Growth pilot wikis
 * 2023-03-24: published Thanks Usage analysis
 * 2023-05-25: released Personalized praise module on Growth pilot wikis
 * Next: release Personalized praise on all Wikipedias

Resumen
El equipo de crecimiento se ha centrado en construir una "experiencia coherente para los recién llegados" que les proporcione el "acceso" que necesitan a los elementos que les ayudan a unirse a la comunidad de usuarios de Wikipedia. Por ejemplo, con tareas para recién llegados, les hemos dado acceso a oportunidades de participar, y con el módulo de mentoría, les hemos dado acceso a mentoría. Las ediciones sugeridas han conseguido que un mayor número de recién llegados realicen sus primeras ediciones. Con ese éxito, queremos tomar medidas para animar a los nuevos a seguir haciendo más ediciones. Esto llama nuestra atención sobre un elemento no desarrollado al que los recién llegados necesitan acceder: la evaluación del rendimiento. Llamamos a este proyecto "refuerzo positivo".

Queremos que los recién llegados comprendan que hay una progresión y un valor en las contribuciones sostenidas en Wikipedia, aumentando la retención de aquellos usuarios que dieron el primer paso para hacer una edición.

Nuestra gran pregunta es: ¿Cómo podemos animar a los recién llegados que han visitado nuestra página de inicio y han probado nuestras funciones a seguir editando y aprovechar ese primer impulso?

Antecedentes
Cuando la página de inicio de los recién llegados se desplegó en 2019, contenía un "módulo de impacto" básico, que enumeraba el número de páginas vistas de las páginas que el recién llegado había editado. Esa es la única parte de las funcionalidades de Crecimiento que da al recién llegado una idea de su impacto, y no la hemos mejorado desde que se implementó por primera vez. Con esto como punto de partida, hemos reunido algunos aprendizajes importantes sobre el refuerzo positivo:


 * Los miembros de la comunidad nos han transmitido sus impresiones positivas sobre el módulo, y los editores experimentados dicen que es interesante y valioso para ellos.
 * Se ha demostrado que el reconocimiento por parte de otros usuarios aumenta la retención, como en el caso de los "gracias" (aquí y aquí) y en un experimento en la Wikipedia alemana. Creemos que estos refuerzos procedentes de personas reales serían más eficaces que los mensajes automatizados procedentes del sistema.
 * Los miembros de la comunidad han explicado que para los recién llegados es prioritario pasar a tareas más elaboradas después de empezar con las más fáciles, en lugar de quedarse estancados haciendo sólo tareas fáciles.
 * Otras plataformas, como Google, Duolingo y Github, utilizan numerosos mecanismos de refuerzo positivo como insignias y objetivos.
 * Las comunidades son reticentes a incentivar una edición poco saludable. Hemos visto que el hecho de que los concursos de edición ofrezcan premios en metálico, o simplemente que funciones útiles como la de "confirmado ampliado" dependan del recuento de ediciones, puede incentivar a las personas a realizar muchas ediciones problemáticas.

User persona
Hay muchas partes del recorrido de los recién llegados en las que podríamos intentar aumentar la retención. Podríamos centrarnos en los recién llegados que han dejado de editar después de una o unas pocas ediciones, o podríamos fijarnos en los recién llegados que han dejado de editar después de semanas de actividad. Para este proyecto, hemos decidido centrarnos en los recién llegados que han completado su primera experiencia de edición y que queremos que vuelvan a realizar una segunda sesión. El diagrama los ilustra con una estrella amarilla.

Queremos centrarnos en los recién llegados en esta fase, ya que es la siguiente etapa del embudo del editor en la que podemos ayudar a mejorar la retención. También es donde vemos una tasa de abandono muy significativa actualmente, así que si podemos ayudar a retener a los nuevos en este punto, debería tener un impacto significativo en el crecimiento de los editores con el tiempo.



Investigación y diseño
Se investigaron los distintos mecanismos que se han empleado para animar a la gente a contribuir con contenidos a los productos on y off-wiki. A continuación se exponen algunas de las principales conclusiones de la investigación:


 * Las motivaciones de los editores de Wikipedia son múltiples y cambian con el tiempo y la experiencia. Los nuevos editores suelen dejarse llevar más por la curiosidad y la conexión social que por la ideología.
 * Los proyectos internos se centran en incentivos intrínsecos, apelan a motivaciones altruistas y no se aplican sistemáticamente.
 * Ampliando las motivaciones más allá de las ideológicas se puede mejorar la diversidad de los editores fidelizados en Wikipedia.
 * Los mensajes positivos de los usuarios experimentados y los mentores han demostrado su eficacia en la retención a corto plazo.

Para ver un resumen de las ideas de diseño actuales sobre el refuerzo positivo, consulta este Informe de diseño. Nuestros diseños seguirán evolucionando gracias a los comentarios de la comunidad y a varias rondas de tests de usuarios y usuarias.

Ideas
Tenemos tres ideas principales para el refuerzo positivo. Es posible que sigamos con varias ideas mientras trabajamos en este proyecto.

Impacto

 * Impacto: Una revisión del módulo de impacto basada en la incorporación de estadísticas, gráficos y otra información sobre las contribuciones. El módulo de impacto revisado proporcionaría a los nuevos editores más contexto sobre su impacto, además de animarles a seguir contribuyendo. Las áreas de estudio son, entre otras, las siguientes:
 * Hito de ediciones sugeridas, para animar a los usuarios a probar las ediciones sugeridas.
 * Estadísticas sobre cuánto ha editado el usuario a lo largo del tiempo (similar a lo que hay en X Tools).
 * Recuento de "agradecimientos recibidos", para destacar la capacidad de recibir el reconocimiento de la comunidad.
 * Actividad de edición reciente: incluye los días seguidos que los recién llegados han editado ("rachas") para animar a seguir participando o recordar a la gente que debe reiniciar sus contribuciones.
 * Ver la actividad de lectura en los artículos que los recién llegados han editado a lo largo del tiempo (similar a la información en Wikipedia:Pageview_statistics).



Subir de nivel

 * Subir de nivel: Es importante para las comunidades que los recién llegados progresen hacia tareas más valiosas. Para los que realizan muchas tareas fáciles, queremos animarles a que intenten tareas más complejas. Esto podría ocurrir tras completar un determinado número de tareas sencillas, o mediante un llamamiento en su página de inicio. Las áreas de exploración incluyen:
 * El recién llegado verá mensajes de éxito tras su edición que le motivarán a realizar más ediciones de igual o distinto nivel de dificultad.
 * El módulo de ediciones sugeridas sugiere la posibilidad de realizar ediciones más difíciles, para que los recién llegados puedan convertirse en editores más hábiles.
 * El módulo de impacto incluye un contador de hitos o un área de premios.
 * En la página de inicio, añade un nuevo módulo con retos establecidos para conseguir alguna recompensa (insignia/certificado).
 * Añade notificaciones para incitar a los recién llegados a intentar una tarea más difícil.



Reconocimientos personalizados

 * Elogios personalizados: las investigaciones demuestran que los reconocimientos y estímulos de otros usuarios aumentan la retención de los recién llegados. Queremos estudiar cómo animar a los usuarios experimentados a agradecer y premiar a los recién llegados por sus buenas contribuciones. Quizás se podría alentar a los mentores a hacerlo en sus paneles de control de mentores o a través de notificaciones. Podemos utilizar los mecanismos de comunicación existentes que, según estudios anteriores, tienen cierto grado de efecto positivo. Las áreas a explorar son, entre otras, las siguientes:
 * Un mensaje personal del mentor del nuevo usuario que aparece en la página de inicio.
 * Una notificación a modo de eco del mentor o del equipo de crecimiento de Wikimedia.
 * “Agradecimiento” en una edición específica.
 * Una nueva insignia de hito otorgada por el mentor o el equipo de crecimiento de Wikimedia en relación con una edición específica.



Discusión con la Comunidad
Discutimos el proyecto de Refuerzo Positivo con los miembros de las comunidades de ar:ويكيبيديا:مشروع فريق النمو (التعزيز الإيجابي)bn:উইকিপিডিয়া:আলোচনাসভাcs:Diskuse k Wikipedii:Zkušenosti nových wikipedistů/Pozitivní posílenífr:Discussion Projet:Aide et accueil/Volontaires, y aquí en mediawiki.org.

Recibimos comentarios directos sobre las tres ideas principales, junto con muchas otras ideas para mejorar la retención por parte de las nuevas personas editoras.

A continuación se resumen los principales temas abordados, junto con la forma en que pensamos iterar en función de los comentarios recibidos.

Impacto


Elogios personalizados


Otras ideas:
Las personas de la comunidad sugirieron otras ideas para mejorar el compromiso y la retención de las recién llegadas. Creemos que todas ellas son ideas valiosas (algunas de las cuales ya estamos explorando o en las que queremos trabajar en el futuro), pero las siguientes no encajarían en el ámbito del proyecto actual:
 * Envío de correos electrónicos de bienvenida a los recién llegados (el equipo de Crecimiento está estudiando la posibilidad de enviar correos electrónicos de compromiso en colaboración con los equipos de Marketing y Recaudación de fondos).
 * Presentar a las personas recién llegadas Wikiproyectos relacionados con sus intereses.
 * Include a customizable widget on the newcomer homepage to allow wikis to promote certain newcomer tasks or events.
 * Send notifications to users who welcome newcomers once the newcomer reaches certain editing milestones (to help prompt the user to offer Thanks or Wikilove).



Segunda consulta a la comunidad
In February 2023, we completed a community consultation in which we reviewed the most recent Leveling up designs with the Growth Pilot wikis. This consultation was completed in English on MediaWiki, and at Arabic Wikipedia, Bengali Wikipedia, Czech Wikipedia, and Spanish Wikipedia. (T328356) In general, feedback was quite positive. These two tasks help address feedback mentioned by those that responded to our questions:


 * Leveling up: Community configuration (T328386)
 * Leveling up: Second design iteration of "Try a new task" dialog (T330543)

In March 2023, we completed a community consultation in which we reviewed the most recent Personalized praise designs with the Growth Pilot wikis. This consultation was completed on English Wikipedia, Arabic Wikipedia, Bengali Wikipedia, Czech Wikipedia, French Wikipedia, Spanish Wikipedia, and at MediaWiki in English. (T328356) Most feedback was supportive of Personalized praise features, but several further improvements were requested. We've created Phabricator tasks to address these further improvements.


 * On Arabic Wikipedia, and other wikis with Flagged Revisions, mentors want to see not only the number of edits a user had completed, but more details on the review status of edits (T333035)
 * Mentors want to be able to view the number or percentage of reverts their mentee has, and customize how many reverts a newcomer can have to be considered praiseworthy (T333036)
 * Mentors would appreciate knowing which edit a mentee is Thanked for (T51087)



Pruebas de usuarios
Junto con las conversaciones con la comunidad, queríamos validar y ampliar nuestros diseños e hipótesis iniciales probando los diseños con personas lectoras y editoras de varios países. So our design research team conducted Positive Reinforcement user testing aimed to better understand the project's impact on newcomer contribution across several different languages.

We tested several static Positive Reinforcement designs with Wikipedia readers and editors in Arabic, Spanish, and English. Along with testing Positive Reinforcement designs we introduced data visualizations from xtools as a way to better understand how these data visualizations are perceived by newcomers.



User testing results

 *  Make impact data actionable:  Impact data was a compelling feature for participants with more experience editing, which several related to their interest in data—an unsurprising quality for a Wikipedian. For those new to editing, impact data, beyond views and basic editing activity, may be more compelling if linked to goal-setting and optimizing impact.
 *  Evaluate the ideal editing interval:  Across features, daily intervals seemed likely to be overly ambitious for new and casual editors. Participants also reflected on ignoring similar mechanisms on other platforms when they were unrealistic. Consider consulting usage analytics to identify “natural” intervals for new and casual editors to make goals more attainable.
 *  Ensure credibility of assessments:  Novice editor participants were interested in the assurance of their skills and progress the quality score, article assessment, and badges offer. Some hoped that badges could lend credibility to their work reviewed by more experienced editors. With that potential, it could be valuable to evaluate that the assessments are meaningful measures of skill and further explore how best to leverage them to garner community trust of newcomers.
 *  Reward quality and collaboration over quantity:  Both editor and reader participants from esWiki were more interested in recognition of their knowledge or expertise (quality) than the number of edits they have made (quantity). Similarly, some Arabic and English editors are motivated by their professional interests and skill development to edit. Orienting goals and rewards to other indicators of skilled edits, such as adding references or topical contributions, and collaboration or community involvement may also help mitigate concerns about competition overtaking collaboration.
 *  Prioritize human recognition:  While scores and badges via Growth tasks is potentially valued, recognition from other editors appears to be more motivational. Features which promote giving, receiving, and revisiting thanks seemed most compelling, and editors may benefit from selecting impact data which demonstrates engagement with readers or editors most compelling to them.
 *  Experiment with playfulness of designs:  While some positive reinforcement features can be seen as the product of “gamification”, some participants (primarily from EsWiki) felt that simple, fun designs were overly childish or playful for the seriousness of Wikipedia. Consider experimenting with visual designs that vary in levels of playfulness to evaluate broader reactions to “fun” on Wikipedia.

Design
Below are the current designs for Positive Reinforcement. We have refined the three main ideas outlined above, but the scope of plans and the actual designs have evolved based on feedback from community discussions and user testing.

Impacto
El módulo de impacto actualizado ofrece más contexto sobre su impacto a quienes comienzan a editar. El nuevo diseño incluye información mucho más personalizada y visualizaciones de datos que el diseño anterior. Este nuevo diseño es bastante similar al diseño que compartimos anteriormente cuando discutimos esta funcionalidad con las comunidades. Puedes ver el progreso actual de ingeniería en beta wiki, y esperamos lanzar pronto esta herramientas a las wikis piloto de Crecimiento.

Leveling up
The Leveling up features focus on encouraging newcomers to progress to more valuable tasks. Ideas also include some prompts for new editors to try suggested edits, since structured tasks have been shown to improve newcomer activation and retention.
 * “Level up” post-edit dialog message: A new post-edit dialog message type is added to encourage newcomers to try a new task type. We hope this will encourage some users to learn new editing skills as they progress to different, more challenging tasks.
 * Post-edit dialog for non-suggested edits: Introduce newcomers who complete ‘normal’ edits to suggested edits. We plan to experiment by showing newcomers a prompt post 3rd and 7th edit. Desktop users who click through to try a suggested edit will also see their Impact module, which we hope helps engage newcomers and provides a small degree of automated positive reinforcement. We will carefully measure this experiment, and ensure there aren't any unintentional negative effects.
 * New notifications: New echo notifications to encourage newcomers to start or continue suggested edits. This acts as a proxy to “win-back” emails for those who have an email address and settings on to receive email notifications.



Agradecimientos personalizados
Las funcionalidades de elogio personalizadas se basan en los resultados de la investigación que demuestran que el estímulo y el agradecimiento de otras personas aumentan la retención de quienes editan.
 * Motivación de las personas mentoras: Añadiremos un nuevo módulo al Panel de mentoría, diseñado para animar a las personas mentoras a enviar mensajes personalizados a las recién llegadas que cumplan ciertos criterios. Podrán personalizar y controlar los criterios por los que les son mostrados (cómo y cuándo) sus aprendices "dignos de elogio".
 * Aumentar los Agradecimientos en la wiki: Tenemos previsto cumplir la petición en la Lista de deseos de la comunidad de Activar el botón de agradecimiento por defecto en listas de seguimiento y cambios recientes (T51541, T90404). Esperamos que esto aumente los agradecimientos y la actitud positiva en las wikis, y esperamos que las personas recién llegadas se beneficien de ello directa o indirectamente.



Hypotheses
The Positive Reinforcement features aim to provide or improve the tools available to newcomers and mentors in three specific areas that will be described in more detail below. Our hypothesis is that once a newcomer has made a contribution (say by making a structured task edit), these features will help create a positive feedback cycle that increases newcomer motivation.

Below are the specific hypotheses that we seek to validate across the newcomer population. We will also have hypotheses for each of the three sets of features that the team plans to develop. These hypotheses drive the specifics for what data we will collect and how we will analyse that data.


 * 1) The Positive Reinforcement features increase our core metrics of retention and productivity.
 * 2) Since the Positive Reinforcement features do not feature a call to action that asks newcomers to make edits, we will see no difference in our activation core metric.
 * 3) Newcomers who get the Positive Reinforcement features are able to determine that making un-reverted edits is desirable, and we will see a decrease in the proportion of reverted edits.
 * 4) The positive feedback cycle created by the Positive Reinforcement features will lead to a significantly higher proportion of "highly active" newcomers.
 * 5) The Positive Reinforcement features increase the number of Daily Active Users of Suggested edits.
 * 6) The average number of edit sessions during the newcomer period (first 15 days) increases.
 * 7) "Personalized praise" will increase mentor’s proactive communication with their mentees, which will lead to increase in retention and productivity.

Experiment plan
Similarly as we have done for previous Growth team projects, we want to test our hypotheses through controlled experiments (also called "A/B tests"). This will allow us to establish a causal relationship (e.g. "The Leveling Up features cause an increase in retention of xx%"), and it will allow us to detect smaller effects than if we were to give it to everyone and analyze the effects pre/post deployment.

In this controlled experiment, a randomly selected half of users will get access to Positive Reinforcement features (the "treatment" group), and the other randomly selected half will instead get the current (September 2022) Growth feature experience (the "control" group). In previous experiments, the control group has not gotten access to the Growth features. The team has decided to move away from that (T320876), which means that the current set of features is the new baseline for a control group.

The Personalized Praise feature is focused on mentors. There is a limited number of mentors on every wiki, whereas when it comes to newcomers the number increases steadily every day as new users register on the wikis. While we could run experiments with the mentors, we are likely to run into two key challenges. First, the limited number of mentors could mean that the experiments would need to run for a long time. Second, and more importantly, mentors are well integrated into the community and communicate with each other, meaning they are likely to figure out if some have access to features that others do not. We will therefore give the Personalized Praise features to all mentors and examine activity and effects on newcomers pre/post deployment in order to understand the feature’s effectiveness.

In summary, this means we are looking to run two consecutive experiments with the Impact and Leveling up features, followed by a deployment of the Personalized Praise features to all mentors. These experiments will first run on the pilot wikis. We can extend this to additional wikis if we find a need to do that, but it would only happen after we have analyzed the leading indicators and found no concerns.

Each experiment will run for approximately one month, and for each experiment we will have an accompanying set of leading indicators that we will analyze two weeks after deployment. The list below shows what the planned experiments will be:


 * 1) Impact: treatment group gets the updated Impact module.
 * 2) Leveling up: treatment group gets both the updated Impact module and the Leveling up features.
 * 3) Personalized praise: all mentors get the Personalized praise features.

Leading indicators and plan of action
While we believe that the features we develop are not detrimental to the wiki communities, we want to make sure we are careful when experimenting with them. It is good practice to define a set of leading indicators together with plans of what action to take based if a leading indicator suggests something isn't going the way it should. We have done this for all our past experiments and do so again for the experiments we plan to run as part of this project.

Impact
 Impact module interactions:  We find that the proportion of newcomers who interact with the old module (6.1%) is significantly higher than for the new module (5%): $$\chi^2 = 17.5, df = 1, p \ll 0.001$$ This difference showed up early on in the experiment, and we have examined the data more closely understand what is happening. One issue we identified early on was that not all interaction events were instrumented, which we subsequently resolved. Examining further, we find that many of those who get the old module click on links to the articles or the pageviews. In the new module, a graph of the pageviews is available, thus removing some of the need for visiting the pageview tool. As a result, we decided that no changes were needed.

 Mentor module interactions:  We find no significant difference in the proportion of newcomers who interact with the Mentor module. The proportion for newcomers who get the old module is 2.4%, for those who get the new module it's 2.2%. A Chi-square test finds this difference not significant: $$\chi^2 = 1.5, df = 1, p = 0.219$$

 Mentor module questions:  We do not see a substantial difference in the number of questions asked between the old module (269 edits) and the new module (281 edits). The proportion of newcomers who asks their mentor a question is also the same for both groups, at 1.5%.

 Edits and revert rate:  We do not see a substantial difference in the number of edits nor in the revert rate between the two groups measured on a per-user average basis. There are differences between the groups, but these are driven by some highly prolific editors, particularly on the mobile platform.

Levelling up
 Levelling up post-edit dialog interactions:  We find a higher proportion of newcomers interacting with the post-edit dialog in the Levelling Up group (90.8%) compared to the standard post-edit dialog (86.5%). This is largely driven by mobile where the Levelling Up interaction proportion (88%) is a lot higher than the other group (81.6%). The proportion is still higher for the Levelling Up group on desktop (93.6%) compared to the control (92.2%), but we regard it as "virtually identical" because the high proportion in the control group means there is little room for an increase.

 Try a suggested edit click through rates:  21.9% of newcomers who see the "Try a suggested edit" post-edit dialogue chooses to click through, which is significantly higher than the threshold set. The proportion is higher on desktop (24%) than on mobile (19.7%), but in neither case is there a reason for concern.

 Increase your skill level click through rates:  We find that 73.1% of newcomers who see the "increase your skill level" dialog click through to see the new task, which is a lot higher than our expected threshold of less than 10%. Proportions are high on both desktop (71.1%) and mobile (77.3%).

 Get started click through rates:  3.8% of newcomers who get the "Get started" notification clicks through to the Homepage. Users who registered on desktop are more likely to click the notification (5.5%) than those on mobile (2.5%). Because the threshold of 5% is met, we are investigating further to understand this difference between desktop and mobile behaviour, particularly to understand if our 5% threshold is reasonable.

 Keep going click through rates:  We find that 9.6% of users who get the "Keep going" notification clicks through to the Homepage. Similarly as we do for the "Get started" notifications, we find a much higher proportion on desktop (16.2%) compared to mobile (4.7%). Our investigations into differences in notification behaviour by platform will hopefully give us more insight into this difference.

 Activation:  We find a decrease in constructive article activation (making a non-reverted article edit within 24 hours of registration) of 27% compared to 27.7%. As soon as we noticed this we opened T334411 to investigate the issue, with a focus on patterns in geography (countries and wikis) and technology (devices and browsers). We did not find clear patterns explaining the issue. The investigation of this decrease in activation will be investigated further: T337320.

Personalized praise
Data was gathered on 2023-06-13, from the four pilot wikis where the feature is deployed (Arabic Wikipedia, Bengali Wikipedia, Czech Wikipedia, and Spanish Wikipedia).

Personalized praise notification click through: Although this is still a relatively small sample, results seem healthy and show that Mentors are indeed receiving notifications and clicking through to view their praise-worthy mentees.

Personalized praise mentor dashboard module click through: Only 27.5% of Mentors are clicking through to a mentee's talk page, however it's to be expected that some of the mentees we are surfacing aren't deserving of praise. Based on this data and feedback from Mentors, the Growth team will pursue the following tasks to help improve this feature:


 * Add revert scorecard to Personalized praise module on Mentor dashboard (T337510)
 * Exclude blocked accounts from the Personalized praise suggestions (T338525)

Experiment Results
Many of the experiments that the Growth team runs will focus on the same set of key metrics (commonly referred to as KPIs), and this includes all of the Positive Reinforcement experiments. The key metrics are defined as follows:


 * Constructive activation is defined as a newcomer making their first edit within 24 hours of registration, and that edit not being reverted within 48 hours.
 * Activation is similarly defined as constructive activation, but without the non-revert requirement.
 * Constructive retention is defined as a newcomer coming back on a different day in the two weeks after constructive activation and making another edit, with said edit also not being reverted within 48 hours.
 * Retention is similarly defined as constructive retention, but without the non-revert requirements.
 * Constructive edit volume is the overall count of edits made in a user's first two weeks, with edits that were reverted within 48 hours removed.
 * Revert rate is the proportion of edits that were reverted within 48 hours out of all edits made. This is by definition 0% for users who made no edits, and we generally exclude these users from the analysis.

Impact module experiment results
We found a significant decrease in constructive activation for newcomers who registered on mobile web and got the New Impact module. There was no difference in activation for newcomers who registered on desktop. This was quite surprising as the empty state for the old Impact module was nearly identical to the empty state of the new Impact module.

First-day activity correlates strongly with later activity, and as a result we also found a significant decrease in edit volume for mobile web users. Again, there was no difference for desktop users.

We found no difference in retention rates and revert rates. While there are features in the New Impact module that focuses on staying active and making good contributions, such as the number of thanks received and the streak counter, we often do not see significant impacts on metrics unless there's a clear call to action or we are able to isolate a specific subgroup motivated by the feature.

As soon as we learned about the decrease in activation we started investigations into probable causes of this in T330614. Unfortunately we could not identify a specific reason and we also found that the issue was not replicated in another dataset. We decided to add activation as a leading indicator to the Levelling Up experiment so that we could take action more quickly. When we noticed that the issue persisted, we started a new investigation in T334411 and created an "epic" task that connects all relevant subtasks: T342150. We restarted experiment data collection after making several small changes, and we now see that activation is identical between the experiment and control group, which is what we would expect.

Although we are pleased that we have received positive feedback from new editors regarding the new Impact module, we have found that the Impact module alone hasn't resulted in significant changes in newcomer retention, edit volume, or revert rates. Our next experiment will combine the new Impact module with the Leveling up features. We hope that this combination of Positive Reinforcement features will lead to substantial improvements in activation, retention, and edit volume. We will soon publish a detailed report that highlights the outcomes of this experiment.