Wikimedia Product/Inclusive Product Development/Draft Playbook

Background
In fall 2021 through spring 2022, a select number of teams (Growth, Community Tech, Android, Editing, Campaigns, Platform Engineering Team, Web) in Product and Technology elected to use version 1 (V1) of the Inclusive Product Development playbook. The playbook is intended to serve as just one of the tools teams will use to check if their practices welcome members of the community with diverse experiences, ensure their processes and practices are inclusive and equitable. The Beta teams provided feedback about V1 and that input generated a draft version 2 (V2). Draft V2 underwent two rounds of feedback by 30 people resulting in V2 of the playbook below. V2 will be used by all teams at the Foundation that have a Product Manager for the next fiscal year. We will request that teams leave notes about the steps and we will conduct a survey to gain an understanding of how their processes and impact has changed based on V2 of the Inclusive Product Development Playbook.

Technology teams will be interviewed so a version of the playbook for teams without Product Managers is created.



V1 Beta Teams Test Results
In May 2022, our Beta testing teams completed a survey and turned in their copy of the playbooks. The information from the surveys and playbooks were synthesized and an external consultant shared recommendations based on this feedback.

Themes
Below are the themes shared between teams.

Time vs Deadlines
Over time it is common for teams to work toward optimization, streamlining their process in order to hit deadlines. Common as it may be, this individualized approach often lacks resilience and as a result of incremental adjustments, teams can become increasingly divergent in the ways that they get their work done. While this isn’t always a bad thing, it can certainly present challenges when attempting broad adoption of new tools and methodology.

Teams that introduced the Playbook earlier in the process seemed to have an easier time adding it into their workflow. Teams that were further along in the process had to pause, and rework their goals in order to include the playbook’s recommended steps.

The playbook helped guide team conversations, but left teams feeling unsure about how much time to invest in specific tasks, versus moving on to the next step required to meet their larger project goal. Teams questioned the rigidity of the process, wondering if all tasks must be completed before they could move to the next phase.

There was a shared belief that the approach defined in the playbook would require much more time. For instance, the time required to define and recruit a representative group of participants takes time. Translating between all relevant languages takes time (and costs money). The more you focus on people and give them time to engage, it all takes more time. This slower pacing can feel at odds with more agile ways of working, leaving people to ask what does a Prototype/MVP look like with this new approach? Not all of the 111 tasks in the playbook require the same level of investment - some tasks seem deceptively simple, only to end up occupying far more time than expected. Providing more guidance around WMF’s expectations for time spent within each stage in the process would help teams more accurately scope timelines.

Clarity & Specificity
In many cases, concerns around time came back to a lack of clarity around the expectation or specificity of the task itself. From determining which guidelines to follow or knowing who to ask if you got stuck, to knowing if you’ve completed a task to expectations, all of these issues could be clearer, if not specified on a task-by-task basis.

Many of the tasks have been distilled down to their simplest form, which puts a lot of work onto the plate of team members looking to unpack the meaning of each bullet point, without always feeling certain that they’ve unpacked it correctly.

Requesting that a team consult with a group, or adhere to a best practice without some further guidance and definition leaves a lot of room for interpretation, and continues the challenge of teams developing their own divergent approach to getting the work done.

Many teams expressed questions related to knowing how their work would be measured. Teams wanted to know how they would be reviewed, in order to learn and improve for next time. Some teams went so far as to request that the tasks be more binary or quantifiable, presumably so that they could self-assess their own performance with the playbook.

While the playbook did provide some direction for folks trying to find answers to their questions, the process of bringing someone in and getting them up to speed can feel like a significant time sink.

Recommendations
The majority of the following recommendations are designed to address the challenges outlined in the themes above. In many cases, when addressing an issue of specificity there is a direct benefit to the concerns around time requirements.

Ensure that every task in the Playbook is as specific as possible.
The more detail you can provide in the task the better, but if that isn’t possible either link to the details or a resource that makes it easier for someone to complete the task in a timely manner.

Clearly define the desired outcome for each task.
If the goal is to have every WMF project be WCAG 2.1 AAA compliant, be clear that will be how success is determined. Checklists are best suited to Yes/No questions, but if a task isn’t as simple as yes or no, help people know they’re doing the right thing by clarifying the end goal of the task.

Conduct a time audit to ensure that all tasks are achievable within a reasonable period of time.
Reviewing each of the 111 tasks in the playbook and assigning each with an estimate for the time required complete would be helpful when thinking about each task’s impact to project timelines. A time audit will be much easier after all of the previous recommendations have been addressed. Reducing ambiguity will remove significant levels of uncertainty, which in turn reduces hesitation, debate, and the need to search for answers.

Sort the binary (quantifiable) tasks and the mindset or qualitative tasks.
This isn’t always clear, even the WCAG 2.1 checklist starts with internalizing a set of four principles. Having open-ended items (like principles) alongside more quantifiable tasks introduces ambiguity and uncertainty. How do you know when to move on to the next item - if you’re not sure you’ve sufficiently or accurately internalized an ‘inclusive mindset’? A particular mindset or deeply internalized principle can be a powerful thing, and essential for guiding someone through the ambiguity of a project, but it takes time to shift your way of thinking and working, and that time rarely fits within the scope of a single project.

Clarify WMF time expectations to ensure a more equitable product design process.
If a team doesn’t know the foundation’s expectations around time, they will resort to the product priorities as they have in the past. If WMF states that a minimum of 20% of every project timeline should be dedicated to explicitly ensuring more equitable product outcomes, that prioritizes the playbook over business as usual.

Reconsider the rollout plan for the playbook.
Rather than introducing the playbook as part of the project kick-off, align the phases of the playbook to the specific project timeline or milestones. Some teams seem to have done this already, the difference would be having someone from the DEI Playbook team with them at each stage. This phased rollout would provide an opportunity for a DEI Playbook team member to be present for reflections on the previous steps, as well as contextualizing the playbook and answering questions to guide the project team into the next stage.

Provide more contextual support to the project teams.
The Wikimedia Product/Inclusive Product Development page is a useful resource but even so, the teams had questions about how best to tackle a task, or who to ask when a team got stuck on something. If the playbook included a list of support resources for each phase, including experts' names and contact, links to relevant documents, link to slack support channel, and the ability to @ the Playbook team in their document. The project team PM should also have a series of brief regularly scheduled standups with a member of the DEI playbook team, this time can be used to help answer team questions or provide feedback and suggestions for helping the team be successful in meeting the goals defined by the playbook.

SUPPORT MATERIALS
Within the recommendations above, there are hints and references to materials that don’t exist or may not exist in a way that fully addresses the need. This section includes some of the materials that will provide more support to teams as they work through the playbook. These materials range from role assignments to digital collections.


 * Playbook Marketing Campaign
 * Playbook v2
 * Slack Support as a Community of Practice
 * Examples, Case Studies & Templates
 * Resource Library
 * Open Office Hours

STRATEGIZE
Plan for the work. Align it to the larger strategy. Do risk assessment. Articulate “why” and “for whom?”

REQUIRED

 * Involve all functions (Product Management, engineering, design, community, etc) in setting strategy, including analytics
 * Welcome diverse opinions on what the plan should be, by allowing time for dissent, listening to dissent and allowing multiple channels of communication for feedback (both oral and written)
 * Specify DACI for decision-making for different types of decisions
 * Ask what partners need to be brought into each phase


 * Understand the Organizational and Product Platform strategic plans and goals
 * Explicitly discuss DEI as part of the plan
 * Consider a wide variety of use cases
 * Set explicit accessibility goals at the beginning of the project
 * Be intentional about choosing which communities to engage with (have a clear why)
 * Coordinate the choice of communities with other teams
 * Ask, “Who are we leaving out?”and incorporate either outreach or checkpoints later in the process
 * Be intentional and explicit in your OKRs about who, including DEI centered OKRs
 * Get feedback on clarity of goals and strategic alignment of these goals with the team, users, expert insight, and stakeholders
 * Share plans with other teams and leadership
 * Conduct a Pre-Mortem to identify risks
 * Iterate on plan based on risks identified in Pre-mortem
 * Strategy and planning are constantly evolving through learnings- recognize that the plan may change.  Discuss this with the team.
 * Check in with legal on any issues/considerations for the regions you are planning to work in.

SUGGESTED

 * Review the Community Engagement Guidelines as a team
 * Review your plans with another PM, asking them to review it for DEI objectives.
 * Incorporate user research when creating your plans
 * Plan Discovery research with risk factors from Pre-Mortem in mind
 * Use data when creating your plans
 * Have a team offsite when starting a new project--focused time is valuable.
 * Plan for reactive work in order to ensure space for proactive work.
 * Plan to use the Best Practices Playbook for every stage of the process.
 * Create the space for differing opinions--ask others “what should be different and why?”
 * Review your heuristics for choosing partner wikis
 * Ask how you will champion specific populations as you move through the development process
 * Share the context of your DEI-related goals and plans with others
 * Have a team mission that explicitly states your intentions around developing in an intentional DEI-aware way.
 * Talk with people in historically underrepresented communities to find out what they think about existing products, services, or brands similar to yours.

DISCOVER
The goal of this phase is to identify the problems you’ll be solving for. This includes identifying users you’ll be working with, understanding the needs and goals they have and how these experiences fit into their lives.

REQUIRED

 * Identify the problems you are trying to solve--user problems and needs, product problems and inclusion problems
 * Consult with user research on the problems you are solving for
 * Provide context for any partner teams, including tech teams, that you are working with; help them understand the user experience and potential impact of the work.
 * Ask “who else?” as you define the work; be specific.  Eg, How might this problem look different across different socio-economic groups? Who might not be able to use this product and why?
 * What is the WCAG 2.1 level AA guidance in this area?
 * At the end of this phase, submit your Playbook for the project to [email alias here] [1]
 * Avoid leading questions in research, review research questions with Design Researcher before conducting research

SUGGESTED

 * Consult with external organizations/DEI experts that focus on specific populations to increase learning
 * Share insights through written documentation so we have institutional memory of our learnings
 * Share with other functions/partners/departments (marketing etc)
 * Partner with research agencies that are in the market you are serving

TEAM NOTES
[1]    The checklist in this Playbook will be our internal mechanism for tracking the kinds of practices, behaviors and activities our teams are using. We will be comparing what we learn via this reporting channel to the baseline research we did at the beginning of this process in order to track progress toward a more DEI-centered product development process.

DEFINE
The goal of this phase is to turn research, insights, etc into an addressable and achievable scope of work with a concise problem statement.

REQUIRED

 * Evaluate all of your decision heuristics for biases
 * Develop your hypotheses with your partner communities
 * Gather feedback on definitions from partner communities, including underrepresented populations
 * Use neutral pronouns or specific actor roles whenever possible in the documentation
 * Are any images used representative of different genders, cultures and backgrounds?
 * Are multiple input modes such as voice and text available if data input is part of the feature?
 * Documentation should be easily understandable to laypeople, free of jargon
 * If using AI or ML algorithms, then check training data for biases
 * Work with analytics on how to measure impact, including specific populations
 * Ensure you gather views and data from multiple sources instead of relying on a single team member to represent all people with whom they share an identity.  Avoid tokenization, including for user personas.
 * Define code coverage and any needed unit testing plan to be executed as part of the development cycle
 * Identified typical browser and device usage for targeted users, including underrepresented communities, and develop a testing plan
 * Identify if your team needs specialized skills or additional training for development

SUGGESTED

 * Examine assumptions that could lead to exclusion.  I.e., assumptions about ability (age, language, technical connectivity etc).
 * Use common language around DEI efforts
 * Question your hypothesis: how will you measure underserved populations?
 * Share definitions publicly to get feedback
 * Identify and question the assumptions about the feature solutions.
 * Identify different Customer Journeys that reflect multiple viewpoints
 * Use a colorblind tester or simulator to diagnose any color contrast issues.
 * Evaluate and refine your hypothesis through Causal Loop Diagrams
 * Hold a design sprint focused on bringing in different viewpoints
 * Review data models for inclusivity and privacy
 * Consider how legacy terminology in policies (and elsewhere) might require less “insider” synonyms and simplify language where possible [1]

TEAM NOTES
[1]    For example “revert” is a technical way of expressing “undo” – which might be a more layperson-friendly way of expressing the same concept. Consider when a legacy term might need to be challenged with the community for the purpose of making the platform more accessible to newcomers.

DEVELOP
Development of experiences identified in the STRATEGIZE phase, with refinement and further development of the most successful concepts in preparation for the DELIVER phase. This includes design iterations.

REQUIRED

 * Look for bias in underlying tools, systems such as AI or ML. Work with the responsible teams to address the bias
 * Develop multiple concepts
 * Adhere to accessibility design and technical standards
 * Test concepts with your partner communities, in their preferred language
 * Incorporate community feedback
 * Iterate on the most promising designs
 * Design according to design standards
 * Use MediaWiki best practices and engineering architecture principles for frontend and backend development.  (Vue etc)
 * Design with accessibility goals in mind
 * Conduct concept testing in the native language of the partner communities you are working with
 * Conduct concept testing with your target populations
 * Community liaison PR effort to socialize the work to target populations

SUGGESTED

 * What technical barriers are there to inclusion? Reach out to other teams and Engineering leadership to identify solutions.
 * Ask for volunteer developers support
 * Conduct concept testing across a variety of populations and languages

DELIVER
Testing and delivery of final code and/or interface. This could be either for delivering a smaller scaled test or final scaled experience to production.

REQUIRED

 * Perform user testing with specific populations, and people from different backgrounds
 * Beta test with people from different backgrounds, genders, ages and cultures
 * Conduct user testing in the native language of the partner communities you are working with
 * Evaluate performance for low bandwidth users
 * Test assistive devices
 * Do we have standard devices that we develop and test on that should be included here?

SUGGESTED

 * Instrument for downstream risk factors identified in the planning phase

OWN
Ongoing monitoring of positive and negative impacts resulting from the product/code delivered.

REQUIRED

 * Do retrospectives on your work
 * Who was left out of previous work and why?  Is there input into future strategy?
 * Plan for future technical/design debt around DEI goals
 * Gather user feedback and share it with other teams and users
 * Make the results of your work public
 * Consider language barriers in feedback--ask community ambassadors, Community Relations Specialists, and research for help in solving this.
 * Work with analytics on how to measure specific populations over time as needed
 * Work with analytics on ensuring privacy through metrics.  Question if you need to keep measuring specific metrics.
 * Elevate awareness of how goals shifted over time, and why
 * Evaluate real results compared to expected
 * Have a plan for localization as you scale
 * What is community feedback?
 * Track risk factors identified in STRATEGY phase
 * Responsible and respectful exit from intensive community engagement

SUGGESTED

 * Create a “chore wheel” to gather and share user feedback
 * Throughout this process, have a “future work” document where you can collectively write down insights/learnings for the future.
 * Establish user or community councils for feedback
 * Use user feedback surveys
 * Share your DEI Prod Developments efforts publicly to get feedback
 * Identify unexpected uses
 * Identify unexpected resultsIdentify the audience actually using (and not using) it.  Identify why.