User:DannyH (WMF)/2018 Product points of view/Trust

Overview
The concept of ‘Trust’ on Wikipedia is complicated as the amateur crowdsourced nature of our content is counter to the traditional ideas of expert contribution and professional review. However, over the years, Wikipedia has established itself as a trustworthy knowledge resource on the web in the global north. Trust is essential to both content consumption, content leverage, and continued content creation. We need to invest in this strength and protect our earned reputation, particularly in the modern era of misinformation.

Global Consumers
Strengthening trust will require more transparency about our processes and our weaknesses. Specifically, we need to give users better tools for understanding what to trust and what not to trust on Wikipedia. Citations are part of this, but represent an incomplete solution. We believe easily digestible insights into content creation and curation represent a big under-resourced opportunity in this area.

Knowledge Equity
As we consider the next 2 billion people who are coming online, we need to understand the factors currently limiting trust in Wikipedia among marginalized people. If someone goes to Wikipedia and doesn’t see their own experience reflected there, and doesn’t see a transparent process for contributing, how can they truly trust it? In important ways, the tools that create trust among the majority by relying on and mirroring traditional structures of power and authority undermine the trust of those who have been negatively impacted by this hierarchy. Solving for this tension will not be easy, but it is central to our mission.

Contributors
Wikipedia content is more-or-less accurate because of the open review processes and because the tools encourage construction over destruction. These systems work for content that the existing content reviewers can easily familiarize themselves with, but as new voices arrive on Wikipedia, the existing community needs tools, training, and abilities to make appropriate decisions on the accuracy of new content. At the same time, these new voices need the tools, knowledge, and ability to add durable references that meet an ever-maturing standard of notability.

If Wikipedia is to defend and improve upon the trust that underwrites content creation and consumption, it will need to be more transparent about how content is created and make it easier for users to accurately generate, judge, and protect trustworthy content.

Areas of Impact

 * Curation workflows
 * Experienced editors
 * Citations