Wikimedia Product/Perspectives/Trust

Overview
The concept of ‘Trust’ on Wikipedia is complicated as the amateur crowdsourced nature of our content is counter to the traditional ideas of expert contribution and professional review. However, over the years, Wikipedia has established itself as a trustworthy knowledge resource on the web in the global north. Trust is essential to both content consumption, content leverage, and continued content creation. We need to invest in this strength and protect our earned reputation, particularly in the modern era of misinformation.

Strengthening trust will require more transparency about our processes and our weaknesses. Specifically, we need to give users better tools for understanding what to trust and what not to trust on Wikipedia. Citations are part of this, but we believe easily digestible insights into content creation and curation represent a big under-resourced opportunity.

As we consider the next 2 billion people who are coming online, we need to understand the factors currently limiting trust in Wikipedia among marginalized people. If someone goes to Wikipedia and doesn’t see their own experience reflected there, and doesn’t see a transparent process for contributing, how can they truly trust it? In important ways, the tools that create trust among the majority by relying on and mirroring traditional structures of power and authority undermine the trust of those who have been negatively impacted by this hierarchy. Solving for this tension will not be easy, but it is central to our mission.

Wikipedia content is more-or-less accurate because of the open review processes and because the tools encourage construction over destruction. These systems work for content that the existing content reviewers can easily familiarize themselves with, but as new voices arrive on Wikipedia, the existing community needs tools, training, and abilities to make appropriate decisions on the accuracy of new content. At the same time, these new voices need the tools, knowledge, and ability to add durable references that meet an ever-maturing standard of notability.

If Wikipedia is to defend and improve upon the trust that underwrites content creation and consumption, it will need to be more transparent about how content is created and make it easier for users to accurately generate, judge, and protect trustworthy content.

Aspects

 * [DRAFT] Verifiability of Content
 * [DRAFT] Transparency
 * [DRAFT] Accountability of Contributors

Examples

 * Copyvio Tools
 * Archive Bots
 * Interaction Timeline

Areas of Impact

 * Curation workflows
 * Experienced editors
 * Citations



Resources

 * D. Kamir, 2011 USER DRORK: A CALL FOR A FREE CONTENT ALTERNATIVE FOR SOURCE https://www.networkcultures.org/_uploads/%237reader_Wikipedia.pdf Page 288


 * R. Faulkner, 2012 Etiquette in Wikipedia: Weening New Editors into Productive Ones http://www.opensym.org/ws2012/p17wikisym2012.pd


 * H. Ford, 2013 Getting to the source: where does Wikipedia get its information from? https://drive.google.com/open?id=1i3NkQatHG7mR7InP-iGomJi__4H5hpm6
 * J. Reagle, 2012 Good Faith Collaboration: The Culture of Wikipedia https://books.google.com/books?id=msLxCwAAQBAJ&dq=wikipedia+%22assume+good+faith%22&lr=
 * D. Laniado, 2012 Emotions and dialogue in a peer-production community: the case of Wikipedia http://chato.cl/papers/laniado_kaltenbrunner_castillo_fuster_2012_emotions_wikipedia.pdf
 * A. Menking, 2015 The Heart Work of Wikipedia: Gendered, Emotional Labor in the World’s Largest Online Encyclopedia https://drive.google.com/open?id=1ahvgXf-knzaEE-YTiIQbKL-9W046r4Ki
 * J. Reagle, 2008: In Good Faith: Wikipedia Collaboration and the Pursuit of the Universal Encyclopedia https://reagle.org/joseph/2008/03/dsrtn-in-good-faith.pdf
 * J. Antin, 2011: Gender Differences in Wikipedia Editing http://pensivepuffin.com/dwmcphd/syllabi/info447_wi14/readings/03-GenderAndWikipedia/antin.et.al.GenderDiffInEditing.WikiSym11.pdf
 * S. Das, 2018 Pushing Your Point of View: Behavioral Measures of Manipulation in Wikipedia https://arxiv.org/pdf/1111.2092.pdf
 * S. Kumar, 2016 Disinformation on the Web: Impact, Characteristics, and Detection of Wikipedia Hoaxes http://infolab.stanford.edu/~west1/pubs/Kumar-West-Leskovec_WWW-16.pdf
 * C. Keating, 2018 Tensions facing movement strategy https://meta.wikimedia.org/wiki/User:The_Land/Tensions_facing_movement_strategy
 * A. Shaw, 2014 Mind the skills gap: the role of Internet know-how and gender in differentiated contributions to Wikipedia https://drive.google.com/open?id=0B500zraS_1RvR0RhOG1MYjRCWkxDNEN6NjZ6Mjk5RDlWUTI0
 * A. Shaw, 2018 The Pipeline of Online Participation Inequalities: The Case of Wikipedia Editing https://drive.google.com/open?id=0B500zraS_1RvOVBXanNHb3VUYU9jM2Z0ei02MHJPTG5wY1RJ
 * M. Redi, 2018: What are the ten most cited sources on Wikipedia? Let’s ask the data. https://blog.wikimedia.org/2018/04/05/ten-most-cited-sources-wikipedia/