Wikimedia Product/Perspectives/Trust

Overview
The concept of ‘Trust’ on Wikipedia is complicated as the amateur crowdsourced nature of our content is counter to the traditional ideas of expert contribution and professional review. However, over the years Wikipedia has established itself as a trustworthy knowledge resource on the web in the global north. We need to protect this earned reputation in the modern era of misinformation while also establishing trust in new markets if we want to achieve true knowledge equity.

Wikipedia content is more-or-less accurate because of the open review processes and because the tools encourage construction over destruction. These systems work for content that the existing content reviewers can easily familiarize themselves with, but as new voices arrive on Wikipedia the existing community needs tools, training, and abilities to make appropriate decisions on the accuracy of new content. At the same time, these new voices need the tools, knowledge, and ability to add durable references that meet an ever-maturing standard of notability.

As the internet grows, the world becomes smaller and threats against privacy become more challenging. Wikimedia must continue to protect our contributors by keeping their credentials and identities secure. This protection must not come at the costs of allowing malicious actors (such as vandals or misinformation campaigners) to more easily continue their disruption or of our content to be censored in high-risk parts of the world.

Aspects

 * [DRAFT] Verifiability of Content
 * [DRAFT] Transparency
 * [DRAFT] Accountability of Contributors

Examples

 * Copyvio Tools
 * Archive Bots
 * Interaction Timeline

Areas of Impact

 * Curation workflows
 * Experienced editors