Wikimedia Product/Perspectives/Augmentation/Governance

From mediawiki.org

Governance[edit]

Summary[edit]

In order to meet our movement’s goal of making all the world’s information available to everyone, we have more work to do than human editors can do alone. We need help in the form of augmentation, which is when humans and algorithms work together. Though augmentation in the wikis is not new, it will be a growing part of the future of the wikis. To ensure that the contributions made by algorithms are productive, unbiased, and fair, we will need to stick to our movement’s principles of openness, transparency, and the ability for anyone to contribute. We should build closed-loop infrastructure and interfaces that allow anyone to contribute new algorithms, and for even non-technical editors to participate in training and tuning those algorithms. These principles would apply to all types of augmentation, whether it is in the aspect of content generation, content curation, or governing interactions between people.

Governance is a word meant to capture a broader scope than “Code of Conduct”. It refers to all the ways that people interact with each other on wiki projects, in both constructive and unconstructive situations. Current newcomers rarely contribute past their initial edits because of bad reactions to quality control mechanisms, algorithmic tools (bots) or policy. We see augmented governance practices as the vehicle that will safeguard, and simultaneously empower the Wikimedia community to become that desired safe haven for knowledge discourse through a set of human - centered principles .


White Paper[edit]

DRAFT

Resources[edit]

A. Halfaker, Basic Description of JADE https://www.mediawiki.org/wiki/JADE/Intro_blog/Short_story

A. Halfaker, 2017 Interpolating Quality Dynamics in Wikipedia and Demonstrating the Keilana Effect. In Proceedings of the 13th International Symposium on Open Collaboration (OpenSym '17). ACM, New York, NY, USA, Article 19, 9 pages. DOI: https://doi.org/10.1145/3125433.3125475

J. Buowamlini, 2018 The Dangers of Supremely White Data and The Coded Gaze [Video from Wikimania 2018]

J. Zhang, J.P. Chang, C. Danescu-Niculescu-Mizil, L. Dixon, Y. Hua, D. Taraborelli and N. Thain, 2018 Conversations Gone Awry https://www.mediawiki.org/wiki/File:Conversations_Gone_Awry_(slides).pdf