Wikimedia Product/Perspectives/Trust/Transparency

Summary
The general level of trust in digital platforms, and in one another, has hit a new low. As a culture, we can now see the social costs of moving fast and breaking things. And while it’s easy to bemoan the experience gap between wiki projects and other people-powered platforms there’s a hidden upside. The open knowledge model has an innate stability and is inherently more reliable since very little of this output changes at the pace of the world around us. Our projects adapt at a plodding (human) pace, change is slow and painfully incremental, however this slowness is an advantage in terms of being able to see what's happening under the hood.

In the current climate of distrust, being perceived as trustworthy presents an opportune moment for Wikipedia. With so much positive social capital built up over so many years, it's possible to take a risk that other platforms like Facebook and Google cannot. When issues are known but go unaddressed, it invites greater scrutiny down the road. Being called out on a known issue is far more damaging to the organization than tackling the issue head on. As we see with Facebook, an organization’s resources are far more strained by damage control than they would be by self-initiated remedies. In light of this, we posit that it will be necessary to critique the flaws in our own glass house in order to retain trust in the long run. And while being more intentionally transparent about the messy process by which knowledge is created would almost certainly invite criticism, it’s also the only way to begin to address gaps and bias at a system level. Transparency is just good business.

Resources
Author., 20xx title https://www.linklink.net