What insulates Wikipedia from the criticisms other massive platforms endure? We explored some answers—core values, lack of personalization algorithms, and lack of data collection—in last week’s “How Wikipedia Dodged Public Outcry Plaguing Social Media Platforms.”
But wait, there’s more:
Wikipedia moderation is conducted in the open.
“The biggest platforms use automated technology to block or remove huge quantities of material and employ thousands of human moderators.” So says Mark Bunting in his July 2018 report “Keeping Consumers Safe Online: Legislating for platform accountability for online content.” Bunting makes an excellent point, but he might have added a caveat: “The biggest platforms, like Facebook, Twitter, and YouTube, but not Wikipedia.“
Wikipedia, one of the top web sites worldwide for about a decade, works on a different model. The volunteers writing Wikipedia’s content police themselves, and do so pretty effectively. Administrators and other functionaries are elected, and the basic structure of Wikipedia’s software helps hold them accountable: actions are logged, and are generally visible to anybody who cares to review them. Automated technology is used; its coding and its actions are transparent and subject to extensive community review. In extreme cases, Wikimedia Foundation staff must be called in, and certain cases (involving extreme harassment, outing, self-harm, etc.) require discretion. But the paid moderators certainly don’t number in the thousands; the foundation employs only a few hundred staff overall.
More recently, a Motherboard article explored Facebook’s approach in greater depth: “The Impossible Job: Inside Facebook’s Struggle to Moderate Two Billion People.” It’s a long, in-depth article, well worth the read.
One point in that article initially stood out to me: presently, “Facebook is still making tens of thousands of moderation errors per day, based on its own targets.” That’s a whole lot of wrong decisions, on potentially significant disputes! But if we look at that number and think, “that number’s too high,” we’re already limiting our analysis to the way Facebook has presented the problem. Tech companies thrive on challenges that can be easily measured; it’s probably a safe bet that Facebook will achieve something they can call success…that is, one that serves the bottom line. Once Facebook has “solved” that problem, bringing the errors down to, say, the hundreds, Facebook execs will pat themselves on the back and move on to the next task.
The individuals whose lives are harmed by the remaining mistakes will be a rounding error, of little concern to the behemoth’s leadership team. On a deeper level, the “filter bubble” problem will remain; Facebook’s user base will be that much more insulated from information we don’t want to see. Our ability to perceive any kind of objective global reality—much less to act on it—will be further eroded.
As artificial intelligence researcher Jeremy Lieberman recently tweeted, we should be wary of a future in which “…news becomes nearly irrelevant for most of us” and “our own private lives, those of our friends; our custom timelines become the only thing that really matters.” In that world, how do we plan effectively for the future? When we respond to natural disasters, will only those with sufficiently high Facebook friend counts get rescued? Is that the future we want?
It’s not just moderation—almost all of Wikipedia is open.
“If you create technology that changes the world, the world is going to want to govern [and] regulate you. You have to come to terms with that.” —Brad Smith, Microsoft, May 2018. As quoted in Bunting (2018).
From the start, Wikipedia’s creators identified their stakeholders, literally, as “every single human being.” This stands in stark contrast to companies that primarily aim to thrive in business. Wikipedia, on the whole, is run by a set of processes that is open to review and open to values-based influence.
This point might elicit irate howls of frustration from those whose ideas or efforts have been met with a less-than-respectful response. Catch me on another day, and the loudest howls might be my own. But let’s look at the big picture, and compare Wikipedia to massive, corporate-controlled platforms like YouTube, Facebook, or Google.
- Wikipedia’s editorial decisions are made through open deliberation by volunteers, and are not subject to staff oversight.
- Most actions leading up to decisions, as well as decisive actions themselves, are logged and available to public review and comment.
- It’s not just the content and moderation: the free software that runs Wikipedia, and the policies that guide behavior on the site, have been built through broad, open collaboration as well.
- The Wikimedia Foundation has twice run extensive efforts to engage volunteers in strategic planning, and in many instances has effectively involved volunteers in more granular decision-making as well.
There is room for improvement in all these areas, and in some cases improvement is needed very badly. But inviting everyone to fix the problems is part of what makes Wikipedia thrive. Treating openness as a core value invites criticism and good faith participation, and establishes a basic framework for accountability.
“While principles and rules will help in an open platform, it is values that [operators of platforms] should really be talking about.” — Kara Swisher in the New York Times, August 2018.
Wikipedia lacks relentless public relations & financial shareholders.
There’s another frequently overlooked aspect of Wikipedia: financially speaking, the site is an ant among elephants.
The annual budget of the Wikimedia Foundation, which operates Wikipedia, is about $120 million. That may sound like a lot, but consider this: Just the marketing budget of Alphabet (Google’s parent company) is more than $13 billion.
In terms of the value Wikipedia offers its users, and the respect it shows for their rights, Wikipedia arguably outstrips its neighbors among the world’s top web sites. But it does so on a minuscule budget.
Wikipedia doesn’t have armies of public relations professionals or lobbyists making its case. So part of the reason you don’t hear more about Wikipedia’s strategy and philosophy is that there are fewer professionals pushing that conversation forward. The site just does its thing, and its “thing” is really complex. Because it works fairly well, journalists and policymakers have little incentive to delve into the details themselves.
Wikipedia is driven by philosophical principles that most would agree with; so the issues that arise are in the realm of implementation. There is little pressure to compromise on basic principles. Tensions between competing values, like business interests vs. ethical behavior, drive the debate over corporate-controlled platforms; but those tensions basically don’t exist for Wikipedia.
The reasons you don’t hear much about Wikipedia’s governance model are that it is rooted in clearly articulated principles, works fairly well, is reasonably open to benevolent influence, and lacks a public relations campaign.
Those are all good things—good for Wikipedia and its readers. But what about the rest of the Internet? The rest of the media world, the rest of society? If the notion of objective truth is important to you, and if you’re concerned about our access to basic facts and dispassionate analysis in today’s rapidly shifting media landscape, you might want to challenge yourself to learn a bit more about how Wikipedia has been made and how it governs itself…even if you have to dig around a bit to do so.
This is the first in a three-part series. Read on:
- How Wikipedia dodged public outcry plaguing social media platforms
- “Open” everything, and minimal financial needs: Wikipedia’s strengths
- A proven innovation could benefit Facebook’s users—and its shareholders, too.