For the average internet user, Wikipedia operates in the background, its 44 million entries serving as a priceless resource, rarely thought of until you need to know the capital of Azerbaijan. This week, however, Wikipedia’s volunteer editors and the nonprofit that makes its work possible, the Wikimedia Foundation, suddenly found themselves in the news, tasked once again with providing a ground-level truth for a platform unwilling to provide one of its own.
On stage at the South by Southwest conference on Tuesday, YouTube CEO Susan Wojcicki announced that her company would begin adding “information cues” to conspiracy theory videos, text-based links intended to provide users with better information about what they are watching. One of the sites YouTube plans to use is Wikipedia. “We’re just going to be releasing this for the first time in a couple weeks, and our goal is to start with the list of internet conspiracies listed where there is a lot of active discussion on YouTube,” Wojcicki said on stage.
The move came as a surprise—even to the Wikimedia Foundation. “In this case, neither Wikipedia nor the Wikimedia Foundation are part of a formal partnership with YouTube. We were not given advance notice of this announcement,” the organization said in a statement.
YouTube, a multibillion-dollar corporation flush with advertising cash, had chosen to offload its misinformation problem in part to a volunteer, nonprofit encyclopedia without informing it first. YouTube did not immediately respond to a request for comment, but the move prompted protestations from the media and some of Wikipedia’s editors.
“As a longtime Wikipedia editor, I wondered whether YouTube thought deeply about how relying on Wikipedia to combat disinformation on YouTube videos is going to impact Wikipedia and the community of editors,” says Amanda Levendowski, a clinical teaching fellow at the Technology Law & Policy Clinic at New York University Law School.
Wikipedia never claimed to be perfect, and isn’t.
But YouTube is far from the first tech company, or even the first social platform, to use Wikipedia’s content for its own goals. Its parent company, Alphabet, frequently uses Wikipedia content in Google search results. Facebook is also testing using Wikipedia to fight its own misinformation problem, though it informed the Wikimedia Foundation of its intentions first. Artificial intelligence researchers also frequently use the online encyclopedia—which still adds 20,000 new entries each month—to train algorithms or teach smart assistants. And Levendowski notes that Alphabet-owned Jigsaw used Wikipedia article discussion pages, in part, to train its open-source troll-fighting AI.
“Our content powers hundreds of semantic web services and knowledge graphs, including those maintained by Google, Apple, and Yahoo!. Our traffic data is used to track the flu virus, analyze changes in the stock market, and predict which movies will top the box office. Our structured and linked data platform, Wikidata, is used to organize datasets from the Library of Congress to the Metropolitan Museum of Art,” says Katherine Maher, the executive director of the Wikimedia Foundation.
Which is to say that much of the tech industry uses Wikipedia—it’s not only YouTube that has to consider the consequences of making it the arbiter of truth.
Who Writes History
It’s worth acknowledging that Wikipedia is, for the most part, remarkably good at its job. The site is a free, generally reliable, vast source of information. But it does have its issues. Only 16 percent of the site’s volunteer editors identify as female, according to a 2013 study. Nearly half of all articles about geographic places were written by inhabitants of just five countries: the United Kingdom, the United States, France, Germany, and Italy, a 2015 Oxford University study concluded. The same study found that more edits have been made from the Netherlands than all of Africa combined.
These disparities result in real consequences both for the kind of content that ends up on Wikipedia, and how it’s written. While it presents itself as a source of facts, articles can have their own slants. “For certain political topics, there’s a central-left bias. There’s also a slight, when it comes to more political topics, counter-cultural bias. It’s not across the board, and it’s not for all things,” says Sorin Adam Matei, a professor at Purdue University and the author of Structural Differentiation in Social Media, a book that studied 10 years worth of Wikipedia editing logs.
The vast majority of edits to Wikipedia are also made by a tiny fraction of its volunteers. Seventy-seven percent of Wikipedia’s content is written by one percent of its editors,…