At face value, the website Wikipedia advertises itself as the online, de facto encyclopedia. But comedian Stephen Colbert had a different definition for the site on a 2007 episode of “The Colbert Report,” calling it, “The encyclopedia where you can be an authority even if you don’t know what the hell you’re talking about.”
Colbert, in his criticisms of Wikipedia, coined the term “wikiality” to signify a shift, that “when Wikipedia becomes our most trusted reference source, reality is just what the majority agrees upon.”
To prove what he identified as Wikipedia’s flimsiness on the truth, he asked viewersin 2006 to edit the site’s article on elephants and write that the elephant population had tripled over the last six months. Viewers took to the task and updated the article with the erroneous statement. The edits prompted an editor to lock the article, preventing any edits to take place, and even blocked a user they believed was Colbert from editing any other Wikipedia articles.
Colbert’s experiment is far from an isolated case. Founded in 2001 by Jimmy Wales and Larry Sanger, Wikipedia has steadily grown to house more than 40 million articles across 301 languages, and is the fifth most visited website in the world. Although it holds power as an online arbiter and provider of information, the forces driving its influence — as in, the people creating and editing articles — are a vast network of volunteers.
Samantha Lien works at the Wikimedia Foundation, the nonprofit organization that houses Wikipedia. She said there are about 250,000 people who edit Wikipedia on a daily basis. Lien added that the Wikimedia Foundation does not set any editorial policy for Wikipedia.
“It’s open for everyone to edit, and people edit in a variety of different ways,” she said. “Some folks like finding a grammatical error or a spelling mistake, and they’ll go in and fix that. Some people like writing whole articles.”
Pitfalls in the system
This centralized and volunteer-based structure — the fact that anybody can make changes or add information to any article on the site — might be cause for concern for those worried about the accuracy and verifiability of information found online. If just about anybody can become a contributor to a robust online encyclopedia, who’s to say the information is correct in the first place?
Then there’s the issue of the review process for articles and edits that Colbert brought up: What kind of truthfulness does information on Wikipedia really carry if it is deemed factual simply because a majority of volunteer editors and writers say so?
Lien said Wikipedia has a verifiability policy in place that’s meant to analyze the characteristics and qualities of the source material. She said one hard and fast rule on the site is that original research is not accepted as a verifiable source.
“For example, is that source known for publishing corrections?” Lien said. “They also look at the publisher, where that comes from, and there’s actually a whole notice board on Wikipedia that vets and evaluates sources and what kinds of sources can be used on the site. And it also looks at the piece of work itself: So, for example, is it a newspaper article or is it a book? Is it a third-party, neutral source?”
Despite the site’s intentions to remain factual and accurate, its volunteer-based structure does come with its pitfalls. For instance, because any person can make changes to a Wikipedia article, vandalism can occur. One notable Wikipedia mistake was a statement that retired journalist John Seigenthaler helped assassinate John and Robert Kennedy. Seigenthaler wrote an editorial about this false accusation and said it was on the site for four and a half months before being taken down.
One Wikipedia editor, Philip Cross, has been criticized for unfairly editing Wikipedia pages to either delegitimize a popular UK figure or to remove critical information about people he supports. For instance, Cross’ editing history shows that he removed information from columnist Oliver Kamm’s Wikipedia page about a court case against him for harassment and defamation. Cross had previously expressed views agreeing with Kamm, who supported the Iraq war. Cross also removed information from columnist Melanie Phillips’ page about her climate change denial.
Some Wikipedia articles have also been “protected” from any kind of editing, including articles on the 2004 United States election voting controversies in Ohio, Cuba, Islamophobia, Kosovo and human rights in the People’s Republic of China. Semi-protected articles — which can only be edited by users who have been registered with Wikipedia for at least four days — include Palestinian refugee, Michael Jackson, gay, Jew, God, Ku Klux Klan, the 9/11 attacks, Afghanistan, anarchism and more.
Past reports have also shown that Wikipedia articles can be influenced by conflicting interests willing to pay for the privilege. An article by the Atlantic found that notable figures concerned with how they’re portrayed on Wikipedia can pay freelancers, PR firms and Wikipedia “experts” to make changes to certain articles. Of course, any changes still need to pass the verifiability test, but contributors acting in a certain party’s interest — rather than analyzing the pure accuracy of sources — can still skew the bias of Wikipedia articles.
Tweaks to the structure
In 2006, the site’s top administrators strengthened their efforts to cut down on vandalism and thereby reinforce the site’s commitment to accuracy. Jimmy Wales told the New York Times that the Wiki community would prioritize quality of articles over sheer quantity. Now, the site also relies on artificial intelligence in the form of bots to distinguish between site vandalism and trustworthy changes to articles.
Aaron Halfaker from the Wikimedia Foundation told Cade Metz of Wired in an article published in 2016 that, after that switch, Wikipedia volunteers fact-checked and analyzed articles with greater effort and became more cautious of new users.
But, as Metz wrote:
“As site administrators fought to maintain quality, they created an environment that led to steady decline in the size of Wikipedia volunteer community. The ultimate irony is that, as fewer and fewer people edit Wikipedia, we run the risk of a small group of people bending reality to suit their particular opinions or attitudes or motivations.”
Furthermore, in an effort to fight the tide of fake news, Wales introduced WikiTribune, a new media organization that would act as a hybrid, combining the open Wikipedia community with traditional news media.
“The news is broken,” Wales said in his announcement of WikiTribune’s launch in 2017. “But we figured out how to fix it.”
Nearly two years later, WikiTribune is in operation, but with a structure that stands out from more traditional digital media outlets. On its homepage, WikiTribune describes itself as a news platform dedicated to “neutral, factual, high-quality news.”
“Come collaborate with us,” the site’s banner reads, “because facts really do matter.”
But, similar to Wikipedia, on WikiTribune any volunteer can add a new story or edit a current story on the site in real time, whether it’s a story on political unrest in Venezuela or a piece fact-checking a politician’s statement. So, as Colbert pointed out more than a decade ago, how much do facts matter if any person has the power to determine what is fact and what is not?
A flawed structure
Despite its new efforts to emphasize accuracy, Wikipedia’s guidelines on verifying source information are not without their flaws. Rosie Stephenson-Goodknight, who has been editing and contributing to Wikipedia articles since July 2015, said one of the site’s biggest issues is that the contributor and editor community continues to follow guidelines on reliable sources that were written in the early 2000s.
“Is the definition of a reliable source in 2001 still accurate in 2018?” she said. “And the answer is no, it is not still appropriate.”
Stephenson-Goodknight said she has created close to 5,000 articles for the site so far. She also said that continuing to follow 2001-era rules in 2019 impacts the kinds of articles the site publishes in the first place.
“We’ve come to realize that that is biased, that if we continue to live with that bias we will have a very skewed encyclopedia that will omit things that are otherwise considered notable,” she said.
Stephenson-Goodknight argues that Wikipedia’s outdated policies have already affected the types of articles that have been published. In 2014, she founded the Women Writers Project with Wikipedia, with the purpose of increasing the number of biographies of women in history. Since starting the project, Stephenson-Goodknight said, the share of biographies of women as compared to men on Wikipedia has grown from 15 percent to nearly 18 percent.
“Women did not get the same kind of opportunities, let alone coverage,” she said. “So if someone was a woman and she was a writer, she was a scientist, she was an artist 150 years ago in the 19th century, she would have to be pretty amazing to get a mention.”
In comparison, Stephenson-Goodknight said contributors and editors have to work harder and look for more sources to put together an article about a woman.
“We have to find many more sources in order to pull together an article about her than we would’ve had to about a man,” she said. “It’s easy because there’s been so much written about men and not so much about women.”
The gender issue of Wikipedia content is further reflected in the demographics of its content creators — Stephenson-Goodknight said only 9 percent of the editors on English Wikipedia are women. She said this steep gender disparity impacts the policies that are crafted and how they are implemented.
“You can imagine probably 90 percent being men,” she said. “Those are the ones who are forming the decisions regarding just about everything, regarding what should the notability policy be, should we make changes, what about reliable sources and civility?”