After Russian Elections Ads, Facebook Turns to Wikipedia to Fight 'Fake News'

After revealing election-time operations by Russian-linked accounts, Facebook enlists Wikipedia to kill what it dubs as ‘junk news’ - but is the crowdsourced encyclopedia enough?

File photo: U.S. President Donald Trump leaves an event in the East Room of the White House October 6, 2017 in Washington, DC.
BRENDAN SMIALOWSKI/AFP

Facebook has revealed a key part of its plan to fight “fake news”: Wikipedia. According to a statement and video released by Facebook on Friday, the social media giant will now be using the free online encyclopedia that anyone can edit to inform users about publishers’ identity - or lack thereof.

From now on, Facebook explained, clicking the little “i” (information) button next to articles published by a page belonging to a purported media outlet will give you details about the publisher. Called “article context,” the tool will aggregate information from Wikipedia to give users a bit of background about the publisher behind the Facebook page, allowing users to decide for themselves whether the news is real or what Facebook dubs “junk news.”

The news comes amid reports that Facebook will give Congress information on the election-time activities of thousands of Russian-linked accounts that spread so-called fake news reports and even organized events on U.S. soil on divisive issues that touched the heart of Donald Trump’s message.

Crowdsourcing the truth

As more details of Russia’s involvement in the U.S. election become known, it is clear that undermining the distinction between what is true and false, real and imagined, was key in its alleged interference in Trump’s favor. This raises the question of whether outsourcing the verification of news sites’ reliability to an easily manipulatable website like Wikipedia is wise.

Facebook, that until a few months ago denied its platform was complicit in the “fake news” phenomena, has recently started using algorithms to try to quell the spread of unsubstantiated and politically bias reports by targeting certain publishers, with varying and generally limited success. Only last week, in wake of the tragic mass shooting in Las Vegas that left over 50 dead, were false reports originating in Russia and alt-right sites making their way to Facebook’ Trending Stories service that shows users popular articles.

The embrace of Wikipedia indicates a shift to human - as opposed to technological - moderation of content on Facebook and a change in focus to articles - as opposed to publishers. Human editors are the heart of Wikipedia and help maintain its complex system of self-governance. While anyone can indeed edit the content of Wikipedia - for example, rewriting an entry for a “junk” news outlet to cast it in a legitimate light - the community has gotten exceptionally good at weeding out fake news, so much so that Wikipedia’s founder, Jimmy Wales, recently announced that he was expanding the Wikipedia model to news production with his WikiTribune project.

Indeed, Wikipedia no longer allows unregistered users to create new pages, and most pages need to go some form of review by an active editor after being published. The site also has pretty strict standards, enforced by its community, about sourcing claims. Moreover, so-called “bots” help human editors to patrol new content on the encyclopedia.

Nonetheless, until a few years ago, Wikipedia was considered by many the first iteration of the fake news logic. In what is arguably the most famous of such incidents, the article for John Seigenthaler, a journalist and friend of the Kennedy family, erroneously stated he was a suspect in Kennedy’s assassination, prompting some to vandalize Jimmy Wales's article to claim he had been murdered, just to prove the point that the Wikipedia model was prone to manipulation.

Wikipedia even has an entire article dedicated to hoaxes on Wikipedia, that includes everything from fake academics, to fake bands and even fake books and theories, some of which were live on Wikipedia for over four years and were cited in popular media.

One potential problem with this solution is that, though an army of human editors manages to keep Wikipedia generally reliable, it is far from foolproof.

Mark Zuckerberg says he doesn't want his social media behemoth to be used to undermine democracy, but the idea of crowdsourcing everything - from truth to journalism - might have more in common with fake news than anyone in Silicon Valley wants to admit.