Facebook has rolled out its new feature in the United States to try to combat fake news. Announcing the news Tuesday, Facebook wrote that the new "features [aim to] provide more context for people so they can decide for themselves what to read, trust and share."
According to an earlier statement and video released by Facebook, the social media giant will now be using Wikipedia – the free online encyclopedia that anyone can edit to inform users about publishers' identity (or lack thereof).
"Based on research, we’re making it easy for people to view context about an article, including the publisher’s Wikipedia entry, related articles on the same topic, information about how many times the article has been shared on Facebook, where it is has been shared, as well as an option to follow the publisher’s page," Facebook wrote.
"When a publisher does not have a Wikipedia entry, we will indicate that the information is unavailable, which can also be helpful context," it added.
In addition to aggregating knowledge about the publisher from Wikipedia – a move that may further burden the free encyclopedia, which has now also been enlisted into YouTube's fight against fake news – Facebook will also highlight other stories from the same publisher.
It will also allow you to see which of your friends shared the story and will test out a new feature, which will also allow author bylines to become clickable within the site's instant article format – making it easier to gain context not just on the story but on the person behind it.
Crowdsourcing the truth
As more details of Russia's involvement in the U.S. election become known, it is clear that undermining the distinction between what is true and false, real and imagined, was key in its alleged interference in Trump's favor.
This raises the question of whether outsourcing the verification of news sites' reliability to an easily manipulatable website like Wikipedia is wise.
Facebook, which until a few months ago denied its platform was complicit in the fake news phenomena, has recently started using algorithms to try to quell the spread of unsubstantiated and politically bias reports by targeting certain publishers, with varying and generally limited success.
For example, following the tragic mass shooting in Las Vegas that left over 50 people dead last October, false reports originating in Russia and alt-right sites made their way to the Facebook Trending Stories service that shows users popular articles.
The embrace of Wikipedia indicates a shift to human – as opposed to technological – moderation of content on Facebook and a change in focus to articles as opposed to publishers.
Human editors are the heart of Wikipedia and help maintain its complex system of self-governance. While anyone can indeed edit the content of Wikipedia – for example, rewriting an entry for a junk news outlet to cast it in a legitimate light – the community has gotten exceptionally good at weeding out fake news. So much so, in fact, that Wikipedia's founder, Jimmy Wales, announced last year he was expanding the Wikipedia model to news production with his WikiTribune project.
Wikipedia no longer allows unregistered users to create new pages, and most pages need to go through some form of review by an active editor after being published. The site also has pretty strict standards, enforced by its community, about sourcing claims. Moreover, so-called bots help human editors patrol new content on the encyclopedia.
Nonetheless, until a few years ago, Wikipedia was considered by many the first iteration of the fake news logic. In what is arguably the most famous of such incidents, the article for John Seigenthaler – a journalist and friend of the Kennedy family – erroneously stated he was a suspect in Kennedy's assassination, prompting some to vandalize Jimmy Wales' article to claim he had been murdered, just to prove the point that the Wikipedia model was prone to manipulation.
Wikipedia even has an entire article dedicated to hoaxes on Wikipedia. This includes everything from fake academics to fake bands, and even fake books and theories, some of which were live on Wikipedia for over four years and were cited in popular media.
One potential problem with Facebook's new solution is that, though an army of human editors manages to keep Wikipedia generally reliable, it is far from foolproof.
Mark Zuckerberg says he doesn't want his social media behemoth to be used to undermine democracy. But the idea of crowdsourcing everything – from truth to journalism – might have more in common with fake news than anyone in Silicon Valley wants to admit.
Want to enjoy 'Zen' reading - with no ads and just the article? Subscribe todaySubscribe now