Analysis

Trump Voters Trapped in Facebook's 'Filter Bubbles'

Social media can filter problematic information sources, but with so much material out there it’s hard to distinguish between hard truth and comforting lies.

FILE PHOTO: Polish protesters against events, groups and profiles blocked by Facebook in front of the Facebook Office in Warsaw, Poland, Saturday, Nov. 5, 2016.
Czarek Sokolowski, AP

“If I were to run, I’d run as a Republican. They’re the dumbest group of voters in the country. They believe anything on Fox News, I could lie and they’d still eat it up. I bet my numbers would be terrific.” Donald Trump didn’t actually say this in a 1998 People Magazine interview, but that minor detail didn’t stop thousands of people from sharing the quote on Facebook and believing that it was true (I pray I wasn’t among the sharers, but I really can’t be sure).

The same thing happened with the “proof” that the creators of “The Simpsons” had predicted the future. Turns out the shot from the show of Trump declaring his candidacy, complete with him descending on the escalator, was not taken from a 16-year-old episode, but from a short video posted on the Fox Animation Channel in July 2015.

These are just two of the myriad false stories that were spread during the recent election campaign via Facebook, Twitter and WhatsApp groups, shared thousands of times over. For the most part, people clicked on “Share” without giving a moment’s thought to the most basic question: Is this authentic? Nowadays, with the information we digest coming in nonstop bursts of clickbait headlines, critical thinking has largely become a quaint relic of the past.

Until now, this was a subject for dry research studies, occasional journalistic laments and possibly Facebook discussions, but the just-ended United States election campaign was the perfect storm that made social media’s effect on “truth” and facts a burning issue for America and beyond.

The “filter bubbles” that Facebook is so good at putting us in, so we’ll only hear others who think like us and won’t want to leave the site for a minute, meshed with the ease with which social media enables the dissemination of news from a wide range of uncorroborated sources. And this was happening just as, for economic reasons – Facebook’s business model is based on the dissemination of funded posts – the company was reducing the dissemination of established sources.

Facebook chief Mark Zuckerberg on November 10, 2016.
Lluis Gene, AFP

Facebook founder Mark Zuckerberg has been trying to put out the fire. The company is merely a channel for communication and has no influence, he maintains, insisting that false news stories are just a tiny fraction of the material shared on the network. But even some Facebook employees have rejected these arguments.

The problem isn’t confined to Facebook. Google’s search engine still leads to unsubstantiated news reports, and then there’s Twitter, which is brimming with tweets that aren’t always backed up by reliable sources.

There are partial solutions available, at least. Facebook newsfeeds and Google search results have already been filtered to a degree based on the algorithms employed by these companies which take numerous factors into account. These companies could flag unreliable information sources and eliminate them from the newsfeed or lower their ranking in the search results. We wouldn’t even notice that they were gone. But with 62 percent of Americans, and God knows how many Israelis, receiving their news through Facebook’s very personally tailored feed, some problems remain.

First is that it’s basically up to Facebook to police itself on this issue, and it has been doing its utmost to evade responsibility, as evidenced by Zuckerberg’s comments. Second is that when there are so many information sources out there, and every story can have so many different angles and shadings, it’s become a lot harder to distinguish between truth and falsehood. That goes for people and even more so for algorithms. And third, while Facebook may decide to do more to block false “news” sources, it probably won’t be so keen to do anything about eliminating the echo chambers it creates, which could potentially expand users’ views of the world.

And so we come to one of the central problems: Facebook’s public, which is perhaps the most programmed element of the site. The social network creates a bubble for users that makes staying there so pleasant and comforting. So we are lulled into consuming more of the same.

One reason for the proliferation of sensationalist and baseless items is that they make for perfect clickbait. Until we realize that even in the ostensibly fair and neutral arena of Facebook, we are being exposed to a daily diet of distortions, we will keep falling victim to, and participating in, the spread of falsehoods.