Putin's Dark U.S. Election Plot Is More Deceptive Than You Thought

Putin is employing an array of bots and trolls to sway the U.S. election for Trump. But how does it work? Two studies explain

Send in e-mailSend in e-mail
Go to comments
Signs in Wisconsin. “We don’t [all] live in a post-truth age, but about half of the GOP’s hard-core supporters do,” says Benkler.
Signs in Wisconsin. “We don’t [all] live in a post-truth age, but about half of the GOP’s hard-core supporters do,” says Benkler. Credit: Bing Guan / Reuters
Omer Benjakob
Omer Benjakob
Omer Benjakob
Omer Benjakob

Dr. William Marcellino has seen a thing or two in his time. After all, he’s a former reserve officer in the U.S. Marines who trained as a tank commander and served in the Philippines. Still, when he says today that, “It’s impossible to believe the shocking things I’ve seen,” he’s not referring to his period in the military. He’s talking about the posts and tweets he encountered when he and a team of researchers conducted an in-depth study of the army of bots and trolls that Russian President Vladimir Putin is deploying against American democracy ahead of the November 3 election.

Marcellino, who specializes in behavioral science, is a senior researcher at the RAND Corporation, a think tank established by the U.S. army after World War II. This month, following a lengthy investigation, his team published what is perhaps the most comprehensive report on efforts to use social media to interfere in the election. Their study, which focused on the months leading up to Joe Biden's Democratic presidential nomination, asserts unequivocally that a concerted effort is afoot by foreign elements to intervene in next week’s vote. The report does not name Russia explicitly as the guilty party, but doesn’t leave much room for doubt, either: “Although the origin of the accounts could not be identified definitively, this interference serves Russia’s interests and matches Russia’s interference playbook.”

On the face of it, this finding sounds almost axiomatic. Ask anyone, and they’ll tell you that Russia is using social media to interfere in elections in the United States. The goal, they will tell you, is to subvert the country’s democratic institutions and sow distrust. The means: Help Donald Trump remain in the White House, continue intensifying social rifts and disseminating a political culture of hatred, suspicion, racism and misogyny.

Indeed, as the election looms, there is no doubt that an intensive effort is underway to disseminate false information and thus deceive American voters. Facebook and Twitter, with bitter experience from the 2016 election, confirm that they have identified activities linked to the Internet Research Agency, Putin’s so-called troll farm in St. Petersburg. Law enforcement, military and civilian intelligence units, from the FBI to the CIA to the U.S. Cyber Command, all attest to finding repeated evidence that Russia is working to sabotage Biden's campaign and even to try to distort actual voting through either cyberattacks or influence campaigns.

Just last week, the authorities in the United States uncovered a series of foreign-based attempts to intervene in the election. These included the dissemination of false information, but also genuine cyberattacks on voter databases and voting infrastructures. They even turned up an alleged case of Iranian intervention: an email sent to voters in Florida by a far-right American organization turned out to be fake, its source being attributed to Tehran. Though Israeli cyber experts seemed somewhat skeptical of the Iran claim, and hinted it may actually be Russia, it’s worth noting that both Iran and Russia chose to focus their efforts on voters and the voting process.

American authorities say Russia is the bigger threat of the two and are convinced that the aim of such hacks is not necessarily to cause concrete damage to voting infrastructures, but rather to create the feeling that they are vulnerable to manipulation. There is one overriding objective: to cast doubt over the voting procedure and sully the credibility of the elections.

If so, it seems there is no doubt about the existence of a Russian effort. But what do we really know about it? What does it look like? How does it work? Interviews with two researchers – both of whom are at the forefront of monitoring these developments on social networks – offer a look inside the operation, its methods and the shadowy world in which it is being conducted. At the same time, they demonstrate how difficult it is to survey the Russian project and to estimate its impact. Indeed, while one study confirms the existence of a disinformation effort, another says it may be completely ineffective. Together, the detailed analysis of how disinformation spreads and by whom helps shed light how the election is being problematized and the electorate polarized. However, they also show how little we know and show how many of the issues being faced today are a continuation of battles started long ago, be it the Cold War or the birth of Fox News.

The Russians are coming

With Facebook having blocked access to its data from most researchers, and Russia constantly changing its methods, Marcellino and his team at RAND developed a complex and novel research method to detect disinformation efforts on Twitter. To locate the suspect users, the investigators resorted to a mix of qualitative and quantitative research methods. Combining network analysis with textual analysis, they worked with artificial intelligence to process vast amounts of information and with human intelligence to analyze the texts. Combining the two, they scanned the entire Twitter network and located what they term “publics” – communities of users that congregate around one subject, such as whether or not the president should have been impeached – and characterized them on the basis of the content they distribute. “Our algorithm doesn’t give a ‘yes’ or ‘no’ answer as to whether a certain user is a bot or troll,” says Marcellino. “It tells us how likely or probable that it is, based on how similar their behavior is to that exhibited by users we know for certain were Russian trolls active during the 2016 election.”

This way, the researchers effectively trained their AI program to identify accounts that are likely Russians trolls, or users operated either manually or automatically to push out divisive or false content. “We have a data set of hundreds of accounts confirmed to be connected to the Internet Research Agency,” says the researcher.

“Therefore we can say we are very confident in our ability to say that this or that user is a troll or what we call a ‘super-connector,’” he says. “A typical troll profile might have an object (like a coffee mug) as a profile picture, tweet only about politics from a hyper-partisan perspective (never talk about kids, pets, sports, etc.), and tweet around the clock,” he says, explaining that in the 2016 and 2018, “we saw troll accounts with either no picture or gorgeous Eastern European women as their picture.”

Marcellino provides a classic description of the behavioral MO of a fake and suspect account: “These tend to follow only other users suspected of also being bots or trolls, they won’t tweet for two years and then suddenly ‘wake up’ and start tweeting frantically about a single subject - only to then suddenly stop. No one real acts that way,” he says.

However, he quickly qualifies his assessment, conceding that, “we don’t have a smoking gun. We can never say with 100 percent certainty that a specific account is fake. But we are very confident about our findings.”

Their study identified more than 630,000 suspect Twitter accounts, which produced a total of 2.2 million tweets between January and May of this year. Two types of accounts were found: trolls – accounts that are most likely fake and may even be automatically operated bots that have been mobilized in favor of one or another political side; and “super connectors” – not bots, but rather avatars that are operated by a real person and serve as nodes for the dissemination and amplification of information. If the goal of the trolls is to post radicalizing content, the goal of the connectors is to amplify it throughout the wider social network.

Putin and Trump, at last year’s G20 summit.
Putin and Trump, at last year’s G20 summit. Credit: Susan Walsh / AP

“We discovered 11 communities, which are groups of Twitter accounts,” Marcellino explains. “The smallest comprised 7,000 accounts, the largest 150,000.” The largest groups are those that supported Trump or Biden, another large group backed Bernie Sanders, and there were small groups that supported other Democratic candidates, such as Pete Buttigieg. In addition, the team found another pair of opposing groups: those who believe that Trump should be ousted over what is called “RussiaGate” and deal exclusively with that, and those that are convinced that the impeachment effort was a case of political persecution by the ‘“deep state.’”

At the basic level, the researchers found that the accounts in the so-called pro-Trump network followed the same motifs as those disseminated during the election four years ago, with minor updates to preserve their political relevance. “If during the previous elections we saw a lot of activity focusing on Ferguson [Missouri, site of 2017 protests of police violence against Blacks], for example, then this time it will be about Black Lives Matter,” notes Marcellino. “In this regard we also saw a lot of seemingly non-political posts made by this group that did not necessarily address BLM, but rather just portrayed African Americans in a negative light – for example posts with videos highlighting incidents of violence by members of the Black community. This does not directly help Trump but it does resonate with some of the propaganda his supporters are pushing out.” For example, he says accounts of this kind overplayed incidents of “violence, rioting and looting,” which he says were used to spread the message that “crazy socialists and anarchists have taken over the Democratic party,” or others in that vein.

This, they found, was also true for the anti-Trump side too. Accounts purportedly supporting Biden disseminated content to the effect that all right-wingers were “Nazis” or “fascists.” In fact, what he terms as “parallelism” exists in the information disseminated to the different camps, the research says. The similarity in methods and forms of messaging of the operation is what indicates that these are indeed coming from the same source, and for the researchers made it possible to identify the “tradecraft” of the Russian activity across political divides.

“For example, one thing we found is that troll accounts affiliated with both the far right and left posted derogatory memes depicting first ladies – Michelle Obama and Melania Trump. These memes usually included unflattering if not manipulated images and were a form of political misogyny: Each side attacks the other side in an identical manner, even if the content seems different. Each community is targeted with the most radicalizing messaging you can imagine.”

A similar parallelism exists on other subjects as well. Jeffrey Epstein, for example, was a common theme for trolls on both sides: Trolls on the left pushed claims that Trump was close to the late serial pedophile, trolls on the right made similar claims about Biden. Both sides alleged the same thing: that incriminating information exists about the candidate in this regard, and that the evidence will soon be made public, and thus destroy his political future. It is worth noting that this it is precisely conspiratorial claims of this sort (though not about Epstein) that stand at the heart of the far-right QAnon discourse, but was also used here to target Biden and Bernie Sanders supporters.

Both sets of trolls also distributed parallel messages in connection with the coronavirus, with the aim of calling into question the actual severity of the epidemic. “Left-wing trolls claimed the virus was going to be used by Trump to create a crisis to prevent Democrats from voting, while those on the right claimed it was a hoax aimed at suppressing Republican votes,” Marcellino says.

People on each side like to believe that they think critically, and the other side is stupid, but that is exactly the goal: To make sure we Americans no longer trust our own neighbor.

Marcellino

The parallel character of the Russian propaganda – the heart of the “us” and “them” method that serves as the core of what is termed Russia’s “active measures” – is one of the most significant features found by the researchers.

“I think the single biggest thing people don’t understand is that this targets and affects both sides,” says Marcellino. “People on each side like to believe that they think critically, and the other side is stupid, but that is exactly the goal: To make sure we Americans no longer trust our own neighbors, that we think that it’s not just that we don’t agree politically, but that those that we disagree with are secretly either Nazis or communists.”

As Marcellino puts it, “This is how they [the Russians] undermine America’s democracy, by fostering mutual distrust and polarization that will keep us so busy that we won’t have to time focus on them and what they are doing to us.”

The parallelism was not confined only to Twitter accounts, of course, and actually seems to be a cornerstone of Russian political disinformation.

For example, a Russian network that was recently removed from Facebook operated a genuine news site with real writers – inexperienced freelancers identified with the American left – who had no idea they were employees of the Russian institute and were helping it create propaganda with the purpose of wreaking destruction. And if doubt lingers about the source of the activity, the site was called PeaceData – a play on words of the juicy Russian curse pizdetz (roughly, “fucked-up”).

Last month it was revealed that the Russians also made use of a dummy right-wing site that carried content favorable to Trump. The surprising fact about this revelation was that the site was connected with PeaceData, the site that was aimed at the American far left. In fact, the two sites were mirror images of one another, and both were apparently administered by the Russian Internet Research Agency.

Democratic presidential candidate former Vice President Joe Biden speaks at a drive-in rally at Broward College, Oct. 29, 2020, in Coconut Creek, Florida.
Democratic presidential candidate former Vice President Joe Biden speaks at a drive-in rally at Broward College, Oct. 29, 2020, in Coconut Creek, Florida.Credit: Andrew Harnik,AP

Helping Biden help Trump

The primary Russian effort in the 2020 election hasn’t been directed at creating support for Donald Trump – but rather at undermining his opponent. Indeed, one of the RAND report’s sections surveys how the ostensibly left-wing accounts operated. In essence, they expressed support for Biden but in practice were working in Trump’s favor. A central tool in this regard involved attempts to cause Biden supporters to doubt the former vice president’s personal and political integrity. While the pro-Trump accounts promoted a positive message about the president to his supporters the accounts from the pro-Biden camp did the opposite, trying to exacerbate internal tensions within the Democratic party and attacking Biden from the left. Posts from these users, the RAND researchers found, cast doubt on “whether Biden stood a chance of winning, asked whether he was really progressive and in some cases openly supported Democratic opponents or even praised Sanders directly,” Marcellino says.

Many of these users ostensibly tried to do this while maintaining a positive spirit. “This is classic Russian tradecraft,” Marcellino notes. “They seem to think that the American left are hippie types, so a lot of these posts had this peace-and-love tone they think resonates with American progressives.”

But there were also users who went farther. These trolls were seemingly the farthest to the left – “what you would call ‘Bernie Bros,’ revolutionary types who shared Marxist content and want to take down the system – and they were used in order to undermine Biden regularly.

“For example, these accounts stressed Biden’s connections to corporations to cast him as a capitalist who should not be supported,” Marcellino says. This line is almost completely identical with the critique of the Russian establishment and of Trump against Biden, for example in connection with his and his son’s alleged involvement in Ukraine, and also echoes many of the claims both lobbed against Hillary Clinton four years ago.

In contrast to the peace-and-love messaging coming out of the community that “supported” Biden, the community that expressed support for Sanders employed far more negative and fear-based rhetoric. “It was less pro-Sanders than it was anti-everybody-else,” Marcellino says. “These posts explained Biden was a bad candidate, and, during the primaries, why Elizabeth Warren or Pete Buttigieg were terrible candidates. There was just a ton of negative stuff. Generally there was a lot of scaremongering regarding the economy collapsing and the looming ecological disaster and that we need to start a new society.”

The pro-Sanders community, though relatively small in terms of number of users, was the largest in terms of its interactions, and wielded more influence even than the community that supported Biden. “The pro-Sanders camp [of operatives] was not big but it was the only one that communicated with people from both the pro-Trump and pro-Biden communities,” Marcellino points out. “These communities, on the other hand, tended to communicate only within themselves or with other like-minded communities,” he says. For example, users affiliated with the pro-Biden camp were also in contact with users from another smaller community that supported Trump’s impeachment and discussed only that topic. “It’s a different community but politically it is similar in that both are left leaning,” he says, contrasting them to politically ambidextrous pro-Sanders users.

The methods of operation described by the researchers are consistent with the information gathered by U.S. intelligence organizations. FBI director Christopher Wray testified before the House Homeland Security Committee last month that much of the Russian effort is aimed precisely at the left side of the map. His statement confirmed previous testimony made by officials in the Office of the Director of National Intelligence. Wray maintained that their efforts to harm Biden’s campaign remain “very active” even now.

QAnonism

A central theme runs through the Russian activity that is directly and openly supporting Trump in this election: QAnon. According to this conspiracy-theory movement, connected to the far right, the Democratic Party, along with the liberal elite of Hollywood, the deceased Epstein and his colleagues, the military-security establishment and an array of deep-state mechanisms are secretly operating a child sex-trafficking ring and a pedophile network. Q, the theory runs, is a senior government official who is in possession of incriminating information about the network. Far-fetched as it may sound (like “Pizzagate,” in the framework of which Hillary Clinton was accused of being involved in a network of pedophiles that operated out of a pizzeria), the theory touches on themes at the heart of the disinformation effort in support of Trump.

“The pro-Trump troll camp could actually have been labled the pro-QAnon camp,” Marcellino says. “It’s always the same type of story: Someone has some devastating information that would help Trump take down the swamp or deep-state, but someone – usually the Jews or the media – is preventing them from making the information public.”

Though reluctant to share specific posts due to ethical constraints followed by social media researchers, Marcellino says one post by a user pretending to be a senior military intelligence officer published a long thread about how he will soon reveal information that will “take down” the defense establishment. “This is really the main Q theme: Any day now Q or one of his followers will publish the truth that will shake America to its core. Trolls in this camp would sometimes just write conspiratorial sounding posts and add the letter ‘Q’ to signal to others and create a sense of community.”

About two weeks ago, Facebook announced that it had closed down a large number of groups connected with QAnon and added that it would also ban it on Instagram (which is owned by Facebook) every account identified with the theory, such as those that make use of the term “The Storm,” which for the theory’s supporters represents the day of reckoning that Donald Trump is planning against the pedophilic forces of darkness.

What makes the QAnon theme – that information is being repressed by elites – so potent, the researcher observes, is that “it makes it very easy to convince people that just sharing content about a potential Q ‘drop’ will help Trump take down the pedophile network the Democrats are working so hard to hide.”

 In this Aug. 2, 2018, file photo, a protesters holds a Q sign waits in line with others to enter a campaign rally with President Donald Trump in Wilkes-Barre, Pa.
In this Aug. 2, 2018, file photo, a protesters holds a Q sign waits in line with others to enter a campaign rally with President Donald Trump in Wilkes-Barre, Pa. Credit: Matt Rourke/AP

The fake ‘fake news’ scare

While researchers, officials, social media firms and other experts agree that Russia is involved in a massive disinformation campaign, not everyone agrees that Russia is the principal source of the false information. Moreover, not everyone is convinced that falsehoods are disseminated mainly in the social networks or that these have any real effect on the American electorate.

In fact, another study published last week portrays the problem of disinformation on the web in a completely different light.

The study, led by Yochai Benkler, of the interdisciplinary Berkman Klein Center for Internet and Society at Harvard University, doesn’t contradict the RAND findings but does reframe them substantially. It’s very possible and in fact likely that there is a Russian operation involving trolls and super-connectors spreading conspiracy theories. However, the study suggests they are a marginal element in the story of how disinformation is affecting American politics. The majority of the American public is indeed exposed to non-credible and politically biased information, but not because of Russian bots, suggests Benkler. It’s because of President Trump, his mouthpieces in the Republican Party and the media that cover them. It’s they who are supplying the raw materials and social divides that the bots are capitalizing on.

The Israeli-born Benkler, a professor at Harvard Law School, is considered one of the world’s foremost thinkers about the internet. His 2006 book “The Wealth of Networks” is considered a foundational study of the digital economy, and together with people like Laurence Lessig, he has helped create much of the vocabulary through which we think about the digital arena, including terms like “peer-production,” which relates to the way content can be created online in a collaborative manner.

Benkler has long studied the subject of disinformation, and in 2018 was a co-author of the seminal work on the topic, called: “Network Propaganda: Manipulation, Disinformation, and Radicalization in American Politics.”

In both that book and the current study, which focuses on the 2020 election, Benkler and his team maintain that there was no real significance to the Russian intervention in the 2016 election. According to him, it’s not clear that “even one single vote” moved from Clinton to Trump because of false information on the web.

“This whole idea that social media is central and that if some John or Jane Doe opens their computer and encounters some dis- or misinformation on Twitter or Facebook , they then form a new political belief, is baseless – there is simply no empirical evidence to prove this actually happened. Or at least, it has never been shown to operate at a meaningful scale,” Benkler tells Haaretz.

This is classic Russian tradecraft. They seem to think that the American left are hippie types, so a lot of these posts had this peace-and-love tone they think resonates with American progressives.

Marcellino

On the other hand, he says, “what they did find is that the way public attention was actually captured online started with classic or regular media, coming directly from Trump and other leaders, and from there to the mass media, rather than garnering attention on social media, which we found played a secondary role. The low-quality [user] accounts usually associated with Russian efforts simply re-circulated and reinforced these messages.”

The researchers examined 55,000 different news stories, five million tweets and nearly 75,000 Facebook posts from January to August 2020, which together received millions of views.

The crux of his study’s finding and what makes Benkler’s argument so compelling is the claim regarding who is actually being exposed to this information. “In studies that looked directly at trolls and who interacts with them you see they actually only interact with people who already believe them – people from the far end of the political spectrum, especially the far right. You don’t just catch bad beliefs by going online for 15 minutes, that’s just not how social media works.”

The study, titled “Mail-In Voter Fraud: Anatomy of a Disinformation Campaign” and published in early October, focuses precisely on the mainstream media’s role in the dissemination of false information. Benkler examined Trump’s effort to claim that voting by mail is a method susceptible to abuse, which the scholar characterizes as “the central disinformation campaign of this election.”

“This campaign is designed to suppress mail-in voter participation by raising doubts about its validity,” he says, and adds that the goal is “generally, to raise questions about the election’s legitimacy should the results be close enough to require a recount.”

His team focused on disinformation related to that subject, and found that “the core drivers behind the mail-in voter fraud campaign was Donald Trump, as the leader of the Republican party, not Trump as the crazy uncle in the attic. His statements were always supported and amplified through official accounts, coordinated with statements made by other Republican leaders and White House communications staff members,” he says. These were then debated extensively in the media.

Yochai Benkler.
Yochai Benkler.Credit: Eyal Toueg

However, only people on the extreme right, who already live in a highly unreliable media environment and were already intending to vote Trump, were truly exposed to the Russian disinformation efforts, he says. It is on this basis that he claims social media played a secondary role while the president played a primary role and played the mass media like a fiddle.

“There is only a small segment of American society that actually lives in a post-truth world. But it is like that because it has been subjected to decades of disinformation and propaganda,” he says, noting Fox News and climate change denial as earlier pre-social media examples.

“In the U.S., our previous work found that there is an asymmetry between the right-wing media and the rest of the media landscape. This was our single most important finding in the book: That we don’t live in a post-truth age, but that about half of the GOP’s hard core supporters do. Fox News viewers, evangelicals and talk-radio listeners, people who follow Sean Hannity or Rush Limbaugh, this is about 40 million Americans whose life both online and offline is a post-truth reality. But for the majority of the remaining 200 million or more voting-age Americans, they don’t live in a post-truth world. They actually believe the media and know where to get their news,” he says.

But it is exactly here that the mainstream media that works so hard to debunk Trump and counter disinformation fails them, he says: “Despite not living in a post-truth world, these Americans get bombarded non-stop by claims from trusted media outlets that tell them we are living in a post-truth age, and these outlets end up actually giving a platform to propaganda this way.”

As the study found: “Throughout the first six months of the disinformation campaign, the Republican National Committee (RNC) and staff from the Trump campaign appear repeatedly and consistently on message at the same moments, suggesting an institutionalized rather than individual disinformation campaign. The efforts of the president and the Republican Party are supported by the right-wing media ecosystem, primarily Fox News and talk radio functioning in effect as a party press. These reinforce the message, provide the president a platform, and marginalize or attack those Republican leaders or any conservative media personalities who insist that there is no evidence of widespread voter fraud associated with mail-in voting.”

Benkler recommends that news outlets adopt a method that has been proven to counter disinformation. The model, called a “truth sandwich,” which has recently been deployed by The New York Times, has the inaccurate information presented already in the opening as a politically motivated lie.

Benkler offers an example: “If back in July, the Associated Press had a story about Trump claiming that mail-in balloting was susceptible to fraud, then the journalistic practice of equal time would open with Trump’s claim, followed by some Democrat saying that the claims are not true, and then another counter-argument by another Republican, and then only in the fourth or fifth paragraph there would be an expert explaining that there is no evidence of this type of fraud every having taken place.”

On the other hand, in the “truth sandwich” model, Trump would not be quoted in the opening, but rather “the first paragraph and headline explain that Trump lied today about mail-in ballots as part of his political strategy. The second paragraph is then the Trump quote and the third is the experts explaining this is just not true. That way the lie is sandwiched between two truths.” Indeed as Benkler notes, the reason Trump succeeds at “playing the media like a fiddle” is because he understands how to manipulate three key aspects of it: “elite institutional focus (if the president says it, it’s news); headline seeking (if it bleeds, it leads); and balance, neutrality, or the avoidance of the appearance of taking a side.” It is his use of these that makes Trump such an efficient agent of disinformation.

Claims that Americans are living in a post-truth era rile Benkler. In fact, he says, the very fact that this allegation is voiced in the media is the biggest Russian victory of all.

“I’m not a Russia expert, but everything I’ve read from what I consider to be good work on Russian propaganda, suggests that the primary role of Russian propaganda is to create a world where ‘nothing is true and everything is possible.’ Putin is playing us, and the whole point is to get us not to trust anything anymore. They only win if we retell their story as if it’s true.

“From the perspective of Russian propaganda efforts, if your goal is to disorient your opponents’ population, then nothing is better for Putin than the mainstream media saying we live in a post-truth age. The media is taking all the remaining trust and authority it has not yet lost to Putin and using it to say exactly what he wants.”

Only last week, Trump’s former National Security H.R. McMaster told the Washington Post that the biggest threat to the election is not Russia. Rather, he said, “It’s what we’re doing to ourselves. The Russians cannot create these fissures in our society, but they can widen them.”

An example of this can be seen in recent reports of Russian cyberoperations. Alleged attempts to hack America’s electoral system, some say, are not actually intended to work but rather only to add to a sense that America’s voting mechanisms are exposed and at risk. This is termed “perception hacking,” and is just a new way to think about psychological warfare in which the goal is to create the sense the enemy is much stronger than it actually is.

“Some of this stuff is so clumsy that perhaps they want to get caught, so that there will be reports saying things like, ‘the Russians are strong and you shouldn’t believe anything you read’,” Benkler says.

Facebook CEO Mark Zuckerberg prior to testifying at a 2019 hearing of the House Financial Services Committee.
Facebook CEO Mark Zuckerberg prior to testifying at a 2019 hearing of the House Financial Services Committee. Credit: Andrew Harnik / AP

Post-truth politics

One thing is clear: It is far more difficult to monitor Russian intervention today than it was in 2016. There are two reasons for this. First, the Russians have improved both their messages and their operational methods. They have begun to use artificial intelligence to create trolls that appear to have original profile photos; the fake users are taking care to maintain a presence on more than one social network (even LinkedIn) in order to generate greater credibility and to complicate the task of locating them; and, in addition, they are by now not only disseminating fake news, they are also creating commentaries.

Nothing is better for Putin than the mainstream media saying we live in a post-truth age. The media is taking all the remaining trust and authority it has not yet lost to Putin and using it to say exactly what he wants.

Benkler

Second, if in previous elections researchers had broad access to Facebook information, today, after the Cambridge Analytica affair, that access is largely blocked. Many researchers maintain that the company, citing the argument of preserving surfers’ privacy, is preventing attempts to monitor the social networks. When Facebook takes action of its own in this regard, it does so far from the public eye. The company removes users and even whole networks without updating anyone, not even those who were exposed to the content disseminated in those accounts.

Facebook, for its part, says that it’s fighting the problem on its own and has set up a research initiative called Social Science One, which allows people like Benkler to work with them to do productive research. However, researchers involved in the project told Haaretz that it has been mired in foot-dragging and that even those given access to data are still severely restricted.

Although the term “fake news” entered our life only a few years ago, the RAND researchers see nothing new in it. “Actors like Russia understood long before us the power of information,” Marcellino explains. It’s the continuation of the efforts of the Soviet Union during the Cold War to wage psychological warfare against the American public.

“They’ve been developing these techniques for over 70 years,” he says. “We on the other hand, don’t even have a proper term for it – is it information warfare? Fake news? Disinformation? We know for example that true facts can also be used for these efforts,” he says, adding that “its funny but in this sense they are actually light years ahead of us.”

The studies conducted by the RAND Corporation have this goal in mind: to present an American response to the war for consciousness. The researchers are working to create tools and new investigative methods to identify and map the information on the web, to propose defenses against it and also to try to estimate how much damage has been done to the concepts of truth and lie among the American public.

Still, the researchers are aware that their effort is limited, above all because they don’t have access to most of the information – that which is found on Facebook. Marcellino admits this with frustration and says that the blockage of information has dealt a substantial blow to the ability to monitor the Russian intervention in the election. He also points a finger of accusation at the social networks (that is, Facebook and Twitter), which he says could do more.

Another issue that prevents proper research on this is ethical limitations: Personal user data cannot be shared with researchers, and even in cases where they are permitted access, then they cannot share for example with the media the identity of users, due to risks they may expose actual people.

Though Benkler does not doubt the methods used by the RAND researchers, he is skeptical of this methodological and ethical explanation.

“Facebook has a real paradox, if not conflict of interest, here. There is a way to check the effectiveness of these operations, but there has been almost no work done on exposure to fake news. They’re doing it for sure, but we don’t know what actually happens to the people who see it.”

For him, the explanation is simple: Facebook does not want us to know. “If they let us study this, one of two things will happen – both potentially devastating for Facebook: One is that we will find out that Trump really is in the White House because of something that happened on Facebook. The second is that we discover it made no difference whatsoever, but then what the hell is Facebook actually selling?”

The ethical limitations, he says, are just an excuse for Facebook. “They don’t want this out there and they use the privacy debate to prevent academics from answering these questions.”

He urges me and the public not to buy the narrative according to which Facebook is somehow protecting our privacy by keeping all the data about us to itself. For Benkler, the real reason they don’t want to share this information is because it could show that their entire business model is less efficient than we’ve come to assume. In a sense, you could say that Facebook benefits from the fact that we live in a society that believes that content, if well-targeted, can change people’s minds. This narrative, he says, serves Facebook’s financial interests more than anything: “The bottom line is that both what Russia and even Cambridge Analytica did was never actually proven to be effective. Please don’t buy their snake-oil sales pitch, there is no evidence they moved a single voter, let alone through psychological targeting. We looked at this, and there is no science behind it, so let’s not project our anxieties on reality.”

Nonetheless, he and RAND say more can be done. For example, the RAND researchers shared with Twitter the long list of suspect accounts they located. So far, the network hasn’t done much about it. “A large number of the accounts are still active,” Marcellino says with disappointment.

However, Benkler urges us not to think of the issue of disinformation and foreign intervention as a social- media problem, or even as a technological one.

“Part of the problem is the business model of Facebook and Google – they collect too much information on us and doing research on this is impossible without violating people’s privacy. So one aspect of the problem is that this model cannot preserve the dignity of the subjects,” he says.

However, for him, the issues arising from the massive collection of data by tech giants only exacerbate much larger issues stemming from wider historical and economic shifts. “We also have a bigger problem of the collapse of neoliberalism, and the fact that racists, misogynists and fascists are harnessing the misery of millions for their own political opportunism.” That he says, is not a uniquely American problem and certainly not a technological one.

Regardless of whether you agree with Benkler or RAND or both, one thing is clear: In today’s world, it is impossible to follow disinformation efforts and researchers, experts and journalists are prevented from playing a substantial role in monitoring social media in what seems like an apt metaphor for a world in which expertise been rendered obsolete.

Comments