A dark cloud is casting a shadow over the next Knesset election. The 2016 U.S. presidential elections and the Brexit referendum in Britain that same year showed the extent to which Western democracies are vulnerable to online campaigns combining disinformation (or fake news), focused marketing through social media messages and armies of bots and fake profiles promoting secret agendas. Just last week Reuters reported the scope of the Iranian influence operation, which included fake accounts in 11 languages on various platforms.
Headlines over the past few months have trumpeted the bots (sort of opinion-shaping robots) uncovered on Twitter, but they are no more than twigs in a huge forest of influence and fake news operations on the internet.
The real problem is that the bots – which enable the automatic generation of certain messages or advocacy of certain ideas – are not working alone. They are part of an arsenal of tools to influence discourse. Some of these tools are legitimate, such as recruiting communities, and some are less so, like “real” profiles working for parties with vested interests or false profiles like that of “Adam Gold” (a popular right-wing activist whose profile was removed by Facebook after it was revealed to be fake). The more fake likes and followers crowd into the arena, and the less transparent they are, the foggier the picture becomes.
Wide-ranging research conducted at Oxford University, which compared social media manipulation in 28 countries revealed a general rise in the use of bots, but, interestingly, in Israel no such activity was discerned. On the other hand, the study found activity by accounts belonging to parties with clear vested interests. “Mobilization of ideological communities is part of human political behavior, which is not unique to the digital age,” says Dr. Anat Ben-David, head of the Open Media and Information Laboratory (OMILab) at the Open University. “But the use of disinformation or the propagation of such tools is newer,” she adds.
'Too bad you’re not our president'
- Tel-Aviv Times? Iran Created Fake Hebrew News Sites in Major 'Influence Campaign'
- Seven Strengths That Could Make Netanyahu in the Next Elections – and Seven Weaknesses That Could Break Him
- Under Netanyahu, Israel Is Barreling Toward Autocracy
Prime Minister Benjamin Netanyahu, or more specifically, his Facebook page, illustrates one of the strange ways the internet works.
Ben-David and Open University data scientist Dan Bareket analyzed all the responses to posts by Knesset members on Facebook over a two-year period (2015–2017). They discovered that on the one hand political discussion on these pages was quite random – 49.9 percent of responders to political posts only responded once. On the other hand, discussion was influenced by a very small group of users, with 5 percent responsible for some 70 percent of responses.
Ben-David and Bareket divided the responders into additional groups: 18.54 percent responded to only one politician. Another 3.5 percent were “basers” – supporters of that politician’s party– who responded to various politicians all belonging to the same party. Seven percent were called “campers” – meaning they responded to politicians from different parties on the same side of the political map and 16.7 percent responded to politicians from different parties and camps. This last group – together with a prominent minority on the web – conducts the critical discourse. Indeed, Israelis love to argue: Critical responses accounted for a quarter of all responses they reviewed. Moreover, the same 5 percent were responsible for a third of the responses on average on the MKs’ Facebook pages.
The activity on Netanyahu’s Facebook page dwarfs those of all the other Israeli politicians, accounting for half of all responses. A total of 350,000 people responded to his posts – which is more than three times the number who responded to posts by the next most popular Facebook pol. That would be Yesh Atid’s Yair Lapid, who garnered 100,000 responders.
But the strangest finding involves the one-time responders and fans (supporters of one politician). Of this group, 48 percent post warm responses in English such as “God bless you,” and “too bad you’re not our president,” and disappear. The number of such posts is estimated at 160,000 – more than the entire number of responses to Lapid’s Facebook page. Similar comments come from Netanyahu’s fans, who account for a quarter of the responses to his page, as opposed to an average of 10 percent of the responses to posts by other politicians.
The Open University researchers noticed one characteristic shared by almost all of the one-time responders and many of the others (most of whom are not bots): They are connected to Christians United for Israel, an evangelical group headed by Pastor John Hagee, an outspoken supporter of Israel – with at times strange ways of showing it. In a sermon, Hagee once expressed gratitude to Hitler for helping to bring about the establishment of the Jewish state. Hagee’s organization, which fosters warm ties to Netanyahu, also contributes to the far-right Israeli group Im Tirtzu.
Christians United for Israel is considered one of the most important evangelical groups in the world and conducts significant digital campaigns, some of which have as their goal the sending of thank-you letters to Netanyahu.
On the face of it, there is no problem with that. But the very dissemination of posts by sympathetic responders heavily shapes the discussion.Even if criticism is voiced, it will be swallowed up in the cascade of responses by Hagee’s army of commenters. Ben-David adds that the one-time responses make it easier for the operators of bots, saving them from the need to work to build credibility.
'A vicious cycle'
Leftist activists will certainly be infuriated at what they see as the hypocrisy of the prime minister and Im Tirtzu, which unceasingly accuse the left of working with foreign organizations and governments, while all the while they are getting extensive support from the Christian right. But this activity cannot be labeled illegitimate in the same way that the use of bots could be.
“A lot of money goes through the disinformation industry,” Ben-David says, explaining that the more time people spend on social media, the more politicians have to increase their presence and invest more money in this avenue. “A vicious cycle is created in which politicians depend on the platforms for disseminating information, and the platforms, like Facebook, You Tube and Twitter are the main beneficiaries,” Ben-David says.
And still, a major portion of the debate takes place outside the parameters of social media, in closed groups communicating through chats on WhatsApp and Telegram, and of course in real, face-to-face conversations. “Anybody who has a little money can run bots, or go to a PR firm that will magnify the discourse,” Ben-David told Haaretz, but that is not necessarily sufficient to set the agenda. “It’s not enough just to slant the discussion over time, all the time, and ignore the very deep foundations of existing networks, such as churches, yeshivas or other strong organizations.”
Ben-David made of list of companies that offer clients shortcuts to reach the pinnacle of popularity on social media. Only a few of them offer the option of buying Twitter followers. Most of them prefer to provide various promotional services on Facebook, Instagram and YouTube and other platforms like Soundcloud, with prices ranging from 10 shekels ($3.5) for 10 followers on Facebook to hundreds of thousands of shekels for other services. While bots on Twitter look like the next big thing today, Facebook is much more influential in the discourse and the use of Facebook by biased parties is already here.
In June the Israel Television News Company reported on a venomous Facebook campaign against Yair Lapid which used various Facebook pages under different names to attack the Yesh Atid leader. The report found that the campaign was run by a company called Spotlight Political Research, whose major client is the Israeli Labor Party. The Labor Party confirmed the report, but said that it saw no problem with the method used. “We are trying to reach new audiences, especially a young audience that consumes digital media,” the Labor Party said in response. “It goes without saying that everything was done according to the law, and was reported and overseen as required.”
But Israeli law is having trouble keeping up with the technology. “The legal framework is of course the Election Law (Propaganda Methods), which is intended to ensure fair dissemination of election propaganda in an election campaign. But as we know, the law does not deal at all with propaganda on the internet,” says Prof. Niva Elkin-Koren of the Haifa University law faculty. Elkin-Koren says that the Supreme Court has repeatedly stressed that the current law is archaic and should be amended; so did a committee headed by former Supreme Court President Dorit Beinisch, who last year recommended applying the law to the web.
“I propose thinking about political bots as a kind of digital megaphone that magnifies dissemination of the message,” Elkin-Koren says, adding that bots do not necessarily have to be disqualified outright. The use of bots, she says, raises concerns that public opinion will be based on a misleading presentation of the popularity of certain positions (thus strengthening extreme positions that would otherwise remain on the sidelines). “On the other hand, of course, there is a concern that prohibiting the use of bots will exclude minorities whose voices it is important to hear in public discourse,” Elkin-Koren adds.
“The main problem of bots, is, of course, the lack of transparency,” It’s not hard to see who’s using a megaphone, “but if we can’t know how to distinguish between real people in civil society taking part in the elections and bots, we might give added weight to messages spread by bots.” That is why Elkin-Koren says, there are legislative initiatives in the works to ensure that bots and their messages are marked as such.