The former director general of Israel’s Justice Ministry, Emi Palmor, is the most senior Israeli on Facebook’s Oversight Board. In an interview, she explains why she believes the board can play a positive role and urges Facebook to take more responsibility.
The oversight board on Wednesday upheld the social media giant’s suspension of former U.S. President Donald Trump - but said the company was wrong to make the suspension "indefinite" and asked it to re-assess the penalty.
"The Board has upheld Facebook's decision on Jan. 7 to suspend then-President Trump from Facebook and Instagram. Trump's posts during the Capitol riot severely violated Facebook's rules and encouraged and legitimized violence," it wrote in its decision.
However, it further noted that they "found Facebook violated its own rules by imposing a suspension that was 'indefinite.' This penalty is not described in Facebook's content policies. It has no clear criteria and gives Facebook total discretion on when to impose or lift it."
Palmor, who in the past led initiatives to address racial discrimination in Israel and advance access to justice via digital services and platforms, justifies the company’s blocking Trump’s account. However, she criticizes Facebook’s inconsistent content management policy, which she says discriminates among users.
“Trump’s Facebook and Instagram accounts still exist,” says Palmor in an interview with Haaretz. “He hasn’t been removed from the platform, he just can’t go on posting content in his accounts. That’s something Facebook invented just for him,” says Palmor.
“The board is demanding of Facebook to act consistently and fairly and to practice a consistent policy toward all its users – whether they’re leaders or ordinary people.”
- Iranian accounts, Russian tactics and Q: Israel has become a disinformation battlefield
- Iranian network awakes with new op against Gantz
- Israeli spyware firm vows transparency after rights groups claim it’s not acting in ‘good faith’
'Facebook made a commitment'
Facebook’s oversight board, which began operating in October 2020, consists of 30 high-profile judges, journalists, social activists and former state leaders.
It is financed by an independent body based on a $130 million trust provided by Facebook founder Mark Zuckerberg to cover the initial first six years of operation.
The trust set up a subsidiary company that employs the board members, who are committed to working 15 weekly hours and are supposed to deal with 60 incidents a year, to be selected by the board itself. The board members are reportedly paid a six figure annual wage each. The board is independent and Zuckerberg promised that its decisions will be binding. Trump’s case was one Facebook asked the board to address.
“In our discussions we use the rules of international law – which apply to large companies. We use legal tools like the UN’s Universal Declaration of Human Rights and we talk about proportionality. Our goal is to make it clear that leaders don’t have privileges in practicing freedom of expression compared to ordinary users, but a leader does have greater influence and there are damages that must be taken into consideration,” says Palmor.
The entire process is accompanied by legal aides and written in a number of languages to help make its decisions accessible. The decisions will be translated into Hebrew soon, so Israelis will also be able to learn about it decision-making processes, she says.
“Facebook is committed to abiding by the board’s decisions - we’ve already established that they were wrong and they acted in accordance with our recommendations. They are obliged to respond within 30 days, so we’ll see. The board’s power is now mainly manifest through the media coverage of our decisions. Without following up and spreading the board’s messages – we won’t be able to exert pressure on the company to change improper policy.”
‘I’m suspicious too”
This is the second time that the board has published its decisions. At the end of January it published its first rulings, disqualifying six of Facebook’s seven content decisions. In other words, the board overturned Facebook’s decisions to block posts on various subjects and ordered the company to reinstate posts it had blocked.
“The intention is to publish reports and set high demands,” says Palmor.
“The board consists of high quality, balanced people, not populists or sycophants. They shine a light on the company’s weak points. It’s time Facebook grows up and realizes there’s no choice. To deal with such a scope of content management it will have to reach into its purse. It will have to invest in both personnel and technology and it will cost a lot of money.”
The board wrote in its recent decision that Facebook had been asked 46 questions about Trump in the investigation process, of which it refused to reply to nine.
“That angers us, we don’t intend to make allowances and let them cut corners,” she states.
“The burden of proof is on us. It’s not something whose impact we’ll see tomorrow but we expect to see results in the coming year. The company has six months to react in Trump’s case, and until then they’ll have to form a clear policy and won’t be able to evade it.”
Despite many critics’ cynical claim that the board serves as a fig leaf for a company that in any case doesn’t act with transparency, Palmor believes the board will make a difference to Facebook’s behavior.
“I too am suspicious, but in contrast I see it as a very interesting experiment. The more I got to know the board members, I understood their reputation is very important to them. They’re all in other positions, so none of them will sell herself or himself to Facebook alone,” she says.
The problem with a decision on blocking Trump is that the event took place when the board had just started to operate, and had not yet accumulated successful decisions the public could praise.
“We want to understand how they make a decision to block someone,” says Palmor. “To avoid making mistakes, we must understand what this person did wrong. The problem is two-directional – most users are frustrated, but they haven’t read the community’s rules either, so it’s time they do that. On the other hand, Facebook cannot act without giving any explanation. It must learn to provide explanations to why it blocks people.”