Censoring the Internet Is No Way to Make a Living

A New York-based artist-couple delves into the murky world of 'content moderators,' the poorly paid workers meant to protect us from horrific Internet images.

Franco and Eva Mattes.
Natan Dvir

“I feel a bit like a superhero when I think about how I might be helping a child from seeing a graphic photo, but at the same time I don’t necessarily feel like it’s my place to have the power to decide that the photographer shouldn’t be allowed to display his or her work I never speak with friends or family about what I do. They know I do online work, but they don’t know any details. They’ve never really asked and I haven’t offered anything.”

This disturbing confession, delivered by an anonymous speaker, is part of a challenging new exhibition by Eva and Franco Mattes titled “I Would Prefer Not to Include My Name.” The New York-based artist-couple have been exploring the dark sides of digital culture for two decades. The current exhibition, which is on through December 6 at the small Essex Flowers Gallery on the Lower East Side, consists of three video screens that show, in an endless loop, disturbing interviews with employees in the shadow industry of Internet censorship, euphemistically called “content moderation.”

The three interviews in the exhibition, chosen from among more than 100 that the Matteses, both Italian-born and 39 years old, conducted with these content moderators via Internet chat services, expose the sophisticated and exploitative way in which tens of thousands of people from places such as the Philippines and Latin America are employed to assume the effective role of “Internet police.” It’s they who decide whether the post you’ve uploaded to Facebook or the video you shared with YouTube or Vimeo can be construed as “offensive content.”

For a paltry monthly wage of between $300 and $500, these workers spend most of the day watching graphically appalling content, including abuse of animals, acts of pedophilia, beheadings, gory documentations of suicides, and other images whose existence the average surfer would prefer not to know about.

Not surprisingly, burnout is very high in this job. According to a report by Adrian Chen, published a year ago in the magazine Wired, most of these workers leave within a few months and few last longer than a year. Still, this is an industry that employs more than 100,000 salaried and freelance personnel around the world at any given time, most of them by way of third-party companies. But even if Facebook, Google and others don’t employ these people directly, they have a vested interest to insure that surfers will not accidentally encounter the latest ISIS video or images of brutalized bodies while leisurely browsing clips of cuddly kittens.

“As digital artists, one of the things we do when we are facing any content whatsoever – whether we’re reading a blog or watching YouTube – is to ask what is not there, what is absent,” Franco Mattes told me in an interview I conducted with him and his wife, Eva, in a café near the gallery in Manhattan. “For us, it’s an interesting way to map the boundaries of a given system. When we first logged into Facebook, we immediately noticed there was a suspicious lack of controversial content. I couldn’t see any neo-Nazi propaganda, or ‘white supremacy’ groups, or anything that might provoke me.

Image from 'The Others,' 2011: A young man whose head is partially shaved stares intently at the camera.
Eva and Franco Mattes

“We started looking around and couldn’t find any disturbing content on either our feeds or our friends’ feeds,” he continued.

“We came up with two possible explanations: Either there are no neo-Nazis or crazy maniacs out there – which we know for a fact is not the case – or these kinds of groups don’t want to use social media to promote their agendas, which is also not very believable. More likely, these people try to use these tools, but the kind of content they upload is rejected or quickly removed from the Internet. We started investigating this void and very quickly encountered the content-moderation industry, working to make the internet a ‘family-friendly’ place.”  

Was your work influenced by Adrian Chen’s piece in Wired about the starvation-wage workers who filter content for the Silicon Valley giants?

Eva: “We started our research a few years ago, but were deeply inspired by Chen’s piece. We asked him to serve as an advisor on this project, which he did.”  

How were you and Franco able to track down all these dozens of employees?

Eva: “The only way to get to these people proved to be to post a job ad and to pose as a content company. We used the same system and methods the outsource companies use to hire content moderators. We ended up talking to more than a hundred moderators, mainly in Eastern Europe, but also in the U.S., Latin America and the Philippines. We interviewed them via chat – they all wanted to remain anonymous because they are required to keep secret the true nature of their job from family and friends, and are required to sign nondisclosure agreements.”

But if you never met them, how do you know that their testimonies are reliable?

Franco: “I know they are reliable because they were consistent. We heard similar stories from different sources. When 40 different people tell you the same story, there is no reason not to trust them, especially considering the fact they are anonymous and therefore have no interest in lying about their experience.”

The stories presented in the exhibition are eye-opening for anyone who may have thought that the Internet is a playing field for adults, a magical kingdom where, in the worst case, one might encounter offensive and racist comments or, in extreme cases, public shaming.

Given the interviewees’ reluctance to be identifiable, the Matteses came up with the creative solution of processing their voices, eliminating evidence of such human traits as age, gender or ethnic origin. Visually, the testimonies are read by avatars: computerized images of men and women of different ages that morph randomly every few seconds. The result is a confusing work of video art that prevents the viewer from stereotyping the speaker. Without the ability to determine whether the speaker is male or female, a young Filipino man or an elderly American woman, all that remains is to listen to the distressing content and be exposed to the underbelly of the digital world.

'catt' by Franco and Eva Mattes, on display at London's Carrol/Fletcher art gallery.
Julian Abrams

The highly unpleasant content and the workers’ disinclination to tell family and friends what they are doing makes these jobs highly demanding and traumatizing.

“I’ve seen children in sexually suggestive clothing and poses; I’ve also seen humans having sex with animals,” one of the interviewees testifies in the video-art installation. “I try very hard to avoid the requesters [i.e., clients] that post these kinds of images, and stick with requesters I know that generally don’t have these kinds of images up for moderation. Some requesters are known for always having really graphic images, and other requesters – like the Mormon Church, for example – very rarely or never have images like this. I prefer to work for these ‘safer’ requesters, to reduce the chances of seeing the really awful images.

“In the case of the Mormon Church,” the worker continues, “we moderate images that go into the church genealogy archives and website. For those, you have to flag and remove anything that shows very much skin, a low-cut neckline, a swimsuit, people of the same sex who are acting affectionately toward each other, men wearing women’s clothing, etc. They only want approval on very modest images and absolutely nothing that could imply homosexuality, cross-dressing or anything like that.”

Another major problem raised by the exhibition stems from the fact that this is a global industry that operates in the gray zones of the law – namely, in the absence of coordination between the content-monitoring companies and the law enforcement authorities.

“I do have a common frustration,” the same worker says. “When I see an image that needs to be tagged because it is really graphic or illegal, I flag it and then move onto the next image or the next task. There is no follow-up. I never know what happens, or if anyone saw that it was flagged, or that anyone took care of it. Maybe I flag the image and they remove it from their data set, so it’s not visible online anymore, but what if there was something illegal in the image, like child pornography, that needed to be reported to the police?

“Does anyone at the company call the police? Does anyone try to help the child in the image? I have no idea. Since the requesters are almost always anonymous, I can’t contact the company to even ask these questions. I click to flag the image, but after that I don’t know if anyone ever cares.”

Scientology threats

Although it’s easy enough to understand the need to remove certain material from the Web, a disturbing byproduct of the unofficial censorship is the use made of these companies by wealthy and influential political elements to further their goals. Thus, for example, a worker who was employed full-time by Vimeo (“even though I never set foot in the company’s offices and never met other workers”) related that in two different cases he was requested to remove clips from the popular video site.

“There were some conspiracy-theory movie clips that we were made to take off the site,” he told the Matteses. “These videos contained documentary-style information about events like 9/11. There were also videos removed about Scientology, and the rumor was that Scientology had threatened to sue IAC [the company that owns Vimeo]. The videos that were removed dealt with reaching a high ranking in the religion, and were private documents that only those reaching that level would be allowed to see.”

The Matteses themselves both say they were surprised to discover that many companies bow to political pressure. “We were absolutely blown away by the scale and extent of censorship online,” Franco said. “Initially, this project started as a survey of content moderation and it emerged from our ongoing fascination with the dark side of humanity. But very quickly the research took a surprising turn when we started to learn about political censorship on the Internet. For example, when Osama bin-Laden was killed, many companies ordered their content moderators to remove any video or content about the assassination from their websites.”

'Nike Ground,' an artwork by artists Eva and Franco Mattes, 2003.
Courtesy

Who is giving these kinds of orders? Who are the people controlling the content?

“We don’t know. Interviewing content moderators is like interviewing a policeman in the street – they won’t be able to tell you who designed the policy they are asked to enforce. They only know that their superior gave them orders they must follow in order to keep their jobs. We do know that these orders come from both the government and private companies, for many different reasons. A few years ago, content moderators were asked to remove a video of a Buddhist monk who set himself on fire as a protest against the Chinese government. This video was not especially violent, gory or graphic – and yet it was removed. Around the same time, Facebook started to allocate resources in order to gain more influence in China. So it makes perfect sense that Mark Zuckerberg doesn’t want to upset the Chinese government.”       

Despite the difficulty of knowing who gives the orders, the bottom line is that the decision to remove content is based on random, unclear, shifting categories. A clip that was okayed yesterday could be shelved tomorrow, and what was considered offensive two years ago might return to our lives as a viral video, as indeed happened on several occasions with clips of beheadings that ISIS managed to upload to social networks such as Facebook and Twitter.

This may sound like something from the plot of a Philip K. Dick novel, but the article in Wired revealed that truth can indeed be stranger than fiction. In 2010, the magazine reported, “Google’s legal team gave moderators the urgent task of deleting the violent sermons of American radical Islamist preacher Anwar al-Awlaki, after a British woman said she was inspired by them to stab a politician.” The result was that Awlaki’s sermons disappeared from the Web overnight as though they had never existed. (Some sites will note when material has been removed, but there are no agreed-upon procedures for this.)

You interviewed about 100 workers but eventually decided to include only three interviews in the gallery exhibition. What do you plan to do with the data you’ve collected?

“These videos are only the first samples of a much bigger project, titled ‘Dark Content,’” Eva explains. “We’re planning to publish a new episode every month on the darknet for two reasons: First, the darknet is the place where most of the content that gets filtered by the moderators eventually ends up. This is where these banned clips or photos get a second life after they tried to kill them. In the Internet, it’s almost impossible to eliminate content for good. The censored content is probably sitting in some dusty hard drive in some basement in the Nevada desert. A second reason is that we really want to encourage people to venture into the darknet to find these artworks, and to start using the darknet as an anonymous way to browse and find information.”

How would you describe the darknet for someone who has never heard of it?

Franco: “It’s a sub-portion of the Internet that requires anonymity. So the technology is designed in a way that makes it very difficult for anyone to trace where the content is coming from – who published the information and who gets to see it. Some artists use the darknet as a source for inspiration, salvaging content from the darknet and placing it in the gallery space. We wanted to offer a role reversal –from the gallery into the darknet.”  

In your joint career you have created many projects under fictitious names. In fact, your real names are not Eva and Franco Mattes. Why is anonymity important to you?

Eva (after noting with a smile that she and Franco prefer to talk about their art and not about themselves, and after refusing a request to reveal her real name): “There cannot be free speech without anonymity. You can never freely express yourself if you know that you’re under surveillance, which is when self-censorship comes into play. Therefore, anonymity is the very basis of democracy. This is why voting is always anonymous. Otherwise, the voter might be subjected to different threats, pressures or influences.”

The 'Dark Content' exhibition by artists Eva and Franco Mattes at Essex Flowers gallery, 2015.
Kyle Knodell

Darko Maver dies

Given the amazingly rich history of the Matteses, who have been creating art under different names for the past 20 years – since they first met in Madrid in 1995 – their fondness for the darknet and for aliases is understandable. The couple, who moved to the United States about 10 years ago and live in Brooklyn’s Bushwick neighborhood with their two-year-old son, are considered pioneers of digital art. Not long after they met, and without any formal art studies, they started to replicate websites. In 1998, they built a digital double of the Vatican’s official website and ran it for a year posting anti-clerical material on it, a few years before the American activists known as The Yes Men adopted a similar tactic when they replicated the site of George W. Bush.

In addition to digital works that were ahead of their time, the couple had a major breakthrough in 1999, when the prestigious Venice Biennale proposed that it hold a retrospective of the works of Darko Maver, a Serbian artist whose provocative art on the theme of violence in the Balkans had led to his persecution and arrest by the regime. It was only after Maver’s purported death in prison in 1999, during NATO bombing, that it turned out that he was a fictitious artist who never existed. Eva and Franco, who at that time worked under the mysterious name 0100101110101101.org, revealed that Maver was a product of their incandescent imagination.

In the light of this history, it’s clear why Franco and Eva – and the different names they have used – have been widely covered in the media and variously called “hackers,” “activists, “storytellers” and “Web artists.” They themselves have reservations about the categorizations.

Eva: “I find ‘activist,’ ‘storyteller’ and ‘hacker’ very flattering, but unfortunately I guess we’re just ‘artists.’ To be an activist, you have to have really clear ideas about ‘good’ and ‘evil,’ and we’re much more confused than that. To be a hacker, you need a set of technical skills that we don’t have. And while there’s an important component of storytelling in our work, I wouldn’t claim the title ‘storytellers’ because if we were good at that, we would write books.”

You said in an interview with The Guardian that the idea of copyright is boring. Why?

Franco: “Intellectual property is substantially different from physical property. Ideas are not like objects. You can share them without stealing them. There is this great quote by George Bernard Shaw: ‘If you have an apple and I have an apple and we exchange these apples then you and I will still each have one apple. But if you have an idea and I have an idea and we exchange these ideas, then each of us will have two ideas.’ So we both get enriched, and there is no loss in the process. Mostly, copyright laws are a way to apply a system designed for objects and material things to abstract ideas while denying the inherent differences between these two categories.”

But some will argue that this is the only way for creative people – such as yourselves – to make a living by doing what they love.

Franco: “In theory – yes. In practice, the only people copyright laws protect are the mediators: publishers, agents, music labels and so forth. These are the people who earn money as a result of these laws and regulations.”  

In another current New York exhibition by the Matteses, titled “The Others,” the artists present 10,000 photographs and clips that they stole from personal computers of individuals who used group-sharing sites – without their agreement or knowledge. The result, on view during the month ahead at the Abrons Art Center, on the Lower East Side, is thousands of images from the daily lives of anonymous users: slumbering cats, the Eiffel Tower, family events, a girl baring her breasts, a couple kissing in the kitchen. In sum, a multitude of random low-quality testimonies of the relentless need of people to document every moment of their lives. Their purpose, notes Franco, was to generate awareness of the extent to which our content is accessible, Franco notes.

Franco, who teaches at several art schools in the New York are, adds that many of his students think that the “cloud” is a program that’s installed in the computer. They don’t understand that in fact their most personal information is stored in numberless semi-secure servers in all kinds remote server farms in Texas or Mexico. In regard to the cloud, there is no way of knowing who is liable to gain access to our content, he says.

Do you believe that the Internet should give expression to all the types of human perversions and passions, even if they are sadistic?

Franco: “I too don’t want to see graphic violent content, but I want to know that it is out there. The real issue, however, is that this is a slippery slope: Until recently Facebook removed photos of same-sex couples kissing, and it still refuses to publish photos of female nipples. You can make the claim that people might get offended by a woman’s nipple, but personally I don’t mind seeing these images.

“The problem is that no one will ask me what my red lines are. Today, content moderation is an obscure and mysterious process. We don’t know who removed the content, and why. Without transparency, Internet users cannot know what guidelines content moderators follow, and we don’t know what happens to the content once it’s removed. Since these are private companies, they act as if they have no accountability.”

Eva: “If we care so much about freedom of speech, we have to redefine and openly debate its limitations and contours. That process should not take place in the shadows.”