Opinion

Why I Refused to Install a Baby Monitor in My Daughter’s Bedroom

People who are monitored start to behave and think differently – and no one knows it better than Israelis

A baby monitor.
Shutterstock.com

Last month, a few days elapsed between publication of the names of the top executives of the monitoring and intelligence firms Black Cube and NSO on TheMarker’s list of the 100 people who have the most influence in Israel, and a report in Haaretz about a tender issued by the National Insurance Institute for the collection of information via the web about recipients of NII allowances.

On the one hand, there is the desire of a government agency – whose purpose is to ensure the public’s welfare and protection – to maintain incessant surveillance of citizens, and on the other hand, the extolling of representatives of the industry that makes such surveillance possible: covert monitoring contractors who are willing to provide espionage services to all manner of people, from banking tycoon Nochi Dankner to Hollywood mogul Harvey Weinstein. The recent comprehensive investigative report in this paper, about Israel’s extensive spyware industry and its exports, only heightens the oppressive feeling one is left with.

Israel is happy to adopt surveillance as a way of life. Of course, it’s not the only country that does so, and it’s also not the country with the most intensive surveillance (the United States and Britain do no less in this category). But Israeli surveillance comes with a particular aftertaste.

When we talk about the “startup nation,” it’s easy to forget how closely intertwined the technologies being developed here are with the security-intelligence establishment. “Graduates of 8200” – referring to a vaunted Israeli army intelligence unit – is a description that accompanies many employees in local high-tech. But the implication of the term is not only of an “army programmer”; it’s of an “army programmer who has participated in creating systems for mining personal information in incomprehensible quantities.” We should pause to consider how far this view of the world – according to which personal information is simply “out there” and all that’s needed is to find the appropriate technology to locate it – has trickled down into the business-security consensus in Israel.

Many people ridiculed the letter signed by Unit 8200 reservists in 2014, in the wake of Operation Protective Edge in Gaza, in which they declared that they would refuse to collect personal information that would be used against Palestinians. Pearls of wisdom were bandied about at the time, mocking them as personifying “Tel Avivism,” as consumers of espresso and sushi. Others were angry at the fact that combat personnel endanger their lives, and denigrated the “moral difficulty” entailed in working in cushy air-conditioned offices, which paves the way to a career in high-tech and fulfillment of the Israeli middle-class dream.

But those who listened carefully could grasp the distress. Nothing foments more despair than the intimacy that’s forged within the framework of blanket surveillance, between those being tracked and the trackers. After a time, the choice is between rebelling against the system or yielding completely to that way of thinking.

My grandfather, who grew up in the Soviet Union in the waning period of Stalinism, once told me that for years he used to break off every conversation he was holding, however innocent it was, if someone he didn’t know approached within a few meters of him.

A rendering of the Panopticon, devised by English philosopher Jeremy Bentham and described by French theorist Michel Foucault.

An impressive metaphor that originated in a concrete plan of action is that of the Panopticon, devised by the 18th-century English philosopher Jeremy Bentham (as described by the French theorist Michel Foucault). It is a round building with observation posts in the center and open cells that allow an observer, such as a prison guard, to keep an eye on a large number of people simultaneously.

But Bentham didn’t conceive the Panopticon solely as a prison. In his thinking about scientific efficiency, he wanted to use this model in the creation of schools, workshops, hospitals and any other institution in which people of all types need to be under observation and inspection. An additional element, which is particularly relevant for the surveillance industry in Israel, is that a special system of shutters and lighting was intended to prevent those in the Panopticon cells from knowing whether they are being observed, and when. People who constantly suspect that their behavior may be monitored will be more circumspect and more effective subjects, in terms of the sovereign ruling them, than those who know when they are being watched and when they are not.

When our first child was born, my partner and I wanted to install a baby monitor in her bedroom, to allow us to watch over her even when we were not by her side. At that time, most of the available products were already “smart” and networked (IP cameras connected to the web), and offered the possibility of broadcasting video footage directly to the smartphone, with recording options.

I was vehemently against that. In the end we bought a simple radio-based device with only a microphone.

My opposition stemmed not only from security reasons (though I was indeed appalled by cases in which hackers penetrated those devices). Most of all I was apprehensive that our daughter would develop, from a pre-aware stage, as an object of surveillance. I still can’t explain exactly how I expected an infant to understand or internalize the idea that someone would always be watching her, from inside the room or elsewhere, but I was concerned that she would understand nevertheless. People who are monitored start to behave differently and also to think differently. What’s private disappears and great mental energy is invested in constant self-examination: What am I revealing? How can it be used against me? I couldn’t know whether my daughter would grow up differently from someone who wasn’t raised under surveillance, but I didn’t want to take the risk.

This situation is termed “liquid surveillance” by the researcher David Lyon and the philosopher Zygmunt Bauman (who co-wrote a book with that title) – referring to means of surveillance that trickle into every opening of modern society, not only through cameras but also, and indeed mainly, by the constant collection and cross-matching of information and metadata. Even now, as I write these words, a slight apprehension unsettles me. I am here voicing criticism, albeit indirect, of an industry whose purpose is to extract details about people from their activity on the web and elsewhere, and to use it against them.

I don’t think that Black Cube and NSO are the problem. They are only its symptom. But if people there don’t like what I’ve written, those companies can cause me plenty of damage, at least in terms of the mental distress that accompanies the exposure of personal details. If my surfing history or the WhatsApp messages I have sent my partner were to be leaked tomorrow, there wouldn’t be much I could do about it, just as a citizen who’s suspected by the NII will find it almost impossible to cope in the face of the technological surveillance mechanisms deployed against him – “black boxes” whose contents can’t be known. And that’s exactly the rising power of the Black Cube nation.

Alex Gekker is a lecturer of new media and digital culture at the University of Amsterdam.