‘You Nazi!’: The Man Who Wanted the Internet to Stop Invoking Hitler Has a Plan to Save Social Media

Mike Godwin, father of the internet’s most famous adage, says all is not lost for the web, but warns: ‘Never again is not a prediction, it's a moral imperative’

Send in e-mailSend in e-mail
Go to comments
Trump taps the screen on a mobile phone during a roundtable discussion on the reopening of small businesses in the State Dining Room at the White House in Washington, U.S., June 18, 2020.
Trump taps the screen on a mobile phone during a roundtable discussion on the reopening of small businesses in the State Dining Room at the White House in Washington, U.S., June 18, 2020.Credit: LEAH MILLIS/ REUTERS
Omer Benjakob
Omer Benjakob
Omer Benjakob
Omer Benjakob

Most people might not know about Mike Godwin, but the internet adage bearing his name is perhaps the most famous predictor of online activity: Godwin’s Law stipulates that, “As a discussion on the internet grows longer, the likelihood of a person being compared to Hitler or another Nazi reference, increases.” If you are active on social media, even if you’ve never heard of Godwin or his eponymous law, you’ve probably experienced it first-hand.

Godwin conceptualized his law at the end of the 1980s on the basis of his experiences as an early user active in the blogs of yesteryear (so-called bulletin board systems). “I realized I could’ve formulated a more earnest version: ‘Please, please do not bring up Hitler, unless you cite your sources,’” he says in an interview with Haaretz and my new podcast Disambiguation (with Slate’s Stephen Harrison). “But I wanted to create something that had enough memetic and self-propagating power so that it would not need me around – and part of that was making it funny.”

Speaking of memes, Godwin was also the first to apply that term – coined by Richard Dawkins in his book “The Selfish Gene,” to describe a unit of cultural inheritance – to digital culture.

At 63, Godwin is one of the youngest members of what can be thought of as the internet’s founding generation. As a law student (at the University of Texas), he made a name for himself by working on one of the first and most famous hacker cases, Steve Jackson Games, Inc. v. United States Secret Service. The case helped spur the establishment of the Electronic Frontier Foundation, an organization focused on internet rights that’s still active today, and of which Godwin was the first paid employee.

While it may seem today that the internet is mostly populated by for-profit conglomerates led by a new class of billionaires like Jeff Bezos and Mark Zuckerberg, Godwin’s work, on the other hands, takes place in a different, more historical and ideological arena of the internet – that of nonprofit organizations. Among others, Godwin has worked with the Internet Society, an NGO that owns the rights to the .org domain, and served as general counsel for the Wikimedia Foundation, which supports the activities of Wikipedia and its plethora of sister projects. He even helped Wikipedia fend off legal action by the FBI in 2010 when it demanded that the free encyclopedia remove the G-men’s logo from its article about the FBI, claiming it was a form of impersonation of the federal agency. Calling out that claim, Godwin sent the FBI a sharply worded yet humorous response, which he also leaked to the media. Today, we would say the letter “went viral,” and together with many of his other activities it serves as a relic of a simpler time, when the biggest challenge to the internet was legal overreach by national authorities. (The FBI backed down.)

Today, reflecting on over 30 years of digital activism, he flags other, bigger threats to the internet – namely how social networks have eroded the public debate so critical for democracy’s survival – and discusses why he thinks social media can still be saved. However, he says that “to some extent, it’s quite often the case that what’s purported to be a new problem that arises out of social media is actually an old problem that we dealt with – or at least tried to deal with – 20 or 25 or 30 years ago.”

What was already clear back then?

“We correctly understood that internet communications were going to be disruptive in various ways, because for most of human history, having access to a mass medium was something that required a lot of capital – you had to own a newspaper or radio station – to reach audiences of tens or hundreds of thousands or even millions of people… so obviously this was going to be disruptive.

Mike Godwin.
Mike Godwin.Credit: Say Mony of Voice of America

“Also, like anyone who spent time online, I knew that when other people got online and especially when they were new to that experience, they would often get into quarrels and say bad things about people. Once you learn what people actually think, that’s much more upsetting than walking around and assuming people think like you – apropos Godwin’s Law – and when you realize how horribly different other people’s opinions are, you may have a reaction to that.”

What, then, is a new and different problem?

“As the internet became public, and people who were not at universities or think tanks or the government [also had] access to it, then one would properly expect many of these same behaviors to manifest themselves at more and more levels. But I think the part that isn’t predictable was the consolidation of media platforms and certain services.”

You mean the emergence of social media giants like Facebook?

We taught people how to fix their own car – that’s really hard! I think people can also learn to do this basic, obligatory citizen stuff – like being more analytical and critical, but we don’t have a culture that encourages those things.

Godwin

“Because the barrier to entry for creating a blog or starting your own internet service or forum was relatively low, we thought there would always be thousands or even millions of them. So 10 or 15 years ago, it was the heyday of blogs, so anyone could have a blog or blog-hosting service, and there were many of them out there. This could and did lead to a multiplicity of voices.

“What couldn’t be predicted or anticipated is that social media platforms like Facebook or Myspace or even Yahoo would lower the barriers to entry so much, allowing you to connect very quickly to networks of friends and family and like-minded people [that people would not need alternatives to them]. That led to consolidation, and there are different economies of different scales that come from that, and once you have a dominant platform, it attracts advertising dollars. I just don’t think we thought that someone like Mark Zuckerberg would have more users than most nation-states – except perhaps China – have citizens.

“The ‘90s was a time everyone had a homepage and [the web] was still a vital forum for vital discourse – and I don’t think it was that great. It was great – it was better than people stewing silently at home [like now] – but all the behaviors, including the bad ones, but also the good ones, were already apparent in the early days.

“The difference is that once you have dominant platforms, they can easily be captured – either by advertisers, for commercial purposes, or by political interests, which raises a different set of problems. It’s no wonder that Facebook and Zuckerberg feel like they are victims of their own success in that respect.”

Facebook’s CEO Mark Zuckerberg testifying before Congress on July 29, 2020.
Facebook’s CEO Mark Zuckerberg testifying before Congress on July 29, 2020.Credit: Graeme Jennings/AP

Sorry, but I want to disagree with you: They are not really victims of what became of their platforms, because they set them up that way. For example, if you look at the fundamental decision by Tim Berners-Lee to register the World Wide Web as a nonprofit, it created structural or legal safeguards that maybe can’t work for everyone, but do provide a different ethos and model than what Zuckerberg is doing.

“I think that on the one hand, there is a role for social organization that is not just about – or not primarily about – making money. The NGO world is important in that respect. But these NGOs need money… Everybody’s got to raise money if they are going to do the work they feel needs to be done.”

“Traditional journalism is also dependent on advertising, which typically relies on capturing audiences with appealing [editorial] content and hoping they look at your ads. We don’t want to outlaw newspapers just because they are for-profit, or tell The New York Times they cannot continue to exist if they sell ads.”

Culture of disinformation

One of the most interesting claims Godwin does make against social media giants like Facebook has to do with a new and unique problem that he identifies with contemporary internet culture: disinformation. For him, Facebook has shirked responsibility by framing the issue as one of free speech – and proposing dealing with it by way of content moderation – instead of as a cybersecurity problem, as he views it, if not a national-security one.

“I am very critical of Facebook and what they’ve been doing with their oversight board...they have not been forward looking,” he says. Facebook announced in May the creation of a so-called ‘supreme court’ to deal with content issues on the site, in the wake of years of criticism, during which it has done little to stymie hate speech and so-called fake news. The oversight board is to be charged with deciding when individual pieces of content should or should not be displayed on the site, but for Godwin, this misses the point.

“It is the nature of the human condition that there is going to be a lot of misinformation out there, as people misunderstand an issue or try to understand stuff that is beyond their expertise…. We are inherently vulnerable. But disinformation, and the deliberate attempt to mislead people by misstating facts, is something [new, and] we need to do a lot to correct for it."

Billboard depicts Trump as Pinocchio outside the White House, May 28, 2020.
Billboard depicts Trump as Pinocchio outside the White House, May 28, 2020. Credit: Eric Kayne,AP

Perhaps there is also room for regulatory intervention? For example, as President Trump has suggested, by canceling Section 230 of the Communications Decency Act, so that online platforms are treated the same way as publishers?

“Section 230 was meant to empower the internet companies to undertake content intervention in an ad hoc manner.... But here’s where things went wrong: The internet companies correctly understood that, to the extent they intervene on content sometimes... it raises the expectations of more content moderation and prevention – and that doesn’t scale well for big platforms like Facebook. So these companies decided to be hands off, because if and when they intervene, people will start complaining that, either they censor too much, don’t censor enough or censor the wrong stuff.”

Godwin envisages a binding ethical code, written on the basis of what he terms 'consensus between all the relevant stakeholders' – from tech companies to the public and lawmakers.

For Godwin, with all that Facebook still has to do, people too need to learn to be much more critical consumers of the content they encounter online: “We taught people how to fix their own car – that’s really hard! I think people can also learn to do this basic, obligatory citizen stuff – like being more analytical and critical, but we don’t have a culture that encourages those things.”

We don’t have that culture because social media is geared toward interaction – not to fostering a real and critical debate. When you open Facebook, the first thing you see is an empty box, which asks you, ‘What’s on your mind?’ Facebook wants me to share my feelings, and therefore I find it hard to believe they can be the ones to promote critical thinking.

“I’m just very cautious of any characterization of the media landscape as being pre- or postlapsarian... If you look at what Facebook has been trying to do with its oversight board, there is a bunch of criticism that one could have, since they haven’t really done any work yet.

“One critique is that [Facebook] confines itself to content moderation issues – but that’s not the only source of complaints about social media platforms and tech companies. There are also tons of complaints about how user information is handled, and that there is this trove of info about each of us as consumers that Facebook and advertisers have access to, and that we don’t have much control over.

“The second issue is [political] microtargeting, where you’re aiming for tiny, tiny groups of weirdly specific demographics and [you present an ad whose] purpose is to suppress their vote or get them to vote against somebody – I think it’s reasonable to be creeped out by that.

In your book “The Splinters of Our Discontent: How to Fix Social Media and Democracy Without Breaking Them” [2019], you suggest setting up a code of ethics for technology firms as a wider-ranging solution to all these different issues.

“I think the tech industry needs to respond to the criticisms not by boardroom-level reform, but by beginning with intra-industry reform: There needs to be [a situation similar to the one in which] all lawyers and doctors share an ethical position with a common language that allows everyone to understand when you’ve done something right or wrong… Maybe we still distrust lawyers, but at least we have a common language to define when they are engaged in malpractice.”

Godwin envisages a binding ethical code, written on the basis of what he terms “consensus between all the relevant stakeholders” – from tech companies to the public and lawmakers.

“This has been a [successful] pattern, in Western democracies at least – that once you have a well-developed code of ethics within a profession then it is really easy to get the government to buttress that with enforcement frameworks, so that if you violate medical ethics then it’s not just that you don’t get to hang out with other doctors anymore, but rather you lose your license to practice medicine. True, not everyone trusts lawyers or even doctors – but it does give you a framework to state your complaints and have them addressed.”

Karl Popper.
Karl Popper.Credit: LSE Library

Open society principles

For Godwin, perhaps the best example of an internet project with a clear ethical code is Wikipedia, in which the community of users are also active members in defining the project’s policies and goals. For him, such ethics touch to the core of what the internet can be.

“Wikipedia has many flaws. But the reason it’s so successful... is that the principles around which the society of Wikipedians have organized are critical and positively oriented. It’s basically a lot of ‘Open Society’-type principles. I use ‘Open Society’ quite literally, from Karl Popper. One of the things Popper does in [his book] “The Open Society and Its Enemies” is to say that, instead of trying to figure out who is the right person to be in charge of society, you build institutions so that even when it’s the wrongest of the wrong who is in charge, the society still survives, it’s robust, and around the world today, societies are undergoing testing of exactly this type.”

In light of this new political reality, is it time to update Godwin’s Law?

“I’m the father of the law, but my child is now about 21 so I’m not in charge. Godwin’s Law has its own life now and sends me a postcard on Christmas. Godwin’s Law has its own autonomy. I think people have a misunderstanding of what it proves and I say the emergence of the comparison [between Hitler and contemporary leaders] shows how accurate it is.

“The purpose of Godwin’s Law was primarily to underscore what it really means to ‘never forget,’ and what it really means to remember the Holocaust. What it means is that you don’t invoke it liberally as a so-called ‘trump card’ in a public debate. Hardly any politician alive is like Hitler. So, you don’t want to see that comparison trotted out routinely.

“That said, obviously when you look around the world – in Brazil and in the Philippines, and in my wife’s home country of Cambodia, or in Myanmar – you see a bunch of different governmental actions or government-sanctioned actions that are hugely destructive, and they look like examples of totalitarian regimes. So you don’t want to say that it can never happen again: ‘Never again’ doesn’t mean it can’t happen again; it’s a moral admonition, not a prediction.”

Comments