The (Honest) Truth About Dishonesty:
How We Lie to Everyone − Especially Ourselves, by Dan Ariely. Harper, 285 pages, $27

Raised in Israel and now a professor of psychology and behavioral economics at Duke University, Dan Ariely is one of those best-selling, globe-trotting, charm-the-pants-off-the-reader authors who use the results of psychology lab experiments, engaging storytelling and pop-culture references to explain to us why we humans behave in the seemingly irrational and venal ways that we do (and, implicitly, what we should do about it). It’s a fraternity that also includes Steven Levitt and Stephen Dubner (authors of “Freakonomics”), Malcolm Gladwell (“The Tipping Point”) and, in his recent book “Thinking, Fast and Slow,” the godfather of behavioral economics, Israeli-born Nobel laureate Daniel Kahneman.

In his new book, “The (Honest) Truth About Dishonesty,” Ariely makes the case that various forms of what he calls “dishonest” behavior (more on that label below) are the result less of rational, cost-benefit calculation than of a range of irrational, subconscious emotions and influences, including conflicts of interest, self-deception, rationalization, “everyone-else-is-doing-it” reasoning, psychological distance, fatigue and even altruism. In support of his argument, Ariely offers a range of fascinating data − culled from his own and others’ ingenious empirical studies − as well as a collection of mostly amusing (though sometimes self-indulgent) personal anecdotes and cultural commentary.

Much of this is quite fascinating. For example, in one experiment, subjects were given designer sunglasses to wear. Some were told that they were authentic, some that they were counterfeits, and others, nothing (in fact, they were all authentic − Ariely, ever the charmer, had somehow convinced the Chloe company to lend him $47,000 worth of luxury goods). The subjects were then given a task designed to test their honesty by self-reporting their success in solving various number problems. Ariely found that while (a still significant) 30 percent of those who had been told their glasses were the real thing dishonestly reported solving more problems than they actually had, a whopping 74 percent of those who believed they were wearing fakes engaged in misreporting. Ariely speculates that “once we knowingly put on a counterfeit product, moral constraints loosen to some degree, making it easier for us to take further steps down the path of dishonesty.”

Similarly intriguing is Ariely’s tongue-in-cheek discussion concerning the common phenomenon of college students’ sending e-mails to their professors reporting the sudden “death” of their grandmothers. According to a study conducted by biology professor Mike Adams, and related by Ariely, “grandmothers are ten times more likely to die before a midterm and nineteen more likely to die before a final exam.” The vast majority of the students, of course, are lying. Consistent with his view that fatigue makes people more likely to cheat and lie, Ariely theorizes that “students become so depleted by the months of studying and burning the candle at both ends that they lose some of their morality.” Perhaps. But a better explanation may be simply that, as the day of the exam approaches and study time grows short, the prospect of failure rises, and students have more incentive to cheat. Here, as elsewhere, Ariely seems to succumb to a kind of confirmation bias, finding in the data support for just the conclusion he’s looking for.

Throughout the book, Ariely seeks to puncture the idea, first advanced by classical economist Gary Becker, that people cheat and lie because it’s in their rational interest to do so. Surely Ariely is right that people do not engage in criminal activity solely as a result of weighing the benefit to be gained from criminal behavior against the expected punishment if caught and the probability that they will be caught. But Ariely misses an important element of rational calculation by failing to acknowledge that people fear not just actual jail time but also the stigma and ostracism they will suffer if their wrongdoing is detected. Indeed, Ariely’s own studies, which he says support the view that cheating is “infectious,” can in some cases just as easily be interpreted as supporting the view that cheating is more probable when detection is unlikely.

Horn-blowing

Another, albeit minor, complaint I have about “The (Honest) Truth About Dishonesty” is that, charming as Ariely often is, he has a tendency to blow his own horn a bit too much − about his academic accomplishments, his famous friends, his busy speaking schedule. For example, Ariely tells us that a “typical itinerary” of his recently took him from his “home in North Carolina to New York City, then on to Sao Paulo, Brazil; Bogota, Colombia; Zagreb, Croatia; San Diego, California; and back to North Carolina” and a “few days later,” to “Austin, Texas; New York City; Istanbul, Turkey; Camden, Maine; and finally (exhausted) back home.” There’s really no reason to include all of this, especially when it has almost nothing to do with the otherwise interesting point he’s making (about the “infectiousness” of immorality); indeed, it distracts from the interest of his argument, and a good editor should have taken it out. (Nor, I suspect, do most readers really need to be told that San Diego is in California or that Istanbul is in Turkey.)

My biggest problem with Ariely’s book, however, is that it tends to treat a collection of key moral concepts − including dishonesty, cheating, deception, lying, conflicts of interest and stealing − as if they were interchangeable.

They are not. As we normally understand such concepts, cheating consists of violating a rule with an intent to obtain some competitive advantage. Lying involves making a statement that is not only intended to deceive, but is also literally false. Other forms of deception do not require literal falsity. Stealing consists of unlawfully depriving another of her property rights. Conflicts of interest typically involve a breach of loyalty. Dishonesty is something of an umbrella term that signifies some moral defect, but need not involve deception, stealing, cheating or disloyalty.

Ariely, unfortunately, makes no such distinctions, and the result is a good deal of conceptual confusion. To take just one characteristic example, Ariely says: “Companies also find many ways to cheat. ... Think about credit card companies that raise interest rates ever so slightly for no apparent reason and invent all kinds of hidden fees and penalties (which are often referred to, within companies, as ‘revenue enhancements’). Think about banks that slow down check processing so that they can hold on to our money for an extra day or two or charge exorbitant fees for overdraft protection and for using ATMs. ... [I]t is important to discourage ... [these] forms of dishonesty.”

No one, certainly no consumer, would disagree that it’s annoying that credit card companies and banks engage in these kinds of practices. But before we can determine whether any of this involves cheating, or even dishonesty, we need to know more facts. For example, was the credit card company legally prohibited from raising its rates or adding fees? If so, then we can justifiably say that it cheated. But what if no rule was broken? In that case, we might still say that the company was behaving aggressively or avariciously or perhaps even exploitatively − but not that it cheated, and perhaps not even that it acted dishonestly.

These kinds of distinctions matter, in law and in morality. Consider someone who’s misrepresented his income or claimed deductions he is not entitled to. In moral terms, we can say that this person has cheated and, in legal terms, that he has committed the crime of tax evasion. By contrast, a person who aggressively claims every deduction and exemption that he’s entitled to has engaged in nothing more than lawful tax avoidance − not a crime, not cheating, and perhaps not even dishonest behavior. Similarly, one who, while testifying under oath, lies about a material matter, has committed perjury, a serious crime that, in the United States, can lead to a punishment of up to five years in prison. But one who offers literally true testimony, even when it is misleading, has not lied, and therefore has not committed perjury.

Such distinctions matter from a psychological perspective as well. The fact that “everyone else is doing it” makes people more likely to cheat on Ariely’s numbers problems doesn’t necessarily tell us anything about the effect of such rationalization on their propensity to commit adultery or illegally download software from the Internet (to use two examples of what Ariely seems to regard as cheating). Similarly, the fact that people who were tired were more likely to cheat on a multiple-choice quiz about the history of Florida State University tells us little or nothing about their inclination, when tired, to steal or to run red lights (again, all examples given by Ariely).

The point is not that Ariely’s experiments aren’t valuable or intriguing; they are both. But his failure to recognize significant differences among various forms of unethical and antisocial behavior, and to treat them all as if they were fungible, causes him to claim too much for the significance of his data. In the end, Ariely’s account misses much of the subtlety that makes our moral lives so richly perplexing. And it may blind him to the need for further study.

Stuart P. Green is the author of “Lying, Cheating, and Stealing: A Moral Theory of White Collar Crime,” and, most recently, “Thirteen Ways to Steal a Bicycle: Theft Law in the Information Age.”