A professor of behavioral economics and psychology at Duke University, Ariely is the author of Predictably Irrational: The Hidden Forces that Shape Our Decisions, and The Upside of Irrationality: The Unexpected Benefits of Defying Logic, both New York Times bestsellers. Ariely’s new book, The (Honest) Truth About Dishonesty, explores some of the surprising reasons we lie to each other, and ourselves. Raised in Israel, Ariely holds Ph.D.s in both business administration and psychology. Wired senior editor Joanna Pearlstein spoke with Ariely as part of the Live Talks Business Forumsseries at the City Club of Los Angeles.
Wired: One of the key ideas in your book is that in order to combat dishonesty, we should understand the reasons people lie and cheat. Break it down for us: Why are we dishonest?
Dan Ariely: If you’re a fan of a sports team, it’s easy to see a call against your team as the referee being evil or stupid. You need to have a motivation to see reality in a certain way. The second thing you need is flexible rules. If the rules are incredibly strict, you can’t bend them in any way. But if the rules are slightly grey, there’s a zone in which you can cheat. And finally, you need a way to rationalize your actions for yourself.
I mean, today you probably will have some opportunities to take stuff and put it in your backpack, and nobody would know. You could take some silverware, there’s cups, there’s all kinds of things you could take home with nobody noticing, and no probability of being caught. If you’re a rational economist you would say, you should take all those options, there’s opportunity! But of course you think differently — if you did that, you’d feel you’re a bad person, and that’s what’s stopping you. The curious thing is that what stops us, doesn’t stop us from doing everything. We have a fudge factor, we have an ability to rationalize some dishonesty and as long as we cheat just a little bit, we can still rationalize it.
Wired: What implications does this have for our ability to prevent dishonesty?
Ariely: If you thought that crime or dishonesty is driven by a cost-benefit analysis, then you have some very basic solutions — for example, put people in prison. And people who were going to commit a crime would say, ‘Okay, I’ll go to prison, not worth it.’ I’ve been talking to big cheaters, including people who have been to prison, and I tell you, nobody I’ve talked to has ever thought about the long-term consequences of their actions. How many people who did insider trading thought about the probability of being caught and how much time they would get in prison? The number is incredibly close to zero, maybe exactly zero. What will happen if we increase the prison sentence? Basically nothing, because it’s not part of their mindset. What we need to understand is the process by which people become dishonest.
We can look at a cheater and say, we would have never been able to do that. But when we look at the long sequence of events, you see it happened over time. You can ask, did the person who was the criminal think they would take all of these actions, or did they just take one? They took one step that they could rationalize. And after they took one, they became a slightly different person. And then they took another step, and another step. And now you think very differently about dishonesty.
Wired: You did an experiment in which you got a designer label to give you some sunglasses. You gave the sunglasses to two groups and told one they were wearing authentic designer sunglasses and one they were wearing fakes. Then, you gave them a test and tempted each group to cheat. What happened?
Ariely: If you think about counterfeits, they’re incredibly good these days from the perspective of technology. It’s almost impossible to distinguish between the counterfeit and the real thing. In our experiments, we give people a test and then ask them to tell us how many answers they completed. We find that on average, people exaggerate a little bit — they solve four problems but report solving six. And what happens if you’re wearing a fake? People cheat a little bit more.
Usually we think about fashion as what we tell the world about ourselves, but maybe fashion has some role in also what we tell ourselves about who we are. If you think about the idea of external signaling — telling the outside who you are, versus the internal signaling of telling yourself who you are — counterfeits tell the world one thing, but you yourself know.
If you think about it, your own idea of morality is really kind of binary, you’re either good or bad, nobody thinks to themselves, I’m 80 percent good. What happens if you pass the threshold? If you pass the threshold and you can no longer think of yourself as good, you say to yourself, I’m a bad person, I’m an immoral person, I might as well enjoy it.
From that perspective fashion is a unique experience. Because imagine that you’re dishonest in one aspect of your life, you have illegal downloads on your computer, you cheat a little on taxes — how likely is that to pass to another category? Not very likely; you do your taxes in one day, you forget about it, you rationalize it, you move on. But if you’re walking around with a fake product from a high-fashion designer? Now you have a constant reminder of your own shaded morality, and because of that, I think fashion, unlike other things, could actually create dishonesty in other things as well.
Wired: Princeton University has a very strict honor system. You describe how incoming students attend lectures about honesty, sign a pledge, et cetera. Does all of that work?
Ariely: Yes, but not in the way people think. Here is the issue: What happens when people sign the honor code? You take a test and you sign something at the top that says, I promise not to cheat and not to lie. The standard theory is that this makes it clear that there’s a consequence, because if I get caught I will get expelled and so on, that’s about the cost-benefit analysis.
There’s another account that says it’s not about the cost benefit analysis, it’s about the fact that if you just wrote something down that says you’re going to be honest, you will have a harder time rationalizing, at least for a short time, your own dishonesty. You are more aware, more thoughtful, more careful, and therefore you will be more honest for a short while.
Now what do you think happens if you finish the test and you sign an honor code at the bottom? Nothing. By the time it’s over, people have already finished cheating. So the signature at the bottom does nothing. At the top, the signature does a lot. I think when we say we’ll teach people about the honor code one time, and think it’ll be good for four years, that’s a little too naïve. We really have to remind people over and over.
We talk about honesty, but the reality is we have lots of human values, and they are not all compatible. We don’t always tell the truth about everything, no matter what the consequences. If you have an internal truth of what you think and you have an external truth of what you say to society, in the social domain it’s called politeness, and it’s many cases it’s okay. The problem arises when this becomes commercial rather than personal. If you’re an accountant and you have an internal truth of what’s happening in your company and you have an external truth, you can see where this goes. Honesty is a complex and tricky thing, and we don’t want to be honest all the time.
Wired: As a teenager you were in a terrible accident in which you were burned very badly. You write in your book about when you were in the hospital receiving treatment, the nurses told you a white lie about how much pain a particular procedure would inflict.
Ariely: This is a great example of where dishonesty is useful. There was something that was incredibly painful, not for too long, but incredibly painful, and I knew about it for six weeks, and they told me for six weeks, it’s going to be painless. If they had told me how painful it was, I would have had six really miserable weeks of anticipation. So at the end of it all, I think they were right.
But I will tell you a story about when they were not correct. One day I went back for a burn checkup and a doctor found me and said, ‘Dan, I have a wonderful new treatment for you.’ I was burned on the right side of my face, so I have no hair on that side, but the left side of my face has little dots from when I shave. The sides are not the same. So what was this proposed genius solution? He was going to tattoo little dots on my face so it’s even. He showed me pictures of people who’d he done this for, and I thought about it and came back and said, ‘I don’t think I want this, I feel kind of strange about it.’ And he said to me, ‘What’s wrong with you, do you enjoy being nonsymmetrical, do you enjoy looking different?’ Now, I was in the hospital for a very long time, I did lots of treatments, I refused some treatments, but I never got this magnitude of a guilt trip. I was really kind of baffled. And I went to his deputy and said, ‘What’s going on, where is this guilt trip coming from?’ And the deputy tells me, they’ve done this to two people and they need a third for an academic paper.
Now the thing that’s interesting is, this guy was a fantastic physician, thoughtful, he cared. But at that moment, his own conflicts of interest pushed him over the line.
Wired: Pharmaceutical companies are known for wooing doctors and their staffs with gifts, meals, etc. You write that when a pharmaceutical company pays a doctor to give a speech, they are looking for something a bit unexpected. What was that?
Ariely: The drug companies pay physicians to stand in front of their colleagues and give a short lecture about a new drug. And what’s happening is the companies don’t care about the people who are listening, they care about the physician who is giving the speech. Because they’ve found that the moment you express an opinion you start believing it to a higher degree.
By the way, there’s some really disturbing results showing that the best financial investment in the U.S. is lobbying. Return on the money is really high. I mean, you can give someone a sandwich and they will start seeing the world from your perspective to a slight degree. The moment you do favors for somebody, the moment you put them in a different situation, their view does change.
Wired: You write that people find it easier to rationalize stealing when they’re taking things rather than actual cash. You did an experiment where you left Coca-Colas in a dorm refrigerator along with a pile of dollar bills. People took the Cokes but left the cash. What’s going on there?
Ariely: This, I think, is one of the most worrisome experiments we’ve ever conducted, and it’s again about rationalization. There’s a story about a kid who gets in trouble at school for stealing a pencil from another kid, and the father comes home and says, ‘Johnny, that’s terrible, you never steal, and besides, if you need a pencil, let me know and I’ll bring you a box from the office.’
Why is that slightly amusing? Because we recognize that if we were taking the pencil from the office we would not have to confront that we are being immoral, in the way that we would if we took $10 from the petty cash box (even if we used that cash to buy pencils).
Now the reason this worries me is we’re moving to a cashless society; we’re soon going to have all kinds of electronic wallets. We have all kinds of esoteric financial instruments. We have lots of things that are multiple steps removed from money. We are moving to a situation which allows people to rationalize dishonesty to a much, much higher degree. And because of that whenever we have financial instruments that are further way from money, we just need to be more careful.
Wired: You’ve done some research on how stress affects people’s behavior: You gave groups of people different mental tasks and then presented them with some snacks. Who ate what, and why?
Ariely: So imagine that in one condition I ask you to do something that doesn’t take much of your thinking capacity — remember a two-digit number. And in another condition, I ask you to think about a seven-digit number. And you have to keep on remembering. And then we say to you, would you like chocolate cake or an apple? When people have to remember a much longer number, they’re much more likely to take the chocolate cake.
If you’re asked to think about the seven-digit number, part of your brain is busy, and the voices that you hear in your mind come from the emotional side; the cognitive side is busy. As we get tired, we are more likely to do the thing that is the path of least resistance. We have an impulsive nature that is basically saying, chocolate cake.
The experiments show quite clearly that as you resist more and more temptation, you’re actually more and more likely to fail. And we also find this in dishonesty. So we show that if we exhaust people mentally, we give them all kinds of complex tasks that they have to suppress their initial instincts, as they do it more and more, they also cheat to a higher degree. Because cheating is one of those things that we have this immediate reward, which is cheating, and in the long-term thought, it would be nice to be honest, and as that part is getting weaker and weaker we follow more of our impulsivity.
Wired: What should people do about that? It’s impossible to eliminate stress.
Ariely: I think people should work on their taxes first thing in the morning.
Wired: Different religions have opportunities for penance and atonement — Catholics have confession, Jews have Yom Kippur. Are those helpful?
Ariely: I went to Italy to talk to a Catholic priest, and I said, please explain confession to me. And this is no offense to the Catholics in the room, but from an economics perspective, confession is really odd. Because if you can confess and get absolved, shouldn’t you cheat more? Shouldn’t you cheat on the way to confession to minimize time in purgatory? So the priest said, ‘No, this is not how it works.’
You say to yourself, ‘I can steal, I can get caught, but I’ll also have to talk to a priest and it’ll be unpleasant.’ And you say, ‘You know what, not worth it.’ It’s as if the priest is increasing the cost of the crime.
Another possibility is that, much like thinking about the honor code, you come out of confession, and you feel pure and wonderful, and you stay with that feeling for a little bit longer and you don’t cheat after confession.
Now imagine what would happen if we started to do this more commonly in society: What if we allowed our bankers and politicians to start fresh? Now if you think that people basically want to be good, as long as you remind them about it, that means that a confessional process could actually allow people to achieve that. Because if you behave badly for a while, if it’s a slippery slope, there’s basically no way for you to reset.
In Catholicism, you as the sinner decide when to confess. In Judaism it’s once a year, outside of your control. The second difference is that in Judaism, because it’s a fixed day, everybody confesses at the same time. So there’s a social coordination mechanism.
Wired: And in Judaism, on Yom Kippur, you’re fasting, depleting yourself.
Ariely: And fasting is again really important. Suffering. What is the role of suffering in forgiveness? We have done experiments on torturing yourself, and it does look like it is cleansing. I mean it’s kind of sad, right? It would be nice if we could cleanse ourselves by doing something good for other people, but we can’t. We really have to suffer personally to do that.
- Dan Ariely tells the Truth about Dishonesty (wnyc.org)
- Dan Ariely “The (Honest) Truth About Dishonesty” (eyeonbooks.com)
- Dan Ariely: We Are Taught That Lying Is The Polite Thing To Do (businessinsider.com)
- Why (Almost) All of Us Cheat and Steal (business.time.com)
- Behavioral Economist and Bestselling Author Dan Ariely Visits St. Louis County Library (prweb.com)
- The (Honest) Truth About Dishonesty. How we lie to everyone – especially ourselves. Dan Ariely. (regnordman.com)
- Behavioral Economist Dan Ariely on the Relationship Between Creativity and Dishonesty (brainpickings.org)
- Why So Many of Us Cheat and Steal: A Behavioral Economic Analysis (business.time.com)
Filed under: news Tagged: | Ariely, Cheat, Cheating, Dan Ariely, Dishonesty, Duke University, Go to Prison and Eat Chocolate Cake: 10 Questions With Dan Ariely, Predictably Irrational, Princeton University, The Upside of Irrationality: The Unexpected Benefits of Defying Logic at Work and at Home, Why We Lie