Who do we do good for, huh? Can sense the “right thing to do” on an existential, foundationally human level? Does it just feel good knowing the “right” thing was done? Are we doing it for ourselves, proving we are good moral people? Are we being “moral” because society has deemed certain behaviours moral and desirable, and that sounds like it has a lot of perks? What is it that motivates ethical actions? Better yet, what motivates us to be unethical in spite of it all?
In 1997, Batson and colleagues started on a journey, a series of studies designed to try and determine why we behave morally. Are we doing it for our own moral integrity, or are we showing off? The results brought up an interesting question: How aware are people of their own hypocrisy? When we perform immoral actions, are we sinisterly aware of our deeds or are we tricking ourselves into thinking it’s justified?
Ever seen a mirror placed behind the counter at a bar? Well there’s a reason for it other than letting you fix your hair, it’s actually to prevent people from stealing drinks. Could you do it? Could you look your reflection in the eye, asking yourself if you can go through with what you’re about to do, and pour yourself a drink when the bartender is looking the other way? Batson and colleagues had done this with their subjects, and they found that forcing people to see what they were doing gave enough pause that they bailed out on it. So, this must prove that moral hypocrisy happens because we aren’t thinking about what we’re doing; this must prove that we deceive ourselves into a state of ignorance, right?
Well, three European psychologists (Lonnqvist, Irlenbusch, & Walkowitz, 2014), were skeptical, and brought forth another possibility. A mirror not only raises your awareness of your self, but it raises your self-awareness. If you can see yourself, then who else can see you? The use of the mirror heightens the feeling of being watched, so the implication here is that people in front of a mirror catch themselves from being immoral not necessarily because they would judge themselves, but because someone else might be around to judge them. So, the Batson study really didn’t prove one way or the other, and these three psychologists worked on modifying the experiment in a way that might give them answers. Do people who claim to be moral act like hypocrites lose their state of mind and commit a crime of passion, or is it premeditated?
The paradigm that started all this was a form of the “dictator” game. It isn’t as intense as it sounds; it just means one person gets all the power to make a decision that affects other people. The game Batson et al. used was very simple and had some surprisingly hilarious results (at least to me), and this study used a modified version.
Students at the University of Cologne in Germany were assigned to pairs and then given either the role of dictator or not dictator (i.e. helpless citizen). They were put individually into cubicles to make sure they didn’t interact with anyone and could feel like they were alone. The dictator was then given further instructions. They were given 10€ (Remember, this is Germany) and told they would be deciding how it would be distributed between them and their assigned partner. They could either distribute it 8 for them and 2 for the other, or split it 5 and 5. Here’s the fun part: they were also a given a third option. The dictators were given a special coin, one side saying “5/5” and the other saying “8/2,” and were told they could use the coin to make their decision if they wanted. There was a computer in the cubicle, on which they could click to indicate which decision they made and whether or not they used the coin, and the experimenter then left them completely alone in their cubicle with no one watching them, not even secretly. Presumably, the partner was in another cubicle twiddling their thumbs during all this.
After making their decision, the participant would then answer some questionnaires on the computer about personal values (specifically to measure their affinity to conform and their commitment to fairness and justice and all that).
Lonnqvist and colleagues found in their results exactly what Batson and colleagues had found in theirs. By far the majority of dictators went with the 8/2 distribution (because obviously), a little over half of them without using the coin. The real interest is in the results for those who indicated that they did use the coin. What are the odds on a coin flip again? It’s 50/50 right? Well, funnily enough, no less than 100% of people who used the coin left with 8€.
So what happened here? About half of them must have been lying, statistically speaking, maybe even more. There was no actual person watching what they did, but doesn’t directly taking the 8/2 just sound like a dick move? What people realized they could do was flip the coin and, whether it landed on the right side or not, look over their shoulders to confirm no one was there, and then check 8/2. It isn’t mentioned in the paper, but if the question was worded like “did you use the coin to help you decide?” they technically weren’t even lying. Although, some of them probably never even flipped the coin; they just outright lied. The majority of participants wanted to make the decision that was good for them at the expense of another, but doesn’t that just feel wrong? What if you could blame it on a coin flip, does it still feel wrong?
One of the suggested reasons for moral behaviour is that we do it for the social benefits that come with being known as a moral person. At the same time, moral behaviours require a sacrifice of some kind, but what if you could get the benefits and avoid the costs? That’s what people in this study ended up doing, they got the money and they avoided blame at the same time. It’s a win-win!
The measure of personal values didn’t show anything too surprising: those who were more committed to fairness were the few who chose 5/5, and those high in conformity were usually the sneaks who said they flipped the coin.
The second study tested basically the same thing. Will people say they are good even if they would avoid the moral costs given the opportunity? Students at the same University were brought in for essentially the same experiment, only this time there was no coin and the partners were also given something to do. In fact, they were given the same thing to do; both were told they were the dictator. Only, one of them was told it was all a hypothetical situation and they weren’t actually going to get any money. If you were put in that situation, what would you do? They found that those given the hypothetical were split about 50/50 on taking more money or giving fairly, while most of those actually dealing with money made the choice that benefitted them.
This highlights a problem that has always been present in psychological research. Hypotheticals will never be quite the same as actually being in that situation. We all want to appear moral, and we want to insist that even in a real life situation we would make the “right” decision, but that’s not always the case.
The third and final study took away the option to fudge the results. It was the same setup as the first study, only the coin was digital, on the computer, and it’s verdict was immediately put in as the answer. There was no possibility to lie about that one, and the participants knew it. So what did they do? What would you do? The people who used the coin the first time weren’t necessarily malicious, they just wanted a way to get what they wanted and not feel bad. Now, the “coin” was a binding decision, it became a gamble. Would you be willing to risk it? Would you put 3€ on the line just because you don’t want to be seen as the obvious jerk? What’s the worry there? Are others going to judge you for it, or will you judge yourself? What if we increased the stakes, multiplied everything by a hundred so now it was 300€ on the line? Would you care so much about your reputation then?
Unsurprisingly, the majority of people this time directly chose the 8/2 option. So what does this mean for our original question? Do people trick themselves into thinking they’re not being immoral, or are they fully aware and lie to make the rest of the social world like them? The authors of this paper claim that there is no self-deception involved in this case because clearly even people who actually flipped the coin must have known going in that they would fudge the results if necessary, but there was another component to this study that gives me pause.
As we saw in the second study, hypotheticals and real-life situations can give very different results and for very different reasons. Thinking about it in a hypothetical sense, we feel that they must have been aware of their intent to lie, but in the moment, faced with the tough decision and an isolated freedom, they could awkwardly stumble there way into using the coin to choose 8/2. You could flip the coin just for fun and if it lands on 8/2 you could go with it, or deceive yourself into thinking that first coin-flip didn’t count, or that you should do best 2 out of 3, or enter a weird meditative trance that ends in you choosing 8/2. It sounds weird when you know all the components and can think about it rationally, as the experimenter or anyone else, but I think it’s very easy to ignore our own behaviours and act without thinking at all; the decision to lie can be made after the fact.
References
Batson, C. D., Kobrynowicz, D., Dinnerstein, J. L., Kampf, H. C., & Wilson, D. W. (1997). In a very different voice: Unmasking moral hypocrisy. Journal of Personality and Social Psychology, 72, 1335–1348. doi: 10.1037/0022-3514.72.6.1335
Lönnqvist, J., Irlenbusch, B., and Walkowitz, G. (2014), Moral hypocrisy: Impression management or self-deception? Journal of Experimental Social Psychology, 55, 53-62. doi: 10.1016/j.jesp.2014.06.004