Say you’re in sales and your boss offers you, privately, to choose between two accounts. Both clients have the same commission potential, but one of them has an unpleasant reputation. Will you hesitate before choosing the nicer client and letting your colleague deal with the other one?
More people than you might think would rather avoid making the obviously selfish decision in this kind of situation. So, if their boss offers to flip a coin, they are likely to do so. But should the coin fall on the “wrong” side, they are also likely to disregard the toss and ultimately assign themselves the nicer client. After all, the boss said that they could choose.
In essence, this is what Dale Miller of Stanford and I show in our recent paper, “A dynamic perspective on moral choice: Revisiting moral hypocrisy”. When people flip a coin (out of fairness), it is not with the intention of disregarding the outcome if it disadvantages them. They are sincerely hoping to get the better deal through luck, thus avoiding any damage to their self-regard. But if Lady Luck isn’t on their side, only then do they revise their decision, using the justification that flipping the coin was optional in the first place. It is a two-step process.
Fairness carries a weight
Acting fairly or doing good deeds for others often comes with a cost. For instance, you may need to expend time, money or effort. But acting selfishly is not cost-free either – the bill comes due when you look at yourself in the mirror. For many people, this creates a painful dilemma. My co-author and I ran four studies to examine how people deal with it.
In our first study, survey participants considered four possible ways of assigning the most advantageous of two tasks: assigning it to oneself or to someone else, or obtaining – or losing it – through chance (a coin toss). Getting the better deal via flipping a coin was perceived as the happiest possible outcome. On a scale of 7, participants rated it 5.91, higher than just assigning it to oneself (5.24). The least happy outcome was to lose the better deal based on a toss (3.58). If the other party is to get the better deal, it might as well be your own doing – at least you could feel good about being such a kind person.
Doing the right thing
Our second study showed that 62 per cent of people chose to flip a coin to assign a better task even when they were told that the decision would be final. After being reminded that a coin toss was considered the fairest practice, 495 participants out of 804 chose to let an automated randomiser decide whether they – or someone else – would get assigned a task that came with a potential US$10 bonus (as opposed to a task without the possibility of a bonus). The take-away: A substantially large proportion of people value fairness enough that they are willing to risk losing the most attractive of two options.
In the third study, we showed that ignoring the outcome of the toss and assigning to oneself the pleasant task was viewed as more justifiable when the coin flip was optional as opposed to mandatory. This suggests that people think the defence that “I could have picked the self-interested outcome for myself from the start” is a reasonable one.
The last study allowed us to show to what extent people will try to hang on to principles of fairness. Participants could either click a button to use a randomiser, or another button to skip the randomiser and assign themselves the task. Many participants (54 percent) clicked on the randomiser, even though they could just assign themselves the better task.
When the randomiser did not favour them, the majority moved on to assign the tasks, ostensibly according to the randomiser’s outcome. About half of these people were honest, assigning the other participant the positive task, whereas half were dishonest, assigning themselves the less onerous task.
Our contention is that the latter half – those who morally capitulated and assigned themselves the positive task – started out with the best of intentions. They expected to follow the toss outcome. But once they were faced with the negative result, they justified their change of tack by saying, “I could have just assigned myself the positive task anyway.” This way, they were able to justify prioritising their self-interest.
As evidence of the two-step process, we examined whether, when given the chance, people would actually “undo” their decision to use the randomiser after seeing a negative outcome. We saw that indeed, when placed in such a situation, some participants ‘unclicked the randomizer, as if they had never seen its result, to begin with, then assigned themselves the positive task as if that had been their intention all along. We confirmed that this was not just due to carelessness or mis-clicking, as those who received the positive outcome were much less likely to un-click; they instead followed the random outcome and assigned themselves the positive task, probably thinking that they would have done the fair thing if the randomiser had gone the other way.
The importance of accountability
When people act selfishly after professing their commitment to fairness, we tend to think that they are moral hypocrites. We assume that they knew all along that they would favour their self-interest and that they just paid lip service to notions of fairness. In reality, they may have been sincere.
Self-interest eventually prevails, once it becomes impossible to both have one’s cake (acting in one’s best interest) and eat it too (avoiding shame and guilt). In other words, people stay moral – until they cannot. They overestimate their ability to follow the flip’s outcome.
However, the dilemma at hand allows for a variety of responses. While a minority of people (dubbed the selfish) chose the best task for themselves outright, some people (called altruists) immediately assigned the positive task to the other person. Some people (the fair do-ers, or morally trapped) flipped the coin and honestly followed the outcome, whether it disadvantaged them or not. Of interest to us were people who set out to follow the coin toss, but ultimately changed their minds. Half of them – the lucky ones – got the positive outcome and could follow their self-interest guilt-free. But the other half – dynamic moral shifters – flipped the coin but ultimately gave in to their self-interest, once they found a justification.
The practical implication is that people, as a rule, do want to achieve a fair outcome. But whether they implement it often depends on how accountable they are. So, if a company starts a prosocial behaviour – say recycling – but suddenly stops, it may be because some external factor, like an increase in costs, tipped the balance. If the company does not hold itself accountable, it may just relent, even on its best intentions.
Ultimately, most decision-makers set out with better intentions than you might think. But someone needs to hold them accountable when the going gets tough.