Imagine you have been asked to participate in an experiment where you are asked to solve a number of puzzles. For each correct solution, you are given $10. The results are self-scored and when the time is up you are in the group that is permitted to shred the answer sheet. The proctor mentions in passing that the research program is well-funded.
Quick poll…would you be tempted to cheat in this experiment? What if you were awarded $100 for each correct solution? The researchers found that those allowed to shred their answer sheet reported significantly higher scores than those who did not. Also, when money was present on a table in the room, participants were more likely to cheat. Their ultimate conclusion was that people who, in the abstract, believe they are honest, cheat when given such an easy, unverifiable opportunity. Participants would later rationalize their behaviour so their positive self-image remained intact (i.e. “I would have easily solved that problem if I had 10 more seconds or didn’t make a careless mistake.”)
What’s going on here? In Blind Spots: Why We Fail to Do What’s Right and What to Do About It Max Bazerman and Ann Tenbrunsel focus on “bounded ethicality” which is the systemic and predictable ways in which humans act unethically beyond their own awareness. So, why aren’t you (not me, of course!) as ethical as you think you are? The authors say we need to take a good look at our “should” and “want” selves. The “should” self is influential when predictions are being made. In other words, prior to being faced with an ethical dilemma, people predict they will make an ethical choice (“I should never improperly withdraw funds from a client trust fund so I won’t.”) At decision time, the “want” self can be more influential (“I need the money to pay a late bill and will pay the trust fund back”). The “want” self doesn’t see the ethical implications of the decision. Post-decision we can be motivated to reduce any dissonance between the “should” and “want” selves and remember behaviours that support our self-image or change the standards of unethical behaviour (“I quickly paid the client trust fund back so no harm was done”).
What can we do to avoid unwittingly acting unethically? Bazerman and Tenbrunsel recommend narrowing the gap between the “want” and “should” selves. First, acknowledge that we are all vulnerable to unconscious biases. Second, during planning, bring out the “want” self by considering possible motivations at the time of decision. This promotes more accurate predictions about the challenges you will face at decision time. They also suggest practicing responses to ethical situations and sharing ethical choices with a trusted, unbiased and (of course) ethical person.
Giving more voice to the “should” self is another tool that the authors recommend. Here are a few of their suggestions:
The authors recommend applying the above suggestions to all of your important decisions because you may not know which ones have an ethical component. In the book, they discuss the concept of “ethical fading”, where the individual simply doesn’t see the ethical dimension of the decision.
Finally, immediately debrief your decisions on a regular basis. Have a trusted friend or colleague act as a “devil’s advocate" - someone who is not afraid to say “really?!” when your decision is questionable.
If your reaction to this discussion is “I’m fine. I’d never do anything unethical” that’s a big red flag! We are all somewhat vulnerable and unprepared for the influence of self-interest at the time of a decision, and it’s in our best interests to constantly examine our behaviours.