In the early 1960s, Stanley Milgram published one of the most famous (and depressing) experiments of the 20th century. In it, he found that the vast majority of participants were willing to inflict physical pain, via electric shock, on strangers when instructed to do so by an authority figure in a lab coat.
The experiment may have been ethically dubious -- understandably, many participants were deeply traumatized by the ordeal -- but it raised important questions about human nature and obedience.
In a new study published in Current Biology, researchers attempt to get at the root of why we are so easily persuaded to commit unethical actions when ordered to do so by another person. To answer that, they first needed to parse out the differences between what happens in the brain when we act badly by our own volition vs. when we are compelled to do so by someone else.
Participants were given £20 each, assigned the role of agent or victim and paired up in groups of two (eventually, they would switch roles). All agents were then given three tabs: The first administered “a demonstrably painful electric shock” to their assigned victim, the second enabled them to steal a small percentage of the victim’s £20 for themselves, and the third did nothing. In some cases, a researcher told the participants the tab they should press. In others, they were allowed to choose for themselves. Regardless, every time a tab was pressed a tone would sound.
According to the study, “coercion increased the perceived interval between action and outcome, relative to a situation where participants freely chose to inflict the same harms.” In other words, agents who proactively decided to either shock or steal from their assigned victim reported hearing the tone sooner than those who were ordered to perform the same action.
This is important because previous research indicates our perception of time is fluid. When we consciously decide to perform an action and then follow through, the outcome feels immediate. In contrast, when an action feels unintentional and outside of our control, our perception of the speed at which it occurs slows down.
The difference in perceived interval between action and outcome indicates that “people who obey orders may subjectively experience their actions as closer to passive movements than fully voluntary actions,” the authors write.
“Specifically, coercion may reduce the linkage that normally binds the experience of actions to their outcomes,” they continue. “Indeed, emotional distancing from distasteful outcomes of one’s own necessary actions forms a specific part of training and professional culture in medicine and in the military.”
All of which is to say, when we receive orders to act badly, our brains help us distance ourselves from the consequences of our behavior. In real life, this disconnect may help explain why corrupt and morally-reprehensible behavior has a tendency to spread through organizations and even industries.
How then do we combat this harmful dissociation? According to the study, “learning the true valence of one’s actions’ outcomes might potentially make the sense of agency more resilient to the undermining effects of coercion.”
So the next time your boss asks you to act in a way that violates your moral code, remember: just because someone told you to perform the action doesn’t mean you aren't responsible for the consequences.