Science —

It’s actually easy to force people to be evil

Neurological evidence that people feel less responsible for actions when taking orders.

If the military forces you to destroy an alien species in space, your brain won't process it the same way it would if you chose to destroy the aliens of your own free will.
If the military forces you to destroy an alien species in space, your brain won't process it the same way it would if you chose to destroy the aliens of your own free will.
Ender's Game

We've known for a long time that people will do terrible things under orders—like hurt strangers. But why are we so easily persuaded to do things we wouldn't otherwise choose, even when nobody is holding guns to our heads? A new scientific experiment sheds light on this ancient ethical question.

University College London neuroscientist Patrick Haggard and his colleagues wanted to measure what's happening in the human brain when ordered to do something, versus choosing to do something. In Current Biology, the researchers report on how they reenacted a famous twentieth century experiment to find out.

Back in the early 1960s, a Yale psychologist named Stanley Milgram conducted a now-infamous set of experiments about how far people will go to follow orders. He asked volunteers to deliver an electric shock to a stranger. Unbeknownst to the volunteers, there was no shock—and the people they were shocking were actors pretending to be terribly hurt, even feigning heart attacks. Milgram found that most people would keep delivering the shocks when ordered by a person in a lab coat, even when they believed that person was gravely injured. Only a tiny percentage of people refused.

The Milgram experiments raised ethical red flags—about human nature, certainly, but also about how the experiment itself had been conducted. Many of the volunteers were emotionally scarred by the experience. Plus, Milgram had lied to them to get his results. Partly as a response to Milgram's work, universities and other research bodies created institutional review boards (IRBs) that oversee all experiments on humans.

Haggard and his colleagues wanted to revisit Milgram's experiment and do it right this time. They were especially interested in how subjects felt when they were under orders and whether that feeling translated into recognizable patterns in an EEG reading. So they gave a group of subjects £20 each, wired up their scalps, and began.

"Agents" were given a device with three keys on it. One would deliver a shock to a "victim," another would take a small amount of the £20 from a victim, and another would do nothing. A researcher would sometimes tell agents which keys to press and sometimes would let them choose. Each time the agents pressed a key, they would hear a tone.

The tone was, in a sense, the crux of the experiment. That's because the researchers relied on a peculiar observation about human neurology. When people intend to do something, they perceive the outcome as happening more quickly than when they do something unintentionally. In other words, if you kick a ball deliberately, it seems to fly through a window faster than if the kick was an accident.

People acting under orders feel like they have less control over their actions, almost as if they are acting unintentionally—like that person who accidentally kicked the ball. The researchers surmised that people under orders might also experience time distortion. So each time an agent pressed a button, whether under orders or by free choice, the researchers asked the agent to tell them how long it was before the tone played.

It turns out the researchers' hunch was right. The agents who were told to push the pain button thought the tone came later than the agents who chose to do it. The researchers write:

Coercion increased the perceived interval between action and outcome, relative to a situation where participants freely chose to inflict the same harms. Interestingly, coercion also reduced the neural processing of the outcomes of one’s own action. Thus, people who obey orders may subjectively experience their actions as closer to passive movements than fully voluntary actions.

This goes some way toward explaining why it's so easy to make people do evil things, or at least not very nice ones. When we take action because we've been ordered to, we feel less in control of the outcome. We feel less responsible. The experience is so profoundly different that our brains actually process it differently.

Of course we're still left with an ethical question. Why don't more people fight their urge to follow orders, and make their own choices? The answer to that one may be beyond the realm of science—for now.

Current Biology, 2016. DOI: 10.1016/j.cub.2015.12.067

Channel Ars Technica