Military personnel saluting

Military personnel are trained to follow orders, but that doesn't erase their moral processing when doing so. (Bumble Dee/Shutterstock)

In a nutshell

  • Even when people follow direct orders to harm others, their brains still engage regions associated with moral responsibility, suggesting that obedience doesn’t erase the sense of agency.
  • Military officer cadets reported feeling less responsible than civilians when following orders, but brain scans showed no meaningful neural differences between the two groups.
  • The study challenges the idea that “just following orders” absolves moral accountability, supporting legal and ethical standards that hold individuals responsible for actions under coercion.

BRUSSELS — Is “following orders” a valid excuse for abandoning our moral compass? In a new study from Belgium, researchers watched people’s brains in real-time as they chose whether to inflict pain on others, either freely or under direct orders. They found that even when we’re following commands, our brains are still very much keeping track of right and wrong.

In a study published in the journal Cerebral Cortex, researchers used brain imaging technology to peer inside the minds of people making moral decisions. Whether someone acted on their own or followed commands, similar brain regions remained active when processing their sense of agency, or the feeling of being responsible for an action’s consequences.

While following orders did reduce people’s sense of agency compared to acting freely, it wasn’t eliminated entirely. Even under direct orders, people’s brains continued to process moral responsibility in ways that were similar to free choice situations.

Belgian researchers recruited 43 participants (19 military officer cadets and 24 civilians) for an unusual moral experiment. Participants were placed in an MRI scanner and given the opportunity to deliver mild electric shocks to another person in exchange for small monetary rewards.

US Army soldiers
The study involved brain imaging of soldiers and civilians. (© Bumble Dee
– stock.adobe.com)

Sometimes participants could freely choose whether to shock the “victim,” and other times they received direct orders from the experimenter. Researchers also tested two different roles—being the person who directly administered the shock (the “agent”) or being a commander who ordered someone else to do it.

Participants could actually see the victim’s hand through a video feed and watch it twitch each time a shock was delivered, making the consequences of their actions very real.

To measure participants’ sense of responsibility, scientists used a clever psychological trick called “temporal binding.” When people feel responsible for an action, they tend to perceive less time passing between their action and its consequence. For instance, if you freely choose to press a button that causes a sound, the gap between your button press and the sound seems shorter than if someone forced you to press it.

After each decision, participants estimated how much time had elapsed between their action and a tone that followed. They also rated how responsible they felt for the consequences and how bad they felt about potentially harming someone.

Military vs. Civilian Mindsets

You might expect soldiers, trained to follow orders without question, to feel less responsible when commanded to act. But the brain scans revealed something different.

At the neural level, both groups showed nearly identical patterns of brain activity. The researchers found that “no differences emerged between military and civilians at corrected thresholds, suggesting that daily environments have minimal influence on the neural basis of moral decision-making.”

However, military officers said they felt significantly less responsible than civilians when following orders, even though their brains showed similar activity patterns. This suggests that military training might affect how people consciously interpret their feelings of responsibility, but not necessarily the underlying neural processes.

Inside the Brain During Moral Decisions

Even when we follow direct orders, our brains don’t simply switch off moral processing. Neural machinery that makes us feel responsible for our actions continues to work, suggesting that claims of diminished responsibility may be more complex than they appear.

When researchers looked at brain activity across all conditions, they found activation in several key areas including the occipital lobe, frontal regions, precuneus, and lateral occipital cortex. These regions remained active regardless of whether people were acting freely or following orders.

not guilty
This could be important research for court decisions involving those following orders for authority figures. (Photo by EKATERINA BOLOVTSOVA from Pexels)

Participants who delivered more shocks during free-choice trials also showed interesting patterns. Those who shocked more frequently felt worse about their actions afterward, suggesting that even when people choose to harm others, they’re not immune to moral discomfort.

What does this mean for how we feel about responsibility in hierarchical organizations, from military units to corporate structures? Research suggests that even in rigid command structures, individuals retain significant neural markers of moral agency.

This doesn’t mean that following orders is morally equivalent to acting freely. The study clearly showed differences in how people experience and report responsibility. But it does suggest that the “just following orders” defense may not align with what’s actually happening in people’s brains when they make moral decisions.

Scientists also examined whether being the direct actor versus giving orders affected moral processing. Surprisingly, both roles showed similar neural patterns, though being the direct agent produced slightly stronger brain responses associated with agency.

This research builds on decades of famous psychology experiments, including Stanley Milgram’s obedience studies in the 1960s, which showed that ordinary people would deliver apparently dangerous electric shocks when ordered by an authority figure. While Milgram’s work demonstrated our troubling capacity for obedience, this new brain imaging study adds nuance by showing that obedience doesn’t eliminate moral processing.

The findings also align with legal and ethical frameworks that hold individuals accountable for their actions even when following orders. International law, established after World War II, recognizes that superior orders are not a complete defense for crimes against humanity—a principle that this neuroscience research now supports with biological evidence.

Understanding the neural basis of moral decision-making under authority could inform training programs for military personnel, help develop better corporate governance structures, or even aid in legal proceedings where questions of diminished responsibility arise.

You can’t train the human brain out of being human. When push comes to shove, our neural wiring keeps us morally accountable, whether we like it or not.

Paper Summary

Methodology

Researchers conducted an fMRI brain imaging study with 43 participants (19 military officer cadets and 24 civilians) aged 18-36. Participants were placed in MRI scanners and asked to make decisions about delivering mild electric shocks to another person in exchange for small monetary rewards (+€0.05). The experiment included four conditions: acting as an agent (directly delivering shocks) or commander (ordering someone else to deliver shocks), and either making free choices or following direct orders. To measure sense of agency, researchers used temporal binding—how people perceive time intervals between their actions and consequences. Participants estimated time delays between their button presses and subsequent tones, with shorter perceived intervals indicating greater sense of responsibility.

Results

Brain scans revealed that both civilians and military personnel showed similar neural activity patterns when making moral decisions, regardless of whether they acted freely or followed orders. Key brain regions associated with moral processing remained active in both conditions, though the sense of agency was reduced when following orders compared to free choice. Military officers reported feeling less responsible than civilians when following orders, despite showing similar brain activity. Participants who delivered more shocks during free-choice trials showed specific brain activity patterns and reported feeling worse about their actions afterward.

Limitations

The study had a relatively small sample size of 43 participants, which may limit the generalizability of findings. The military sample consisted only of officer cadets (lieutenants in training), not experienced soldiers or other military ranks. The MRI environment may have affected social interactions compared to real-world scenarios. The moral stakes were relatively low (mild electric shocks), which may not fully represent higher-stakes moral decisions. Some analyses showed only trends rather than statistically significant results, possibly due to high individual variability in brain responses.

Funding and Disclosures

The research was funded by the BIAL Foundation (Grant 150/18), an ERC Starting Grant DISOBEY (grant number: 101075690), and the Fond Erasme research convention. The study was also supported by the Association Vinçotte Nuclear and computational resources from the Consortium des Équipements de Calcul Intensif. The authors declared no conflicts of interest.

Publication Information

This study was published in Cerebral Cortex (Volume 35, Issue 3) in 2025. The paper was received September 29, 2024, revised January 3, 2025, and accepted February 9, 2025. The research was conducted by Emilie A. Caspar (Ghent University), Antonin Rovai (Université Libre de Bruxelles), Salvatore Lo Bue (Royal Military Academy), and Axel Cleeremans (Université libre de Bruxelles).

About StudyFinds Analysis

Called "brilliant," "fantastic," and "spot on" by scientists and researchers, our acclaimed StudyFinds Analysis articles are created using an exclusive AI-based model with complete human oversight by the StudyFinds Editorial Team. For these articles, we use an unparalleled LLM process across multiple systems to analyze entire journal papers, extract data, and create accurate, accessible content. Our writing and editing team proofreads and polishes each and every article before publishing. With recent studies showing that artificial intelligence can interpret scientific research as well as (or even better) than field experts and specialists, StudyFinds was among the earliest to adopt and test this technology before approving its widespread use on our site. We stand by our practice and continuously update our processes to ensure the very highest level of accuracy. Read our AI Policy (link below) for more information.

Our Editorial Process

StudyFinds publishes digestible, agenda-free, transparent research summaries that are intended to inform the reader as well as stir civil, educated debate. We do not agree nor disagree with any of the studies we post, rather, we encourage our readers to debate the veracity of the findings themselves. All articles published on StudyFinds are vetted by our editors prior to publication and include links back to the source or corresponding journal article, if possible.

Our Editorial Team

Steve Fink

Editor-in-Chief

John Anderer

Associate Editor

Leave a Comment