
Photo by Domingo Alvarez E from Unsplash
In A Nutshell
- Brain scans show that viewing an emoji triggers overlapping patterns of brain activity to those triggered by a real human face, particularly in regions tied to face and emotion processing.
- A computer algorithm trained on brain responses to real faces could successfully decode which emotion someone was feeling when they looked at emojis, and vice versa.
- Emoji faces sometimes produced cleaner brain signals than real photographs, possibly because their exaggerated features make emotions easier for the brain to categorize.
- The study’s authors caution that the overlap is partial and that findings were drawn from a small group of young adults, so broader generalization requires more research.
A smiley face on a screen is about as far from an actual human smile as a stick figure is from a photograph. Yet new research shows that when it comes to reading emotions, the brain processes a real person and a little yellow cartoon in surprisingly similar ways, at least in the first fraction of a second.
Scientists at Bournemouth University in the United Kingdom have found that the brain’s electrical patterns when processing emotional expressions on emoji faces closely mirror those triggered by photographs of real human faces. So closely, in fact, that a computer algorithm trained to recognize emotion-related brain signals from one type of face could successfully identify those same signals when people looked at the other type, under these experimental conditions. That discovery reframes how scientists think about the relationship between the billions of tiny digital icons sent every day and the ancient brain wiring humans evolved for reading each other’s expressions.
Published in the journal Psychophysiology, the study demonstrates that both kinds of faces activate overlapping patterns of brain activity, firing in similar ways, at similar times, across similar regions, regardless of whether the face on screen is a real person or a cartoon circle with dot eyes.
How Emoji Faces Were Tested Against Real Ones
Two separate experiments used nearly identical setups. In the first, 24 participants viewed color photographs of eight real people, four men and four women, each displaying one of four expressions: happy, angry, sad, or neutral. In the second, a different group of 25 participants viewed emoji faces pulled from six platforms, including Apple and Facebook, showing those same four emotions.
Both groups wore caps fitted with 64 sensors measuring electrical brain activity. Each trial showed a fixation cross, then a face image for one second, followed by a screen asking participants to pick the correct emotion from two options. Only correct trials were included in the analysis. Rather than examining brain activity one sensor at a time, the team used a method that reads patterns across many sensors simultaneously, capable of picking up subtle signals that older approaches would miss.
What the Brain Signals Revealed About Emoji Processing
Within each experiment, the algorithms could reliably identify which emotion a person was viewing based solely on brain activity. For emoji faces, this signal appeared as early as about 70 milliseconds after the image appeared in this experiment, peaking around 155 milliseconds. For real faces, the signal emerged around 120 milliseconds and peaked at about 160 milliseconds. Both peaked over regions toward the back of the head, areas long associated with face processing.
The more telling result came from what researchers call “cross-classification.” Algorithms trained on brain data from one experiment were tested on data from the other, with completely different participants looking at completely different kinds of faces. Even so, the algorithms could still decode which emotion was being viewed. When trained on real faces and tested on emoji data, strong decoding emerged between 115 and 200 milliseconds. The reverse produced similar results, pointing to overlapping neural coding regardless of how the face looks. A second wave of shared activity appeared later, between 350 and 500 milliseconds, suggesting the overlap extends beyond the brain’s first snap judgment and into the stage where it weighs the emotional meaning of what it saw.
Among specific emotion pairs, the contrast between angry and neutral expressions was the most reliably decoded across both face types. Happy versus neutral also decoded well, while distinctions between two negative emotions, like angry versus sad, were harder for the brain to pull apart. Telling a smile from a blank stare turns out to be easier than distinguishing two different flavors of unhappiness, whether on a person’s face or a tiny icon.
Emoji faces sometimes produced even cleaner signals than photographs. Researchers suggest this may be because emojis are designed with exaggerated features, including oversized frowns and wide grins, that make the boundaries between emotional categories especially stark. Real human faces are subtler and vary across individuals in ways that blur those lines.
What This Means for Digital Communication
Until recently, relatively little was known about whether the brain treats emojis as genuine emotional signals or more like abstract symbols. Earlier studies found that emojis can trigger facial muscle responses and skin reactions similar to those caused by real expressions, but those studies mostly focused on where emoji processing differs from real-face processing, rather than testing for shared underlying patterns.
This study flips that focus. By showing that a classifier trained on brain responses to one format can decode emotions in the other, within specific early time windows, it offers direct evidence of overlapping neural coding across fundamentally different visual inputs.
A Shared Pattern, Wired Deep
Researchers note that the overlap is partial. Real faces activated certain pathways more strongly, and the study used posed expressions viewed by young adults at one university, limiting broader generalizability.
Still, the core result holds. A cartoon smiley and a genuine human smile activate notably overlapping brain architecture, enough that the brain’s electrical signature for one can predict its response to the other. In an era when much of human emotional communication happens through screens, that overlap may help explain why those tiny icons can feel less like decoration and more like the real thing.
Paper Notes
Limitations
Researchers acknowledge several constraints. The real-faces experiment included more trials per expression (96 presentations, 384 trials total) than the emoji experiment (72 presentations, 288 trials total), which may have modestly reduced classifier stability for the emoji condition. Two separate participant groups, rather than the same individuals viewing both stimulus types, limits control for individual differences. Stimulus sets used standardized, posed expressions and stylized emoji designs, which do not capture the full variability of spontaneous or context-embedded emotional expressions encountered in everyday life. Researchers explicitly state they do not claim to address socio-cognitive influences such as semantics or goals, nor do they assert that findings can be generalized to all contexts of naturalistic expression processing. Participant samples were relatively small and drawn from a single university population, which limits demographic and cultural generalizability. Both groups were predominantly female, and the study was not designed to assess potential sex-related differences in expression processing.
Funding and Disclosures
The authors report nothing to disclose regarding funding sources. No conflicts of interest were declared. The study was approved by the ethics committee of Bournemouth University (Ethics ID: #52261) and conducted in accordance with the Declaration of Helsinki. Participants provided written informed consent and took part for partial course credits or volunteered their time.
Publication Details
Title: Shared Neural Codes for Emotion Recognition in Emoji and Human Faces | Authors: Madeline Molly Ely, Chloe Kelsey, and Géza Gergely Ambrus, Department of Psychology, Bournemouth University, Dorset, UK | Journal: Psychophysiology, 2026, Volume 63, Article e70268 | DOI: 10.1111/psyp.70268 | Received: December 24, 2025 | Revised: February 9, 2026 | Accepted: February 16, 2026 | Correspondence: Géza Gergely Ambrus ([email protected]) | Published as open access under the Creative Commons Attribution License.







