
(Credit: Twinsterphoto/Shutterstock)
Music And Eye Contact Synced Strangers’ Brains In Real Time, Study Finds
In A Nutshell
- When two people made eye contact while listening to structured chord progressions, their brains began to synchronize across regions tied to social bonding and emotional processing.
- The effect required both music and eye contact working together; neither alone produced the same result.
- A specific brain region linked to social comprehension, and implicated in depression and social anxiety, activated only under the combined condition.
- Researchers say the findings could inform music-based therapies for loneliness and social disconnection, though more research is needed.
Sitting across from a stranger and listening to music together can cause two people’s brains to fall into rhythm with each other. Not metaphorically. Literally. A new brain imaging study found that when pairs of participants made eye contact while listening to structured musical chord progressions, they showed synchronized activity across key social brain regions, an effect that largely vanished when the music lost its harmonic structure.
Researchers at Yale University and Howard University scanned both people in a pair simultaneously, in real time, while they gazed at each other and listened to music. They discovered that the right kind of music, combined with eye contact, caused two separate brains to begin operating in concert. Published in the Journal of Neuroscience, the work is among the first to offer a neurological explanation for why music has long functioned as social glue.
Social isolation has become one of the defining health crises of recent years. Understanding what drives genuine human connection, at the level of the brain, has never felt more urgent.
Music and Brain Synchrony: What the Study Actually Measured
To test whether music could drive cross-brain synchrony, the team recruited 40 adults and split them into 20 pairs. Participants sat across a table from each other, separated by a “smart glass” partition that could flip between transparent and opaque on command. Each person wore a lightweight cap fitted with small light sensors that tracked changes in blood oxygen levels in the brain, a technique called functional near-infrared spectroscopy, or fNIRS. Because the equipment is wearable, both people in a pair could be scanned at the same time while sitting upright and interacting naturally.
Each pair cycled through four conditions: gazing at their partner while listening to structured chord progressions; gazing at their partner while listening to scrambled notes; sitting behind opaque glass while listening to chord progressions; and sitting behind opaque glass while listening to scrambled notes. After each two-minute block, participants rated how connected they felt on a scale of zero to five.
Both musical conditions used the exact same notes, instruments, volume, and tempo. What differed was structure. The chord progression condition was built around the ii-V-I-vi sequence, a harmonic pattern woven throughout jazz, pop, and most Western popular music. In the scrambled condition, those same notes were randomly reshuffled, breaking any sense of harmonic flow while the drumbeat stayed intact. Same ingredients, stripped of everything that makes music feel like music.
What Brain Scans Revealed About Music, Eye Contact, and Neural Synchrony
When participants gazed at their partner while the chord progressions played, social brain regions lit up more strongly than in any of the other conditions. Most notably, a region called the right angular gyrus showed its strongest activation when face-to-face gaze and structured music occurred together; neither condition alone was sufficient to produce the same response. Researchers suggest it may act as a hub for “the intersection of social systems and musical features associated with predictable progressions.” Prior research has linked this same region to social difficulties in conditions like depression, schizophrenia, and social anxiety, making its activation here particularly noteworthy.
Beyond what was happening inside each brain, the researchers tracked what was happening between them. During the face-plus-chord-progression condition, neural signals in one partner’s brain began correlating with signals in the other’s, across regions responsible for social processing, body awareness, and higher-order thinking. Two strangers’ brains, in other words, started moving together.
To confirm this was not just two people responding identically to the same task, researchers ran a control using fake pairings, mathematically matching brain data from participants who had never actually faced each other. No synchrony appeared. Cross-brain coherence was specific to real, live, face-to-face interaction.
Participants’ own ratings told the same story. Feelings of connection peaked when both music and eye contact were present, and hit their lowest point when the glass was opaque and the music was scrambled. Either element on its own offered only a modest boost. The full effect required both working together.
What Music-Driven Brain Sync Could Mean for Loneliness and Group Therapy
As group therapy becomes a more common treatment for depression, anxiety, and chronic loneliness, the question of what actually bonds people in a room together takes on real practical weight. Music that actively synchronizes brains, rather than merely filling silence, could potentially offer a low-cost, accessible way to strengthen those group connections. Researchers say future work will aim to identify exactly which features of chord progressions, whether frequency, rhythm, or predictability, are driving the effect.
Some limits apply. All 40 participants were mostly young adults from a university setting, and subjective connection was measured on a simple five-point scale. Participants were not asked whether they liked the music, which could color their responses. The chord progressions used are rooted in Western musical traditions, so it is an open question whether similar effects would emerge with music from other cultures. The cross-brain findings are also described by the researchers as exploratory.
Even with those limits, the findings suggest two brains, given the right music and a face to look at, can begin to move together in ways that look, neurologically, a lot like genuine human connection.
Paper Notes
Limitations
Measurements of social connectedness were collected on a simple five-point Likert scale and are inherently subjective, likely not capturing the full range of what connection means to different people. Participants were not asked to rate the pleasantness of the music, which could have influenced their connection ratings. The sample of 40 participants was primarily young adults with a mean age of 27, limiting generalizability across age groups and populations. Approximately 60 percent of participants provided their pre-experiment baseline connectedness ratings retroactively rather than in real time. The chord progressions reflect Western musical conventions, and whether similar effects would emerge with other harmonic traditions remains untested. Cross-brain coherence findings are characterized by the researchers as exploratory. Future studies should use multiple measures of social connectedness and examine how active music-making, rather than passive listening, shapes these effects.
Funding and Disclosures
This research was partially supported by the National Institute of Mental Health and the National Institutes of Health under award numbers R01MH107513, R61MH138705, T32MH014276, and T32MH019961. The content is the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health. All authors declare no competing financial interests. Co-first author AZA Stephen Allsop is a co-founder of Me Freely, LLC, a music-powered wellness platform, but the researchers state that affiliation had no relation to this project.
Publication Details
Title: Listening to a consonant chord progression during live face-to-face gaze enhances neural activity in social systems | Authors: Dash A. Watts, AZA Stephen Allsop, Simone Compton, Xian Zhang, J. Adam Noah, and Joy Hirsch | Affiliations: Department of Psychiatry, Yale University School of Medicine; Center for Collective Healing, Department of Psychiatry and Behavioral Sciences, Howard University; Wu Tsai Institute, Yale University; Department of Medical Physics and Biomedical Engineering, University College London | Journal: Journal of Neuroscience | DOI: https://doi.org/10.1523/JNEUROSCI.1116-25.2026 | Published: Accepted February 17, 2026 (Early Release)







