ADHD on TikTok

(Image by StudyFinds)

In A Nutshell

  • A review of 27 studies found that misinformation about mental health and neurodivergent conditions is widespread on social media, with TikTok having the highest rates of any platform analyzed.
  • More than half of TikTok videos about ADHD contained inaccurate or unsupported claims; autism content on the platform was wrong 40 to 41 percent of the time.
  • Content about ADHD and autism had consistently higher misinformation rates than content about other mental health conditions, likely because these topics attract more personal storytelling and anecdote than clinical input.
  • Researchers called on mental health organizations, clinicians, and platforms to do more to counter misinformation and strengthen content moderation.

A teenager watches a few TikTok videos about ADHD and starts to wonder if that’s why she can’t focus in school. A young man stumbles into an autism corner of the app and decides the traits he’s seeing must be his own. Many people now turn to social media for mental health answers before ever speaking to a doctor. A new review of the research suggests that when they land on TikTok specifically, what they find is often inaccurate.

Researchers at the University of East Anglia analyzed 27 studies covering more than 5,000 social media posts and videos across TikTok, YouTube, Facebook, Instagram, and X, formerly known as Twitter. Published in the Journal of Social Media Research, the review found that on TikTok, more than half of videos about ADHD contained inaccurate or scientifically unsupported claims. Autism content on the platform was inaccurate 40 to 41 percent of the time. In some cases, especially for ADHD, misleading claims were more common than accurate ones. For a generation increasingly turning to short-form video for health guidance and self-diagnosis, those numbers matter.

Why Self-Diagnosing From TikTok Mental Health Content Is Dangerous

Researchers noted that young people self-diagnosing with ADHD, autism, and other mental health conditions after watching social media content has become a documented trend. When the content driving those self-diagnoses is inaccurate, the consequences can be serious: delayed or inappropriate treatment, unnecessary anxiety about symptoms that may have a different explanation, and a distorted picture of what these conditions actually involve. The paper cautions that research on these outcomes is still developing, and that links between online misinformation and real-world harm, while plausible and supported by related evidence, have not been definitively proven in all cases.

Inaccurate beliefs about the causes of mental illness, such as the idea that mental health problems stem from personal weakness, can discourage people from seeking professional help. Content promoting unproven treatments may push someone further down the wrong path before they ever see a clinician.

Tiktok on woman's Smart Phone Screen
Across all platforms and topics, misinformation rates averaged roughly 26 percent. On TikTok, that average climbed to nearly 35 percent. (© tashatuvango – stock.adobe.com)

TikTok Mental Health Misinformation by the Numbers

Researchers searched four major academic databases and identified 27 studies evaluating the accuracy, quality, or reliability of mental health and neurodivergence-related social media content. In total, 5,057 posts and videos were analyzed. YouTube was the most studied platform, appearing in 18 studies, followed by TikTok with 5, Facebook with 2, and Instagram and X with one each. The X study focused specifically on schizophrenia-related content.

Researchers looked at two broad categories: mental health conditions such as depression, anxiety, bipolar disorder, and anorexia, and neurodivergent conditions including ADHD and autism. Misinformation was defined as a claim based on anecdotal, false, or misleading information due to a lack of scientific evidence.

Across all platforms and topics, misinformation rates averaged about 26 percent, though measurement methods varied across studies. On TikTok, that average climbed to nearly 35 percent. ADHD content on TikTok hit 52 percent in one study, while autism content ran between 40 and 41 percent across two separate studies. Videos about postpartum depression on YouTube and Facebook had some of the lowest rates in the review, ranging from about 3 to 8 percent.

Why ADHD and Autism Face the Worst TikTok Mental Health Misinformation Rates

Content about ADHD and autism consistently had higher misinformation rates than mental health content more broadly. Researchers suggest this may be because conditions with less clearly defined public understanding, ones that attract heavy volumes of personal storytelling and anecdote online, are especially prone to distortion. When people lack a solid baseline knowledge of what a condition involves, oversimplified takes spread more easily.

TikTok’s design makes things worse. Its recommendation algorithm serves users content that matches their existing viewing behavior, creating feedback loops where inaccurate claims get reinforced rather than challenged. Researchers noted that platform-specific factors like algorithmic design and content moderation may influence how misinformation spreads. That connection is supported by existing theory on echo chambers and confirmation bias, though the review did not directly measure algorithm effects.

YouTube averaged around 22 percent misinformation overall, well below TikTok’s rate. YouTube Kids scored the lowest of all, with one study reporting no misinformation in content about anxiety and depression aimed at children, most likely a result of that platform’s stricter content standards.

Who’s Sharing the Most Reliable Mental Health Information Online

Across nearly every platform and topic in the review, content made by healthcare professionals scored higher on reliability and quality than material from non-professionals. On YouTube, videos from doctors and healthcare organizations consistently outperformed content from patients, news channels, and general users on standardized measures.

One finding cut against that pattern. For bipolar disorder, content from medical professionals and patients scored comparably on reliability measures, with no meaningful difference between the two groups. Researchers flagged this as a concern, since professionals are generally expected to outperform non-professionals on accuracy. This may reflect inconsistencies in how bipolar disorder is presented online rather than a gap in clinical knowledge, though the review was not designed to explain the disparity.

A disconnect between accuracy and usefulness also appeared in the data. A video can avoid outright falsehoods while still leaving a viewer with an incomplete picture of a condition, which matters just as much when someone is using that content to make decisions about their health.

Researchers called on mental health organizations and clinicians to get more active on social media by creating and sharing accurate, evidence-based content. They also pushed for stronger platform moderation and clearer, more consistent standards around what constitutes mental health misinformation. As the authors wrote, “Addressing these issues is vital to protect public mental health and improve the reliability of online information.”


Disclaimer: This article is based on a systematic review of previously published studies and does not constitute medical advice. Misinformation rates varied across individual studies and measurement methods. If you have concerns about your mental health or a potential diagnosis, consult a licensed healthcare professional.


Paper Notes

Limitations

Most of the 27 included studies focused on YouTube, which limited platform comparisons for Facebook, Instagram, and X. Studies assessing other qualities of social media content, such as how understandable or actionable the information was, fell outside the review’s scope. Using multiple evaluation tools across included studies, including different versions of the DISCERN reliability scale and the Global Quality Scale, made direct comparisons across platforms and topics harder to draw cleanly. Most included studies analyzed content in only one language, and the majority came from the Global North, which limits how broadly the findings apply internationally.

Funding and Disclosures

This study received no specific funding from public, commercial, or nonprofit organizations. Authors declared no conflicts of interest.

Publication Details

Carter, A., Gracey, F., Moody, J., Ovens, A., & Chatburn, E. (2026). Quality, reliability and misinformation in mental health and neurodivergence content on social media: A systematic review. Journal of Social Media Research, 3(1), 60-77. DOI: https://doi.org/10.29329/jsomer.84

The five authors are affiliated with the University of East Anglia’s Norwich Medical School, Department of Clinical Psychology and Psychological Therapies (Carter, Gracey, Ovens, Chatburn); Norfolk and Suffolk NHS Foundation Trust (Moody); and the University of Cambridge, Department of Public Health and Primary Care (Chatburn).

About StudyFinds Analysis

Called "brilliant," "fantastic," and "spot on" by scientists and researchers, our acclaimed StudyFinds Analysis articles are created using an exclusive AI-based model with complete human oversight by the StudyFinds Editorial Team. For these articles, we use an unparalleled LLM process across multiple systems to analyze entire journal papers, extract data, and create accurate, accessible content. Our writing and editing team proofreads and polishes each and every article before publishing. With recent studies showing that artificial intelligence can interpret scientific research as well as (or even better) than field experts and specialists, StudyFinds was among the earliest to adopt and test this technology before approving its widespread use on our site. We stand by our practice and continuously update our processes to ensure the very highest level of accuracy. Read our AI Policy (link below) for more information.

Our Editorial Process

StudyFinds publishes digestible, agenda-free, transparent research summaries that are intended to inform the reader as well as stir civil, educated debate. We do not agree nor disagree with any of the studies we post, rather, we encourage our readers to debate the veracity of the findings themselves. All articles published on StudyFinds are vetted by our editors prior to publication and include links back to the source or corresponding journal article, if possible.

Our Editorial Team

Steve Fink

Editor-in-Chief

John Anderer

Associate Editor

Leave a Reply