Key takeaways
• Short-form videos can trigger suicidal thoughts in vulnerable viewers
• New AI model predicts which videos pose higher risk before they reach large audiences
• Approach guides platforms’ moderation teams and improves user safety
BALTIMORE, Nov. 17, 2025 – A significant new study in Information Systems Research finds that certain short-form videos on major platforms can trigger suicidal thoughts among vulnerable viewers—and that a newly developed AI model can flag these high-risk videos before they spread. The research delivers one of the first data-driven, medically informed tools for detecting suicide-related harms in real time, giving platforms a clearer early-warning signal at a moment when youth mental-health concerns are rising and scrutiny of platform safety is intensifying.
The study was conducted by Jiaheng Xie of the University of Delaware, Yidong Chai of the Hefei University of Technology and City University of Hong Kong, Ruicheng Liang of Anhui University of Finance and Economics, Yang Liu of the Hefei University of Technology and Daniel Dajun Zeng of the Chinese Academy of Sciences.
Their work comes as short-form video use grows at staggering speed. Globally, 1.6 billion people consume short clips on TikTok, Douyin and similar platforms, yet experts have raised alarms about content that glamorizes or normalizes self-harm. Viewers often express emotional distress directly in the comment sections of these videos, giving platforms a real-time signal of harm.
“Our goal was to help platforms understand when a video might trigger suicidal thoughts and to catch those risks before they spread,” said Xie. “The comments people leave are powerful indicators of how video content affects them, especially when viewers feel anonymous and more willing to share what they are feeling.”
The research team developed a knowledge-guided neural topic model, a type of artificial intelligence that combines medical expertise about suicide risk factors with patterns found in real video content. The model predicts the likelihood that a new video will generate suicidal thought comments, allowing moderation teams to intervene before the video reaches wider audiences.
Unlike existing methods that treat all videos and comments the same, the model distinguishes between what creators choose to post and what viewers think or feel after watching. It also separates known medical risk factors from emerging social media trends, such as viral heartbreak clips or challenges that may influence teens.
“Short-form videos often mix personal stories, emotion-driven visuals and intense themes,” said Chai. “By bringing medical knowledge directly into the AI model, we can detect harmful content more reliably and surface it to human moderators when it matters most.”
The model outperformed other state-of-the-art tools and revealed medically relevant themes that appear in videos linked to suicidal thought expressions. For platforms, this means automated systems can more accurately flag videos for follow-up by human reviewers, improving consistency and reducing the volume of content they must assess manually.
The authors note that the model is designed to support, not replace, human judgment. They emphasize that moderation teams should continue to make final decisions based on platform policies, legal standards and ethical considerations.
The findings offer practical guidance for platforms facing mounting scrutiny over teen safety and mental health harms. With lawsuits, regulatory pressure and rising public concern, the researchers say tools like theirs could help reduce preventable tragedies.
You can read the study in full here.
About INFORMS and Information Systems Research
INFORMS is the world’s largest association for professionals and students in operations research, AI, analytics, data science and related disciplines, serving as a global authority in advancing cutting-edge practices and fostering an interdisciplinary community of innovation.
Information Systems Research, an INFORMS journal, focuses on the utilization of information technology to enhance organizational efficiency. INFORMS helps its members advance research and practice through cutting-edge journals, conferences and resources. Learn more at www.informs.org or @informs.
Contact
Rebecca Seel
Public Affairs Specialist, INFORMS
Subscribe and stay up to date on the latest from INFORMS:
###
Media Contact
Jeff Cohen
Chief Strategy Officer
INFORMS
Catonsville, MD
[email protected]
443-757-3565