If a therapy bot walks like a duck and talks like a duck then it is a medically regulated duck

Large language models (LLMs) are increasingly used for mental health interactions, often mimicking therapeutic behaviour without regulatory oversight. Documented harms, including suicides, highlight the urgent need for stronger safeguards. This manuscript argues that LLMs providing therapy-like func...

Full description

Saved in:
Bibliographic Details
Main Authors: Ostermann, Max (Author) , Freyer, Oscar (Author) , Verhees, F. Gerrik (Author) , Kather, Jakob Nikolas (Author) , Gilbert, Stephen (Author)
Format: Article (Journal)
Language:English
Published: 05 December 2025
In: npj digital medicine
Year: 2025, Volume: 8, Pages: 1-5
ISSN:2398-6352
DOI:10.1038/s41746-025-02175-z
Online Access:Verlag, kostenfrei, Volltext: https://doi.org/10.1038/s41746-025-02175-z
Verlag, kostenfrei, Volltext: https://www.nature.com/articles/s41746-025-02175-z
Get full text
Author Notes:Max Ostermann, Oscar Freyer, F. Gerrik Verhees, Jakob Nikolas Kather & Stephen Gilbert
Description
Summary:Large language models (LLMs) are increasingly used for mental health interactions, often mimicking therapeutic behaviour without regulatory oversight. Documented harms, including suicides, highlight the urgent need for stronger safeguards. This manuscript argues that LLMs providing therapy-like functions should be regulated as medical devices, with standards ensuring safety, transparency and accountability. Pragmatic regulation is essential to protect vulnerable users and maintain the credibility of digital health interventions.
Item Description:Veröffentlicht: 05. Dezember 2025
Gesehen am 27.02.2026
Physical Description:Online Resource
ISSN:2398-6352
DOI:10.1038/s41746-025-02175-z