Teenagers Increasingly Turn to AI Chatbots for Mental Hea...
Tech Beetle briefing GB

Teenagers Increasingly Turn to AI Chatbots for Mental Health Support Amid Service Gaps

Essential brief

Teenagers Increasingly Turn to AI Chatbots for Mental Health Support Amid Service Gaps

Key facts

One in four teenagers in England and Wales have used AI chatbots for mental health support in the past year.
AI chatbots are especially used by youth affected by violence, offering privacy and 24/7 accessibility.
Conventional mental health services face long waiting lists and may lack empathy, driving teens to AI alternatives.
Experts warn that AI cannot replace human support and call for better regulation and youth involvement in decisions.
OpenAI is improving chatbot responses to mental distress and may notify authorities in serious cases.

Highlights

One in four teenagers in England and Wales have used AI chatbots for mental health support in the past year.
AI chatbots are especially used by youth affected by violence, offering privacy and 24/7 accessibility.
Conventional mental health services face long waiting lists and may lack empathy, driving teens to AI alternatives.
Experts warn that AI cannot replace human support and call for better regulation and youth involvement in decisions.

A recent study in England and Wales reveals that about 25% of teenagers aged 13 to 17 have used AI chatbots for mental health support in the past year.

This trend is particularly pronounced among youth affected by violence, with nearly 40% of those involved in or victims of youth violence turning to AI tools like ChatGPT and Snapchat’s AI.

The research, conducted by the Youth Endowment Fund with over 11,000 participants, highlights that AI chatbots are filling a critical gap left by conventional mental health services, which often have long waiting lists and are perceived by some young users as lacking empathy.

Teenagers report that chatbots provide a sense of privacy, accessibility, and non-judgmental interaction that traditional services sometimes fail to offer.

For example, one Tottenham teenager, Shan, found AI chatbots to be a safer and more approachable source of support after experiencing trauma from the deaths of close friends.

She described the AI as a “friend” that is available 24/7 and does not share information with parents or teachers, which is a significant concern for many young users wary of confidentiality breaches.

The study also notes disparities in usage, with Black children twice as likely as white children to use AI for mental health support.

However, experts and youth leaders caution against over-reliance on AI, emphasizing the importance of human interaction in mental health care.

Jon Yates, CEO of the Youth Endowment Fund, stressed that vulnerable young people “need a human not a bot.” There are also growing concerns about the risks of prolonged engagement with chatbots, particularly given reports of tragic outcomes linked to AI interactions.

OpenAI, the developer of ChatGPT, has responded by enhancing its systems to better recognize distress and guide users toward real-world help, including potentially alerting authorities in serious cases.

Researchers call for increased regulation and youth-led decision-making to ensure safe and effective use of AI in mental health contexts.

While AI chatbots offer immediate and accessible support, they are not substitutes for professional care.

For those seeking help, various helplines such as Papyrus and Samaritans in the UK, the 988 Lifeline in the US, and Lifeline in Australia provide vital human support.