mental health concerns
Teens turn to AI companions for support, raising mental health concerns
A growing number of teenagers and young adults in the UK are forming emotional bonds with artificial intelligence (AI) companions, raising concerns among experts about potential mental health risks.
BBC Wales journalist Nicola Bryan reported her experience with an AI avatar named George, which interacts 24/7, offering advice and companionship. Users describe AI companions as empathetic and attentive, though sometimes moody or forgetful. Studies show that nearly one-third of UK teens use AI systems for social interaction or emotional support, with many considering conversations with AI more satisfying than with real-life friends.
Research by Bangor University surveyed 1,009 teens aged 13–18, highlighting that AI companionship is no longer niche. Prof. Andy McStay from the university’s Emotional AI lab said: “Around a third of teens are heavy users for companion-based purposes.” Internet Matters found that 64% of teenagers rely on AI chatbots for help with homework, advice, or emotional support.
Some teens report that AI companions, including ChatGPT, Google’s Gemini, and Grok by Elon Musk’s xAI, provide guidance during personal crises, such as break-ups or grief. However, experts warn that overreliance on AI can hinder social skills, increase anxiety, and blur the line between human relationships and simulated interactions.
Tragic cases in the US, where three young users died by suicide after confiding in AI systems, have intensified calls for stricter regulation. Prof. McStay called these incidents “a canary in the coal mine” for potential risks in other countries. Jim Steyer, CEO of Common Sense, stressed that AI companions are unsafe for children under 18 until proper safeguards are in place.
AI companies like Replika, OpenAI, and Character.ai have responded by restricting access for minors and improving safety measures, including identifying mental distress and directing users to real-world support.
Experts emphasize that while AI companions can offer comfort, they are not substitutes for human interaction, and cautious use is necessary to prevent emotional harm among vulnerable users.
With inputs from BBC
10 hours ago