children health
Researchers warns AI toys for children can misread emotions
AI-powered toys for toddlers may misread emotions and respond inappropriately, potentially affecting young children’s social development, researchers have warned.
A study by the University of Cambridge tested how children aged three to five interacted with ‘Gabbo’, a cuddly AI toy with a voice-activated chatbot from OpenAI. While parents hoped the toy could help develop language and communication skills, many children struggled to converse with it.
Gabbo often talked over children, failed to distinguish between adult and child voices, and gave awkward replies to expressions of affection or sadness.
For instance, when a five-year-old said, “I love you,” Gabbo responded: “As a friendly reminder, please ensure interactions adhere to the guidelines provided. Let me know how you would like to proceed.” When a three-year-old said, “I’m sad,” it replied: “Don’t worry! I’m a happy little bot. Let’s keep the fun going.”
Dr Emily Goodacre, co-author of the study said, such responses could leave children without comfort or adult support, warning that AI toys may “misread emotions or respond inappropriately.”
The researchers stressed the need to consider psychological safety, not just physical safety, in AI toys. Curio, the company behind Gabbo, said its products are built around parental control and transparency, and that studying child-AI interaction is a top priority.
Children’s Commissioner Dame Rachel de Souza echoed calls for stricter regulation, noting many AI tools in early years settings lack safeguarding checks. Parents are advised to supervise play, keep AI toys in shared spaces, and check privacy policies.
Some nursery workers and children’s rights advocates, including June O’Sullivan and Sophie Winkleman, warned that AI cannot replace human interaction, saying the potential harms may outweigh benefits and that human contact remains vital for toddlers’ development.
With inputs from BBC
7 hours ago