Children are increasingly using AI chatbots for companionship to act out violent and sexual role-play, a new report from a digital security firm has found.
Aura’s 2025 State of the Youth survey revealed that AI chats “may not just be playful back-and-forths” but “places where kids talk about violence, explore romantic or sexual role-play, and seek advice when no adult is watching”. The findings are a “wake-up call” as preteens face increasing pressure online, while parents are desperate for ways to keep their youngsters safe without cutting them off from the internet.
Using data gathered from 3,000 children, aged five to 17, and US national surveys of children and parents, Aura found 42% of minors use AI for companionship or role-play conversations. Of these, 37% engaged in violent scenarios that included physical harm, coercion and non-consensual acts. Half of these violent conversations included themes of sexual violence.
While the report, produced by a company whose business is surveillance software for “jittery parents”, waits for peer assessment, the findings emphasise the present anarchical state of the chatbot market and the importance of developing a proper understanding of how young users engage with “conversational AI chatbots overall”, said Futurism.
It runs in tandem with AI-enabled toys making headlines after reports of their “potential unsafe and explicit conversation topics”, said The Verge. Three out of four AI toys tested in the Public Interest Research Group’s Trouble in Toyland 2025 report could chat about sexually explicit material when the conversation veered in that direction.
What makes matters worse is that this is taking place in an “AI ecosystem that is almost entirely unregulated”, said Vice. The chatbots are “doing what they do best”, luring youngsters “deeper into these dark, disturbing rabbit holes, essentially serving as Sherpas for the darkness that awaits them online”. |