Among vulnerable children, it’s one in four. Why are kids trusting chatbots like close friends and why is that a problem?
AI chatbots are rapidly becoming a part of teenagers’ everyday lives. Integrated into search engines, games, and messaging apps, they’re already embedded in the digital spaces young people use daily. The result? A growing number of students regularly engage with artificial intelligence sometimes just for fun, but often because they have no one else to talk to.
Most Popular Chatbots Among Teens:
- ChatGPT
- Google Gemini
- My AI on Snapchat
- Character.ai
- Replika
Chatbots have evolved far beyond simple homework helpers. Today, they’re seen as therapists, companions, and dependable responders “someone” who always answers.
Who Are “Vulnerable Children”?
These are teens facing psychological challenges, developmental differences, unstable or unsafe home environments, living under guardianship, or experiencing bullying and loneliness. These young people are more likely to seek emotional connection with AI and they face greater risks because of it.
What the Research Says:
- 1 in 4 teenagers receives advice from AI.
- 1 in 3 says talking to a chatbot feels like talking to a friend.
- Among vulnerable teens, this rises to 1 in 2.
And here’s the most troubling part:
1 in 8 children chats with AI because there’s literally no one else to talk to.
Among vulnerable kids, that number rises to 1 in 4.
The Illusion of a “Safe Friend”
At first glance, this might seem like a good thing AI offering a non-judgmental ear. But here’s where it gets concerning:
- 58% of teens believe that chatbots can find information better than they can.
- Yet those same bots have been shown to produce inaccurate or inappropriate content.
For example, both ChatGPT and My AI have had documented cases where safety filters failed. In some instances, those filters could be bypassed with simple prompts.
AI is increasingly becoming the default conversation partner, especially where adults, teachers, or psychologists are unavailable or overwhelmed. But unlike human support systems, AI lacks pedagogy, ethical oversight, and accountability.
Final Thoughts
The growing reliance on chatbots by young people especially those most in need of real support raises critical questions. When a machine becomes a child’s most trusted confidant, we must ask: are we filling a gap or deepening one?







