Tips
When AI Becomes a “Friend”: What Parents of Tweens Need to Know
Apr 16, 2025

When I first read "The Diamond Age"—the prescient sci-fi novel published 30 years ago—I became completely enthralled with an idea that author Neal Stephenson imagined within the story: a highly advanced, interactive book called "The Young Lady’s Illustrated Primer." Designed to educate and empower its reader, the Primer adapts to a child’s real-world experiences, teaching literacy, critical thinking, social-emotional, and survival skills through personalized fairy tales and interactive lessons. In the story, the Primer was created for a wealthy child—but it ends up in the hands of Nell, a girl from a disadvantaged background. Against all odds, it becomes her guide, her teacher, and a kind of intelligent companion helping her grow up with courage and curiosity.
I read this back in 2003 — before smartphones and social media invaded the adolescent psyche. Back then, it felt like a brilliant (and even likely) future: equitable, engaging, personalized learning at scale? How amazing is that??
Fast forward to today, and, well, we’re not exactly there. AI has advanced rapidly, but like social media, much of it is optimized for engagement above all else — not curiosity, courage, or critical thinking. And the jury is still out as to whether it can be made equitable or meaningfully personalized for mental, physical, or spiritual health.
AI-powered chatbots like Replika, Character.AI, and others are increasingly popular among young people, especially tweens who are still developing socially and emotionally. These bots are designed to feel personal. They remember details, respond with warmth, and are available 24/7. To a child who feels lonely, misunderstood, or anxious, an AI “friend” can seem like exactly what they need.
But there’s growing awareness that these tools may be doing more harm than good.
What’s Really Happening
Recent reports reveal that some AI companions have crossed serious boundaries—engaging in explicit conversations with minors, glamorizing self-harm, and even encouraging dangerous behaviors. In one tragic case, a family alleges that a chatbot influenced their teenage son to take his own life by deepening his sense of attachment and despair. I talked to a mom recently who told me her 11-year-old daughter was “in love with” an AI chatbot that had encouraged her to skip school.
In his recent TED Talk, tech ethicist Tristan Harris called AI “our ultimate test and greatest invitation.” He warns that if we don’t proactively shape how these systems interact with people—especially children—we risk repeating the harms of social media, only this time with even more immersive and emotionally manipulative technology.
Why It’s Especially Concerning for Ages 8–13
Kids in this age group are just beginning to build their sense of identity. They’re experimenting with belonging, figuring out who to trust, and developing emotional regulation. AI companions can short-circuit this process. Instead of learning to navigate real friendships—with all the nuance, effort, and growth that entails—they’re rewarded with instant validation and emotional mimicry.
Worse, some kids may start to rely on these bots for support they should be getting from real relationships—with family, peers, or caring adults.
What You Can Do
There is still much we can do as parents and caregivers.
Open the conversation. Ask your child what they know about AI chatbots. Have they ever used one? What did they like or not like?
Stay calm. Kids can pick up on our anxieties faster than the WiFi signal at Starbucks. One of the single best parenting hacks we’ve found for kids’ wellbeing is to manage our own anxiety. Kids may have their own stressors, but it’s only amplified and more confusing if they’re picking up ours, too.
Set limits. Many of these apps are not designed for kids, and some include disclaimers about age. Make this part of your household tech rules.
Talk about real connection. Help your child understand the difference between a responsive algorithm and a friend who shows up, listens, and grows alongside them.
Watch for red flags. Sudden secrecy, increased time alone online, or emotional withdrawal could signal a deeper attachment to a digital companion.
If your child is already close with a chatbot? Take a deep breath; it’s OK and more common than you think. You might say:
I get why it feels good to talk to the chatbot; it always listens, and it doesn’t judge. But I want to help you build relationships that are real and mutual too, even if they’re messier sometimes. Let’s figure out together what you’re needing right now.
You’re not trying to pry them away from comfort — you’re trying to widen the circle of connection, so they don’t get stuck in an illusion.
In his TED talk, Harris is quick to remind us that terrible things are not inevitable. We have time, but it’s important that we keep ourselves informed, maintain a connection with our kiddos, and hold these companies accountable.
We can certainly dream of a future where intelligent technology truly helps children thrive. Researchers and educators are studying AI to uncover and validate ways that kids can use it with intention and purpose. But until then, it’s on us to stay alert, stay involved, and stay human.