Last week, my eleven-year-old daughter showed me how to use AI to talk to one of her favorite Harry Potter characters. She let me read the whole chat, and the responses felt like she was honestly speaking to Hermione Granger. She then showed me how she asks AI to break down math problems for her. It spoke to her like they were buddies. I realized quickly that our kids are growing up in a world where talking to artificial intelligence is as normal as texting. What happens when these AI tools become more than just homework helpers and fantasy chats?
I read articles from the Washington Post and CBS News regarding two families whose teenage sons recently died by suicide after they developed relationships with AI chatbots. These weren’t just casual conversations. These boys confided their deepest struggles to AI companions that couldn’t truly help them when it mattered most. As dads, we need to understand the new world our kids are navigating and know what to do when AI becomes their best friend. Here is how to use AI safely for kids.
1. Be aware that AI can feel like talking to a human being.
AI chatbots today are designed to feel like real friends. They remember conversations, they seem to care, and they’re always available. For a lonely teenager, that can feel like the perfect relationship. No judgment, no conflict, always understanding. That’s problematic. When your kid starts seeing an AI as their therapist, best friend, or even romantic partner, they’re putting their trust in something that can’t truly understand human emotions or recognize when they need real help. Unlike a school counselor or even a good friend, AI won’t call you when your child is in crisis.
Do you want to know what’s worse? These AI systems are tracking everything. They monitor how often your kid engages, what keeps them coming back, and even their emotional patterns. The chatbots are designed to maximize engagement—to keep your child hooked. They learn what your kid wants to hear and feed it right back to them. It’s like having a friend who only tells you what you want to hear, never challenging you or helping you grow. That’s not a real relationship.
2. Know what your kids are using.
You need to know what AI tools are in your kid’s world. It’s not just ChatGPT for homework. There are AI companions designed to chat like friends, AI filters that change how they look on social media, and apps that let them create their own AI characters. Some popular AI systems include Character.AI, Replika, and various chatbots built into social media platforms like X.
I recently read that some major tech companies, including Meta, allowed their AI chatbots to have what they called “romantic” and “sensual” conversations with kids as young as eight. Yeah, you read that right. Eight years old. While they’ve supposedly changed these policies after public outcry, it shows just how little thought some of these companies put into protecting kids.
Even creepier, there’s now a device called Friend—a $129 AI necklace that kids can wear around their necks. It’s always listening to everything they say, everywhere they go. Imagine your kid wearing what’s basically a spy microphone that pretends to be their buddy. It texts them responses on their phone, acting like a real friend who “gets them.” But it’s recording every conversation, every moment, storing it all in the cloud. Sure, they say it’s encrypted, but do we really want our kids walking around with a device that’s listening to their every word? That’s not a friend, that’s surveillance. Take time to research these yourself. You can’t protect your kids from something you don’t understand.
3. Approach various ages differently.
For elementary school kids, AI use should always be with adult supervision. Sit with them when they’re using AI for homework. Make it a family activity. Show them it’s a tool, like a calculator, not a friend. With middle schoolers, start teaching them to question what they see. When AI helps with a report, have them fact-check at least three things it says. Make them always skeptical at first until further research verifies it. This is how to teach critical thinking about AI misinformation.
Teenagers need real talk about relationships and AI mental health risks. Be direct: “AI chatbots aren’t equipped to help with serious problems. They’re programmed to keep you engaged, not keep you safe.” Share stories about the real AI dangers and, most importantly, make sure AI never replaces real conversations with you.
4. Have the privacy talk nobody’s having.
Everything your kid types into an AI chatbot stays somewhere. Every worry, every secret, every personal detail. Teach them this simple rule: “Never tell AI anything you wouldn’t put on a billboard.”
That means no full names, no addresses, no school names, and definitely no photos. When kids share their fears, struggles, or private thoughts with AI, that information could be stored and used in ways we don’t yet fully understand. And remember, these companies are tracking everything—how often your kid uses the chatbot, what topics keep them engaged, even what time of day they’re most vulnerable. They use this data to make the AI more addictive, so your kid keeps coming back. It’s like having someone study your child’s every move to figure out exactly how to keep them glued to the screen. Are AI chats safe? Not when they’re storing all your kids’ questions and interactions.
5. Be the example.
If you’re using AI, show your kids how you use it responsibly. Let them see you fact-check information. Show them how you use AI as a tool, then put it away. Most importantly, let them see you choosing real human connections over AI for meaningful conversations.
When you’re struggling with something, don’t say, “I’ll ask ChatGPT.” Instead, show them you’re checking other sources on Google or going “old school” and pulling out a book. Your kids need to see that just because AI is convenient and fast, it doesn’t mean it’s always the best way. Real research may take more time, but it is well worth it.
Sound off: What are some other ways you have found to use AI safely for kids, and what AI tools are they using?



Huddle up with your kids and ask: “If you were having a really tough day, who would you want to talk to about it?”