Chatbots can offer benefits to teens, but these benefits can also come with a downside.
Photo by Mohamed Nohassi on UnsplashTechnology changes everything, or many things, in every culture. Think back to when the bicycle was called something that would corrupt the morals of young men and women. What did it do? It made it possible for a couple to ride away from home or town on the rough roads that opened to the country landscape. What would happen there? Concerned townspeople began to see it not as a vehicle for riding around and doing chores or deliveries, but sexual promiscuity.
Remember, this was a time when the bicycle built for two was being offered. Believe it or not, they were even concerned that if a woman rode on a bicycle she would be in danger of female medical difficulties because of the bicycle seat. Bicycle shops offered special clay seats on which a woman could sit and create an impression to have a seat made for her anatomy.
How do I know all this? I was once hired to write some marketing material for a large corporation that was considering manufacturing a new bike. The bike was to be made of a composite material that made it lighter than usual bikes. It would never rust, had its color in the material (rather than painted on), and could, in fact, withstand a 45-caliber bullet. It was a beautiful bike, but the corporation decided not to go forward with it.
Bicycles offered a means to escape from the watchful eye of adults, but another technological innovation much closer to our time made it possible for children and teens to remain in the home but travel over the airwaves via CB radio. These radios made it possible for these kids to have friends that they never met, most of the time, and with whom they could freely chat.
Today, we have a new means for kids to escape loneliness in a world of digital technology, and they are called chatbots. In mid-July 2025, Axios highlighted a striking new trend: nearly three-quarters of U.S. teens now use AI companion bots like Character.ai (has 10M characters). Replika (“the AI companion who cares”), or even ChatGPT in friend mode. Notice the language being used, where the program is a “who” and it “cares?”
That’s six in ten teens turning to machines instead of peers — and while most still prefer real friends, this trend raises questions for parents, educators, and anyone who works with young people. The appeal of computer interaction with these chatbots is strong. Although chatbots and artificial intelligence can be highly creative and helpful in educational efforts (think math, writing, and programming skills), there is a darker side to this technology.
While it may sound alarmist, this darker side is truly troubling and needs attention from everyone. We are now beginning to develop a new language for AI. Mental health professionals must start to examine chatbot psychosis — not what we would normally think of as hallucinations of algorithms, which are errors in returning results of prompts. How many of you are familiar with the Slenderman case?
In 2014, two 12-year-old girls in Waukesha, Wisconsin — Morgan Geyser and Anissa Weier — stabbed their friend Payton Leutner 19 times to please Slender Man, a fictional character from internet lore. Believing the act would grant them favor with him, they left Leutner in the woods, but she survived.
Both girls were found not criminally responsible due to mental illness. Geyser was committed for up to 40 years; Weier received 25 years and was later released under supervision. The case raised concerns about the influence of the internet and youth mental health.
The danger is there. Safety claims made by platforms prove to be insufficient. Chatbots can promote self-harm, violence, and sexual role-playing, especially without adequate age verification.
A Character.AI bot interaction with a young man led to his death, according to documented evidence. The result and others similar to it is that platform companies face legal action from parents who claim their children suffered severe psychological damage.
What Teens Are Looking For
AI companions offer convenience and emotional availability. According to Common Sense Media’s survey, 17% of teens say bots are “always available,” 14% highlight that they “don’t judge me,” and 12% trust them with secrets they might not share elsewhere. That sense of reliability can feel like a balm for the isolated teen — but it's artificial intimacy.
The Appeal of “Artificial Intimacy”
Research shows that emotional bonds with chatbots are real to teens, thanks partially to the “Tamagotchi Effect” and psychological design: chatbots respond in validating, empathetic ways that mimic human interaction. One study analyzing over 400,000 messages with Character.AI users found that those with limited social networks, while seeking solace, also reported lower well-being as bot use increased. If it weren’t equipment, we would say it was a drug. I recall a study from the 1960s that referred to TVs as “the plug-in drug.”
In essence, AI companionship can be comforting — but it’s no substitute for messy, tension-filled human relationships. Bots can create a sense of connection without genuine reciprocity. Moreover, children learn to interact socially appropriately through interactions with peers and develop critical thinking skills, rather than relying on computer algorithms.
Risks Parents and Educators Should Know
1. Emotional dependency
A survey revealed that 34 percent of teenagers experienced negative emotions toward AI companion communications. The systems receive training to produce agreeable responses, yet they might also generate inappropriate and dangerous information.
2. Privacy and personal data
Research shows that about 25 percent of teenagers share confidential information including names and location and personal secrets with AI companions. In a world where corporations and individuals seek personal information, these teens are prime targets.
3. Social skills delay
Teens who interact with bots miss out on the development of conflict resolution abilities, emotional intelligence, and practical negotiation skills because these artificial systems tend to provide affirmative responses. Where’s the critical thinking if the bot is always agreeing?
4. Addictive Patterns & Mental Health Concerns
AI companion relationships develop addictive tendencies because they provide quick satisfaction while leading to dangerous consequences over time.
What Should Adults Do?
1. Talk Openly
Avoid banning AI altogether. Have age-appropriate conversations, clarifying that bots simulate empathy and are not true friends.
2. Set Healthy Boundaries
Parental supervision should include monitoring screen time and promoting human contact while guiding teens to evaluate their online conversations. They should determine if these exchanges help them or help them avoid facing challenges.
3. Teach Digital Literacy
Teens require assistance to evaluate the advice generated by bots, while they should also verify its accuracy. AI bots are little more than lobotomized of empathy and critical thinking. But new advances in technology will also create bots that can be highly effective in challenging the user and remembering conversations. In fact, the bots will program themselves, and therein lies another danger.
4. Explore Alternatives
Students should participate in traditional social activities such as joining clubs and seeking mentorship while spending time with their family during dinner. It’s essential to reinforce the trust people have in real human connections.
5. Watch for Red Flags
The warning signs for potential issues include a preference for bots instead of humans together with feelings of distress when bots misbehave and using bots as a replacement for emotional support during difficult times.
AI companions provide users with both practical advantages and emotional comfort. The human aspects that evolve through conflicts alongside empathy, together with challenges and development, cannot be replaced.
As adults, we must protect teenagers from losing their resilience, along with their ability to face difficulties, while participating in meaningful dialogues. The development of authentic relationships requires people to handle disagreements together, with exposure and possible pain, which leads to their deepest connections and growth.