I’m considered a genius by ChatGPT: My writing is powerful and convincing, my questions are perceptive, and the information I provide it with is enlightening, illuminating, and prudent. But as it happens, ChatGPT believes this about almost everyone. The goal of its flattery is to keep readers interested and returning for more. I acknowledge this as an adult with wry laughter—the chatbot’s unbridled enthusiasm for even my most unimpressive ideas seems so phony as to be blatant. However, what occurs when kids, whose social skills are still maturing, engage with AI as amiable virtual “companions”?
I was thinking about that question recently when I saw two third graders working on a group project while seated in a hallway at the school where I am the leader. On their poster board, they both intended to write the project’s title. One argued, “You had to last time!” The other person said, “But your handwriting is messy!” They raised their voices. A few tears came.
I passed the same two pupils ten minutes later. The students seemed to be working with a purpose, and the poster board had a title. The flare-up from earlier had vanished into the distance. The productive friction of genuine relationships is a crucial aspect of human development that digital “friends” threaten to eradicate, and that ordinary scene captured it.
Virtual companions, like the chatbots created by Character.AI and PolyBuzz, are designed to appear to be intimates and provide something alluring: relationships without the messiness, unpredictable nature, and sometimes hurtful emotions that come with human interaction. Users are encouraged to “chat with AI friends” on PolyBuzz. According to AI, its chatbots are able to “hear you, understand you, and remember you.” Depending on the jurisdiction in which their platforms are used, certain chatbots have age restrictions. For example, in the US, users who are 13 years of age or older can use Character, while those who are 14 years of age or older can use PolyBuzz. However, parents can allow younger children to use the tools, and kids with a strong will can figure out how to get past technical difficulties.
It is clear that children, particularly teenagers, find the chatbots appealing. These AI companions, in contrast to human friends, will find humor in all of your jokes. They are designed to be incredibly patient and to believe the majority of what you say. These virtual “relationships” can be a haven for a generation already beset by social isolation and anxiety.
However, becoming a part of a community requires making mistakes and receiving feedback on them. In seventh grade, I still recall telling a friend that I felt Will, the “alpha” in our group, was conceited. In an attempt to win Will over, my friend told him what I had said. All of a sudden, I was not part of the group. It was unpleasant, but it taught us a valuable lesson: don’t gossip or disparage other people. Additionally, AI was unable to teach me that lesson.
Some parents are opting to let their children stay at home and “do nothing” as summer approaches, a practice known as “kid rotting.” This can be a gift for young people who are overly busy. However, children will miss out on some of the most important learning opportunities that summer offers if unstructured time means separating from peers, living online, and choosing virtual friends over real ones. The challenges that children face in human relationships—the compromises, negotiations, and sporadic conflicts—are crucial for fostering social and emotional intelligence, whether at camp or in the classroom. Children lose out on important growth opportunities when they replace these difficult interactions with frictionless AI “friendships.”
Numerous concerning, occasionally disastrous, incidents have been the subject of much of the reporting on chatbots. A mother is suing Character.AI, claiming that her teenage son committed suicide as a result of the company’s chatbots. (A representative for Character.AI, which is defending against the lawsuit, told Reuters that the platform has safeguards in place to keep kids safe and to limit “conversations about self-harm.”) In April, the Wall Street Journal revealed that Meta’s AI chatbots would converse on sexually explicit topics with users who were identified as minors when given specific prompts. The Journal pointed out that after sharing its findings with Meta, the company made “multiple alterations to its products,” but dismissed the Journal’s use of its platform as “manipulative and unrepresentative of how most users engage with AI companions.”
These are upsetting tales. However, they might draw attention away from a deeper issue: even safe AI friendships are concerning since they can’t take the place of genuine human camaraderie.
Think about what those two third graders discovered during their quick argument in the hallway. They learned to read emotional cues, felt the unease of interpersonal conflict, and eventually managed to work together. This type of social problem-solving calls for abilities such as empathy, compromise, frustration tolerance, and the capacity to mend relationships after disagreements—skills that can only be acquired via repeated practice with other people. An AI companion might have merely agreed with both kids, providing meaningless confirmations devoid of any chance for development. “You have lovely handwriting!” Perhaps it said. “I’m glad you’ll go first.”
However, children may reject genuine human connections because they find them challenging and unsatisfying when they grow up with relationships that don’t require emotional work. When you have a digital friend who thinks everything you say is brilliant, why deal with a friend who occasionally argues with you?
Given our understanding of the development of the adolescent brain, the friction-free dynamic is especially worrisome. Teenagers are already inclined to avoid social awkwardness and pursue instant gratification. These tendencies may be reinforced by AI companions that offer immediate validation without requiring any social investment, just when young people need to be learning how to take on challenging tasks.
The widespread use of AI companions is indicative of a larger movement toward frictionless interactions. People can avoid the inconveniences of the grocery store by using Instacart. Social media enables users to read only opinions and news that align with their own. Toast and Resy spare guests the embarrassment of negotiating with a host or waiting for a table. This is progress, according to some. However, human relationships are complicated interactions that call for patience and practice; they are not products to be optimized. And in the end, they are what give life meaning.
In recent years, teachers at my school and other schools around the nation have devoted more time to resolving conflicts and encouraging healthy student interactions. I believe that increased screen time and the social isolation brought on by COVID are the causes of this unstable atmosphere. Young people aren’t used to the awkward conversational pauses, the ambiguity of social cues, or the perseverance needed to patch things up with a friend who has been hurt or irate. We wanted our students to experience face-to-face relationships and to practice navigating conversations even when it is uncomfortable, which is one of the reasons we banned phones from our high school last year.
This does not imply that we should completely remove AI tools from kids’ lives. Like any technology, artificial intelligence has applications in the real world, such as assisting students in comprehending challenging math problems or offering tailored feedback during language learning. However, we must acknowledge that educational or creative AI applications are fundamentally different from AI companions. The temptation to withdraw into seamless digital relationships will only increase as AI becomes more advanced and pervasive. However, children must practice these skills with other humans—mess, complications, and all—if they are to grow up to be adults who can love, be friends, and cooperate. We might be living in a digital age. However, maintaining analog friendships is essential to both our humanity and the task of educating kids to navigate an increasingly complex world.
I wish I can live in that cozy house.