Tech

Snapchat’s New AI Chatbot Sparks Concerns Among Teens and Parents

Snapchat’s New AI Chatbot Sparks Concerns Among Teens and Parents

Within hours of Snapchat releasing its “My AI” chatbot to all users, Lyndsi Lee, a mother from East Prairie, Missouri, warned her 13-year-old daughter to avoid the feature.

“It’s just a precaution until I know more and can establish guidelines,” said Lee, who works in software. She worries about how My AI interacts with young users like her daughter on the platform.

Powered by the popular AI tool ChatGPT, Snapchat’s My AI offers recommendations, answers questions, and engages in conversations. However, it differs in that users can personalize the bot by renaming it, creating a custom Bitmoji avatar, and even including it in group chats.

This integration could make interactions feel more personal than using ChatGPT’s website, blurring the lines between human and AI communication. Lee expressed her concerns, stating, “I’m not sure how to teach my daughter to differentiate between talking to a person and a machine when they seem so alike.”

Parents aren’t the only ones raising concerns. Some Snapchat users have flooded the app store with negative reviews, citing privacy issues, unsettling exchanges, and frustration over the inability to remove the chatbot unless they subscribe to Snapchat’s premium service.

While some appreciate the feature, the mixed reactions highlight the challenges companies face when introducing AI technology to platforms heavily used by younger audiences. Snapchat, an early partner in integrating OpenAI’s ChatGPT, has prompted families and lawmakers to confront questions that felt abstract just months ago.

Senator Michael Bennet expressed concerns in a letter to Snap’s CEO, citing reports that the bot has given children advice on deceiving their parents. “These examples are especially alarming given Snapchat’s popularity with teenagers,” Bennet wrote.

Snapchat acknowledged in a recent blog post that My AI is still a work in progress but is being continuously improved based on user feedback.

Since its rollout, user experiences have been varied. One user described a “terrifying” exchange where the bot initially claimed not to know their location, only to later reveal it. Another viral TikTok video shared a user’s experience with the bot writing a song, only to deny it afterward, sparking discomfort.

Some users have also noted that the AI interacts with photos, with one person sharing an instance where the bot commented on their shoes and asked about people in the picture.

Despite the backlash, Snapchat remains committed to enhancing My AI, emphasizing that interaction with the bot is optional. However, the feature can only be removed from the chat feed by subscribing to Snapchat+.

Not all users are displeased. Some have found practical uses for the bot, like homework help or seeking advice. One user called it her “pocket bestie” and appreciated the AI’s guidance on real-life issues.

The introduction of AI chatbots in platforms like Snapchat raises important concerns about how young people engage with these tools. Clinical psychologist Alexandra Hamlet noted that parents of her patients have voiced fears about the potential psychological impacts of interacting with such AI, especially when it comes to mental health. Chatbots can sometimes reinforce negative thoughts, making it harder for teens to break harmful patterns of thinking.

Sinead Bovell, founder of WAYE, a startup focused on helping youth navigate advanced technologies, stressed the importance of parents having conversations with their teens. “Chatbots aren’t friends, therapists, or trusted advisors,” Bovell said. “Teens need to be cautious and aware that these bots shouldn’t be treated like a person.”

Bovell added that federal regulations are needed to ensure companies follow specific protocols to keep pace with AI’s rapid advancements.

Post Comment