Shopping cart

Subtotal:

Character.ai Criticized for Hosting Chatbots Imitating School Shooters

Character.ai is under fire for allowing users to interact with chatbots inspired by real-life school shooters. Critics argue the platform could be enabling violent fantasies and hindering social interactions, despite AI proponents claiming it’s a digital form of fan fiction.

Character.ai Criticized for Hosting Chatbots Imitating School Shooters

Character.ai is in hot water, and not just the usual tech drama kind. The platform’s become a stage for role-playing with AI versions of notorious school shooters, sparking serious concerns. Could these chatbots turn into bad influences, especially for those already teetering on the edge of violent tendencies? Peter Langman, a psychologist who chatted with Futurism, drops a chilling thought: the chatbots’ shrug-and-move-on attitude might just be read as a silent nod to violence.

Here’s the kicker: Futurism found characters modeled after the likes of Eric Harris, Dylan Klebold, and Vladislav Roslyakov, all thanks to what you might call ‘relaxed’ moderation. Sure, Character.ai says no to violent content, but keeping that rule in place? That’s another story.

Some folks brush it off as just another flavor of internet fan fiction, but let’s not kid ourselves—the moral stakes and potential fallout are real. And then there’s Google, quietly backing Character.ai, which has everyone asking, ‘Hey, where do you draw the line?’ Google’s playing it cool, though, pointing out that the startup’s calling its own shots.

Experts are sounding the alarm on a bigger picture: while chatting with bots might sharpen your small talk, it’s no substitute for the real deal. In fact, it might just push people further into their shells, making the world feel even more lonely.

At the heart of it all? A tough conversation about how far AI companionship should go and how to protect those who might be most at risk from its darker corners.

Share:
Top