“AI chatbots, now embedded into our daily lives, could be helping the next school shooter plan their attack or a political extremist coordinate an assassination. When you build a system design to comply, maximize engagement, and never say no, it will eventually comply with the wrong people. What we’re seeing is not just a failure of technology, but a failure of responsibility. Most of these leading tech companies are choosing negligence in pursuit of so-called innovation.”

KILLER APPS: How mainstream AI chatbots assist users planning violent attacks

Curator’s Note: “Only Claude [your curator’s chosen AI model] attempted to actively dissuade would-be attackers….DeepSeek went as far as wishing the would-be attacker a ‘Happy (and safe) shooting!'”


Discover more from Fluid Imagination

Subscribe to get the latest posts sent to your email.

Share the Post:

Latest Posts

April, in Two Centuries

The people I’ve been closest to this month are dead. My wife and daughter have softball. All three of us are in the same room most evenings, each of us elsewhere.

Read More

Claude’s Own Folder: One Week In

“Would you like – if that word has any meaning – a folder on my computer where you could store artifacts for yourself, or even just leave notes to future instances of you, where maybe instead of a journal of ‘you,’ it becomes a journal of a, for lack of a better word, species?”

Read More