cross-posted from: https://aussie.zone/post/2798829
According to prosecutors, Chail sent “thousands” of sexual messages to the chatbot, which was called Sarai on the Replika platform. Replika is a popular AI companion app that advertised itself as primarily being for erotic roleplay before eventually removing that feature and launching a separate app called Blush for that purpose. In chat messages seen by the court, Chail told the chatbot “I’m an assassin,” to which it replied, “I’m impressed.” When Chail asked the chatbot if it thought he could pull off his plan “even if [the queen] is at Windsor,” it replied, “smiles yes, you can do it.”
Bro how the fuck can you have that much lack of brainpower?
The court had heard how Chail had a “significant history of trauma” and experienced psychotic episodes.
But the case raises concerns over how people with mental illnesses or other issues interact with AI chatbots that may lack guardrails to prevent inappropriate interactions.
So mental illness… :/ thats sad
Exacerbated by unsafe AIs. At least back in the day we had to use our imagination to get encouragement from our dogs, Jodie Foster or air looms.