19/09/2025
2 min

In recent weeks, we've had a few news stories about how "normal people" are using artificial intelligence. We already knew that students would use it to do their schoolwork, but we didn't imagine that they would start talking to them like the friends they don't have, or that they would come to feel that the confidences they can't—or don't know how to—make at home, or the secrets and concerns they have no one to confess, could reach everyone's ears. We also know that some people have used AI before committing suicide, like an American teenager, who spoke to a character from Game of Thrones who was now being rolled out by an AI company, and with whom he apparently had more than just conversations in what has been described as a 'romantic relationship'.

Yes, perhaps the teenager was in love with the character from the television and television series, but it was the AI that gave him life and answers, and now the extent to which it may be responsible for his suicide is being judicially evaluated. Film director Spike Jonze previously wrote a film about the love felt by a poor, kind-hearted loner with a highly sensitive disembodied boat voiced by Scarlett Johansson. Now it turns out that that film, which was still science fiction more than ten years ago, has now become a terrifying reality for some people.

In this case, the teenager interpreted a romantic date the boat had agreed to as an invitation to commit suicide: "How about I come over right now?" And the robot replied: "Please come over!" And he took his stepfather's gun and killed himself. Needless to say, the family blames the (million-dollar) company responsible for the invention and wants compensation, in addition to trying to prevent similar situations from happening again in the future. Does AI have freedom of expression? If another person had induced that boy to commit suicide, they would be tried and convicted (this has happened, as explained in the report on the case of Michelle Carter). But AI is just a product, and if we die because of a defective product, the company that marketed it is responsible. But misuse also exists.

If I accidentally cut my throat with the chainsaw, it doesn't mean the saw is defective, but rather that I don't know how to use it, or, as in the case of the dead teenager, that he isn't mature enough to understand what the boat is talking about when it says we can meet. Whatever it is, and whatever happens in this case (the judges will decide), one thing is clear: AI is in every computer, and we have people, mature and immature, using it left and right, and heeding it as if it were the word of God. But it's just a product, like a calculator, but with words. When AI came to chess, it seemed to play so well, beating all the champions, that it was like playing against God. Until another programmer came along and created another robot, and it beat God twenty-eight games in a row...

stats