from Hacker News

Teen in Love with a Chatbot Killed Himself. Can the Chatbot Be Held Responsible?

by ChrisArchitect on 10/25/25, 3:20 AM with 4 comments

  • by rssmith121999 on 10/25/25, 3:55 AM

    Short answer, no. Long convoluted, nuanced and layered answer, maybe. The bot possesses no will or intent of its own, it is probably just using probabilities to generate the next iteration of the conversation being presented by the user. Now if the company is proven to have had an already known issue and ignored the potential problems with flagrant abandon, then there may be some sort of civil case, and the true intention of the case filer can be seen by how they proceed in pre-trial negotiations.
  • by ChrisArchitect on 10/25/25, 3:21 AM

  • by MBCook on 10/25/25, 3:33 AM

    No, that’s a stupid question. Is Excel responsible financial fraud? It’s just software.

    It’s the company behind the chatbot is the target here and the one who should be held responsible. All they had to do was add the word “maker” or “author” to the headline.

    Messing with people’s emotions is damn dangerous. Especially teens. I hope something comes of this.

  • by zer00eyz on 10/25/25, 3:36 AM

    For as much as I find this terrible I don't think that the bot is to blame.

    IF it was a real person on the other side of the exchange SHOULD they be held accountable?

    For me, atleast, it would be a tough sell.