from Hacker News

Florida student asks ChatGPT how to kill his friend, ends up in jail: deputies

by trhway on 10/5/25, 6:54 PM with 2 comments

  • by quantumcotton on 10/6/25, 8:07 PM

    They probably shouldn't announce this. I get that they're trying to do it as an example. But, now kids are gunna know how to download a local LLM and do it. At least this way you can catch them one at a time.
  • by higginsniggins on 10/5/25, 7:29 PM

    I don't think this is what he had in mind in trying to jailbreak the program...