“ChatGPT killed my son”: Parents’ lawsuit describes suicide notes in chat logs
“ChatGPT killed my son”: Parents’ lawsuit describes suicide notes in chat logs

arstechnica.com
“ChatGPT killed my son”: Parents’ lawsuit describes suicide notes in chat logs

ChatGPT taught teen jailbreak so bot could assist in his suicide, lawsuit says.
Personal Note: For people who have suicidal thoughts, please be aware that the number of suicidal thoughts a "normal" person is supposed to have is zero. As someone who struggled with this, get help before it gets worse.