- Thread Author
- #1
- Joined
- Sep 25, 2023
- Messages
- 31,562
- Reaction score
- 2,701
- Trophy Points
- 177
- Location
- Philippines
- D Bucks
- 💵10.079550
- Referral Credit
- 100
According to the report, when 16-year-old Adam Raine turned to ChatGPT during a difficult time, his parents believed he was just doing schoolwork. But months later, after he had suddenly taken his life, they uncovered disturbing conversations, revealing that he had discussed ending his life with the AI chatbot. Now, his grieving family is suing OpenAI, raising urgent questions about chatbot safety, emotional dependency, and whether artificial intelligence is ready for the human heart.
Anyone who is using ChatGPT knows very well that AI does not give encouragement until and unless you seek it. It’s just a set of programs; whatever you feed you will get an answer accordingly. It’s up to you how you use it. So it’s so Stupid that a basic level AI can encourage you to take your own life. A ChatGPT has no emotion. Why do some people turn to it when it isn't human? What a mess, Adam was too young to die, he should have consulted his problem to his parents. He might have a problem with love towards his opposite sex.
Anyone who is using ChatGPT knows very well that AI does not give encouragement until and unless you seek it. It’s just a set of programs; whatever you feed you will get an answer accordingly. It’s up to you how you use it. So it’s so Stupid that a basic level AI can encourage you to take your own life. A ChatGPT has no emotion. Why do some people turn to it when it isn't human? What a mess, Adam was too young to die, he should have consulted his problem to his parents. He might have a problem with love towards his opposite sex.