9 Matching Annotations
  1. Last 7 days
    1. But after several therapy sessions, it became clear that the man was fixated on ChatGPT. He quit his job as a graphic designer after using ChatGPT 100 hours a week and having delusional thoughts about solving the energy crisis.

      It didn't occur to me that AI could be addictive but I can totally see why it would be.

    2. Safety guardrails that stop chatbots from encouraging suicide can break down when people engage the bots in extended conversations over days or weeks.

      It's scary that these guardrails were not stress tested to prevent this from happening, seems pretty important to me.

    3. It’s not unusual for new technologies to inspire delusions. But clinicians who have seen patients in the thrall of A.I. said it is an especially powerful influence because of its personal, interactive nature and authoritative tone.

      I feel like AI is amplifying an issue that has been around for awhile but now we will see it on a much more dramatic scale as AI evolves and more and more people use it.

    4. For a product with 800 million users, that translates to 1.2 million people with possible suicidal intent and 560,000 with potential psychosis or mania.

      This is a lot of people over the course of just a month, that's terrifying. We need to do more to help people with mental health struggles instead of dismissing the issue.

    5. For a very small percentage of users in mentally fragile states there can be serious problems,

      Just because the number is small doesn't make it insignificant, I am disappointed in this response to the issue.

    6. More than 30 described cases resulting in dangerous emergencies like psychosis or suicidal thoughts

      30/100 mental health professionals experiencing issues with AI fueled emergencies is a scary number. Mental health is so important and can be so fragile at times. I wonder if anyone is doing any large scale studies on this at the national level yet.

    7. tip people from simply having eccentric thoughts into full-on delusions.

      We don't need more possible triggers for people dealing with eccentric thoughts, especially when the consequences can be so detrimental to themselves and others.

    8. who had no history of mental illness

      It's scary to me that someone could go from 0-100 with improper use of an LLM/AI. I haven't really heard of this being a problem much until recently with the developments in AI and it is very concerning to me. It makes me worried about how others are using AI and how it is impacting them mentally.