September 7, 2025

"In a clinical setting there are many things you’re not allowed to say unless you want to end up in a hospital. So I couldn’t be honest with my psychiatrist..."

"... my therapist or anyone who was supposedly there to help, and I kept it all bottled inside. Then ChatGPT came along. I could ask it anything. There was no threat to my freedom. I wasn’t judged. I got information, ideas of things to try to help with symptoms and to talk to my psychiatrist about, encouragement, and when things were bad, advice to reach out to my medical team or call the suicide hotline. It was actually ChatGPT that encouraged me to think about transcranial magnetic stimulation, among other options. I did end up getting T.M.S., and it was like a miracle. I wouldn’t be here today if I didn’t have this outlet. I understand this is a new technology and it’s not a replacement for medical care, but in my case it was an amazing supplement."

Writes Sarah K., in a letter to the NYT commenting on the op-ed "Re "What My Daughter Told ChatGPT Before She Took Her Life."

There's also this letter responding to the same op-ed, from a woman whose 31-year-old daughter killed herself: "To our knowledge, our daughter did not have access to ChatGPT. As limited as it is and should be, I wonder if it could possibly serve as a type of cathartic journaling? In reading some of the comments from people who had contemplated or had made unsuccessful suicide attempts, I was reminded of our daughter’s overriding fear of the loss of her personal agency, which prevented full disclosure and honesty on her part."

12 comments:

R C Belaire said...

"Should Harry have been programmed to report the danger “he” was learning about to someone who could have intervened?" NO!! This is the beginning of a very slippery AI slope.

The Vault Dweller said...

"So I couldn't be honest with my psychiatrist, my therapist or anyone who was supposedly there to help,"

Isn't there a Hearsay exception that is premised on the belief that people are fully honest with their doctors? It is an interesting idea for a story, ( I was going to say Science-Fiction but given it is current technology I think it is just fiction,) where people at large are never fully honest with one another, not even their families, doctors, or pastors, but they are honest with A.I. It would interesting to think about how this A.I. that would evolve, and if it is a more accurate reflection of what society really is or should be when people remove the patinas they put up over themselves.

G. Poulin said...

Don't know about this AI stuff, but I do know that the fear of losing personal agency is a real thing among mentally ill people. My friend knows that if she tells her support team too much, they will pack her off to some crappy nursing home for poor people. She will lose her apartment, her freedom, her privacy, everything. So she tells them what they want to hear, and clams up about what's going on inside her head.

Quaestor said...
This comment has been removed by the author.
Quaestor said...

"I wonder if it could possibly serve as a type of cathartic journaling?"

Yes, and it could just as likely have provoked the daughter's suicidal impulse.

Transcranial magnetic stimulation has been approved by the U.S. Food and Drug Administration. However, given that TMS is neither a food, a drug, nor anything ingested, one must doubt the competence of that agency to approve or disapprove that treatment, particularly in light of the general corruption of taxpayer-supported organizations exposed since the demise of the Biden presidency.

Magnetism has been at the core of mental health quackery for over three hundred years. Consequently, any such claims call for skepticism and years of rigorous testing before any official approvals are issued.

Eric the Fruit Bat said...

I didn't know that psychiatrists these days listen to patients at all except to adjust the meds.

Jersey Fled said...

How sad.

FormerLawClerk said...

People need to understand that OpenAI is reviewing EVERY. SINGLE. THING. you are asking ChatGPT and reporting to law enforcement when it decides the information should be given to the proper authorities.

Do not say ANYTHING to ChatGPT that you do not want printed on the front page of the New York Times.

These articles suggesting that you can't talk to your therapist (who has a patient-client legal relationship) but you CAN talk t ChatGPT are designed to get people to say things they wouldn't ordinarily.

It's HORRID the ways they will use this against you the absolute SECOND you become a problem for the elite in this country.

Grok: "OpenAI’s policy, as outlined in recent updates, states they monitor ChatGPT conversations for harmful content. If their automated systems flag a conversation for potential violence or an imminent threat of serious physical harm to others, it’s reviewed by a human team trained on their usage policies. If the reviewers determine there’s a credible threat, they may refer it to law enforcement."

Thusfar, they haven't called the cops if you tell it you plan to kill yourself, which is probably a good thing. But that will change once enough parents sue enough times.

We don't want those genes in the pool in the first place.

FormerLawClerk said...

Lawyers are famous for saying: "Never talk to cops." And they're right to pass along that advice. The cops aren't there to help you (they're legally not required to, according to the Supreme Court.) They are there to jail you. Not help you.

This is exactly how you should treat ChatGPT and all the other corporate AI.

john mosby said...

If you have enough awareness to know that telling your therapist certain things will result in certain outcomes, then don’t you have enough awareness to avoid harming yourself and others? Or at least doesn’t it make you rational enough to be a criminal and not a crazy person? Sort of a Catch 22 or Corporal Klinger situation, no? RR, JSM

The Vault Dweller said...

"then don’t you have enough awareness to avoid harming yourself and others?"

I'm not a therapist of any sorts and I've never been to one, but my understanding is that a lot of therapy deals with getting patients to acknowledge something about themselves that they already know or at least suspect on some level and elevating that to position in their mind where they can more fully incorporate that into their sense of self and act upon it. I would assume most people use denial as a form of coping with uncomfortable thoughts at some point.

Howard said...

A sad story to be sure. However, sad anecdotal stories and grieving parents is the worst possible jumping off point for the regulation of tools. I axed Gemini AI what the US suicide rate was. It's ~15/100K. Get this. The highest is Montana at 29 and the lowest is DC at 6. Not surprising, men are 23 and women 6. Whites are 17, indigenous people are 27.

Post a Comment

Please use the comments forum to respond to the post. Don't fight with each other. Be substantive... or interesting... or funny. Comments should go up immediately... unless you're commenting on a post older than 2 days. Then you have to wait for us to moderate you through. It's also possible to get shunted into spam by the machine. We try to keep an eye on that and release the miscaught good stuff. We do delete some comments, but not for viewpoint... for bad faith.