ChatGPT Sent Me to the ER

10 tedsanders 7 9/14/2025, 5:59:20 AM benorenstein.substack.com ↗

Comments (7)

mwetzler · 54m ago
my dad has a similar story. the voice of reason can be very helpful for people who take pride in telling themselves “it’s fine”. thanks Chat!
boopity2025 · 1h ago
Wild to think we’ve reached the point where “my AI told me to go to the ER” is a plausible sentence and not the setup to a Black Mirror episode. Pre‑ChatGPT, you’d Google “droopy eyelid” and get a mix of WebMD hypochondria and SEO‑bait wellness blogs. Now you get a differential diagnosis, a list of red flags, and a gentle shove toward not dying.

AI had carotid dissection in mind from the first message, just quietly waiting for the plot to thicken.

Sure, there’s a lot to worry about with AI, but in this case it basically played the role of the one friend who says “you look weird, go to the doctor” and turns out to be right. Which is both comforting and slightly terrifying.

zahlman · 49m ago
> AI had carotid dissection in mind from the first message

This does not follow from the evidence presented, even if we disregard questions of what "mind" means in this context. It's entirely plausible that the possibility of carotid dissection only made sense to consider partway through the conversation.

anovikov · 1h ago
I guess, saying anything positive about LLMs is now an anathema here so there are no comments...
zahlman · 47m ago
It's late night in North America; you said this less than an hour after the post went up; and plenty of posts get little traction on HN (including submissions of links that later become very popular on a separate submission or from the curated "second chance" queue).
zahlman · 50m ago
This title is clickbait. The implication ("following ChatGPT advice caused an emergency requiring an ER visit") is nearly the opposite of the central claim made ("ChatGPT encouraged me to go to the ER, and it turned out to be a life-saving decision").
wiseowise · 16m ago
That’s your interpretation.

When I read the title I thought about positive case [ChatGPT saved my life], not the negative one.