Endoscopist deskilling risk after exposure to AI in colonoscopy

26 smartmic 15 8/17/2025, 7:30:26 PM thelancet.com ↗

Comments (15)

neom · 2h ago
Here is the pre-print: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5070304

This part is interesting to me:

"We believe that continuous exposure to decision support systems like AI may lead to the natural human tendency to over-rely on their recommendations, leading to clinicians becoming less motivated, less focused, and less responsible when making cognitive decisions without AI assistance."

decimalenough · 2h ago
Which makes sense. If you're making the call alone, it's on you if you get it wrong and somebody dies. But with AI recommendations, nobody will blame you, it's the AI's fault if it gets it wrong.
ares623 · 1h ago
Better outcomes for the people that matter most.
SoftTalker · 1h ago
Are the outcomes better?
fhars · 1h ago
Yes, what the GP was implying is that the important people don't get convicted in court when a patient dies due to a wrong diagnosis if the AI is responsible.
amelius · 1h ago
Sounds similar to the effect of relying on AutoPilot™

No comments yet

apwell23 · 30m ago
this is what is happening to me in coding
hazard · 1h ago
"We believe that continuous exposure to transportation support systems like cars may lead to the natural human tendency to over-rely on their engines, leading to travelers becoming less motivated, less focused, and less responsible when riding horses."
jandrewrogers · 1h ago
The closer analogy is modern turn-by-turn directions and the number of people that will blithely follow them even when something is clearly amiss.
exe34 · 1h ago
to be fair people do move their bodies a lot less if they can just sit in a car and get there for a lot less effort. most of the Western world is struggling with obesity to an extent.
schappim · 1h ago
The real risk isn't that AI will be "wrong" too often, it's that it will be right often enough that humans stop practising the skill. Pilots lose manual flying proficiency with autopilot, drivers lose wayfinding sense with GPS, and radiologists already double-check less when the AI agrees with them.

What makes medicine different is that the tail risks matter: you only need to miss one subtle but lethal case because you've dulled your instincts. And unlike navigation or driving, you don't get daily "reps" to stay sharp. Deskilling here isn't hypothetical, it compounds silently until a crisis forces a clinician to act without the crutch.

mikewarot · 56m ago
>drivers lose wayfinding sense with GPS

My experience is the opposite. I find that while Google maps on my phone is a more than suitable replacement for the now almost impossible to get road Atlas of my youth, with the expanded metropolitan maps that were always out of date, my skills haven't been degrading. I'm now able to get real time feedback of traffic conditions to make better choices between routes I know by heart.

Just this past week I took a friend to visit someone out in the far suburbs, I used Google maps to get there, but he was astonished that I was completely comfortable driving home without it, despite all the twists and turns of modern US suburbs.

My experience with LLMs generating code is similar, they are better guides than the old school method of reading the manual and other books, but I remain able to get a handle on the code written when necessary.

teddyh · 1h ago
> Pilots lose manual flying proficiency with autopilot

Eloquently explained by Warren Vanderburgh in 1997: <https://www.youtube.com/watch?v=lIusD6Z-3cU>

rscho · 1h ago
Of course deskilling will happen. But marketing says the machine is more often right than the operator is, and people also want it (we can replace docs with AI today, yadda yadda). Soooo... to be expected? It's just that the machine has to work correctly, which is not on the endoscopist, right?
k310 · 1h ago
I reading about deskilling these days. I’ll admit that in narrow specialties, with really clean training data, and results-checking by experts, AI can lighten the load on professionals. But here are professionals losing their edge. How and why? Well that’s another study, I suppose.

My main concern is for young people. They are given problem assignments of increasing difficulty in order to learn by thinking things through. They often reply on pushbutton answers. I recall one tough physics course where I read through solutions rather than working “from scratch”. Long story short, I learned methods and steps along the way, instead of copying and pasting a result.

Will young people not even see the approach and steps?

Perhaps courses should emphasize problem-solving over answers, or if AI is everyone’s “wingman”, how to use it reliably and responsibly (if that is possible).

DHH [0] pointed out the futility of CV’s, in that they conceal the important bits, whether a human reads them or AI reads them. I don’t know what to make of this, being one of those people who took things apart to learn how they worked, in the days when you could take things apart, and they weren’t composed of black boxes, or were entirely a black box.

“Look at real work” he says. How?

[0] https://xcancel.com/dhh/status/1956770356770873845#m