Extracts below from an article on Medscape, including comments from author and HIFA member Charlotte Blease. Full text: https://www.medscape.com/viewarticle/ai-eroding-cognitive-skills-doctors...
September 29, 2025
'In a prospective study from Poland published last month in The Lancet Gastroenterology and Hepatology, gastroenterologists who’d grown accustomed to an AI-assisted colonoscopy system appeared to be about 20% worse at spotting polyps and other abnormalities when they subsequently worked on their own. Over just 6 months, the authors observed that clinicians became “less motivated, less focused, and less responsible when making cognitive decisions without AI assistance.”...
The study data “seems to run counter to what we often see,” argues Charlotte Blease, PhD, an associate professor at Uppsala University, Sweden, and author of Dr. Bot: Why Doctors Can Fail Us―and How AI Could Save Lives. “Most research shows doctors are algorithmically averse. They tend to hold their noses at AI outputs and override them, even when the AI is more accurate.”
If clinicians aren’t defaulting to blind trust, why did performance sag when the AI was removed? One possibility is that attitudes and habits change with sustained exposure. “We may start to see a shift in some domains, where doctors do begin to defer to AI,” she says. “And that might not be a bad thing. If the technology is consistently better at a narrow technical task, then leaning on it could be desirable.” The key, in her view, is the “judicious sweet-spot in critical engagement.”...
The longer horizon returns to identity. Blease is unapologetically direct. “I believe that over time, some degree of deskilling is inevitable, and that’s not necessarily a bad thing,” she says. “If AI becomes consistently better at certain technical skills, insisting that doctors keep doing those tasks just to preserve expertise could actually harm patients.”...
The best outcome is that AI reshapes medicine on purpose: We choose the tasks it should own, we measure when it helps or harms, and we train clinicians to stay exquisitely human while the machines do scalable pattern work. In that future, clinical judgment is less displaced than redeployed, with physicians spending fewer hours wrestling software and more time making sense of people.
==
HIFA profile: Neil Pakenham-Walsh is coordinator of HIFA (Healthcare Information For All), a global health community that brings all stakeholders together around the shared goal of universal access to reliable healthcare information. HIFA has 20,000 members in 180 countries, interacting in four languages and representing all parts of the global evidence ecosystem. HIFA is administered by Global Healthcare Information Network, a UK-based nonprofit in official relations with the World Health Organization. Email: neil@hifa.org