Log in




Educational Strategies for Clinical Supervision of Artificial Intelligence Use

22 Aug 2025 11:54 AM | Deborah Hodges (Administrator)

Human–computer interactions have been occurring for decades, but recent technological developments in medical artificial intelligence (AI) have resulted in more effective and potentially more dangerous interactions. Although the hype around AI resonates with previous technological revolutions, such as the development of the Internet and the electronic health record,1 the appearance of large language models (LLMs) seems different. LLMs can simulate knowledge generation and clinical reasoning with humanlike fluency, which gives them the appearance of agency and independent information processing.2 Therefore, AI has the capacity to fundamentally alter medical learning and practice.3,4 As in other professions,5 the use of AI in medical training could result in professionals who are highly efficient yet less capable of independent problem solving and critical evaluation than their pre-AI counterparts. [New England Journal of Medicine]

Such a challenge presents educational opportunities and risks. AI can enhance simulation-based learning,6 knowledge recall, and just-in-time feedback7 and can be used for cognitive off-loading of rote tasks. With cognitive off-loading, learners rely on AI to reduce the load on their working memory, a strategy that facilitates mental engagement with more-demanding tasks.8 However, off-loading of complex tasks, such as clinical reasoning and decision making, can potentially lead to automation bias (overreliance on automated systems and risk of error), “deskilling” (loss of previously acquired skills), “never-skilling” (failure to develop essential competencies), and “mis-skilling” (reinforcement of incorrect behavior due to AI errors or bias).9 These risks are especially troubling because LLMs operate as unpredictable black boxes10; they generate probabilistic responses with low reasoning transparency, which limits assessment of their reliability. For example, in one study, more than a third of advanced medical students missed erroneous LLM answers to clinical scenarios.11

More>

###

Powered by Wild Apricot Membership Software