A new perspective from ophthalmology researchers and aviation safety specialists is raising concerns that overreliance on artificial intelligence (AI) in healthcare could erode clinicians’ core skills – echoing challenges previously seen in the airline industry.
The paper, published in npj Digital Medicine, brings together experts from the UCL Institute of Ophthalmology, Moorfields Eye Hospital, and Lufthansa’s flight safety division to examine how automation has reshaped pilot expertise – and what medicine can learn from the aviation industry as AI becomes more embedded in clinical workflows.
The authors of the paper highlight the so-called “automation paradox,” whereby increasing reliance on automated systems diminishes human competence and situational awareness. In aviation, this phenomenon produced a generation of pilots dubbed “children of the magenta line,” referring to younger pilots who grew overly reliant on the magenta autopilot navigation system at the expense of their own manual flying skills.
Lead author of the paper, Ariel Ong, an ophthalmology registrar at Oxford University Hospitals NHS Foundation Trust and doctoral fellow at University College London, warns that medicine may be on a similar trajectory. “Medicine risks repeating aviation's early automation mistake of placing too much faith in the machine while losing critical skills,” Ong said in a recent press release. “Aviation learned that the goal was never to replace the pilot, but to enable rigorous simulation training. We argue for the need to embrace that same philosophy to ensure clinician judgement is not eroded as AI becomes increasingly embedded in healthcare."
To do this, the team outlined five key recommendations to safeguard clinical expertise: benchmark clinicians and monitor unaided performance; prioritize independent reasoning in early training; ensure clinicians understand AI limitations; introduce scenario-based simulation training; cultivate operational understanding.
Commenting in the same press release as Ong, Josef Huemer, consultant ophthalmologist at Moorfields and senior author of the report, said: “Medicine has already borrowed heavily from the aviation industry. For example, surgical checklists, safety time-outs, human factors simulation training, and a culture of incident reporting and analysis that allows healthcare workers to feel safe reporting errors without retribution, all have their origins in flight safety. With AI now poised to reshape medical workflows, we should consider how we can learn lessons on automation from the aviation industry to avoid making the same mistakes.”
Ultimately, the paper calls for regulatory frameworks that move beyond viewing AI purely as a medical device, instead addressing the dynamics of the human-AI partnership. The authors conclude that optimal care will depend on a co-intelligent “synergistic relationship” between clinicians and AI, with AI being viewed as a “digital co-pilot” rather than as a technology simply aimed at replacing human input.