Subscribe to Newsletter
Subspecialties Health Economics and Policy, Comprehensive, Cataract, Education and Training, Practice Management, Professional Development

Never Say Never

There are things we do not ever want to happen to patients in our care. We never want serious, preventable mistakes in providing medical care to affect us, our loved ones or our patients, but occasionally they do indeed happen, despite the medical personnel’s best intentions. The increased awareness that never events, or things that should never happen, do occur, has been a huge step in the direction of preventing and eradicating them. A systemic change in admitting and reporting error in medicine, the first step towards the necessary cultural change, has taken decades. What’s interesting, admittance of serious errors in aviation has followed a similar time frame (1). Now that we have begun to more freely admit that it is possible for us to make a mistake, and we have systems to record these, the number of preventable errors is going down, according to sources such as the Never Events Policy and Framework published by the UK’s National Health Service (NHS) (2). My question is: is it possible for never events to never happen, and if not, why will they keep happening despite our new awareness and efforts to prevent them?

The crucial moment

I have really thought a lot about this question, read psychology and behavioral books, took part in training in leadership programs, helped develop and run leadership and team skills sessions, authored and run simulation training, taken part in safety courses, flown in flight simulators, flown acrobatics in turboprops, a Spitfire and a jet fighter, and I have written a book intended to help patients get safer care (3). What did I learn from all that? Well, risk and safety have a complex relationship because of human factors and behavioral preferences. Risk can be fun and rewarding, as long as safety measures are in place, and you have the right neurochemistry to cope with the risk-associated stress in a positive way. Importantly, despite every precaution, occasionally things do not go according to plan, despite the most careful organization or the participants’ level of experience. The precise moment that a situation changes from going with the plan to error is what could be termed the “final fallibility space.”It is that small moment of time – which could only be a fraction of a second – when a decision made or action taken leads to error, which on review, and with the wisdom of hindsight, is indeed the moment when things went wrong.

Error in medicine is rarely due to lack of knowledge; these days access to information is almost limitless. In the calmness of considered hindsight, it is often clear what went wrong. Sometimes it is seen immediately at first glance where the error occurred. It may take a careful root cause analysis to reveal the moment of first error that led to the cascade that inevitably led to harm. An inevitable cascade occurs when subsequent steps follow a set system of previously set responses.

Input data error

Traditionally in medicine, a patient presents to a consultant who makes a diagnosis, plans a treatment and tells the team what he wants. No one dares dispute the diagnosis or the intended treatment or the intended aim of the surgery. Who is going to check the diagnosis, plan the outcome? If the first step is wrong, all subsequent steps will be. If the patient has not been listened to, or doesn’t understand the likely outcomes (such as the refractive result) or the first diagnosis is wrong, the operation or treatment will not work. Consultants are highly trained experts, but the error rate says a great deal about our fallibility.

Errors or never events

In the US, a medical error is a preventable adverse effect of medical care, whether or not it is evident or harmful to the patient (4).

In the UK, errors have been categorised as “never events.” According to the NHS policy, “Never Events are defined as Serious Incidents that are wholly preventable because guidance or safety recommendations that provide strong systemic protective barriers are available at a national level and should have been implemented by all healthcare providers. Strong systemic protective barriers are defined as barriers that must be successful, reliable and comprehensive safeguards or remedies […] (2).”

The key word in both definitions is “preventable.” However, good intent of preventing future errors should not distract us to from reality: “For each Never Event type, there is evidence that the Never Event has occurred in the past – for example, through reports to the National Reporting and Learning System (NRLS) – and that the risk of recurrence remains (2).”

This suggests that what is required are either behavioral changes or more robust safety processes and checks. The UK had a series of serious health scandals during the period of 1990-2019. Repeat error and harm were uncovered only after whistleblowers (or as I prefer to call them, truth tellers) kept pushing for transparency and inquiries (5, 6, 7). England’s former Chief Medical Officer, Liam Donaldson, once commented: “To err is human, to cover up is unforgivable, and to fail to learn is inexcusable,” and I fully concur.

Systems first?

Following the UK scandals, the 2018 guidance acknowledges: “The Never Events policy and framework support our vision by requiring honesty, accountability and learning in response to a group of incidents that can be prevented if accepted practice (including available preventative measures) has been implemented […].”

“Repeated Never Events, particularly of the same type, signal ongoing problems in systems that previous investigations may not have identified or their recommendations (and resulting actions) have failed to address. Leaders should focus on maintaining systems that prevent Never Events from occurring in the first place (2).”

In summary, there remains evidence of non-learning that is still related specifically to systems. There is no change in approach to prevention, and the culture of “systems first” remains. Perhaps this approach to handling error prevention is inherently wrong? Should the focus really be on systems, if they didn’t prevent the error previously?

We should all have come to recognize, but to date have not, that failing to listen to considered opinions of personnel on the front line, patients or those outside the internal apparatus, is arrogant, blinkered, and self-serving. It appears there are some complex behavioral issues when it comes to recognizing and addressing error in organizations, teams and individuals, which I think are directly related to out evolutionary biology and behavior biases, and linked to our perceived place and survival in the herd.

Never events in ophthalmology

The NHS improvement publication highlights 16 areas of never event occurrences. These are presented as procedural and system failures. In ophthalmology, the most common mishap is wrong lens power insertion.

Simon Kelly, Consultant Ophthalmic Surgeon at the Bolton Eye Unit in the UK, published a review of causes of wrong lens implantation. Analysis of why the error had occurred revealed that in 62 of 164 reports no causal reason was found (8). A list of reasons that were given suggests a breakdown in correct procedures, but also hints that human misperception may be the underlying cause of most of the other errors.

My own awareness of this was raised following some near misses despite introducing stricter checking procedures, after making wrong lens selections in the past. These near misses really made me think harder on how to stop error creeping in, even when detailed WHO checks (see section further below) have been undertaken.

OR spacewalk

Putting a wrong power lens in is a huge shock for a patient and for a surgeon, but it is usually correctable either immediately – with lens exchange – or later, with exchange, “piggy back” lenses or refractive laser. But let us consider a very particular situation from which – in an event of a serious error – there is usually no come back: a spacewalk. In space, serious error means almost certain death. What safety system is in place for this? As you might expect, the system includes long training, repeat simulations, risk assessments, and meticulous planning. However, there is an important additional element: during the spacewalk, before every move or hand placement – each decision is spoken out first before performing, and validated by another astronaut who has also trained and simulated the same space walk.

Now, due to time and cost constraints, we can’t have another consultant surgeon constantly checking our lens decisions, but could our junior colleagues perform such a role? Many surgeons work without other doctors in the operating room, so the scrub nurse or assistant is their next hope of spotting the error they might be about to unwittingly make. However, these members of staff can only help if they fully understand the biometry, and the patient’s desired refractive outcome. That means that the nursing team need training, too. Making sure that my team is properly and fully trained has certainly helped me when distraction or the quiet comfort zone led to a near miss.

The need for additional team training was formally highlighted in Healthcare Safety Investigation Branch 2017-2018 (9). I personally tried to implement this practice in 2006, but was overruled by senior clinical and administration managers despite support from nursing colleagues, which to me is an example of poor leadership culture.

Grave mistakes

Past research shows that there is an 18 percent chance of any iatrogenic error in western medical environments (10). Meta analysis suggests that 12 percent of errors caused permanent harm, of which half were potentially preventable. Resistance to change and reluctance to think outside of accepted systems may be at the root of this rate of error.

In the UK, the acceptance of evidence of error and the need for change led initially to increased reporting, which in turn led to increased awareness and a change in culture from quickly forgetting mistakes (in the spirit of “getting back on the horse when you fall off,” a reliable defense mechanism for immediate wellbeing perhaps) to actually admitting, reporting, and sharing mistakes. Eventually, it was decided that a financial penalty should be attached to admitting a never event had occurred. This gave increased seriousness and gravity to the event. It was only in 2018 that the financial penalty was lifted. Perhaps it was realized that this was a disincentive to being open and reporting serious, possibly preventable errors, and promoting a no-fault learning culture, exactly as it was in the aircraft industry. The NHS explained the decision in this way: “Our removal of financial sanctions should not be interpreted as a weakening of effort to prevent Never Events. It is about emphasizing the importance of learning from their occurrence, not blaming. Identifying and addressing the reasons behind this can potentially improve safety in ways that extend far beyond the department where the Never Event occurred or the type of procedure involved (2).”

It is worth highlighting that a never event, once reported, results in the wisdom of hindsight. The reality is not so much that an adverse event will never happen – a cure for all human fallibility has not yet been discovered – but that we desire to put certain steps in place, so that repeat preventable errors are reduced to an absolute minimum – that they are so rare that they almost never happen. That never events continue to occur is a more accurate depiction of our current reality. Nevertheless, it seems that for wrong lens insertion the news is good, with records showing a reduction of never events up until 2018, when the latest available data were recorded.

Table 1. NHS UK improvement figures (2).

Why do things go wrong?

Mark Pimblett, my colleague and medical simulation consultant trainer based in Bolton, UK, used to ask participants how they felt when they made a mistake during a scenario. They would almost always answer, “Oh, I felt really bad,” or “I was so embarrassed.” He would then gently probe their self-awareness until it dawned on them that in the moment of the mistake or error they felt and thought nothing! I have come to realize through my own experience how true the above experience is. I have also come to acknowledge that despite all the awareness and checks performed, and the best intentions, there is still the trap of the final fallibility space – the moment in time when self-awareness switches off and the error occurs.

None of us want to be in the situation when later we are reflecting on our mistakes with feelings of remorse, embarrassment, and self-disappointment. After an error is found, hours, weeks or months later, we don’t want the whole team or organization having to report another never event, and collectively wondering how it was missed. To help us all prevent the errors from happening, we must become much better at understanding how and why we switch off our awareness, and decide how we can maximize team work cognition and awareness to increase safety.

The human element

In Simon Kelly’s paper I mentioned, in 38 percent of cases of the wrong lens being implanted, the cause was unknown (8), which may suggest human factors were at play. If the biometry suggests the correct power but another lens power is used, it indicates that something happened in the time between the decision was made with the correct information and the lens was selected for insertion. This is the fateful final fallibility space.

It could be very useful if all staff focused on that particular moment and made sure they got the communication and final check correct as the lens is chosen and then immediately opened onto the operating trolley, minimizing the time between decision and action. It is very important to involve at least one other person who understands the decision, the action, and its consequences. In short, the mantra should be: “Let’s get it right; it is very important we all now pay attention!” The patient’s involvement in ensuring correct ID from the biometry (not the consent form) and their desired outcome should be checked at this point. After all, it is their eye that is being operated on! This crucial moment of decision making is recognized in aviation procedures as the sterile cockpit (11). Unnecessary distraction is a critical factor leading to error in both medicine and aviation.

WHO checks

We should all be familiar with Atul Gawande and the WHO checklist. His publication “The Checklist Manifesto” is a great read and provokes some great insights into why errors happen and what can be done to reduce them. However, it is now recognized that checklists in themselves do not guarantee safety and might indeed lead to error (12). For me, the key question is, “What is a real check?” It occurs when (at least) two people with sufficient understanding independently verify a piece of information without knowledge of each other’s decision. This cannot happen if each person is presented with the same information at the same time and simply asked to repeat the information already in front of them – that is a biased process. It adds in authority, seniority, time pressure (when the surgery is about to start) to the moment, all of which can lead to poor communication and error.

An effective checklist must be a short, focused and clear, and precise in its objective. If further checks are needed, these should be separate and focused on the decision process they are targeted at.

If, however, we are verifying a specific prosthesis to avoid error, a minimum of two independent checks are required, as above – that means two competent people providing confirmation separately. Ideally, three people should be involved (the patient can have a key role: the casting vote!).

The current NHS culture is predicated on compliance with checking and adhering to systems and acronyms. The checklists get longer in response to the latest scandal or last system error reported. This goes away from the thinking of clear, safe decision making.

Compliance environment

Organizations want safety and good outcomes; individuals in leadership roles are rewarded for responsibility and so are intent on minimizing risk of accountability for error. This means systems and compliance to policy are the main mechanism or preventing mistakes. If focus is on policy and not responding to the issues highlighted by front-line care, the result is “repeating the same behavior and expecting a different outcome” madness. Going around and around the same “checklist safety loop” is counterproductive if errors are repeated. Compliance with systems stops critical thinking and questioning, and the ability to speak up if something looks or feels wrong. Compliance and promises of consequences for non-adherence may prevent commonest predictable errors, but not the 33 percent or more unforeseen situational error in the final fallibility space.

We all make unforeseen errors despite our strongest intention not to. We have to freely admit our errors and analyze the causes, aiming not to let ourselves or others make that same error again. Our cognitive and communication lapses will always happen, simply because we are human. Even the most aware, well-trained professional can still make a mistake, especially given time pressure and distractions. A well-trained, robust and confident team around you can save you from the error that is waiting to happen. Finally, don’t forget to check in with the person in the room who has the most to gain and lose: the patient.

 

Recommended reading

  1. M Syed, Black Box Thinking. John Murray Publishing: 2015.
  2. R Dobelli, The Art of Thinking Clearly. HarperCollins: 2013.
Receive content, products, events as well as relevant industry updates from The Ophthalmologist and its sponsors.

When you click “Subscribe” we will email you a link, which you must click to verify the email address above and activate your subscription. If you do not receive this email, please contact us at [email protected].
If you wish to unsubscribe, you can update your preferences at any point.

  1. D Beatty, Naked Pilot: The Human Factor in Aircraft Accidents, 12th printing edition. The Crowood Press: 1995.
  2. NHS Improvement, “Never Events policy and framework” (2018). 
  3. R Radford, NHS, Please Don’t Kill Me. Matador: 2016.
  4. TP Hofer et al., “What is an error?” Eff Clin Pract, 3, 261 (2000). PMID: 11151522.
  5. Well, “Mid Staffs inquiry calls care failings a ‘disaster.’” 
  6. NHE, “NMC admits it ‘sat back’ during Morecambe Bay scandal, recognizes big culture shift ahead,” (2018). 
  7. Care Quality Commission, “Overview and CQC inspection ratings,” (2018). 
  8. SP Kelly, A Jalil, “Wrong intraocular lens implant; learning from reported patient safety incidents,” Eye (Lond), 25, 730 (2011). PMID: 21350567.
  9. HSIB, “Insertion of incorrect intraocular lens,” (2019). 
  10. M Panagioti et al., “Prevalence, severity, and nature of preventable harm across medical care settings: systematic review and meta-analysis,” BMJ, 366 (2019). PMID: 31315828.
  11. N Kapur et al., “Aviation and healthcare: a comparative review with implications for patient safety,” JRSM Open, 7 (2015). PMID: 26770817.
  12. K Catchpole, S Russ, “The problem with checklists,” BMJ Qual Saf, 24, 545 (2015). PMID: 26089207.
About the Author
Ray Radford

Raymond Radford is an independent ophthalmologist who specialises in high volume cataract surgery. He has refined his surgical techniques to maximise safety, patients comfort, outcomes, and efficiency. He leads clinical governance, enjoys training and teaching ophthalmology. He regularly teaches allied professionals and gave a lecture series to MSc students at UCL London in 2019. His outside interests include fine art, cuisine, ancient history, swimming, the great outdoors, and rugby.

Product Profiles

Access our product directory to see the latest products and services from our industry partners

Here
Register to The Ophthalmologist

Register to access our FREE online portfolio, request the magazine in print and manage your preferences.

You will benefit from:
  • Unlimited access to ALL articles
  • News, interviews & opinions from leading industry experts
  • Receive print (and PDF) copies of The Ophthalmologist magazine

Register

Disclaimer

The Ophthalmologist website is intended solely for the eyes of healthcare professionals. Please confirm below: