• Users Online: 53645
  • Home
  • Print this page
  • Email this page

   Table of Contents      
EDITORIAL
Year : 2019  |  Volume : 67  |  Issue : 10  |  Page : 1517-1518

To err is human, but errors can be prevented


Editor, Indian Journal of Ophthalmology, Centre for Sight, Hyderabad, Telangana, India

Date of Web Publication23-Sep-2019

Correspondence Address:
Dr. Santosh G Honavar
Editor, Indian Journal of Ophthalmology, Centre for Sight, Hyderabad, Telangana
India
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/ijo.IJO_1728_19

Rights and Permissions

How to cite this article:
Honavar SG. To err is human, but errors can be prevented. Indian J Ophthalmol 2019;67:1517-8

How to cite this URL:
Honavar SG. To err is human, but errors can be prevented. Indian J Ophthalmol [serial online] 2019 [cited 2024 Mar 28];67:1517-8. Available from: https://journals.lww.com/ijo/pages/default.aspx/text.asp?2019/67/10/1517/267396



“Remember, there is nothing you can do to change [the past], but you can use its lessons to improve your future”

– Rabbi Abraham J. Twerski, MD [1]

A few years ago, a young enthusiastic surgeon was operating in two operating rooms (ORs) on a busy day to help clear the surgical backlog. When he finished a long surgery in one OR, he found that the next retinoblastoma child for enucleation had already been prepared under general anesthesia in the other OR. It had been ingrained in him to never fail to look at the fundus personally to confirm the eye for enucleation. But the pupil was undilated. The entire OR team vehemently claimed that the due process of patient's identification had indeed been followed and the patient's ID tags matched with the medical record. The anesthesiologist seemed eager to start the surgery. However, the adamant surgeon was not inclined to begin until he looked at the fundus himself and confirmed the eye for enucleation. He waited patiently for the pupil to dilate – and it turned out to be the normal eye of another unilateral retinoblastoma child with the same first name, who had been shifted in by mistake! The surgeon's strong gut feeling and the indelibly etched teaching of his mentors saved the patient from an irreparable loss, the hospital from a certain ignominy, and the surgeon himself from an irreversible annihilation. This can be categorized as a near-miss event. It was detected and the potential catastrophic medical error was averted because of due diligence and prudence on the part of the surgeon and his strict adherence to the protocol. Systemic failure of protocols and suboptimal internal checks and balances often lead to errors in the practice of medicine. The involved individuals are mere players in and are often the victims of the same system.

An error is defined as the failure of a planned action to be completed as intended or the use of a wrong plan to achieve an aim.[2] The errors depend on two types of failures: either the correct action did not proceed as intended (an error of execution) or the original intended action was not correct (an error of planning).[3] An adverse event is harm resulting from a medical intervention, but not directly attributable to the underlying condition of the patient.[2] While all adverse events result from medical management, not all of them are preventable (attributable errors).[2] A near-miss incident refers to violation of an established rule or safe practice that did not result in harm.[2]

Doctors are human beings and are bound to commit occasional errors, just like sportsmen, engineers, pilots, judges, investment bankers, bureaucrats, or politicians. Michael Jordan, who missed more than 9000 shots in his career, and missed the critical game-winning shot 26 times, is considered one of the greatest sports legend. Closer home, Virat Kohli is considered one of the best captains Indian cricket has ever produced, despite a winning rate of 60%, 62%, and 75% in test cricket, 20–20, and 1-day matches, respectively. Apollo 11 landed on moon 4 miles off its intended target; yet, it is acknowledged as spectacularly successful. Now, imagine an ophthalmologist, who breaks a 4-μm-thin posterior capsule or gets his cataract surgeries right only 60% or 70% of the times. The society expects 100% success when it comes to healthcare. Excellence in medicine and competitive sports depends on practice-based perfection.[4] However, practice does not make anyone perfect all the time. While the practice in sports relies on coaches to openly point out faults and make amends, doctors have been taught to view failures and errors as shameful events.[4] It would be good to understand that it is not just the doctors, but anyone could commit medical errors that impact a patient's outcome – nurses, pharmacists, administrators, and even patients themselves and their families.[4] Thus, the responsibility to minimize medical errors does not begin or end with doctors – they are mere cogs in the wheel of a gigantic medical system. The culture of how we view, acknowledge, and judge errors in medicine and the expectation of 100% success needs to change.

Medical errors follow the “swiss-cheese model” of accident causation.[5] Healthcare systems typically have a layered security – similar to multiple slices of swiss cheese, stacked side by side, in which the risk of a threat becoming a reality is mitigated by the differing layers of defenses which are asymmetrically layered behind each other.[5] Therefore, in theory, lapses and weakness in one layer do not allow a risk to materialize, because other defenses also exist, to prevent a single point of failure.[5] When all the layers fail and holes in each slice momentarily align forming “a trajectory of accident opportunity” does a medical error occur.[5] Building safety into various layers of processes of care is a more effective way of reducing errors than blaming sentinel individuals.[2] Errors can be prevented by designing the healthcare system at all levels to make it safer, making it hard for people to do the wrong thing and easy for people to do the right thing.[2]

Many high-risk industries have a robust reporting system that has effectively improved safety standards.[1] The confidential and nonpunitive Aviation Safety Reporting System focuses on reporting of near-misses. Each incident is analyzed for the root cause, and the results and recommendations are confidentially shared with the entire aviation community for everyone to learn.[6] A successful reporting system has seven characteristics: nonpunitive, confidential, independent (the system is not controlled by an organization with the power to punish the reporter), analyzed by experts, timely, system-oriented, and responsive.[7] Medical errors of various magnitudes and at different levels are underreported because of the difficulty in reporting errors and the fear of persecution. Our medical culture and systems are not designed to encourage error recognition, reporting, and remediation. As Rosner notes, “The paradox of modern quality improvement is that only by admitting and forgiving error can its rate be reduced.”[1],[8] Each reported medical error or a near-miss is a learning opportunity for everyone concerned. Much can be learnt from the root cause analysis of errors. Errors that do not result in harm also represent an important opportunity to identify possible improvements in the system, having the potential to prevent adverse events. It is critical to find the right balance between the carrot (no blame approach) and the stick (punitive action). Habitual offenders who fail to conduct a “time-out” before surgery or disregard safety checklists may need to be held accountable for potentially unsafe practices.[9]

Potential medicolegal implications of medical errors are emphasized in detail in the new “Perspective” section of this issue of Indian Journal of Ophthalmology and several commentaries that follow.[10],[11],[12],[13],[14] It is strongly recommended that everyone concerned with safety in medicine should read the report, “To Err is Human – Building a Safer Health System” generated by the Institute of Medicine (US) Committee on Quality of Health Care in America in 2000.[2] The report continues to be very relevant nearly two decades later, especially to the current Indian medical system. It asserts that the problem is not bad people in healthcare – it is that good people are working in bad systems that need to be made safer, and that medical errors can be minimized, not by pointing fingers at caring doctors who make honest mistakes but by improving patient safety through the design of a safer health system.[2] This will require a concerted effort by the professionals, healthcare organizations, purchasers, consumers, regulators, and policy-makers.[2] The report emphasizes that traditional clinical boundaries and culture of the blame must be broken down, and most importantly, we must systematically design and incorporate safety into processes of care.[2]

Finally, as an ophthalmologist-turned medicolegal specialist, Pallavi Bradshaw aptly puts it, “We must accept that errors will happen. Doctors are human and the systems they work in are not infallible. As a society, we must decide on whether we wish to punish and blame those who dedicate their lives to helping others, or to ensure that we create an open and supportive environment where both patients and doctors feel safe.”[15]



 
  References Top

1.
Medical errors: Focusing more on what and why, less on who. J Oncol Pract 2007;3:66-70.  Back to cited text no. 1
    
2.
Institute of Medicine (US) Committee on Quality of Health Care in America; Kohn LT, Corrigan JM, Donaldson MS, editors. To Err Is Human: Building a Safer Health System; Washington, DC: National Academies Press; 2000.  Back to cited text no. 2
    
3.
Reason JT. Human Error. Cambridge, UK: Cambridge University Press; 1990.  Back to cited text no. 3
    
4.
Available from: https://frontdoor2healthcare.wordpress.com/2012/02/26/the-biggest-unspoken-secret. [Last accessed on 2019 Sep 18].  Back to cited text no. 4
    
5.
Available from: https://en.wikipedia.org/wiki/Swiss_cheese_model. [Last accessed on 2019 Sep 18].  Back to cited text no. 5
    
6.
Wu AW, Pronovost P, Morlock L. CU incident reporting systems. J Crit Care 2002;17:86-94.  Back to cited text no. 6
    
7.
Leape LL. Reporting of adverse events. N Engl J Med 2002;347:1633-8.  Back to cited text no. 7
    
8.
Rosner F, Berger JT, Kark P, Potash J, Bennett AJ. Disclosure and prevention of medical errors. Arch Intern Med 2000;160:2089-92.  Back to cited text no. 8
    
9.
10.
Nagpal N, Nagpal N. When the ophthalmologists turn blind. Indian J Ophthalmol 2019;67:1520-3.  Back to cited text no. 10
  [Full text]  
11.
Grover AK. Commentary: The times have changed: Are we listening? Indian J Ophthalmol 2019;67:1524-5.  Back to cited text no. 11
  [Full text]  
12.
Ravilla T. Commentary: Playing it safe versus being responsible. Indian J Ophthalmol 2019;67:1525-6.  Back to cited text no. 12
  [Full text]  
13.
Agarwal D, Kumar A, Sundar D. Commentary: Medico legal aspects in ophthalmology in India. Indian J Ophthalmol 2019;67:1526-7.  Back to cited text no. 13
  [Full text]  
14.
Pandey SK, Sharma V. Commentary: Increasing cases of litigations against ophthalmologists: How can we minimize litigations during ophthalmic practice? Indian J Ophthalmol 2019;67:1527-30.  Back to cited text no. 14
  [Full text]  
15.



This article has been cited by
1 Wo Psychotherapie ihr Ziel verfehlen kann
Christian Arnezeder
Psychotherapeut. 2022;
[Pubmed] | [DOI]



 

Top
 
 
  Search
 
    Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
    Access Statistics
    Email Alert *
    Add to My List *
* Registration required (free)  

 
  In this article
References

 Article Access Statistics
    Viewed2319    
    Printed49    
    Emailed0    
    PDF Downloaded492    
    Comments [Add]    
    Cited by others 1    

Recommend this journal