Skip to main content

Currently Skimming:

3 Why Do Errors Happen?
Pages 49-68

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 49...
... However, even apparently single events or errors are due most often to the convergence of multiple contributing factors. Blaming an individual does not change these factors and the same error is likely to recur.
From page 50...
... An Illustrative Case in Patient Safety Infusion crevices are mechanical crevices that administer intravenous solutions containing drugs to patients. A patient was undergoing a cardiac proceclure.
From page 51...
... WHY DO ACCIDENTS HAPPEN? 51 Major accidents, such as Three Mile Island or the Challenger accident, grab people's attention and make the front page of newspapers.
From page 52...
... Furthermore, any element in a system probably belongs to multiple systems. For example, one operating room is part of a surgical department, which is part of a hospital, which is part of a larger health care delivery system.
From page 53...
... Hindsight bias also misleads a reviewer into simplifying the causes of an accident, highlighting a single element as the cause and overlooking multiple contributing factors. Given that the information about an accident is spread over many participants, none of whom may have complete information,1l hindsight bias makes it easy to arrive at a simple solution or to blame an individual, but difficult to determine what really went wrong.
From page 54...
... For example, in medicine, a slip might be involved if the physician chooses an appropriate medication, writes 10 mg when the intention was to write 1 ma. The original intention is correct (the correct medication was chosen given the patient's condition)
From page 55...
... Latent errors pose the greatest threat to safety in a complex system because they are often unrecognized and have the capacity to result in multiple types of active errors. Analysis of the Challenger accident traced contributing events back nine years.
From page 56...
... As a new employee, she may have been hesitant to ask for help or may not have known who to ask. Focusing on active errors lets the latent failures remain in the system, and their accumulation actually makes the system more prone to future failure.21 Discovering and fixing latent failures, and decreasing their duration, are likely to have a greater effect on building safer systems than efforts to minimize active errors at the point at which they occur.
From page 57...
... In contrast to studying the causes of accident and errors, other researchers have focused on the characteristics that make certain industries, such as military aircraft carriers or chemical processing, highly reliable.22 High reliability theory believes that accidents can be prevented through good organizational design and management.23 Characteristics of highly reliable industries include an organizational commitment to safety, high levels of redundancy in personnel and safety measures, and a strong organizational culture for continuous learning and willingness to change.24 Correct performance and error can be viewed as "two sides of the same coin."25 Although accidents may occur, systems can be designed to be safer so that accidents are very rare. The National Patient Safety Foundation has defined patient safety as the avoidance, prevention and amelioration of adverse outcomes or injuries stemming from the processes of health care.26 Safety does not reside in a person, device or department, but emerges from the interactions of components of a system.
From page 58...
... Making environments safer means looking at processes of care to reduce defects in the process or departures from the way things should have been done. Ensuring patient safety, therefore, involves the establishment of operational systems and processes that increase the reliability of patient care.
From page 59...
... Compared to tightly coupled systems, loosely coupled systems can tolerate processing delays, can reorder the sequence of production, and can employ alternative methods or resources. All systems have linear interactions; however, some systems additionally experience greater complexity.
From page 60...
... Therefore, the delivery of health care services may be classified as an industry prone to accidents.38 Complex, tightly coupled systems have to be made more reliable.39 One of the advantages of having systems is that it is possible to build in more defenses against failure. Systems that are more complex, tightly coupled, and are more prone to accidents can reduce the likelihood of accidents by simplifying and standardizing processes, building in redundancy, developing backup systems, and so forth.
From page 61...
... This usually involves having to monitor automated systems for rare, abnormal events43 because machines cannot deal with infrequent events in a constantly changing environment.44 Fortunately, automated systems rarely fail. Unfortunately, this means that operators do not practice basic skills, so workers lose skills in exactly the activities they need in order to take over when something goes wrong.
From page 62...
... Equipment may not be designed using human factors principles to account for the human-machine interface.48 In the case stucly, safer systems could have been designed by taking into consideration characteristics of how people use machines and interact with each other in teams. For example: · Reclesign the crevices to default to a safe mode · Recluce the difficulties of using multiple crevices simultaneously · Minimize the variety of equipment models purchased · Implement clear procedures for checking equipment, supplies, etc., prior to begixnning surgery · Orient and train new staff with the teamis)
From page 63...
... The first is critical incident analysis. Critical incident analysis examines a significant or pivotal occurrence to understand where the system broke down,
From page 64...
... The analysis uncovers the factors weighed and the processes used in making decisions when faced with ambiguous information under time pressure. In terms of applying human factors research, David Woods of Ohio State University describes a process of reporting, investigation, innovation, and dissemination (David Woods, personal communication, December 17, 19981.
From page 65...
... 5. Current responses to errors tend to focus on the active errors.
From page 66...
... Discovering and fixing latent failures and decreasing their duration are likely to have a greater effect on building safer systems than efforts to minimize active errors at the point at which they occur.
From page 67...
... 27. Dye, Kevin M.C.; Post, Diana; Vogt, Eleanor, "Developing a Consensus on the Accountability and Responsibility for the Safe Use of Pharmaceuticals," Preliminary White Paper prepared for the National Patient Safety Foundation, June 1, 1999.
From page 68...
... 55. "Current Projects," Human Factors Research and Technology Division, Ames Research Center, NASA, http://human-factors.arc.nasa.gov/frameset.html 56.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.