Skip to main content

Currently Skimming:

3 Examination of the Current State of the Art in System Safety and Its Relationship to the Safety of Health ITAssisted Care
Pages 59-76

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 59...
... 3 Examination of the Current State of the Art in System Safety and Its Relationship to the Safety of Health IT–Assisted Care To understand the complex relationship of implementation and safety, this chapter presents key concepts of system safety and applies those concepts to the domains of health IT and patient safety. SAFETY IN COMPLEX SYSTEMS Complex systems are in general more difficult to operate and use safely than simple systems -- with more components and more interfaces, there are a larger number of ways for untoward events to happen.
From page 60...
... In many cases, adaptations require human operators to select a well-practiced routine from a set of known and available responses. In some cases, adaptations require human operators to create on-the-fly novel combinations of known responses or de novo creations of new approaches to avert failures that result from good design or adaptation and intervention by those who use the system on a routine basis.
From page 61...
... Safety issues in health IT are largely driven by that complexity and the failure to proactively take appropriate systems-based action at all stages of the design, development, deployment, and operation of health IT. THE NOTION OF A SOCIOTECHNICAL SYSTEM The sociotechnical perspective takes the approach that the system is more than just the technology delivered to the user.
From page 62...
... Organization refers to how the organization installs health IT, • makes configuration choices, and specifies interfaces with health 1 As one illustration, the introduction of a computer-based patient record into a diabetes clinic was associated with changes in the strategies used by physicians for information gathering and reasoning. Differences between paper records and computer records were also found regarding the content and organization of information, with paper records having a narrative structure while the computer-based records were organized into discrete items of information.
From page 63...
... A comprehensive analysis of the safety afforded by any given health care organization requires consideration of all of these domains taken as a whole and how they affect each other, that is, of the entire sociotechnical system.2 For example, an organization may develop formal policies regarding workflow. In the interests of saving time and increasing productivity, health care professionals may modify the prescribed workflow or approved practices in ways of which organizational leadership may be unaware.
From page 64...
... vendor testified to the committee that a user error occurs when B OX 3-1 Mismanaging Potassium Chloride (KCl) Levels Part I: A Sociotechnical View As a result of multiple medication errors, an elderly patient who was originally hypokalemic (suffering from low levels of KCl)
From page 65...
... The two classes are related; bad user interfaces usually reflect an inadequate understanding of the user's domain and the absence of a coherent and well-articulated conceptual model. By blaming users for making a mistake and not considering poor human factors design, the organization accepts responsibility only for training the individual user to do better the next time similar circumstances arise.
From page 66...
... Third, the distinction between "human error" and "computer error" is misleading. Human errors should be seen as the result of human variability, which is an integral element in human learning and adaptation (Rasmussen, 1985)
From page 67...
... . The primary lesson from this perspective on safety can be described as the following: "Task analysis focused on action sequences and occasional deviation in terms of human errors should be replaced by a model of behavior-shaping mechanisms in terms of work system constraints, boundaries of acceptable performance, and subjective criteria guiding adaptation to change" (Rasmussen, 1997)
From page 68...
... These factors, including the poorly designed CPOE interface, may not be identified in a single event chain, yet each independently contributed to the patient's excessive KCl levels. Looking for a single "root cause" responsible for the patient's adverse condition would fail to address the other factors that may continue to put future patients at risk.
From page 69...
... Such a case cannot be made by relying primarily on adherence to particular software development processes, although such adherence may be part of a case for safety. Nor can the safety case be made by relying primarily on a thorough testing regimen.
From page 70...
... THE (MIS) MATCH BETWEEN THE ASSUMPTIONS OF SOFTWARE DESIGNERS AND THE ACTUAL WORK ENVIRONMENT Generally, health IT software is created by professionals in software development, not by clinicians as content experts.
From page 71...
... However, software developers and clinicians generally come from different backgrounds, making communication of ideas more difficult. As a result, these processes for gaining input rarely capture the full richness and complexity of the actual operational environment in which health professionals work and vary enormously from setting to setting and practitioner to practitioner.6 Deviations Versus Adherence to Formal Procedures Indeed, in most organizations, guidance provided by formal procedures is rarely followed exactly by health professionals.
From page 72...
... As discussed previously, unsafe outcomes result not from human failures per se but rather from the way the various components of the larger sociotechnical system interact with each other. Clumsy Automation A particularly relevant illustration of mismatches between the assumptions of software designers and the actual work environment can be seen in the notion of clumsy automation (Woods et al., 2010)
From page 73...
... Poorly designed computerized interfaces tend to make interesting and noteworthy things invisible when they hide important data behind a number of windows on the screen (Woods et al., 2010)
From page 74...
... . Based on the notion that the safety afforded by an organization can benefit more by learning from mistakes than by punishing people who make them, a Just Culture organization encourages people to report errors and to suggest changes as part of their normal everyday duties.
From page 75...
... of many different health care organizations. The next chapter suggests various levers with which to improve safety.
From page 76...
... Journal of the American Medical Informatics Association 12(4)


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.