Medicine has adopted the language of manufacturing with terms such as efficiency, reliability, and “lean processes.” An unintended consequence may be increased risk of system failure. Twenty years ago, Dr. Richard I. Cook published a paper dealing with big ideas for safety in complex systems. His first point was “complex systems are intrinsically hazardous systems.” One idea that has become prominent recently is the notion of “never events.” CMS will penalize hospitals that have one. While it seems laudable, there are some underlying assumptions about systems and the behaviors of the people who operate them that are faulty. Let me provide more quotes from Dr. Cook to explain why.
He noted “complex systems are heavily and successfully defended against failure” and “complex systems run in degraded mode.” The defenses against failure are necessary because of the degraded nature of all systems despite technical, human, and regulatory processes which provide multiple layers of defense. Thus failure, an accident, requires failures in multiple processes occurring simultaneously.
Physicians by nature and training want to get an “A” and so do most managers. As money has become tighter, though, there has been greater emphasis on the part of everybody to improve “efficiency,” which generally means getting rid of “unnecessary,” and often costly, redundancies, particularly when it comes to staff. Yet it is difficult, if not impossible, to predict when curtailing redundancy decreases clinical safety and quality.
He emphasized that “catastrophe is always just around the corner.” Some years ago, we had a unit in the hospital that had achieved a remarkable streak of infection-free days. I asked the medical director to describe what they had done to be so successful, which she did. She did not mention the problem of vigilance, but when I asked her if she expected the system to fail, her answer was: “Of course.”
One of my major challenges as medical director was to remind my staff not to assume things were going to be okay. I found many instances where experienced staff made incredibly short-sighted decisions because “everybody was doing well.” I tried to point out often the challenge for the successful dialysis nurse was to strive for a day where “nothing happened” while always being ready for the next cardiac arrest or bleeding episode. Too much emphasis on the risk leads to paralysis, too little leads to lackadaisical care. Keeping the balance is one of those issues of constant re-calibration, which requires experienced staff and thoughtful, attentive leadership. It is not a machine that “will go of itself.”
A related issue is the failure to recognize the truth that systems operate “because people can make it function despite the presence of many flaws.” Turnover of clinical staff remains a huge issue in most medical organizations. It has been true of nurses for years, and now that more physicians are being directly employed by hospitals, it is becoming true for them as well. Yet a newcomer, no matter how qualified or well-trained, is not going to be familiar with the tacit knowledge used to keep processes out of the ditch.
“Human operators have dual roles: as producers and as defenders against failure. This dynamic quality of system operation, the balancing of demands for production against the possibility of incipient failure is unavoidable. Outsiders rarely acknowledge the duality of this role.”
Administrators may resent the way their clinicians disparage them as “the suits,” but this is a defensive measure by clinicians. Clinicians are fully aware of the risk of failure and are personally liable in case of an accident, even if it was the result of a system failure beyond anyone’s control. The organization may also be sued, but rarely is the administrator held personally responsible. As in the story about the difference in commitment between the chicken and the pig concerning a breakfast of ham and eggs, the clinician is the pig. Recognizing this leads to “defensive medicine,” often used as a pejorative implying sloppy thinking and inefficient practice, but it is also how clinicians try to increase safety. Likewise, clinician resistance to changing routines. Administrators may become frustrated when surgeons won’t “flex” their OR routines, but experience has taught them it is safe. Getting change requires demonstrating how the change makes things even safer, not making them cheaper.
Dr. Cook made three other points worth emphasizing. First, “all practitioner actions are gambles.” Second, “human practitioners are the adaptable element of complex systems.” Third, human expertise in complex systems is constantly changing.” Many of the “experts” are calling for more “reliability,” often shorthand for doing it cheaper with less “unexplained” variation, and the regulatory environment makes administrators more eager to codify and “fix” things than might otherwise be the case. The challenge, of course, is to standardize those things that can/should be standardized, but no more, and for clinical care, there almost always needs to be a bypass system available. For instance, there is no reason for a hospital to forego a standard sliding scale insulin regimen. But the “standard” needs to be easily bypassed if the patient is known to be either brittle or resistant, as their needs won’t be served by the usual approach. It may be fair to call for documentation of exceptions, but that should not serve as a deterrent. Too often, the bureaucratic hoops to making patient-centered decisions causes the clinician to simply give up and move on to the next patient. We need to recognize that all clinical decisions are calculated gambles. Having expert gamblers helps but does make it a sure thing.
To summarize, all medical organizations have to deal with clinical failures regularly. Since it is not possible to eliminate the wager at the heart of clinical care, we need to recognize the need for redundant systems, eternal vigilance, and constant preparation for putting out the next fire. We need to be wary to the unspoken assumptions of the language of manufacturing processes, such as “lean,” which may cause us to forget that clinical decisions involve much larger and much less controllable wagers. What is waste and what is necessary redundancy for is both difficult and requiring constant re-calibration. Doing this well requires a common focus and expert practitioners in all areas, both clinical and managerial. Lastly, it follows that the tacit knowledge of current employees has operational value today. We need to get past “FTE” thinking and realize quality and safety of care is about knowledgeable staff able to interact flexibly with each other and their patients continuously and carefully.
6 January 2020
 Cook RI: How Complex Systems Fail. (Revision D [00.4.21]). Accessed 27 December 2019 at https://web.mit.edu/2.75/resources/random/How%20Complex%20Systems%20Fail.pdf.
Knowledge management (KM) covers any intentional and systematic process or practice of acquiring, capturing, sharing, and using productive knowledge, wherever it resides, to enhance learning and performance in organizations. Which strategy for knowledge management is appropriate in dialysis clinics?
Everyone is opposed to wasteful medical spending, but we still don't have a robust definition of what is waste.
Productivity in Healthcare Part 1
Many are focused on efficiency and productivity in healthcare without a clear understanding that the two are not interchangeable. This article introduces the two concepts as they are commonly used.
Productivity in Healthcare Part 2
The conflict between productivity and efficiency is examined from three perspectives using the care of dialysis patients as the case study.
Productivity in Healthcare Part 3
The conflict between productivity and efficiency is contributing to widespread physician malaise, which has negative implications for health care improvement.
Making simple ideas work turns out to be complicated and hard.
Variation in Health Care
Is variation in health care good, bad, or inevitable? The answer may determine future medical practice.