|
|
|
Theory, Philosophy & Justification for Root Cause Analysis in Healthcare Organizations; Why Bother?The move to conduct root cause analysis is largely motivated by a growing recognition that the complexity of health care and health care delivery drives the incidence of adverse events uncomfortably and unacceptably high (Brennan et al, 1991). It has been estimated that adverse events occur in nearly 4% of hospitalizations, and that 16% of these lead to permanently disabling injuries or death, e.g., sentinel events (Leape et al, 1991). It has been strongly argued that systems should be designed and health care professionals trained in methods that can improve patient safety by reducing hazards in health care (Cook and Woods, 1994). These issues combine to the extent that the JCAHO sentinel event policy has been described as a "lawsuit kit for attorneys" (Healthcare Risk Management, 1998). Consistent with this, the National Patient Safety Foundation (NPSF) maintains as its philosophy that most errors result from faulty systems rather than human error, e.g., poorly designed processes that putting people in situations where errors are more likely to be made, those people are in essence "setup" to make errors for which they are not truly responsible. Root cause analysis is a set of processes by which the underlying causes of adverse outcomes may be identified, with the goal in mind of preventing the reoccurrence of such events. The JCAHO has been explicit in defining the circumstances under which it requires that an root cause analysis be performed. We have, however, expanded that requirement within our practice to include a much broader scope of application. As an example, just within one large department for which we consulted, of the eight root cause analyses performed in 1998, only one was required by the current, published JCAHO Sentinel Event Policy. It is interesting to note that in terms of impact both on patient care and risk reduction (medical and economic), the root cause analysis required by JCAHO policy was among the least important. Needless to say, in order to resource and justify this expanded role of such analyses, it has become critical to identify an analytic process which is both efficient and effective. There have been several iterations of this process in our practice. There are many different processes by which root cause analyses are performed; the engineering and industrial risk management literature is rife with arguments for and against the different approaches. It is not the purpose of this current writing to explore those differences. Comments pertinent to root cause analyses performed outside the health care industry will not distinguish among such approaches, but will address as much as possible those areas of commonality. Root Cause Analysis in Health Care. One area of undisputed agreement is the observation that without strong support by upper management, root cause analyses will be performed in a perfunctory manner, with the singular purpose of meeting JCAHO regulatory requirements. In order to be effective, it must be accepted throughout the organization that the result of any given root cause analysis will be for improvement purposes, not for assignation of blame. This is in keeping with basic philosophy and tenet of continuous improvement in any area of endeavor. Because, however, root cause analysis has been accepted for some two decades in industries other than health care, the level of acceptance by management and personnel is much greater in those industries than can be reasonably expected in health care organizations. Similarly, the value in this analytic procedure is already accepted in other industries and government, being part and parcel of policies pertinent to Departments of Energy and Transportation, Nuclear Regulatory Commission, etc. In the health care industry, root cause analysis is for the most part still viewed as yet another regulatory requirement which is neither value-added nor inexpensive. As a consequence, there is resistance to the performance of root cause analyses, resistance to learning about their performance, and lack of support at all levels for their effective usage. Lack of familiarity with pertinent literature from other industries compounds this systemic and generally passive-aggressive though at times actively aggressive attitude against root cause analysis in all its aspects. Regrettably, a passing familiarity with such literature will in fact increase the above resistance for two reasons. Among health care administrators, the fact that it is not uncommon to spend substantial sums of money on a single root cause analysis raises the question of cost-effectiveness. Among health care providers the emphasis on human error in the root cause analysis literature of other industries raises the specter of blame, personal financial liability and the National Practitioner Databank, the last having no equivalent in other industries. Non-practitioners appear to have a tendency to underestimate the real impact of Databank reporting, as well as practitioners' emotional reactions to possibility of such reporting. In sum, even if the risk manager and/or continuous improvement personnel at a given health care facility is convinced of the value of appropriately performed root cause analyses, there are very difficult obstacles to their effective and acceptable performance. Clearly, education throughout the health care organization is the optimal means by which to address these problems. Unfortunately, a few days of training is not sufficient to allow the attendees to become effective analysis team leaders or facilitators. Books and manuals are not "living" guides, and with their use, translation to novel circumstance is extraordinarily difficult at best. Even if the analysis is conducted well, there are no standards for reporting, excepting perhaps the JCAHO form ("A Framework for Conducting a Root Cause Analysis In Response to a Sentinel Event"), which has been denigrated by every non-health care root cause analysis consultant with whom we have spoken, though for reasons remarkable for their inconsistency. Philosophy. There are critical philosophical differences in error reduction in other industries versus the health care industry. These differences are not universal, but are very common. It has been our experience in discussions with root cause analysis experts in other industries that these differences are usually not appreciated, and in fact are at times considered to be antithetical to understanding of how an effective root cause analyses should be approached and conducted. Significantly and similarly, we have seen no awareness of these differences in the literature pertaining to medical applications of root cause analysis. These philosophical differences have impact upon both the process and outcome of root cause analyses. We have identified three basic philosophical differences: issues of (1) blame, responsibility, and emphasis upon human error, (2) contributing versus causative factors, and, (3) degree of efficacy of corrective action or solutions. It is significant to note at this juncture that the experts with whom we spent the greatest amount of time discussing these and related issues were representatives of firms offering software designed to facilitate the root cause analysis process. It is largely their responses which are reflected in the following paragraphs, when expert's opinions are reported. Regarding the first of these, we offer an assertion made by a prominent expert in root cause analysis outside of the health care arena, "All sentinel events are the result of human errors that queue up in a particular sequence." This writer has just guaranteed that any health care provider who reads this line will adamantly oppose any efforts to institute root cause analytic processes, and has therefore devastated any provider, any hospital counsel and any risk manager who is trying to gain the trust of his or her provider staff in such an endeavor. That the above quotation may or may not be accurate is irrelevant of the fact of its extremely negative emotional impact. That such comments are not uncommon in the root cause analysis literature means that very careful educational groundwork must be established prior to even encouraging health care personnel to read such literature; reality is not necessarily good if the recipient has not been adequately prepared to deal with it. Going further, litigation for sentinel events may result from the root cause analysis in any industry if a plaintiff secures the product of such an analysis. Personal liability, however, is a far greater risk in the health care industry than in other industries. Issues of personal fear are correspondingly more prominent. Regarding the validity of the above assertion, it is interesting to note that Lucian L. Leape, MD, one of the foremost proponents of root cause analysis in medicine articulates his views thusly, "Errors must be accepted as evidence of systems flaws, not character flaws" (Leape, 1994, 1997). In the area of risk management in general (not limited to health care), James Reason asserts, "Indeed it could be argued that for certain complex, automated, well-defended systems, such as nuclear power plants, chemical process plants, modern commercial aircraft and various medical activities (emphasis added), organizational accidents are really the only kind left to happen. Such systems are largely proof against single failures, either human or technical....Perhaps the greatest risk to these high technology systems is the insidious accumulation of latent failures, hidden behind computerized, "intelligent" interfaces, obscured by layers of management, or lost in the interstices between various specialized departments" (Reason, 1994). Cook and Woods (1994) present four distinct reasons that failures or accidents are attributed to human error, especially in "complex systems" when in fact this largely constitutes a mis-attribution. Moray (1994) asserts that, "...the systems of which humans are a part call forth errors from humans, not the other way around." The foremost experts in risk management both within and without the health care industry emphasize system failures and system-driven errors over direct human error, and the philosophy guiding the process of root cause analysis, be it manual or automated, should reflect this emphasis. Fears of criminal prosecution within the medical community are not without foundation. For example, in California, there are a handful of physicians facing second degree murder charges. "We need to make sure we find a way to prevent criminal prosecution of doctors from becoming a trend" says a California emergency physician is acquitted of murder charges and the possibility of 15 years to life in prison stemming from clinical decisions. Another doctor is standing trial for the death of a patient whose uterus he perforated during an abortion. (Prager, 1998). These are extreme examples of how a sentinel event with a tragically poor outcome can affect physicians. These are also examples of how the health care provider can be crushed by a system that points to human error without regard for considering systems and process deficiencies which can likely be identified and corrected with a thorough and credible root cause analysis (JCAHO, 1996). In our research into root cause analysis in aviation, aerospace, transportation, electronics, security and energy industries, we found a nearly ubiquitous underlying assumption that causative factors had to be (1) necessary and sufficient, (2) necessary but not sufficient, or (3) irrelevant. The notion that a factor could be neither necessary nor sufficient to cause an adverse event but could still be of critical importance, seemed to be for the most part an alien and totally unacceptable concept. Even after presenting several actual circumstances in which several factors combined to contribute to a sentinel event, the concept of contributing but non-causative factors was rejected by several of the consultants with whom we spoke. In fairness, however, this orientation was held most firmly by those practitioners who worked primarily within the confines of mathematical modeling as applied to root cause analysis. These tended to be those experts who were most data bound in their considerations, and their approaches emphasized the use of factor weighting, cut scores, etc. While their approaches have substantial advantage in terms of mathematical objectivity, flexibility in application to medical circumstance appeared to us to be limited. Of note is the fact that such rigidity in the rejection of contributing factors is directly contrary to views expressed by the most recognized experts in the fields of human behavior and risk management (Grandjean, 1980; Norman, 1981, 1988; Reason, 1990). As Reason eloquently describes, "... a detailed examination of the causes of these accidents reveals the insidious concatenation of often relatively banal factors, hardly significant in themselves, but devastating in their combination" (Reason, 1994). Even less acceptable was the idea that a partial solution to an identified root cause was worth consideration and implementation. It appears to be assumed that any root cause can be either "corrected" or is "non-correctable," though the exact terminology varied with different consultant writers. Not only would we challenge this assumption in the health care arena, but we would likewise challenge the assumption in all areas of application. The difficulty appears to reside with the recognized requirement to monitor the results of any corrective action implemented. With sentinel events, we are generally discussing very low frequency occurrences, which means rate of occurrence may be a relatively meaningless metric. Every occurrence is critical, is sentinel, and anything less than a complete correction is less than adequate, e.g., is perceived by certain of these consultants and possibly by both internal and external customers to be a failure. This perception belies, however, the underlying philosophy and guiding principles of continuous improvement; improvements are incremental and ongoing; perfection is targeted, but not attained. Regrettably, sentinel events occur with certain "acceptable" levels of incidence, though for most sentinel events which result in an actual adverse outcome, even one instance is indeed unacceptable. It is our goal to progressively reduce the frequency of all classes of adverse events, knowing that many will not be eliminated. This does not necessarily define a failure. We would argue that this applies within and outside the health care industry.
Table of References Brennan, T.A., Leape, L.L., Laird, N.M., Hebert, L., Localio, A.R., Lathers, A.G., Newhouse, J.P., Weiler, P.C., & Hiatt, H.H. (1991). Incidence of adverse events and negligence in hospitalized patients: Results from the Harvard Medical Practice Study I. New England Journal of Medicine, 324, 370-376. Cook, R.I. & Woods, D.D. (1994). Operating at the Sharp End. In Human Error in Medicine, Marilyn Sue Bogner (Ed)., Hillsdale, NJ: Lawrence Erlbaum Associates, Inc. Grandjean, E. (1980). Fitting the Task to the Man. London: Taylor and Francis. Healthcare Risk Management. Sentinel event policy changed, but it's still a 'lawsuit kit' for attorneys. Healthcare Risk Management, July 1998. JCAHO (1996). Conducting a Root Cause Analysis in Response to a Sentinel Event. Leape, L.L., Brennan, T.A., Laird, N.M., Lawthers, A.G., Localio, A.R., Barnes, B.A., Herbert, L., Newhouse, J.P. & Hiatt, H.H. (1991). The nature of adverse events in hospitalized patients. New England Journal of Medicine, 324, 377-384. Leape, L.L. (1994). Preventability of Medical Injury. In Human Error in Medicine, Marilyn Sue Bogner (Ed)., Hillsdale, NJ: Lawrence Erlbaum Associates, Inc. Moray, N. (1994). Error Reduction as a Systems Problem. In Human Error in Medicine, Marilyn Sue Bogner (Ed)., Hillsdale, NJ: Lawrence Erlbaum Associates, Inc. Norman, D.A. (1981). Categorization of action slips. Psychological Review, 88, 1-55. Norman, D.A. (1988). The Psychology of Everyday Things. New York: Basic Books. Prager, L.O. (1998). Keeping clinical errors out of criminal courts. American Medical News, March 16, 41-11, p. 1. Reason, James T. (1990).Human Error. Cambridge, England: Cambridge University Press. Reason, James T. (1994). Foreword to Human Error in Medicine, Marilyn Sue Bogner (Ed)., Hillsdale, NJ: Lawrence Erlbaum Associates, Inc. Root Cause Analyst Software® | FAQs on the Issues | Root Cause Analysis Training Sponsored by Medical Risk Management Associates, LLC Copyright© 1998, 1999 MRMA, LLC. All rights reserved. |