Your browser doesn't support JavaScript. Please upgrade to a modern browser or enable JavaScript in your existing browser.
Skip Navigation U.S. Department of Health and Human Services
Agency for Healthcare Research Quality
Archive print banner

This information is for reference purposes only. It was current when produced and may now be outdated. Archive material is no longer maintained, and some links may not work. Persons with disabilities having difficulty accessing this information should contact us at: Let us know the nature of the problem, the Web address of what you want, and your contact information.

Please go to for current information.

How Safe Is Our Health Care System?

A Systems Approach


John Eisenberg, M.D., M.B.A, Director, Agency for Healthcare Research and Quality (AHRQ), U.S. Department of Health and Human Services, Rockville, MD.

David Woods, Ph.D., Professor, Industrial and Systems Engineering, Institute for Ergonomics, The Ohio State University, Columbus, OH.

Dr. John Eisenberg stressed that little mistakes make big problems. Healthcare is a complex system with numerous interactions among people and with technology. You can't eliminate humans in the health care system and humans will always make errors. Systems can and must be designed to prevent, detect, and handle human errors. Different types of errors are likely to require different solutions.

For example, systems abound in the automobile industry to prevent errors:

  • Bell rings if headlights are on when ignition is off.
  • A light comes on if the doors open.
  • Door locks from outside only with a key.
  • An airbag automatically inflates in a collision.
  • The brake must be pressed to go from park to drive or reverse.

Warning systems that alert health care professionals about impending errors, like the warning systems found in today's cars, are needed to prevent errors occurring in medical care. Studies show that seven specific systems failures account for 78 percent of adverse drug events in hospitals. All seven of these failures could be corrected by better information systems that detect and correct for errors. Examples include:

  • Drug knowledge dissemination.
  • Dose and identification checking.
  • Patient information availability.
  • Order transcription.
  • Allergy defense.
  • Medication order tracking.
  • Improved inter-service communication.

Two major initiatives have galvanized activity around medical errors at the Federal level.

First, a Quality Interagency Coordination Task Force (QuIC) has been established with representatives from the Department of Health and Human Services, Department of Labor, Department of Defense, Veterans Affairs, Department of Commerce, Federal Trade Commission, National Highway Transportation and Safety Administration, Office of Management and Budget, U.S. Coast Guard, Office of Personnel Management, and Federal Bureau of Prisons. The focus of QuIC is to coordinate interagency activities in healthcare quality, including measurement, implementation, information, computer systems and public communication.

The errors agenda is being addressed by QuIC in several ways:

  • Create a national focus to enhance the knowledge base on patient safety.
  • Set performance standards and expectations for safety.
  • Implement safety systems in healthcare organizations.

Second, AHRQ plans to establish a Center for Quality Improvement and Patient Safety (CQUIPS) to:

  • Fund research on medical errors.
  • Translate research findings into improved practices.
  • Educate patients, consumers, and healthcare providers about safety issues.
  • Coordinate collaboration by public and private sectors.

Reporting systems help practitioners identify and learn from errors. Such systems can also make information on errors available to the public. Two types of systems are proposed:

  • Mandatory systems to address serious, preventable, identifiable adverse events with no identification of patients or healthcare professionals.
  • Voluntary systems to solicit reports on near misses or close calls, with data kept confidential.

The QuIC suggested that Federal support be available for States in their initiatives to reduce medical errors such as:

  • Outreach to stakeholders—AHRQ's User Liaison Program as a model.
  • Build awareness: public, purchasers, providers.
  • Publish examples of successful error-reduction strategies.
  • Patient Safety Clearinghouse.
  • National Morbidity and Mortality Conference.
  • Website where patients can report incidents.

States have significant roles to play as regulators, purchasers, providers, educators, and disseminators of public information:

  • As regulators, States accredit and license facilities and practitioners and enact confidentiality protections. States under their regulatory authority also develop and/or expand error-reporting systems and analytic capacity within the State.
  • As purchasers, States can require plans serving State employees or Medicaid beneficiaries to have patient safety programs in place and to require accreditation that assesses patient safety programs.
  • As providers, States can provide strong leadership and support interdisciplinary staff training on patient safety. States can also set examples by using technology or practices that have been demonstrated to reduce errors.
  • As supporters of education, States can influence patient safety through the development of curriculum for health professional schools. The curriculum could include information on error recognition and reduction and the design of training program on analyzing errors data.
  • Finally, in their role as disseminators of public information, States can provide information on the safety practices used by healthcare organizations, errors leading to death or serious injury, and steps patients can take to improve their own safety. Throughout their educational initiatives, States should focus on the importance of viewing medical errors as system failures and not a blame game.

David Woods addressed why people make mistakes and what we can do about it. Complex systems fail because of the combination of multiple small failures, each individually insufficient to cause an accident. These failures are latent in the system and their pattern changes over time. Monitoring processes generally interface with the practitioner, known as the sharp end of the system. Competing demands, dilemmas, conflicts, and uncertainty are the central features of operations at the sharp end. Work at the sharp end inevitably encounters competing demands for production and failure-free performance. There is a strong call for action at the sharp end—convinced that doing something is better than doing nothing at all.

The resources and constraints on their technical work arise from institutional, management, regulatory, and technological blunt end features of the system. Although often not the focus of investigations, features of the blunt end contribute significantly to performance and outcome.

When a failure occurs at the sharp end, the investigation normally stops with the error. This leads to sterile incident-collection results. Learning halts. Post-incident reviews generally identify human error as the cause of failure. Knowledge of the outcome creates a hindsight bias that assumes the path to failure should have been foreseeable although it was not foreseen.

Organizational reactions to failure focus on human error. The reactions to failure are: blame and train, sanctions, new regulations, rules, and technology. These interventions increase complexity and introduce new forms of failure. The cycle repeats itself.

People make safety. Improving safety depends on understanding the details of technical work, how success is usually achieved, and how failure sometimes occurs. Effective change follows.


Billing J. The incident reporting and analysis loop. In Enhancing Patient Safety and Reducing Medical Errors in Health Care. Chicago (IL): National Patient Safety Foundation.1999.

Cook RI. A Brief Look at the New Look in Complex System Failure, Error, and Safety. Chicago (IL): University of Chicago, Cognitive Technologies Laboratory;1999 February.

Cook RI, Woods DD, Miller C. Contrasting Views of Patient Safety. Chicago (IL): National Patient Safety Foundation. 1999.

Woods DD. Investing in Patient Safety: Six Points to Create Learning. Columbus (OH): The Ohio State University, Institute for Ergonomics, 2000 February.

Woods DD, Cook RI. The New Look at Error, Safety, and Failure: A Primer for Health Care. Chicago (IL): National Patient Safety Foundation, 1999 September.

Previous Section Previous Section         Contents         Next Section Next Section

The information on this page is archived and provided for reference purposes only.

AHRQ Advancing Excellence in Health Care