Skip Navigation U.S. Department of Health and Human Services
Agency for Healthcare Research Quality
Archive print banner

Emergency Preparedness

This information is for reference purposes only. It was current when produced and may now be outdated. Archive material is no longer maintained, and some links may not work. Persons with disabilities having difficulty accessing this information should contact us at: Let us know the nature of the problem, the Web address of what you want, and your contact information.

Please go to for current information.

Studies evaluate hospital disaster drill tools

The Joint Commission requires hospitals to put their emergency management plans in action by conducting two disaster drills each year. Because there is no validated method to assess hospital disaster preparedness, University of California, Los Angeles (UCLA), Medical Center researcher Amy H. Kaji, M.D., M.P.H., and colleagues examined three of them during a November 2005 regional disaster drill in Los Angeles to determine how well they worked together. In a second study, Dr. Kaji's team used a drill evaluation tool that the Johns Hopkins University Evidence-based Practice Center developed under a contract with the Agency for Healthcare Research and Quality (AHRQ). Both studies were funded in part by AHRQ (HS13985).

Kaji, A.H., Langford, V. and Lewis, R.J. (2008 September). "Assessing hospital disaster preparedness: A comparison of an on-site survey, directly observed drill performance, and video analysis of teamwork." Annals of Emergency Medicine 52(3), pp. 195-201 and 201.e1-201.e12.

The three methods to assess hospital emergency plans measured different aspects of preparedness for the 6 (of 17) hospitals that agreed to be part of the study. Before the drill, disaster coordinators completed on-site, E-mailed surveys that addressed items such as their emergency plan, training, communication, and supplies. On the day of the simulated disaster—an explosion at a public event—crews videotaped participants from the six hospitals so that a research group, MedTeams, could assess their teamwork skills, including problem solving, structure, and communications. Finally, 32 fourth-year medical student observers rated participants' performance with an evaluation tool. The authors found that the on-site survey addressed material and staff concerns. AHRQ's drill evaluation tool examined those items, but also delved into communication and teamwork. The correlation between the drill evaluation tool and the video analysis was the strongest, most likely because they both scrutinize teamwork and communication issues. The researchers suggest their findings could be useful in developing one tool to assess hospital preparedness that reflects teamwork, communications, surge capacity, supplies, and equipment.

Kaji, A.H. and Lewis, R.J. (2008 September). "Assessment of the reliability of the Johns Hopkins/Agency for Healthcare Research and Quality hospital disaster drill evaluation tool." Annals of Emergency Medicine 52(3), pp. 204-210 and 210.e1-210.e8.

In this study, the researchers used the Los Angeles disaster drill as an opportunity to study the AHRQ drill evaluation tool. The tool identifies zones of action during disasters (command, triage, treatment, and decontamination), and observers evaluate participants' performance in those zones using the AHRQ tool. Two hundred items from the tool were coded as having better versus worse preparedness. The authors found the internal reliability of the tool to be high, which indicates its underlying construct may be valid. However, evaluations varied widely among observer pairs. This could have been caused by observers' lack of training, their unfamiliarity with disaster response, or ambiguous items they were asked to score. This variation indicates that either revision of the tool or more in-depth user training is needed.

Return to Contents
Proceed to Next Article


The information on this page is archived and provided for reference purposes only.


AHRQ Advancing Excellence in Health Care