Skip Navigation U.S. Department of Health and Human Services
Agency for Healthcare Research Quality
Archive print banner

State Healthcare Quality Improvement Workshop: Tools You Can Use to Make a Difference

This information is for reference purposes only. It was current when produced and may now be outdated. Archive material is no longer maintained, and some links may not work. Persons with disabilities having difficulty accessing this information should contact us at: Let us know the nature of the problem, the Web address of what you want, and your contact information.

Please go to for current information.

On January 17, 2008 Marguerite Barrett provided Hands-on Tool Training for the State Snapshots at the State Healthcare Quality Improvement Workshop. This is the text version of the event's slide presentation. Please select the following link to access the slides: (PowerPoint® file, 645 KB; PDF File, 65 KB; PDF Help).

Slides: 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10

Slide 1: Hands-on Tool Training: State Snapshots

State Healthcare Quality Improvement Workshop:
Tools You Can Use to Make a Difference
January 17-18, 2008

Marguerite Barrett, M.Sc.,
Thomson Healthcare (Medstat)

On the top of the slide are the logos for the Department of Health & Human Services and the AHRQ logo. The Department of Health & Human Services logo is an artistic image of an eagle with the outlined profile of faces. The AHRQ logo reads, "AHRQ — Agency for Healthcare Research and Quality: Advancing Excellence in Health Care,"

This presentation uses a template with a blue background and a header with the AHRQ and Department of Health & Human Services logos on the left.

Top of Page

Slide 2: Overview

  • State Snapshots
    -State-level measure from NHQR
    -Technical challenges

  • Maine Quality Forum
    -Hospital-level measures
    -Technical challenges scaling the State Snapshot application to the hospital level

Top of Page

Slide 3: Technical Challenges of the NHQR State Snapshots

  1. Given the wide range of state-level measures in the NHQR, how do we summarize the information?

  2. How do we present the material to a general audience?

  3. How do we build a Web environment that is easily updated as material changes?

Top of Page

Slide 4: How do we summarize the NHQR measures?

  • Decide which NHQR measures belong with each composite
    -Overall, type of care, setting of care, care by clinical area
  • Classify state performance for each measure
    -Calculate all-state and regional averages
    -Determine if the state is statistically better than average, average, and worse than average
  • Score state across measures in a composite
    -1 point for each NHQR measure that was better than average.
    -0.5 point for each NHQR measure that was at average.
    -0 points for each NHQR measure that was worse than average.
    -Sum points and divide by the number of measures

Top of Page

Slide 5: How do we present the material?

  • Composite-specific information
    -Graphic "speedometers" to display current and baseline score for all-state and regional comparison
    -Best performing states table
  • Measure-specific information
    -Data tables "behind" composite speedometers
    -Strongest/weakest measures
    -Ranking table on selected measures

Top of Page

Slide 6: How do we build a web environment that is easily updated?

  • Web pages are data driven.
  • XML files contain measure titles, composite names, state-level data, and information on associated graphic files.
  • Template web page displays appropriate content based on selection of state and composite measure.
  • Text and graphic updates are made by updating XML files.

Top of Page

Slide 7: Technical Challenges of the MQF Web Site

  • Reporting by hospital has unique challenges
    -Small N (hospitals)
    -Small n (cases)
  • Statistical test within an individual measure — not possible
  • Statistical test for difference within composite — only possible for some composites

Top of Page

Slide 8: MQF Web Site — Clinical

  • QIO Clinical Measures
    -Composites for heart disease, pneumonia, preventing infections
    -Statistical test (logistic regression) for difference within composite

To the right of the bullet points is an image of a performance meter for overall heart disease care, with the following caption: Hospital X is an Average Performer in Overall Heart Disease Care.

Top of Page

Slide 9: MQF Web Site—Nursing

  • Nursing Data
    -No statistical test for differences
    -Data tables with rates for the hospital and the average of similar hospitals

Top of Page

Slide 10: MQF Web Site—Consistency

  • Consistency of performance meter on 8 clinical and 2 nursing measures reported by most Maine hospitals
  • Ranks measures as in the best/lowest 10% of all hospitals

Below the bullet points is a pie chart showing the consistency of performance across ten measures of general hospital care. The pie chart does not specify percentages, and is split into three different portions that indicate: average in 70% of all measures in yellow, best in 20% of measures in green, and worst in 10% of measures in red.

Top of Page

Return to Contents

The information on this page is archived and provided for reference purposes only.


AHRQ Advancing Excellence in Health Care