Skip Navigation Archive: U.S. Department of Health and Human Services U.S. Department of Health and Human Services
Archive: Agency for Healthcare Research Quality
Archival print banner

This information is for reference purposes only. It was current when produced and may now be outdated. Archive material is no longer maintained, and some links may not work. Persons with disabilities having difficulty accessing this information should contact us at: Let us know the nature of the problem, the Web address of what you want, and your contact information.

Please go to for current information.

Improving Performance in Practice (Text Version)

Slide presentation from the AHRQ 2009 conference.

On September 15, 2009, Darren Dewalt made this presentation at the 2009 Annual Conference. Select to access the PowerPoint® presentation (2.6 MB).

Slide 1

Slide 1. Improving Performance in Practice

Improving Performance in Practice

On the road to a large scale system to improve outcomes for populations of patients

DeWalt DA, McNeill J, Stanford K, Rome M, Margolis P

Funded by the Robert Wood Johnson Foundation


Slide 2

Slide 2. Outline


  • IPIP purpose and design
  • Intervention and evolution
  • Data source and evaluation methods
  • Results
  • Interpretation


Slide 3

Slide 3. IPIP Program Purpose

IPIP Program Purpose

  • Align efforts and motivate action across primary care specialties and all levels of the health care system to transform care delivery
  • Assist practices in re-designing care
    • Initial focus on diabetes and asthma
    • Spread to preventive services and other conditions
  • Improve outcomes for populations of patients


Slide 4

Slide 4. IPIP:  A Multi-Level Improvement Effort

IPIP: A Multi-Level Improvement Effort

An image a 5 circles is shown. The outer rings inward say "National", "State", "Network", "Practice", and "Patient".


Slide 5

Slide 5. IPIP Design - Practice Level

IPIP Design-Practice Level

  • Local improvement networks
  • Data sharing for learning
  • QI support for improvement networks and individual practices through quality improvement coaches
  • State leaders, network leaders, and IPIP experience led to evolution of how these were operationalized
  • This evolution and variation gives us the opportunity to learn about different effects


Slide 6

Slide 6. Practice Coaching

Practice Coaching

  • Onsite assessment of current systems
  • Onsite teaching and technical assistance
    • Team formation and practice engagement
    • QI methods (Model for Improvement)
    • Information systems advice
    • Measures and reporting
    • Interpretation of performance data
    • Recommended changes in care delivery
    • Improvement ideas
  • Linkage with collaborative improvement programs


Slide 7

Slide 7. Other Elements of Practice Support (Context)

Other Elements of Practice Support (Context)

  • The IPIP intervention is multifaceted
  • Other factors could affect improvement as much or more than coaching style or content
    • Collaborative improvement efforts
    • Practice selection
    • External motivators and incentives
    • External leadership
    • Recommended practice design changes


Slide 8

Slide 8. Objective


  • To evaluate outcomes of the IPIP improvement effort for three states in their first year


Slide 9

Slide 9. Comparison


  Prototype Year Second Year
  State A State B State C
Practice Selection Practices signed up—media campaign Practices were "hand picked"—buddies, cooperative.... Practices were recruited for PCMH pilot
Collaboration No group work Evening mtg 3x/yr Breakthrough Series
Financial Incentives None $2000 to report data Dramatic incentives (e.g., 28-95K/FTE and payer mix for PCMH)
Prepared Registry No No Yes
Consulting QIC support QIC support QIC support
Topic Focus on diabetes or asthma Focus on diabetes or asthma Focus on diabetes or asthma


Slide 10

Slide 10. Measures


  • Process measures (e.g., % with DM with eye exam)
  • Outcome measures (e.g., % with DM with BP<130/80)
  • Implementation
    • Rated on scale of 0-5
      • Registries
      • Protocols
      • Templates
      • Self-management support
      • Overall


Slide 11

Slide 11. Example Rating System

Example Rating System

  • 0 - No activity: No activity on registry adoption or use.
  • 1 - Selected: Practice has chosen a registry, but not yet begun using it.
  • 2 - Installed: Practice has registry installed on a computer, set up a template, entered demographic data on patients of interest (e.g., diabetes) or has a process outlined to systematically enter the data.
  • 3 - Testing workflow: Practice is testing process for entering clinical data into registry; not yet using the registry to help with daily care of patients.
  • 4 - Patient management: All clinical data is entered into the registry and practice is using the registry daily to plan care for patients and is able to produce consistent reports on population performance.
  • 5 - Full integration: Registry is kept up to date with consistent, reliable processes. Practice has checks and monitors registry processes. Practice uses registry to manage entire patient panel (population).


Slide 12

Slide 12. Data Source

Data Source

  • Practice reports own performance
  • Establish baseline
    • Takes number of months to stabilize data quality
    • Take baseline at stable data (biases toward null)
  • Assume no improvement if never achieve baseline (biases toward null)
  • States A and B started February 2007
  • State C started June 2008


Slide 13

Slide 13. Image: 6 line charts as examples of one practice's data

Image: 6 line charts. Here is an example of one practice's data which demonstrates why we had to identify a time when their reporting system became stable. In the upper left-hand corner, you can see the count of diabetes patients, which is relatively stable from the very beginning. But look at their other measures. This pattern here is impossible (A1C>9). As is this (A1C<7) They were clearly still sorting out their measurement system either getting data entered into their registry or EHR or figuring out the right queries to run. For our analysis, we started their data in November 2008 instead of July 2008.


Slide 14

Slide 14. Analysis


  • Compare percent of practices with specified absolute improvement
    • >10% improvement in process measures
    • >5% improvement in outcome measures
  • Calculate average change in performance per month
    • Allows us to take into account different amount of time per practice
    • Based on difference between first stable month and final month


Slide 15

Slide 15. Results: Description of Practices

Results: Description of Practices

  State A
State B
State C
EHR 12% 58% 63%
Improvement Experience 24% 25% 38%
Median Number of Providers 4
(range: 1-11)
(range: 1-61)
(range: 2-65)


Slide 16

Slide 16. Time to Data Stability

Time to Data Stability

  • State A: 16 Practices
    • 16 practices
      • Never stable, 5 practices
      • 11 practices achieved stable reporting by June 2008
        • Time to achieve:
          • Mean 5.7 months
          • Median 6.0 months
    • Maximum months of analysis, 16
  • State B: 12 Practices
    • 12 practices
      • Never stable, 2 practices
      • 10 practices achieved stable reporting by June 2008
        • Time to achieve:
          • Mean 8.8 months
          • Median 10 months
    • Maximum months of analysis, 16
  • State C: 24 Practices
    • 24 practices
      • Never stable, 2 practices
      • 22 practices achieved stable reporting by June 2008
        • Time to achieve:
          • Mean 3.8 months
          • Median 4 months
    • Maximum months of analysis, 12


Slide 17

Slide 17. Baseline Performance

Baseline Performance

Measure State A
State B
State C
% Attn to Nephropathy 44.3 61.1 60.3
% Foot Exam 36.4 55.4 46.1
% LDL test 66.1 74.3 68.0
% Flu Vacc 23.1 43.4 44.9
% Eye Exam 20.2 32.3 25.3
% A1C < 9 72.5 88.9 70.6
% BP < 130 38.1 40.2 41.5
% BP < 140 64.2 60.2 65.0
% LDL < 100 46.9 39.8 39.0
% LDL < 130 59.8 61.3 55.7


Slide 18

Slide 18. Percent of Practices with greater than 10 percent Improvement

Percent of Practices with >10% Improvement

Image: Bar chart shows percent of practices (Foot Exam, Attention to Nephropathy, Smoking Cessation, LDL Test, Flu Vaccination, and Eye Exam) with >10% improvement for States A, B, and C. State C shows the highest percentage of improvement for all practices.


Slide 19

Slide 19. Percent of Practices with greater than 5 percent Improvement

Percent of Practices with >5% Improvement

Image: Bar chart shows percent of practices (A1C <9, BP <30, LDL <100) with >5% improvement for States A, B, and C.


Slide 20

Slide 20. Mean Percent Improvement Per Month

Mean Percent Improvement Per Month

Image: Bar chart shows mean percent of improvement for practices (Foot Exam, Attention to Nephropathy, Smoking Cessation, LDL Test, Flu Vaccination, and Eye Exam) per month for States A, B, and C. State C shows the greatest percentage of improvement for all practices.


Slide 21

Slide 21. Mean Percent Improvement Per Month

Mean Percent Improvement Per Month

Image: Bar chart shows mean percent of improvement (A1C <9, BP <30, LDL <100) per month for States A, B, and C.


Slide 22

Slide 22. A look under the hood

A look under the hood

Image: A look under the hood demonstrates how we are using the coaches report to assess our progress. On the y-axis it has time in months, at the bottom is June 08 and at the top is June 09. The darker the blue, the higher the implementation rating. For example, this dark blue that is a rating of four and a half or five on the implementation scale. The very light blue is down around zero to one. We want to see more and more dark blue as you move from the bottom to the top of the chart. These charts represent state C only because we were not collecting data this way when states A and B started. You can see here a reasonable trajectory of implementation of self-management support and registry implementation and use. This is over one year of time. You can see that a lot of practices still have a lot more to do. This work takes time. Putting this in perspective, I think that the modest improvements in process and outcome measures may be reasonable during the first year. I believe it is unrealistic to assume that one can engage in this work for a year and be done.


Slide 23

Slide 23. Limitations


  • Using data collected and reported by practices
    • Coaches often spent a lot of time on data reporting
  • Time to stable data led to underestimate of improvement
  • Statistical tests do not take advantage of repeated measures analysis (sorting out those models now)


Slide 24

Slide 24. Interpretation


  • Magnitude of improvement in process measures is similar to or greater than improvement seen in Health Disparities Collaborative evaluations
  • State C had more consistent improvement across measures, but the differences are not staggering at this point
  • Design of the practice support may affect results
    • Collaborative learning
    • Clear expectations
    • Payment


Slide 25

Slide 25. Where does this lead?

Where does this lead?

  • IPIP is creating a system for improving improvement
  • Variation provides opportunity for significant learning about systems required to drive improvement
    • Move toward more controlled variation
  • Now close to 250 practices nationwide
    • Growth of the program will offer more statistical power
  • With stable ongoing reporting, the data analysis will become easier and more robust
  • Any single intervention will have a modest effect
    • Need to combine elements of practice support


Slide 26

Slide 26. Acknowledgements


Funded by the Robert Wood Johnson Foundation

  • American Board of Medical Specialties
  • American Board of Pediatrics
  • American Board of Family Medicine
  • American Academy of Pediatrics
  • American Academy of Family Physicians

States of

  • Colorado
  • Michigan
  • Minnesota
  • North Carolina
  • Pennsylvania
  • Washington
  • Wisconsin


Slide 27

Slide 27. Comparison to Other Results

Comparison to Other Results

Image: 11 line graphs are shown. Comparison of performance vs. years in intervention.


Slide 28

Slide 28. Self-Management Support Rating Scale

Self-Management Support Rating Scale

  • 0 - No activity: No activity on self management support.
  • 1 - Materials on hand: Practice has obtained patient education materials and handouts to support self-management.
  • 2 - Roles assigned: Practice has completed a plan for providing self-management support that includes all of the elements indicated in the change package. Staff roles and responsibilities are clearly delineated.
  • 3 - Testing workflow: Practice actively testing their process for self-management support. All staff involved in self-management support has undergone appropriate training. Patient goal setting and systematic follow-up are being implemented at least in part of the practice.
  • 4 - Implementation 70%: Self-management support is consistently offered. Practice documents self-management goals for patient in the chart or registry, getting performed across the entire practice. Monitoring reliability is occurring.
  • 5 - Implementation 90%: Patients consistently have self-management goals documented, follow-up system is reliable, staff are comfortable providing self-management support. Ongoing monitoring ensures the process is carried out consistently for all patients.


Slide 29

Slide 29. Simplified Change Package

Simplified Change Package

  • Registry to identify patients prior to visit
  • Templates for planned care (e.g., visit planner)
  • Protocols to standardize care
    • Standard Protocols
    • Nursing Standing Orders
    • Defined Care team roles
  • Self-management support strategies


Slide 30

Slide 30. IPIP National Key Driver Diagram

IPIP National Key Driver Diagram

  • Goals (by January 1, 2010)
    • 350 new practices participating
    • 90,000 new patients in denominators
  • Increase in clinical process measures
  • Improvement in clinical outcome measures

Key Drivers & Interventions

  • Accountable leadership focused on health outcomes
    • Communicate high expectations at all levels
    • Use multiple communication methods
    • Use structured participatory process for setting population-based goals and targets
    • Enumerate and describe entire population of practices
    • Plan for sustainable leadership
    • Develop leaders' improvement skills
  • Partnerships that promote health care quality
    • Partners assume responsibility for outcomes
    • Link to hospitals, public health organizations, quality organizations and others for resources, expertise, data
    • Access to administrative data (e.g. hospitalizations)
  • Attractive motivators and incentives
    • Maintenance of Certification
    • CME
    • Engage payers in design of rewards (e.g. Pay for Performance)
    • NCQA recognition
  • Measure performance and share data
    • Routine performance measurement
    • Transparency of comparative data
    • Standardized measures and definitions
    • Promote and support the effective use of registries
  • Active participation in an organized quality improvement effort
    • Create enduring collaborative improvement networks
      • Promote practice teams that improve rapidly ("super improvers")
      • Share best practices in clinical and process improvements
      • Promote peer-to-peer communication
      • Ongoing cross-organizational and state learning
    • Provide tools and information that promote evidence-based best practices
    • Share knowledge and improve QI support


Slide 31

Slide 31. Goals for IPIP Performance

Goals for IPIP Performance

Measure Goal
DMPctA1CAbove9 5
DMPctBPBelow130 70
DMPctBPBelow140 90
DMPctEyeExam 80
DMPctFluVacc 80
DMPctFootExam 90
DMPctLDLUnder100 70
DMPctLDLUnder130 90
DMPctMicroalb 90
DMPctSmokCess 90
DMPctWithLDL 90


Slide 32

Slide 32. IPIP Data Flow

IPIP Data Flow

Image: This chart summarizes the IPIP dataflow. The top stream is the practice report. Practices, send us their numerator is in denominators for each measure. practices should use that information to help drive their improvements . the lower stream is the quality improvement Coach report which documents the activities of the improvement team. Are they implementing the recommended changes. We combine this data together to try and understand what changes are most effective and what aspects of team function helped predict success.


Slide 33

Slide 33. Total Number of Diabetes Patients July 2009

Total Number of Diabetes Patients July 2009

Image: Graph shows the Total Number of Diabetes Patients in July 2009.

Current as of December 2009
Internet Citation: Improving Performance in Practice (Text Version). December 2009. Agency for Healthcare Research and Quality, Rockville, MD.


The information on this page is archived and provided for reference purposes only.


AHRQ Advancing Excellence in Health Care