Skip Navigation Archive: U.S. Department of Health and Human Services U.S. Department of Health and Human Services
Archive: Agency for Healthcare Research Quality
Archival print banner

This information is for reference purposes only. It was current when produced and may now be outdated. Archive material is no longer maintained, and some links may not work. Persons with disabilities having difficulty accessing this information should contact us at: Let us know the nature of the problem, the Web address of what you want, and your contact information.

Please go to for current information.

Evaluation of Phase I Demonstrations of the Pharmacy Quality Alliance (Text Version)

Slide presentation from the AHRQ 2009 conference.

On September 16, 2009, Laura Pizzi made this presentation at the 2009 Annual Conference. Select to access the PowerPoint® presentation (657 KB).

Slide 1


Evaluation of Phase I Demonstrations of the Pharmacy Quality Alliance

AHRQ Annual Conference
September 16, 2009

Laura Pizzi, PharmD, MPH
Associate Professor and Director,
Doris N. Grandon Center for Health Economics
and Outcomes Research
Jefferson School of Population Health
Philadelphia, Pennsylvania USA

Telephone: (215) 955-1159 Email:

Note: The timelines presented herein are dependent on OMB approval.

Slide 2



  • Evaluation of PQA Phase I Demonstrations:
    • Provides independent external evaluation
    • Facilitates within- and across-site comparisons
  • Core evaluation team members:
    • Joyce McMahon, PhD - Project Director (CNA)
    • Elizabeth Schaefer, MA - Associate Project Director (CNA)
    • Laura Pizzi, PharmD, MPH - Associate Project Director (Jefferson)
    • Erin Thomson, MPH - Research Analyst (Jefferson)
  • AHRQ Project Officer: Linda Greenberg, PhD

Slide 3


Key Evaluation Questions

  • What is the most efficacious way to collect and aggregate data on the 15 pharmacy performance measures and from the consumer questionnaire about pharmacy services?
  • What challenges, issues, and technical problems were encountered in creating and populating the template reports? How were they resolved? �������
  • How could the Demonstration sites have strengthened their measurement efforts?
  • How well were the reports understood by users (i.e., staff pharmacists, pharmacy managers)?
  • How can the PQA report template and the reporting process be improved (e.g., user-friendliness, comprehension, ability to act)?
  • What are the operational costs and non-financial burdens encountered in collecting the data, generating reports, and using the performance data?

Slide 4


Early Focus

  • Establish lines of communication (AHRQ, PQA, Demonstration sites)
    • Monthly teleconferences between AHRQ, CNA, Jefferson, and PQA
    • Monthly teleconferences with project leaders at each of the 5 sites to monitor progress
  • Primary data requirements
    • Instruments: Interview Guide and Pharmacist Survey
    • Both require OMB approval

Slide 5


Evaluation Components

  • Verification of findings via secondary data analysis (data already collected during the Demonstration itself):
    1. Performance reports (claims data provided by the sites)
    2. Pharmacy consumer survey (survey data provided by Avatar)
  • New information gathered via primary data:
    1. Monthly teleconferences
    2. Qualitative on-site interviews of Demonstration project staff
    3. Quantitative paper survey of field pharmacists

Slide 6


Interviews of Demonstration Staff

  • With the help of site Project Leaders, CNA team has identified six (6) interviewees per site, representing specific job functions:
    1. Demonstration project leadership
    2. Pharmacy operations / management
    3. Analytics management responsible for oversight of performance report analyses
    4. Analytics staff assigned to complete the performance reports
    5. Information technology (IT) staff responsible for developing and/or coordinating Inter- or Intranet components of the project
    6. Senior management (executive leadership)

Slide 7


Interviews of Demonstration Staff, continued

  • Conducted in person by a two-person interview team from Jefferson
  • Duration of each interview will be approximately 1 hour
  • Total duration of each site visit will be 2 business days
  • Approximate timeline:
    • Mid-February to mid-May 2009: CNA team worked with Project Leaders to identify interviewees
    • September 2009: Site visits scheduled
    • October 2009: Site visits conducted

Slide 8


Survey of Field Pharmacists

  • Pharmacist sample currently being obtained with the help of site Project Leaders
  • Sampling parameters
    • Inclusion criterion: pharmacists are required to have participated in the PQA Phase I Demonstration project as recipients of the performance report
    • Sample size: 100 field pharmacists per site, except for sites where fewer than 100 pharmacists participated
  • Survey will require approximately 30 minutes to complete

Slide 9


Survey of Field Pharmacists: Timeline

  • Mid-July to October, 2009: CNA team works with Demonstration sites to identify sample. Also sends sampled pharmacists a formal letter* to:
    1. Inform them that they have been selected as participants for AHRQ-funded PQA evaluation
    2. Explain the purpose of the survey and anticipated time required for completion
    3. Assure them confidentiality and encourage participation
  • Mid-October: CNA team sends the survey by mail to sampled pharmacists. The mailing will include:
    1. A cover letter* to remind pharmacists of the purpose of the survey and anticipated time requirements, and assure confidentiality
    2. The survey instrument, for completion
    3. A postage-paid envelope for returning the completed survey
  • Late October: CNA team sends a reminder letter* prompting pharmacists to complete and return the survey
*Letters will be sent on AHRQ letterhead and signed by Project Officer with reinforcement message sent by email to the pharmacists by their Demonstration Project Leader(s)

Slide 10


Evaluation Constructs: Primary Data

Construct Items Covered Interview Guide for Demonstration Staff Pharmacist Survey
1. Respondent Characteristics Name
Length of time in current position
2. Organizational Background Perceived importance of quality measurement within the organization
Existing quality measurement initiatives within organization/agency
Quality measurement personnel (training, credentials, full-time equivalents) in the organization
Decision-making process surrounding quality measurement (which measures, which disease states, which accreditation organizations)
3. Organizational Resources Personnel qualifications and time required for data collection, aggregation, and analysis of the 15 pharmacy performance measures and the consumer questionnaire
Additional resources required (e.g., training, software, equipment, or tools)
Role of organizational leadership in supporting measures

Slide 11


Evaluation Constructs: Primary Data (continued)

Construct Items Covered Interview Guide for Demonstration Staff Pharmacist Survey
4. Measurement Methodology What data sources were used and why
Method employed to collect data on the 15 pharmacy performance measures
Method employed to select sample
Method employed to disseminate consumer questionnaire (mail, telephone, or mail with telephone follow-up)
How and by whom data were analyzed
Time required for data collection (both in real time and in man-hours)
5. Performance Measure Evaluation Participants' perceptions regarding each of the 15 PQA quality measures as defined using NQF Measure Evaluation Criteria: Importance, Scientific acceptability, Usability, Feasibility
Perceptions and response rates pertaining to the pharmacy consumer survey
6. Dissemination Process Whether and how field pharmacists and other personnel were made aware of measures
Were there implementation partners or other collaborators

Slide 12


Evaluation Constructs: Primary Data (continued)

Construct Items Covered Interview Guide for Demonstration Staff Pharmacist Survey
7. Incentives / Penalties Incentives to promote pharmacy staff participation
Negative consequences for pharmacy staff non-participation or non-completion
8. Usability of Performance Reports Overall usability of performance reports
Factors that facilitated or enabled the usability of the performance reports
Data or measures that do not exist but would be useful
Participants' overall reaction to performance report
Key Learnings: About medication quality measurement, About quality measurement in general, About consumers' experiences and assessments of pharmacy plans and services

Slide 13


Evaluation Constructs: Primary Data (continued)

Construct Items Covered Interview Guide for Demonstration Staff Pharmacist Survey
9. Perceptions Regarding Pharmacy Quality Measurement Pharmacists' perceived barriers and beliefs about pharmacy quality measurement
Readiness to engage pharmacy quality measurement, in general
Perceived self-efficacy (extent to which pharmacists feel they can improve pharmacy quality)
10. Future Directions Recommendations to improve the report template and reporting process
Next steps/thoughts regarding Phase II Demonstration:
Whether organization will continue to promote existing measures, modify them, and/or introduce new ones
Whether organization plans to continue use of the pharmacy consumer survey
Whether organization will change dissemination approach

Slide 14


Pilot Test of Evaluation Tools

  • Goal of pilot testing was to obtain input for refining the interview guide and pharmacist survey in terms of both content and process
  • Pilot test was completed February 2009 (prior to OMB submission)
    • Conducted with pharmacy staff from the Jefferson Health System (JHS)
      • JHS delivers both inpatient and outpatient pharmacy services through approximately 80 full time pharmacists, with care delivered at two hospitals (totaling approximately 800 beds) on the main campus in Center City, Philadelphia
      • JHS operates 3 outpatient pharmacies, which are open to the public and are operationally similar to typical retail pharmacies

Slide 15


Pilot Test of Evaluation Tools, continued

  • Participants from the pilot site were provided with the training materials developed by one of the Demonstration sites, plus a mock performance report based on an actual de-identified pilot site report
  • There were 2 pilot test cohorts:
    • Cohort 1 (5 individuals) participated in testing the Demonstration staff interview guide:
      1. Analytics/IT management and staff
      2. Pharmacy department management
      3. Clinical pharmacy staff
      4. Quality improvement
      5. Senior management (executive leadership)
  • Cohort 2 (5 practicing pharmacists) participated in testing the pharmacist survey

Slide 16


Tasks Following Data Collection

  • Analyze all information
    • Primary data collected: interviews and pharmacist survey
    • Secondary data acquired: consumer survey and claims data
    • Information collected via teleconferences with PQA and sites
  • Prepare case studies
    • Concentrate on both within-case and cross-case studies
    • Cross-case studies will focus on generalizability of findings
  • Share findings with AHRQ, PQA, and Demonstration sites
  • Publication
    • Primary manuscript focused on evaluation findings (pooled data)
    • Submitted to major health policy or quality journal
Current as of December 2009
Internet Citation: Evaluation of Phase I Demonstrations of the Pharmacy Quality Alliance (Text Version). December 2009. Agency for Healthcare Research and Quality, Rockville, MD.


The information on this page is archived and provided for reference purposes only.


AHRQ Advancing Excellence in Health Care