Skip Navigation U.S. Department of Health and Human Services www.hhs.gov
Agency for Healthcare Research Quality www.ahrq.gov
Archive print banner

This information is for reference purposes only. It was current when produced and may now be outdated. Archive material is no longer maintained, and some links may not work. Persons with disabilities having difficulty accessing this information should contact us at: https://info.ahrq.gov. Let us know the nature of the problem, the Web address of what you want, and your contact information.

Please go to www.ahrq.gov for current information.

Ambulatory Care Quality Alliance: Invitational Meeting

Report of the Data Sharing and Aggregation Workgroup

David Kibbe, American Academy of Family Physicians
George Isham, Health Partners

David Kibbe said the workgroup on data sharing and aggregation has developed key principles on data sharing and aggregation, proposed a National Health Data Stewardship Board to set standards and operating rules for data aggregation for quality improvement, and proposed pilot projects to address key questions and methods of data sharing and aggregation.

Kibbe noted that there were a number of groups around the country that could be considered data aggregators. We're trying to be inclusive with respect to what these organizations do, he said. Kibbe also noted that the issues being addressed by several workgroups were beginning to converge over issues of data collection and aggregation. The construct is working well, he said.

Kibbe set out the issues on the table for discussion at the AQA meeting:

  • Review and endorse the revised Data Sharing and Aggregation Principles for Performance Measurement and Reporting.3
  • Review and discuss the National Health Data Stewardship Board document.
  • Discuss whether and how to align the National Health Data Stewardship Board document with the recommendations of the Institute of Medicine (IOM) report, Performance Measurement: Accelerating Improvement (notably IOM's recommendation for a National Quality Coordination Board).

He also noted real action from the workgroup on a new set of tasks: proposed pilot projects.

Next, Kibbe announced that a new subgroup on health information technology had been formed. He said the topic came up in a steering committee meeting because the issues being discussed will connect to health information technology. We don't want to be late in coming to the discussion about health information technology, he said.

Data Sharing and Aggregation Principles

The discussion on data sharing and aggregation consisted of a comment about the focus to date on getting data together and getting them out. Science and measures change over time, said the participant. Even if we do everything perfectly tomorrow, we will need a mechanism to keep us aligned in the future for commonality of reporting. Is that built into the data aggregation functions? she asked. In response, Kibbe referred her to the principles regarding the National Health Data Stewardship Board.

Motion: To approve the revised Data Sharing and Aggregation Principles for Performance Measurement and Reporting.

Result: The motion was adopted.


National Health Data Stewardship Board

George Isham referred participants to the working draft document for a National Health Data Stewardship Board. He said that the workgroup had hoped to have a final draft for approval—but that was before the December IOM report was released. He said the workgroup needed to consider how the report relates to the AQA workgroup's own activities.

Isham referred to the IOM report's executive summary. He highlighted the IOM's key recommendations for achieving a National Performance Measurement and Reporting System:

  • That Congress establish a National Quality Coordination Board (NQCB) with seven key functions.
  • That NQCB have structural independence, substantive expertise drawn from the public and private sectors, and have contract authority. standards-setting authority, financial strength, and external accountability.
  • That local innovation be encouraged in pursuit of national goals for improving quality, and that local efforts be aligned with national goals.
  • That NQCB promulgate measure sets building on the work of key public and private sector organizations.
  • That NQCB formulate and pursue a research agenda that supports the development of a national system for performance measurement and reporting, and that Congress provide funding to carry out that research agenda.
  • That AHRQ work with stakeholders to help identify complementary investment strategies.

In addition, Isham noted that the IOM recommended that Congress establish a National Quality Coordination Board to:

  • Specify the purposes and aims of American health care.
  • Establish short and long-term national goals for improving the health care system.
  • Designate and develop (if needed) standardized performance measures for evaluating performance of providers and monitor progress toward these goals.
  • Ensure the creation of data collection, validation, and aggregation processes.
  • Establish public reporting methods responsive to the needs of all stakeholders.
  • Identify and fund a research agenda for the development of new measures to address gaps.
  • Evaluate the impact of performance measurement on pay for performance, quality improvement, public reporting, and policy levers.

Isham noted that some of the IOM's recommendations overlap with the AQA's recommendations, but that the IOM's overall scope was much broader.

Discussion

The discussion opened with a comment about the usefulness of the IOM report as it focuses on both funding and a research agenda. The participant asked, however, who would fund the front-end measurement development. In response, Carolyn Clancy said that the IOM report did not appear to address the intersection of public and private funding on quality.

The same participant then asked how the AQA's proposed board and the IOM's proposal relate. Do they overlap? he asked, adding that he believed the two proposals would need to be merged into a single board addressing the combined functions.

In response, Isham said that his workgroup was working hard on its own draft document, but was willing to bring its recommendations to the IOM. He indicated that the workgroup still needed to do a thoughtful side-by-side comparison of the two proposals. Clancy added that it was not clear that the IOM report's proposed board would specific standards or if others would do that.

One participant said that the IOM's proposed board seemed to be very broad in scope. She pointed to the word "coordinating" in the title, and said it seemed that the board would coordinate Federal government activities, and activities between the public and private sectors in order to get a coherent approach to measures and to reporting across the health care system.

In addition, she said that the report implies that there needs to be a stable group to steer a course across administrations, and that it also recognizes that measures should be linked to the Centers for Medicare & Medicaid Services (CMS) payment system. She said the coordinating board could not do all these activities itself, but would have to look to the private sector or someone else to carry out some of them.

The amount of money needed to set this up won't happen anytime soon, said one participant. Are there parts of our own needs that should move forward prior to the establishment of this board?

A participant representing CMS said he would recommend going slow on this, noting that his agency believes this is a complicated area. We're struggling internally with how to do integrated data strategy, he said. We talk about transparency, but this report hasn't been subjected to widespread discussion. He added that the AQA was one of many forums where it was legitimate to debate the recommendations—and that it was important not to assume that the recommendations are either the best or the only options. He also cautioned that it wasn't clear that the Office of the Secretary thinks the IOM's recommendations even fit into the Department of Health and Human Services' overall strategy.

Clancy agreed, adding that the AQA needed to address resource issues in order to generate congressional interest. She also stressed the need to address funding and costs.

We're in a good position, said Isham, as we have a proposal to work with that takes into account funding and a national infrastructure.

Kibbe, meanwhile, reminded participants about the process that led to the concept of a National Health Data Stewardship Board. He said the workgroup started formulating principles for data aggregation and then considered putting the principles into practice.

Kibbe noted that one idea that arose was to create a national aggregator—but that the idea "went down in flames" within the workgroup. The consensus, he said, was that data aggregation is both a public and a private sector responsibility. He added that the IOM's recommendations hark back to a governmentally-centered approach.

One of the important tasks facing us and the Hospital Quality Alliance (HQA), observed one participant, is the reality that we all have to do things slightly differently. That's a real leadership task, she said. She added that, at least initially, the AQA was the best entity to accomplish this.

Another participant said that a wide range of opinions would start to surface as Congress evaluates the IOM report. She suggested that it would be a mistake for the AQA to stop its work while that process was ongoing, recommending instead that the AQA move forward. She also suggested the need for AHRQ to reach out to HQA to gauge HQA's interest in joining some of the ambulatory care pilot projects. Another participant also recommended moving forward.

One participant said he wasn't sure how the IOM's proposed NCQB would operate in relation to the many individual aggregators. Will this board set guidelines, do research to inform its processes, and set standards for risk aggregation? he asked. In response, Isham said that there were a lot of outstanding questions that would need to be discussed going forward.

Proposed Pilot Projects

George Isham reminded participants about the goals for the pilot projects:

  • Measure individual physician, group, and system performance.
  • Aggregate data from multiple sources.
  • Generate reports following each of the AQA workgroup principles and parameters documents.
  • Address the questions framed by the AQA Proposed Pilot Projects document.
  • Generate real-time implementation experience and lessons learned.

AHRQ's Nancy Wilson reviewed the criteria for the selection of pilots:

  • Assess clinical quality, cost of care, and patient experience.
  • Understand structural capacity as a co-variate to assess physician performance.
  • Collect and aggregate Medicare claims data and private sector data from multiple sources, and, where possible, Medicaid data.
  • Explore both existing and new methods for collecting, submitting, and sharing data from physicians' medical practices.
  • Leverage the experience of existing aggregation efforts.
  • Disseminate measurement information.

The deadline for responses was December 29. Wilson said the workgroup had looked at 12 applications and come up with three to five communities that had existing active coalitions and met several other criteria as well. If we had unlimited funding, we would fund them all, she said.

The workgroup has proposed to start by funding three sites. The three grantees are Massachusetts Health Quality Partners, California Cooperative Healthcare Reporting Initiative, and Wisconsin Collaborative for Healthcare Quality.

Now, said Wilson, we need to do in-depth work to understand the needs of the three sites and to figure out how to do this work most efficiently.

Discussion

The discussion opened with a question about how the pilots would be run. In response, Nancy Wilson noted that while the organizations chosen had good skills sets, there was a lot of work to be done moving forward. She stressed that the aim of the pilot projects was to learn as much as possible as quickly as possible to inform ongoing initiatives (such as CMS' initiative on voluntary physician reporting). To do this, we had to select organizations that already had infrastructures in place, she said.

One participant noted that although funding constraints limited the AQA's initial activities to three pilots, she was wondering whether it would be possible for the AQA to endorse other efforts in principle. In response, Carolyn Clancy said that it wasn't clear what an AQA endorsement would mean. She added that it is ideal when the scores cluster, but that it didn't happen this time. She asked participants for their sense of whether more diversity was needed with regard to the pilot projects.

David Kibbe warned against seeking perfection at the expense of moving forward. He stressed that an effective data-sharing model requires useful data. The big problem from a physician and aggregator perspective, he said, is that we cannot get a really useful data set without combining data from different health plans. Kibbe said that the combination of government and private data in one data set would be a big step forward. We're going with the organizations that are almost there in terms of disseminating information across the country, he said.

Kibbe also noted that there was consensus around the idea that billing data alone were insufficient for assessing quality and efficiency. With electronic records and the Web, he said, we have new opportunities. We need to test some of these ideas and learn from them.

One participant discussed the need to distinguish between AHRQ and Centers for Medicare & Medicaid Services pilot projects and those undertaken by the AQA. We also need to answer the questions AQA thinks are important, he said. He noted that there were 12 questions with 83 sub-questions, and he suggested revising the questions to provide guidance on physician performance aggregation efforts. He also suggested that, as there were more than just the 12 applicants doing this kind of work, AQA should go out to the various organizations and see what they're doing and whether they would be willing to share their results.

Clancy wondered how, if a physician group wants to play by the same rules, AQA could support that and make it meaningful. She suggested that more work needed to be done before AQA votes on any endorsements.

I'm not clear what we would be voting on, said one participant. What happens when you bring in Medicare and Medicaid data? he asked. Is it clear that every pilot site will answer these questions in language we can read? He also asked what would happen if the reports didn't follow these principles.

Regarding next steps, Wilson noted that representatives from the chosen sites would meet with a technical advisory group. That group, she said, would work through the question of tasks to handle at the site level and what the sites should be doing. She noted that it wasn't yet clear what tasks would be done at all sites, and what else the individual sites would take on. Wilson noted that there were also questions about costs and funding.

One participant stressed that there was a push to expand the pilot projects beyond the three initially selected sites and that the workgroup was reaching out for additional funding.

Motion: To make clear that the suggested questions to be answered by the pilots are designed to provide guidance and not as a directive.

Result: The motion was adopted.


Motion: To add geographic diversity within the pilot projects if and when more funding becomes available.

Result: The motion was adopted.


Isham then asked participants for final remarks. The discussion quickly turned to the process used to select sites. Although you're being purposely secretive, said one person, we want to know that the selection process was thoughtful and systematic. Wilson pointed out that the materials before participants include scoring methods, and another workgroup member stressed that the top three sites had clearly differentiated themselves from the rest.

Without revealing who sites four, five, and six are, said one participant, it would be useful to know what the material difference was between these sites and the three selected. She noted that this could speak to a systematic gap in capability that should be considered in future discussions.

There were also a couple of questions about how and when the awards would be publicly announced. Although the AQA meeting is public, a workgroup member requested that people not speak to reporters until a formal announcement was made. We want the biggest splash possible, he said. Clancy also stressed that it was important to release the information in a way that people will understand.

We're working with "blinding speed" to get our document together, added Isham.

Another person noted that the top score on the grid was 45 and asked how the top three sites had scored. Clancy replied that the top three scored between 42 and 45.

One participant said that while she heartily endorsed the selection criteria she wasn't sure she would have come to the same conclusions. She pointed out that the three sites selected have been at work for some time while other initiatives were just getting underway.

Motion: To endorse the criteria for selecting pilot sites.

Result: The motion was adopted with several abstentions.


2006 Goals

David Kibbe highlighted his workgroup's goals for the coming year:

  • Pilot test data sharing and aggregation principles.
  • Evaluate the National Health Data Stewardship Board in relationship to the IOM's recommendations.
  • Monitor the outcomes of the Phase I pilot projects.
  • Discuss Phase II pilot projects.

Kibbe then asked if there should be any other objectives added to the list.

Discussion

Carolyn Clancy said that it would be useful to allow input from the pilot projects to refine the goals of the National Health Data Stewardship Board. Another participant agreed, suggesting that the results from the pilot projects could also inform the AQA's analysis of the IOM report. Kibbe suggested that the latter goal needed to be taken up by the steering committee, as it was too broad for his workgroup.

Another participant asked Kibbe if he would also look at roll-up measures in 2006. Yes, replied Kibbe.

One participant warned against assuming that the IOM report was the gold standard for comparison. Another suggested that AQA could perform many of the same tasks, and that these efforts would be well informed by having standards to follow.

Shouldn't AQA send a message that we believe there should be industry standards, and that people should consider how they are reporting information based on these standards? asked one participant. In response, Clancy stressed that a number of diverse organizations (including physician organizations and consumer groups) were at the table together at the same time. Kibbe added that perhaps a media or public relations campaign was needed to start to talk about what the AQA is doing on data aggregation.

There was some discussion about how to publicize the AQA's work. Suggestions included publishing an interim report and creating an AQA first anniversary packet of materials.

Carolyn Clancy said that how to communicate about the AQA's work needed to be on the agenda for discussion at the next meeting. She then thanked participants for their attendance and closed the meeting.


3. A copy of the revised Data Sharing and Aggregation Principles for Performance Measurement and Reporting, along with the other documents discussed by the Workgroup on Data Sharing and Aggregation, are available at http://www.ambulatoryqualityalliance.org/january12meeting/datasharing.


Current as of March 2006


Previous Section Previous Section         Contents                        


Internet Citation:

Ambulatory Care Quality Alliance: Invitational Meeting, Summary. March 2006. Agency for Healthcare Research and Quality, Rockville, MD. http://www.ahrq.gov/qual/aqamtg.htm


 

The information on this page is archived and provided for reference purposes only.

 

AHRQ Advancing Excellence in Health Care