Hospitals encountered major problems collecting quality data electronically to meet the requirements of Meaningful Use stage 1, according to a report prepared for the American Hospital Association (AHA). Unless there are significant policy changes, the report warned, those difficulties will continue in MU stage 2.

This is the not the first time the AHA has complained about the quality reporting criteria for Meaningful Use. Commenting on the proposed rule for MU stage 2 in April 2012, the AHA told the Centers for Medicare and Medicaid Services (CMS) that hospitals had had "significant difficulty" using EHRs to do quality reporting. But this report goes into much greater detail about what those difficulties were and shows that even institutions with considerable EHR experience had big problems.

Researchers interviewed executives and operational personnel from four hospitals to gather information for the report. They included large and small, urban and non-metropolitan facilities that had implemented their EHRs five to 10 years earlier.

The surveyed hospitals had high hopes for electronic quality reporting. In federal programs alone, they were required to report on 90 different measures, and the ability to generate quality data automatically out of clinical workflow promised to decrease that burden. But such was not to be.

"Based on the experiences of the hospitals in this case study, the current approach to automated quality reporting does not yet deliver on the promise of feasibility, validity and reliability of measures or the reduction in reporting burden placed on hospitals," the report said.

In Meaningful Use stage 1, hospitals were required to report on 15 electronic clinical quality measures (eCQMs) that together encompassed more than 180 data elements. The known challenges going into the project included:

-- The modification of existing measures without robust testing to determine if all the necessary data were available in the EHRs,

-- Known errors in the eCQMs as specified by CMS

-- Lack of a mature e-specification development and updating process.

As they began their implementations, the hospitals discovered that much of the required data had not been captured in the required format. To capture the data so that the quality measures could be automatically populated, the facilities' IT departments and vendors had to modify the EHRs, in some cases adding new structured fields. "The inflexibility of the eCQM reporting tools" supplied by EHR vendors was partially responsible for this onerous task, the researchers said.

Organizations with integrated systems that used the same database across departments had less trouble than those in which there was little or no interoperability across departmental systems. In the latter case, staff had to manually enter data from other systems into the system being used to extract data for the eCQMs.

Across all the institutions, 80% of the effort "entailed changes to hospital workflow solely to capture eCQM data," the report noted. This didn't contribute to patient care and diverted efforts from other important hospital initiatives.

Despite concerted efforts, none of the organizations were able to validate their results fully. Some hospitals were able to perform technical validations to verify that all the data required by the eCQM reporting tool could be captured in a discrete format in the EHR. But they were unable to verify the extent to which clinicians had entered the discrete data used by the eCQM reporting tools.

Three of the organizations used a "staff-intensive and unsustainable concurrent review process to encourage documentation directly by nurses or order-entry by physicians," the report said. "The accuracy of the data used for eCQM calculation is dependent on staff review of entries in the EHR and manual data input … Organizations either spent considerable time in re-work to revise and validate the eCQM measurement process … or chose to ignore the results in favor of those derived from the chart-abstracted versions of the measures."

The report made several policy recommendations:

-- Slow the pace of the transition to electronic quality reporting with fewer but better testing measures, starting in MU stage 2.

-- Make EHRs and eCQM reporting tools more flexible so data capture can be aligned with workflow and interoperable so that data can be shared across departmental systems.

-- Test eCQMs for reliability and validity before adopting them in national programs.

-- Provide clear guidance and tested tools to support successful hospital transition to increase electronic quality reporting requirements. Among other things, the report suggested, the government should create a reliable, validated crosswalk from SNOMED-CT, the language used in the eCQMs, to the new ICD-10 diagnostic codes.

The release of the report coincides with AHA's statement to the Senate Finance Committee urging the Department of Health and Human Services to extend the timelines for Meaningful Use stage 2.

With these findings and the barrage of reports that CMS is receiving regarding the extension of Meaningful Use Stage 2, I wonder how long it will take for them to realize that maybe the healthcare system is not ready for this yet. With all the problems and kinks with MU Stage 1 not being completely worked out, I donG«÷t think we are ready to integrate MU Stage 2 and its new set of requirements.

To learn more about what organizations are doing to tackle attacks and threats we surveyed a group of 300 IT and infosec professionals to find out what their biggest IT security challenges are and what they're doing to defend against today's threats. Download the report to see what they're saying.

IT pros at banks, investment houses, insurance companies, and other financial services organizations are focused on a range of issues, from peer-to-peer lending to cybersecurity to performance, agility, and compliance. It all matters.

Join us for a roundup of the top stories on InformationWeek.com for the week of November 6, 2016. We'll be talking with the InformationWeek.com editors and correspondents who brought you the top stories of the week to get the "story behind the story."