Healthcare industry questions data integrity

Clinicians, technologists 'define things differently'.

Healthcare industry members have called for better data about hospital admissions, processes and outcomes to inform the design of future emergency departments.

According to Paul Barach of the US Center for Health Design, poor data integrity hindered the development of e-health systems.

“The problem is clinicians and researchers and IT people define things differently,” he told the Digital Hospital Design Conference in Brisbane this week.

“That’s a big challenge. It means that it’s very difficult to synchronise and standardise timetables, and it’s no surprise that – as in other places – people game the system.”

Barach observed that ‘data fudging’ occurred in hospitals, where staff reported on ambiguous situations like how long ago a patient was likely to have hit his head.

In addition, policy makers, clinicians, and technologists had different definitions of metrics like bed occupancy, and there was “a bit of distress going on about how the data will be used”.

Barach highlighted the issue of “DRG (diagnosis-related group) creep” in the US, where hospitals were paid for treating patients in accordance with the group their illnesses fell within.

DRGs were introduced in 1983; in the following years, hospitals reported treating an increasing proportion of cases in DRGs that attracted more funding.

Barach argued that the issue was not one of honesty, but of ambiguity, which “leads to a whole series of misunderstandings”.

Clinicians tended to prioritise interacting with patients over reporting, so by the time they reported a case, it may be minutes or hours later, he noted.

“Memory is flawed … people interpret facts very, very differently but they’re all telling the truth. Truth is very context-dependent,” he said.

Fudging in Australia

Few conference attendees admitted to ‘data fudging’, but several were aware that it occurred and was an issue.

Last month, the NSW Bureau of Health Information (BHI) announced that it would exclude emergency department waiting times from its September 2011 report due to concerns with how data was recorded.

“A bureau analysis suggests there may be differences across hospitals in the way emergency department information is recorded,” it stated at the time.

“Such differences in emergency department data could make it difficult to accurately compare hospitals.”

BHI’s decision was welcomed by Sally McCarthy, president of the Australian College of Emergency Medicine, who noted that emergency department statistics had become politically charged.

“There’s some aspects of the data that’s likely to be so unreliable as to cause the BHI to look at the whole data set and say, ‘how can we improve the reliability and consistency’,” Dr McCarthy said.

“That’ll be welcomed by clinical staff and emergency departments because it supports more accurate and transparent data.”

Data to support emergency ‘pit crews’

Accurate data also underpinned futuristic emergency departments described by Deloitte consultant Katerina Andronis and David Hansen of the Australian e-Health Research Centre.

The centre, funded by the CSIRO and Queensland Government, was developing software for forecasting how many patients a hospital would likely need to admit in the next hour, day, or week.

The so-called Patient Admission Prediction Tool (PAPT) relied on estimates of a hospital’s ‘optimum occupancy level’, calculated with data provided by the hospitals.

It combined techniques used in hotel management to enable hospitals to manage resources including beds, staff, and surgery scheduling.

Andronis called for IT systems that would enable emergency teams to operate efficiently and effectively – like a “pit crew in racing”.

Those included well-defined protocols and checklists, a standard set of tests and care bundles, and electronic medical records to support clinicians’ decision making.

As a result, she hoped hospitals would be “patient-centric”, so they spent less time in waiting rooms, were moved around less, and felt safer.

According to the Center for Health Design’s Dr Barach, accurate hospital data required stakeholders to be more transparent, agree on definitions, and to be made accountable for adhering to those definitions through incentives or education programs.

“I think a hospital is one of the most complex spaces in all of society … that means it’s going to take many, many years to get this right,” he said.

“The building blocks are standardisation of definitions, a culture of truth, a culture of transparency, and engagement of the users by respecting them.”