(This article also appears in the August issue of EMS Insider. EMS Insider, the premier publication for EMS managers, supervisors, chiefs and medical directors, is a must-have resource for the critical, accurate information EMS leaders need. The monthly publication offers quality investigative reporting, exclusive articles, management tips and the very latest news on legislative issues, grants, current trends and controversies. For more about how to become an Insider, go towww.jems.com/ems-insider.)

What value does EMS provide to patient outcomes? Everyone has their own ideas and anecdotal evidence, but how many EMS systems can actually prove their worth? In the brave new world of healthcare reform, EMS agencies will need hard data that not only measures their performance, but is used as the basis for the development of consistent quality improvement and strategic planning. As sensible as that sounds, it still remains a rarity throughout most
of EMS.

The notion of measuring quality in EMS isn’t new. In 2006, the Institute of Medicine published a report, “Emergency Medical Services at the Crossroads,” recommending the development of “evidenced-based performance indicators that can be nationally standardized so that statewide and national comparisons can be made.”1

The challenge has been—and continues to be—the lack of uniformity in data collection and inability to agree on specific performance indicators.

Setting the Bar for the State
Recently, California became the first state to establish a statewide, standardized set of core performance measures that will be used to examine an EMS system and evaluate the treatment of an identified patient condition.

What’s new is the scale, notes Dan Smiley, chief deputy director of the California Emergency Medical Services Authority (EMSA) and one of the project leaders. “We knew we wanted data, but we had to find a standard [metrics] system to measure systems across California. We feel this is a good first pass,” he says.

The purpose of the California EMS System Core Quality Measures project is to increase the accessibility and accuracy of prehospital data. In turn, the data can be used for public, policy, academic and research purposes to facilitate EMS systems evaluation and improvement. By creating a benchmark for performance, EMS agencies in California can not only measure the quality of care provided by the system, but also determine if the recommended treatments are providing the best results for the patients based on their condition or illness.

Eventually, California consumers will be able to compare EMS systems, regardless of where they live, based on timely response, initiation of evidence-based interventions and transport to an appropriate hospital, among other measurements.

The California HealthCare Foundation, a nonprofit that works as a catalyst to improve healthcare in the state, allocated grant money to assist EMS agencies in their efforts to apply the data in ways that ultimately improve the quality of patient care delivered.

Interestingly California standards measure process data rather than outcome indicators—an important distinction for quality improvement. Process data accesses the individual steps providers take in patient care. For example, process measurements log whether or not the EMS crew provided aspirin to the patient with chest pain. Outcome indicators, on the other hand, evaluate the change in the health status of the patient. Did the patient survive the cardiac event?

Project History
EMSA is responsible for oversight of EMS in California. One area of focus is data collection and evaluation, which givew EMSA the authority to implement quality improvement guidelines. However, it doesn’t have statutory authority to regulate agencies and can’t force an EMS system to submit to the guidelines it creates.

In August 2011, EMSA hired Howard Backer, MD, MPH, FACEP, as director. “He immediately observed that EMSA had a profound lack of data and information to describe EMS systems within California,” Smiley says. Less than six months after Backer’s arrival, he approached the California HealthCare Foundation with a grant proposal to improve EMSA’s data collection system.

The grant funded an investigation of core measures for hospitals and an examination of what could be realistically expected from EMS systems. Smiley took on the responsibility of identifying the metrics to describe EMS system performance. Using a set of quality indicators developed through a project by the National Quality Forum and the Performance Measures for EMS, published by the National Highway Traffic Safety Administration (NHTSA), he created a standard set of core measures.

A task force of leaders made up of medical directors, prehospital providers and local EMS agencies representing private and public systems, was created in the fall of 2011 to make recommendations.

Establishing the core measures prior to the first task force meeting gave the team a kick-start, Smiley says. “We were under a one-year time frame to get this project completed. We didn’t have time to go through the normal deliberative process of starting from scratch,” he says.
The task force accepted 80% of the core measures Smiley proposed. “These were realistic, low-hanging fruit for core measures that everyone could get behind,” he says. “We wanted to create a culture and change how we do business to ensure that local EMS is looking at a standard set of care metrics to see how they are doing—whether systems are improving or not.”

The final list, published in the California EMS System Core Quality Measures in March 2013, includes:

Remarkably, the entire project from concept to completion took less than two years. A formal data system profile was identified; short- and long-term data improvement plans were established; time frames were set for achieving reliable measures; and core measures were defined and submitted to the National EMS Information System (NEMSIS) for evaluation. To introduce the project to EMS agencies, three data workshops where held across the state.
EMSA is currently in the process of reviewing data submitted for a one-year period, from April 1, 2012, to April 30, 2013. Preliminary data is scheduled to be released this month.

Los Angeles County EMS
Los Angeles County EMS, one of the largest EMS systems in the nation, will be submitting data to EMSA. The system includes more than 18,000 certified EMS personnel employed by fire departments, law enforcement, ambulance companies, hospitals and private organizations providing prehospital care to more than 10 million residents and visitors.

According to EMSA task force member and LA County EMS Director Cathy Chidester, RN, the request for data wasn’t a huge undertaking for her agency because they already had a history of collecting data for a quality improvement program. Although most of the metrics required by EMSA are being collected by her agency, a couple of modifications were needed to fulfill the request.

“We’re excited the state did this initial step. We support the state’s effort in getting this data. It’s a long time coming,” she says. One of the benefits is to provide supervisors with oversight of the quality of patient care provided in the field. “Not all departments have the luxury of that,” she says.

As more systems transition to electronic patient care reports, she hopes that data collection will be simplified. The key, she says, is data management. “It’s very complicated,” she notes. “Managing the sheer amount of data received is no easy task. At LA County EMS, 600,000 data forms are collected annually. A total of 10 to 15 people assist in one way or another in the management of the data collected.”

One advantage of having good data is that hospitals are more willing to share their data if there’s a reciprocal reporting opportunity. “Hospitals are beginning to see EMS as healthcare providers. It’s a real eye-opener [for them to see that] what is done in the field affects patient outcomes,” she says. “They’re starting to see that now.”

Barriers & Limiting FactorsSmiley says that the core measures are meant to be simple, achievable and will have a direct impact on improving EMS systems. Much of what’s measured is already being done; it simply needed to be benchmarked for compliance.

Still, not all EMS agencies are currently capable of gathering the data needed to do an effective job of benchmarking and improving systems. For the first two years of the project, EMSA will request that EMS agencies only submit retrospective data they glean from patient care reports. “We can move to be more sophisticated once we give local EMS agencies six months to a year to understand what we are asking of them,” he says.

The biggest challenge facing the task force is converting to NEMSIS 3.0. Smiley says the specifications will need to be changed to comply with the new data dictionary beginning in January 2014. “It’s going to be costly and time consuming,” he says.

EMSA still faces concerns from providers who fear that publicly reported data could reflect negatively on their system. “We feel strongly that transparency is critical, especially as it relates to EMS,” Smiley says. Some private EMS agencies perceive this kind of transparency as a potential risk to their contracts, while some fire departments have expressed concern that, for the first time, they may be held accountable. “We are taking the approach that this is the way the rest of healthcare is going. It’s the wave of the future. Why is EMS special?” he says.

Rather than being afraid of what the data may reveal, Smiley says he’s encouraged. “I think we have a great story to tell. The data ultimately will prove that. Unfortunately, [EMS agencies] have not been data-driven as a whole.”

Because EMSA won’t receive valid data from all 32 California EMS agencies this year, there will be no statistical significance to trend data, although he hopes to identify some overall trends. “We won’t be able to say in a definitive way what it means,” he says.

A wrinkle still to be worked out is the duplication of records. In some EMS systems, data is provided from both the fire service responders and ambulance transport agency for the same patient. The task force is examining ways to create a single record for each patient. “Our philosophy is to try to capture the continuum of care to the hospital,” Smiley says.

Eventually, individual patient data will be matched with outcome information from the hospital. “Less than one-third of EMS agencies have patient outcome data for cardiac arrests. That’s a huge policy issue for us,” he says.

Summary
Smiley doesn’t have high hopes for quality data or lots of trend reporting this year. Retrospective data has its limitations. “It’s highly variable,” he says. “While some of the data is good, some if it is not as valid as it should be.” Instead, according to Smiley, the goal for this first year is to identify data collection capabilities. “The numbers are less important this year,” he says. It’s more important to know who can submit data and what they can report. “That’s where we are at right now.”
As the task force receives feedback, the plan is to refine the questions for next year. In two years, they will be collecting prospective data.

Comparing EMS systems across the state is a significant accomplishment. “We need to have this critical information to make decision about our systems,” Smiley says. “We all say how important data and metrics are for clinical care, yet how many are actually doing it?”
A copy of the California EMS System Core Quality Measures and the NHTSA National Quality Forum and the Performance Measures for EMS can be found in the document repository on the EMS Insider website at www.emsinsider.com.