Skip to main content

Advertisement

ADVERTISEMENT

Original Contribution

Measure Hunting

John Erich
December 2010

   Everyone likes to think they put patients first. But in reality, few aspects of most EMS systems are truly designed with the patient foremost in mind.

   "In EMS, the patient has never really been the center of any of its design," says consultant Dave Williams, PhD, late of Fitch & Associates and now working with the Institute for Healthcare Improvement. "And there's not a lot of evidence to say what's the right thing in most situations. So most of the design you see in the industry has been developed around personal experience or opinion."

   Williams spent several years studying patient-centrism in the field--what it is and what inhibits it--as part of his doctoral pursuit. He described his findings at the Pinnacle EMS Leadership & Management Conference in July 2010.

Defining Patient-Focused

   What constitutes a patient-centric system? Through review of scientific journals, trade publications and historical documents, Williams identified 15 defining factors--six related to system design, nine to operational practices:

System design features

  • Public intervention--Training and equipping laypersons to deliver needed actions (e.g., CPR, PAD);
  • No call screening--9-1-1 calls are triaged but not screened;
  • Demand-based deployment--Managing resources to correspond to demand;
  • ALS--A paramedic responds to every call;
  • Full service--Systems provide both emergency response and routine services like interfacility transports;
  • Alternative transport destinations--Patients can go to clinics, doctors' offices and other facilities when they're more appropriate than EDs.

Operational practice features

  • Response time reliability--Calls are dependably answered in a timely way;
  • Reduced call time--Especially for things like trauma, stroke and STEMI;
  • Balanced scorecard--Taking measures that yield a broad view of an organization's overall performance;
  • Outcome-based performance measurement--Measures demonstrating a system is achieving the results it aspires to;
  • Customer satisfaction measurement--Evaluating what patients think of their experiences;
  • Quality improvement--Gauging the effectiveness of changes to care or processes;
  • Economic efficiency--High-quality service at the lowest possible cost;
  • Preparedness--For terrorist acts, natural disasters and other large-scale events;
  • EMS health monitoring--Helping patients through proactive assessment and care (e.g., evaluation of risks in homes, wellness and medication-compliance checks, etc.).

   Williams then zoomed in on five different EMS system types--fire department, third service, private, hospital-based and public utility model--as case studies, reviewing internal data and documents and quizzing leaders about obstructions they faced in these 15 areas. Their answers spanned a predictably broad gamut, but were grouped down to 38 general categories of results.

   The top five resulting obstacles were:

  • Cost/funding;
  • Data measurement;
  • Process and outcome focus (i.e., defining what a system wants to achieve, then figuring out how to go about it);
  • Systems view or design (i.e., interdependence on other entities); and
  • Public information and education.

   The cost/funding and process/outcome focus obstacles were within the top five cited as inhibiting both system design and operational practice features, suggesting they are the two primary overall obstacles to the design of more patient-centric EMS systems.

Process and Outcome

   Cost is a more complex concept than you might think, and Williams had some important findings in this area. But the issue of process and outcome focus can be even harder to wrap a mind around, though it is relevant to every EMS system out there.

   Determining exactly what you want to accomplish, and then how to accomplish it in an evidence-based, data-driven way, isn't always easy.

   "The biggest issue is that people just don't measure things," says Williams. "And when they do measure them, they measure for judgment, not learning. In the vast majority of systems I've seen, if they show me data of any kind, it's almost always in a table. It's a fixed number, generally not very frequent--monthly, quarterly, annually, whatever--and rarely presented in a way where it provides much knowledge. Rarely do I see a run chart or a Shewhart chart. People really can't learn from data like that.

   "That's the first thing: You have to measure stuff," Williams adds. "When you start measuring, all of a sudden you have something to go at. Then the second piece is looking at that and saying, 'How do I change the process to change the numbers? How do I improve both the outcome and its reliability?'"

   There are places where EMS does this pretty well. One is response times. They're easy to measure, even in component segments, and having that data facilitates interventions to compress those segments and squeeze overall times down. On the clinical side, another is cardiac arrest. Survival rates are a measurable, objective outcome, and the process for improving them has been well established.

   Williams contrasts the gaudy survival rates for witnessed v-fib arrests in places like Seattle/King County, WA, and Rochester, MN, to the miniscule numbers of some other major urban areas.

   "That difference isn't from having less-sick people," he notes. "It's purely process. Everyone has Lifepaks with defibrillators and drugs and people who know CPR. But Rochester and Seattle spent a lot of time designing their processes and improving their measures. They really focused on that."

   In his research, Williams found that the numbers and types of measures people were using to guide management of their systems were "strikingly small" and not comprehensive to the whole breadth of a system's performance. People don't know what to measure to give them that fuller picture, and don't know what to measure to yield truly relevant insights that can benefit patients.

   The solution to that is pretty simple: Measure something anyway. That'll help you start figuring it out.

   "I tell people, 'I don't care if there's a proven measure. Go try and measure something, and you'll learn from doing that,'" Williams says. "As soon as you start measuring stuff, if you're thoughtful in what you're measuring and how you look at the data, it empowers you to ask additional questions and get engaged in answering stuff. It's a huge opportunity to get better."

Getting Started

   Williams concluded with half a dozen recommendations to help systems become more patient-centric:

  • Quantify a reasonable cost--Respondents felt the cost of implementing patient-centric features was too high. Systems must assess what reasonable costs are and what factors make systems more or less expensive and add quality.
  • Change the funding model--Rather than a pay-per-transport funding model, pay us for readiness and/or per patient treated.
  • Create a research consortium--This would focus on identifying performance outcomes systems can achieve, quantifying the costs of service and factors that increase cost but not quality, and learning what processes produce the best results. Changes are easier to sell when you know they'll result in improvement.
  • Data measurement to improve and sustain--Systems need the right data measures to help guide improvement and sustain outcomes.
  • QI as an operational strategy--Without data measured over time, we can't know the performance of key processes or if changes result in improvement.
  • Document and share learning--Leaders must document and share objective, data-supported cases of their successful and unsuccessful practices for the benefit of each other.

   Those wondering where to start can begin with the National Association of State EMS Officials' recommended performance measures, available at www.nasemsd.org/Projects/PerformanceMeasures. And then, ultimately, focus on what matters. Focus on areas where EMS can make a difference.

   "Look at your CAD data and the big buckets of call types," says Williams. "In most cases the three big buckets are cardiac, respiratory and trauma, with everything else falling into a fourth bucket. Aim for the stuff that's going to make the biggest difference. Measure first, get data, get a baseline, go back and look at it over time in a run chart--24 months would be ideal--and determine where you are. Then you're able to turn around and say, 'What can we do to improve this process and watch the data change?'"

Advertisement

Advertisement

Advertisement