ADVERTISEMENT
Using Provider Feedback to Improve Efficiency and Utilization of the Flatiron Assist EMR-Based Pathways Tool
Christopher D’Avella, MD, assistant professor of medicine at the University of Pennsylvania Abramson Cancer Center, spoke to the Journal of Clinical Pathways about his institution’s use of the Flatiron Assist pathways program, and how the improvements made to the tool as a result of provider feedback increased the efficiency and utilization of the program.
Transcript
Christopher D'Avella, MD: My name's Christopher D'Avella. I'm an assistant professor of medicine at the University of Pennsylvania at the Abramson Cancer Center, and I'm a specialist in thoracic oncology.
Please give some background about your study and what prompted you to undertake it.
Dr D’Avella: Sure. Well, clinical pathways are an entity that has been in medicine for a long time, and specifically in cancer, there's been a lot of literature about clinical pathways. What the evidence has shown is that they reduced treatment variation and improved the quality of care. Recently, over the past decade, there has been some additional studies that have actually showed that clinical pathways specific to cancer patients can improve patient outcomes and actually reduce cost. The issue with clinical pathways is convincing providers to use them.
There's a perceived impact on workflow. If you're in the middle of a busy clinic, accessing a clinical pathway sometimes can slow you down and that has an impact on workflow. Pathway adoption has been lacking because of this issue. What interested me was we know that Pathways work, we know that they're better for patients, and we know that standardization of care is what's best and provides the best quality of care, but it's something that really isn't happening.
That's when we talked to Flatiron Health about their pathway program, which is a pathway called Flatiron Assist, which is an EMR embedded pathways tool, and started having conversations about a potential collaboration for a pilot in thoracic oncology at the Abramson Cancer Center.
Can you briefly describe how the study was conducted?
Dr D’Avella: Penn and Flatiron Health has have had a longstanding relationship over the years, and Flatiron Health has Flatiron Assist, which I mentioned is this EMR embedded pathways tool that's linked to treatment ordering. Most institutions that use Epic as their EMR use a program that's called Beacon, which is how we order treatments, chemotherapy infusions or immunotherapy or hormonal therapy for patients. The advantage of Flatiron Assist is that it's linked to treatment ordering.
The idea is that it does not impede workflow because you have to order the treatments and it's linked to the treatment ordering and it's actually in Epic. We started discussions about the potential for a pilot in thoracic oncology, and we decided that we would pilot the use of Flatiron Assist in Epic and created this agreement. And then part of the pilot was actually to look at a collaboration to try to improve the technology of Flatiron Assist. The biggest criticism of clinical pathways in these EMR tools is that people don't use them because they impede workflow.
The idea was that Flatiron Health would take feedback from Penn from clinicians using Flatiron Assist to try to improve the program and to try to make it more seamless with workflow. Initially, we created Penn specific clinical pathways for all thoracic oncology patients, which included non-small cell lung cancer patients and small cell lung cancer patients. Were able to get Flatiron Assist integrated into our EMR, loaded our pathways, loaded all of our regimens, and then basically show providers and train them on how to use Flatiron Assist and then started the pilot.
What we were looking for in the pilot was a couple different things. One was the frequency of use. We call that engagement. How many people are actually using Flatiron Assist? And then we were also looking at things like times of sessions and trying to improve performance.
What were the key findings of your study?
Dr D’Avella: Over the course of about a year, the study occurred from August 2021 to December 2022, we looked at a couple different things. We first had a meeting together. And after piloting the software for a couple months, we decided that the areas that we had to address in terms of improving performance were threefold. One was to improve the speed of the app and reduce the amount of clicks, which makes workflow easier for providers using the tool.
Two was to reorganize the tool keys and prompts, and three was to create this feature that's called a hot button, which essentially you click on one button. It's a common clinical scenario. For instance, in non-small cell lung cancer, metastatic adenocarcinoma, that's biomarker negative. Very common. We created this hot button where you could just press one button and it would go right to treatment ordering. And then of course, we looked at overall adoption or engagement, how many people were ordering or using Flatiron Assist.
The results of the study showed a couple different things. When we looked at how many people were using the tool over time, that initially our baseline showed that only about 29% of providers initially were using Flatiron Assist, and that number actually peaked at 74% in April of 2022. Essentially what that meant was our baseline in terms of the number of providers using Flatiron Assist over the number of actual thoracic orders that were ordered at the cancer center was about 30%, and then all the way increased to about 74%.
We're very satisfied with that and we're able to increase the amount of physicians and providers using this over time. And then there were a couple metrics we looked at in terms of trying to improve the performance of the app. When we look at the session length, so how quick is it to use Flatiron Assist, initially in August of 2021 when the pilot started, it was about 106.4 seconds in terms of the average or the median session length for using Flatiron Assist to order. And that actually went all the way down to 39.9 seconds, and that was in December of 2022, which saved about a minute of use.
We were able to significantly improve performance in terms of how quick the app was and how speedily people could use the app. We also looked at things like who was using the hot button and what percentage of treatment orders where people were using the hot button. And that was introduced in August of 2022. That was the feature that happened halfway through the pilot, and that increased over time from 27.6% to about 41.8% in November of 2022. Peak performance for session length as we noted was 38.2 seconds, which was in November of 2022.
And then overall, what we were able to calculate was the number of new treatments per provider that was ordered per week across the cancer center, and that was about 1.7 new treatments. If you look at the data in regard to how long providers use Flatiron Assist, it's about a minute and a half of extra work per week with optimal use of Flatiron Assist. Overall what we concluded was by using provider feedback with this EMR-based tool, we were able to improve how quick it was.
We were able to improve its performance. And by doing that, we were able to increase the amount of people that were using it. We found this to be a very successful approach in terms of trying to increase the utilization of this EMR-based tool over the time of the pilot.
Looking ahead, what potential impact do you hope your findings will have on improving the efficiency and usage of EMR-based pathway tools?
Dr D’Avella: I think that this study is going to have a big impact in that, as I mentioned initially, clinical pathway tools, typically usage of them is pretty variable and it depends on the institution and the circumstances, but I think that there still is a big adoption of these EMR-based tools that's pretty lacking when you look at the numbers in previous studies that have been done. I think this study will make an impact and show that if you work together hand in hand with a company and make a more provider friendly EMR-based tool, that you can actually increase the amount of people that use it.
And by taking clinicians and asking them how the app can perform better with their workflow, you can actually make significant improvements to the app using that feedback. I think feedback is key and also partnership is key among the people that are using the app and the people that make the app.
Is anything else you would like to add?
Dr D’Avella: I think what I would say is just, I think this is the way of the future. Clinical pathways are not going away. We know that ASCO has endorsed them. We know that NCCN has endorsed them. We know that they're good for patients. I think that using apps like this, getting provider feedback, as I mentioned, I think is the way of the future. I think our plan is to take what we learned from this pilot and be able to try to take it to other disease sites.
There's also talks in the future, and I know that there are some pathways that are adopted to things like radiology or surgical options. I do see that this is something that we're going to probably be thinking about how we can expand in the future. I think the lessons learned from this is that working hand in hand with a company that has this technology is really the best way to go.