Simulation Is Important for Interventional Training: It Needs Our Attention and Funding
Why should we consider the use of simulation in interventional procedural training?
We need to embrace simulation as an initial training step for new fellows. There must be a better way than the current model of “see one, do one, teach one”, especially in the very early phase of training. We owe it to our patients to not have to expose them to the steep learning curves most fellows have to go through in the beginning prior to becoming reasonably proficient, or to the challenge of more experienced operators learning how to deal with rare but important situations such as complications. There has been no other way to do it until recently. It is time to make an argument for simulation.1
I am in my early forties, so I grew up in the generation where video games and technological innovation were a baseline feature of normal life. My father is also a cardiologist and has his private pilot’s license. A great deal of his flight training was done on simulators. Seeing what the aviation industry has been able to achieve, it is hard to imagine that we should not be headed in the same direction. In the early 1990s, there was a focus on developing the “feel” of simulation training in medicine, but it was obviously very basic. The first time I saw a physical simulator for procedures, even though that first-generation model was very primitive, it all clicked. It was obvious at that point that simulation is something the whole cardiovascular community needs to invest in and refine, so that it can be the foundational basis for how we train people.
Beyond simulation technology itself, what are some of the roadblocks?
There are a few schools of thought in simulation training. There is a school of thought that basically tries to establish granular competencies, i.e., you build an entire simulation model to test one learning point. That was the initial philosophical direction taken by many simulation educators. But a lot of the things that interventional cardiologists do are actually not singular, granular teaching points but a hundred different muscle memory, intellectual branchpoint decision-making steps all put together. When you try and prove, using current-generation simulators, that educational progress can be made with that degree of complexity, it is hard to scientifically establish in randomized trials. It is easier to build a granular competency model, and we have good data that this improves specific proficiencies, but the scope of what the simulator can train at that point becomes very narrow.
Then there is the willingness to invest in the machines. The machines cost $100,000-150,000 at baseline, and for more advanced versions, it is possible to go up to a quarter of a million dollars. Given the lack of fiscal support for education in general in United States training, where do you go for that kind of budgetary ask? Whenever you approach someone and say, I want to buy a simulator, I want to train, they always ask, what is your return on investment? What is the evidence that shows it improves training? There are small studies here and there in the literature, but it is difficult to translate that into a strong argument for a fiscal return on investment. We are caught in that Catch-22 position where there isn’t enough of a body of evidence and development of a syllabus, so economies of scale don’t exist. If all fellowships around the country required high exposure to simulators and maybe a machine on site, then the price of the machines would definitely come down.
Is simulation being utilized differently outside the U.S.?
In Germany, simulation training in interventional cardiology is not an official requirement, but it is a well-embraced, non-official requirement. There are three regional centers. The German Society of Cardiology has a working group on this subject that has laid out much of the foundational work and has had a very polished program running for more than 15 years. Fellows from around the country will either self-fund their training or their programs will help fund it. They come to the center and spend two days doing very intensive fundamentals of cardiac catheterization and coronary angiography. There are different levels of training, such as percutaneous coronary intervention (PCI) training, femoral training, and radial training, and the centers run about 40+ courses per year. It is a model for us to aspire to in the U.S., but I suspect that two days of exposure is still not enough, and that is one of the bigger questions that remains unanswered: how much exposure, exactly, does a fellow need on a simulator in order to show that it helps with their training? In addition, it has not been clearly established what level of mentorship is needed to be with the fellow while they are practicing on the simulator. Obviously, fellows do best with an attending next to them while they are learning, but the manpower demand for something like that would be phenomenal. Can you imagine if every cardiology trainee needed six hours of supervised simulator time in their first two months in order to reach a certain threshold? Where do you find attendings with an appropriate skill set who can devote that amount of time to each fellow? You also will need people who are appropriately trained to be good educators on simulators. Our center here at the University of Arizona has a director of simulation ER, who completed an ER simulation fellowship. It might be time to start cardiology simulation fellowships. Fellows would spend a whole year learning how to be a simulation educator, understanding the tools, building syllabuses, and embedding themselves in their residency training. An ER resident has to spend 3-4 hours on intubation and code simulations every quarter. Many internal medicine residencies do something similar. How do you build up a competent workforce that can facilitate this type of training? It requires fellowships to grow a cadre of people who are well-versed, proficient, and have used simulators in all sorts of environments and situations. Much of the leadership actually hasn’t come from cardiology, but from vascular surgery and interventional radiology as a group. They have done a much better job of organizing themselves to support simulation and create syllabuses. Cardiology has lagged behind. The models built by vascular surgery and interventional radiology, however, are less complicated than what is required by cardiology; the problem with the heart is that it moves and there is a complex downstream effect at play. If I want to model a dissection after I dilate a stent, the heart has to respond appropriately: there needs to be a change in blood pressure, ST elevation needs to occur, and the patient needs to start going into shock. These kinds of models are much more difficult to build compared to the more stationary stenting of a right renal artery. The kidneys are not moving and there are no acute, minute-to-minute responses that have to be built into the system.
What have you been doing with simulation training at the University of Arizona?
We are still in the early stages, but Dave Fortuin from Mayo Arizona and I have been running a joint single-day simulation course quarterly on a statewide basis for fellows for the last eight years. We have tried to bring in as much procedural variety as possible. Manning the courses has been difficult; much of the educational help has actually been from industry, because the interest and ability to teach on simulators is very limited. Industry has been critical in this entire effort, because many of the built simulators on the market right now are specific commissions for industry, and are not available on regular commercial platforms. It is a big question as to whether these modules should also be available for general purchase. For example, Edwards Lifesciences has a transcatheter aortic valve replacement (TAVR) simulator. They commissioned Medical Simulation Corp, an LLC that has now been bought over by Mentice, to build them a proprietary TAVR module for the Sapien valve. Access has been through the TAVR training courses, which is typical for simulation regarding particular devices, including Watchman (Boston Scientific) and CoreValve (Medtronic). The electrophysiology (EP) field is another field that is similarly complicated. There are good simulation models for cardiac resynchronization therapy (CRT) and ablation, and good transseptal models. But the access is very limited and primarily industry driven. Industry is funding this development without good coordination. They may employ consultant physicians, but often the module development is siloed within the company itself, and there are no good standardized guidelines for module development.
Typically, for commercial purchase, a center will buy the physical simulator and then they choose the digital modules to be included, which then affects the overall cost. You might select carotid stenting or PCI modules, for example, or both. If companies can sell their individual device training simulation modules as part of a package or singularly, it might help companies become more excited about developing modules for real-world use and it would give them a mechanism to provide educational grants. Right now, companies have representatives bringing simulators all around the country. Every time someone wants to use it they have to send it along with a person, because they can’t send a $100,000 machine and just have people play on it. Putting in place an appropriate methodology will help greatly with economies of scale.
Do you envision regional training centers or would each hospital have its own?
Both models are acceptable. Obviously, the ideal model is the one where the simulator is embedded into the daily operations of the cath lab and everyone can use it. This is another important concept. Why should we limit the training of simulation to physicians? It should be part of cath lab staff on-boarding. All of that is dependent on how real-world and expansive we can be in this representation. Right now, as a whole, we are probably 60% of the way there in terms of simulators representing reality. In Germany, Philips has worked with Mentice to allow you to plug the Mentice simulator into a Philips table. You take the output from the actual C-arm and plug into the simulator. It works on the table just like you have a real patient. That same degree of portability and integration is required in order to come as close as possible to real life. A tiny fraction of simulators will allow you to upload a patient’s computed tomography (CT) scan into the simulator and then work on that anatomy. This is absolutely the goal. The issue is funding. We need much more financial support. Right now, it is doable in isolated centers. Carotid stenting, for example, has cerebral CT models that are easier to pull into a simulator. Importing a moving heart model into the digital space is not easy and there are technical issues to overcome in order to do so cleanly. What is the influence of calcium? Of angles? That granularity is still not yet developed enough for us to achieve reality models imported from actual patients. The ultimate goal is to bring real patient information onto the simulator to prepare for the procedure: choosing guides, figuring out optimal bypass graft engagement, determining catheter use, and so on.
So simulation could play a broader educational role than only training fellows?
Simulation does have a role for more advanced users in the exposure and training on procedures that are novel and unusual, because there is no other way to obtain good exposure volume. We need to determine how to use simulation to maintain skills. How do we use it to continue to credential operators? Could we take someone who has worrying performance or outcome statistics, put them on the simulator, and try to find their educational and competency gaps? These are difficult questions to ask, because belief in the fidelity of simulators is not widespread right now, but these questions are what we need to be discussing, especially considering our low-volume providers. A good number of interventional cardiologists in the United States perform <50 PCIs per year.2 The procedure number should also include a lifetime number. An operator who has done 9,000 cases with good outcomes but did 20 procedures this year is a different story. But we do want to understand if operators have kept up with current thinking of how to handle a perforation, for example, or no reflow. These are skills that can be tested within a simulation, above and beyond the standard current methods of evaluation, i.e., board recertification, CathSAP module, etc. There is a huge disconnect between answering multiple-choice questions versus what you actually do in action. Simulation provides probably the only mechanism by which we can accurately test performance metrics.
Additionally, one of the most important characteristics of a good interventionalist is their ability to respond to complications that can be very rare. Simulation may provide a good method for repeat exposure to complications to ensure that all the reflexes are built in and that all the permutations of responses are very familiar even to very experienced operators.
Can you tell us more about the one-day simulation courses you run with Dr. Fortuin?
The fellow reviews tend to always be extremely positive, with the basic statement that we need more of these courses. What we have struggled with is the quantification of the effect. How do we know it actually makes fellows better? That is the reason why the whole field of simulation has lost the interest of interventional cardiology. We expose the fellows, they tell us it is great, they tell us that they have learned a lot from working with the simulators, and uniformly say that it should be more embedded within their training. We have developed individual syllabuses here and there, but we haven’t formalized them, because of the struggle to understand exactly how it should be done. But do we really need to wait for all the studies to come in the next several years? Is simulation the right thing to pursue? Absolutely. Can we not build the literature in parallel to developing the product? We are wasting years. We have to get past the disinterest at this juncture.
How are cardiology societies approaching simulation training?
All society statements strongly endorse the use of simulators in cardiovascular training. However, none provide guidance as to the form of simulation training, its frequency, or a syllabus. There is a Society for Cardiovascular Angiography and Interventions (SCAI) simulation committee nested within the larger SCAI education committee. We are trying to regenerate interest, but the inertia is very large. There is an American College of Cardiology (ACC) simulation workgroup, similarly nested within the ACC educational committees, but for the most part, these groups tend to primarily have a presence at the national meetings, when there is a simulation center present for learning. As far as I can tell, even in Europe, there is no unified group developing a syllabus.
In terms of the actual technology, the feel and rhythm of the procedures, how close is simulation at present?
Probably 60% of the way there. Let’s say we put an .035-inch wire into the machine. It pushes up into mechanical rollers that will capture the three-dimensional movement: the roll, the yaw, the pitch, the forward/backward speed, and all these aspects, and within the digital model, as it encounters objects, the machine feeds breaks so the operator can feel physical resistance. The simulator has lasers that measure the diameter of the object and tell it what the object is supposed to be. One of the problems is that in real life we work in wet environments, but in simulations, that is not possible. It’s going to be an uphill battle to get toward full 100% real-life fidelity, but there are many other aspects on which simulation can provide education and testing, such as decision-making. Perhaps we have a Medina 1,1,0 bifurcation lesion. There are several dimensions that can still be tested in that scenario. We need to slow down the obsession with getting the physical “feel” exactly perfect before committing to any other efforts. Decision-making, complication recognition, and adherence to guidelines are all educational aspects that need to be built upon now. For example, all simulators on the market have inadequate and outdated pharmacopeia. Our work must focus on getting the correct medications on board and on building physiological response models that make sense, which are a big component of decision-making during a case. A patient has been on rivaroxaban and here we are doing an ST-elevation myocardial infarction. What drug should you give? These testing points, when approached singularly, are easy to think about, but when you are physically engaged in treating a patient, it is like juggling and riding a bicycle at the same time, or doing math equations and dancing. Can you make the right intellectual decisions while physically at work in a stressful situation? We have been talking about procedural simulation, but simulation in terms of testing decision-making is another interesting aspect and an important component of procedural competency. How well do you achieve flow performance in a setting of extreme crisis and stress? We don’t address that in any way whatsoever during training. Look, the patient is in v-fib. Are you responding? For every minute, mortality creeps up 50%. What are you doing? Have you made the right decisions? Simulation is perfect for training and testing these types of scenarios. What we should be aiming for is the fighter pilot mentality: ok, this is a problem, respond now or die. Simulation might help us in understanding team dynamics as well, if we can simulate the inclusion of more and more people into the room. Let’s say there is an exact copy or replica of all the guide catheters and wires sitting on the shelf all around the operator, and the simulation is such that mechanical circulatory support has to be placed immediately. That’s when simulation starts to get really interesting.
Have you utilized robotic angioplasty with simulation training?
Corindus has a model allowing you to hook it up to a simulator. During our courses, we have taken a Corindus CorPath, hooked it up to a simulator, and let people play and see what it feels like. Corindus is a natural fit.
What about the rise of video games? Will that affect simulation learning?
Yes! A simulated environment actually allows us to gamify learning. We know that when you gamify learning, retention and interest increase significantly. Can you imagine if, in the simulation environment, there was crowd-based international ranking of performance? You might see that Dr. Kwan Lee, an interventional fellow at Emory, has clocked in 32 hours and done 7 of these specific scenarios, responded successfully in 32% of cases, his survival rate of his patients is “x”, his fluoro time is “x”, and his stent sizing and accurate placement of stents is “x”. You have a global leaders ranking and group fellows by team — Emory fellows vs Cedars-Sinai fellows. It would be hugely popular, fun, and would focus people on the metrics. Program directors could easily determine, for example, why fellows are doing so poorly on remembering to give heparin. Taiwan has done a simulation Olympics, not for cardiology but for other types of training. Teams show up on stage, there is scoring, and it is hilarious to watch.
Any final thoughts?
Philosophically, pursuing simulation is the correct path. We have to generate interest. We have pitched ideas for sessions to all the national cardiovascular societies and meetings to discuss how exactly interventional cardiology should proceed, but there has been no interest. We can’t get onto any of the meeting agendas. I want to challenge the meeting organizers to reconsider their lack of interest. Simulation may not be ready, but the promise is there. We need have to have sessions talking about it at a national level, in order to continue to make progress in a field that will undoubtedly be the foundational basis of how we train and re-certify in the future.
Disclosure: Dr. Lee reports a consultant role for Mentice.
Dr. Kwan Lee can be contacted at klee@shc.arizona.edu.
- Lee KS. Monkey see, monkey . . . should really do, before being set loose. Off-Script. TCT 2017. TCTMD.com. Available online at https://www.tctmd.com/news/script-monkey-see-monkey-should-really-do-being-set-loose. Accessed October 15, 2019.
- Kern MJ, et al. Conversations in Cardiology: what is the annual volume requirement for a PCI operator? Cath Lab Digest. 2017 Dec; 25(12): 6-10.