Skip to main content

Advertisement

ADVERTISEMENT

A road map to evidence-based practice

Imagine that your beloved dog, Rocky, became ill suddenly and required a visit to the animal ER. He received a thorough assessment of his medical history and his current condition. After a few moments, the vet turned to you and said, “Rocky’s life is at risk due to this very serious illness, but I don’t pay much attention to current research. I don’t have time to read journals and the research isn’t very relevant to what we do here. I can treat him with the methods I learned in school three decades ago, but I’m not certain that he will return to good health.” Would you find another vet to care for Rocky?

Now imagine that we are talking about an ill family member or loved one—your parent, child, spouse, sibling, or best friend. Wouldn’t you expect the highest standard of care for them? Don’t you want them to receive treatment that favors the best odds for success?

In the treatment of substance use disorders, we know that there are varying degrees of success based on treatment modalities and techniques, the clinical complexities of patients, the treatment environment, and patients’ social support networks, among other considerations in the treatment equation. When we talk about providing “quality treatment,” one way to support quality is to tie it to outcomes. Given that research has offered evidence of what works best for whom and under what circumstances, don’t we owe it to our patients to offer the highest-quality treatment?

Evidence-based practices (EBPs) are grounded in sound research that demonstrates a greater likelihood of yielding more positive clinical outcomes for patients over other practices. External scientific evidence (i.e., research) plus expert consensus are utilized often in determining whether a treatment practice qualifies as evidence-based. There has been a growing literature in the past 10 to 15 years about use of EBPs in substance use disorder treatment as well as in the treatment of mental illness.

EBPs are typically manualized to increase standardization in their execution and to maintain fidelity across practitioners and programs. This assists providers in conducting outcomes research, as many practices build in some type of measurement component. As more providers evaluate outcomes of EBPs and share their results, the practices' evidence is strengthened. Another emerging benefit is that, particularly when financial resources are scarce, funders may choose to reimburse providers who include EBPs in treatment and may limit financial support to providers who do not.

Evidence-based practice’s opponents, however, are quick to cite the expense of implementation. EBPs can be expensive in both time and money depending upon training requirements and costs of materials. Since many practices are manualized and some are copyrighted, it is likely that original manuals and supporting materials will need to be purchased, if not for each practitioner then for every program. Training fees can vary based on who trains staff. The EBP’s developer is often available to provide training, or the developer may have a staff of approved trainers available at a reduced cost. These trainings can last a few days, but full implementation may require ongoing paid booster or consulting sessions. If a provider organization is not structured with internal program evaluation capabilities, an additional outside consulting fee may be warranted to monitor and maintain EBP fidelity.

Selection process

Choosing an evidence-based practice should be a multifaceted endeavor. At Gateway Rehabilitation Center, our selection and implementation process was thoughtful, strategic and enduring. We attempted to leave no stone unturned during our planning phase. Simply stated, we did our homework.

We concentrated on finding an EBP that was consistent with our treatment philosophy. While we wanted to stay true to the heart and soul of Gateway, we also sought to implement a practice that would allow us to integrate state-of-the-art treatment adjuncts (in our case, a new medication-assisted treatment program). We searched EBPs through the Substance Abuse and Mental Health Services Administration’s (SAMHSA's) National Registry of Evidence-Based Programs and Practices (NREPP), hoping to find a few with outcome variables congruent with our routine outcome evaluation.

Also, since we maintain that addiction is a disease, we hoped to consider a few practices that address the biological, psychological, social and spiritual aspects of addiction, treatment and recovery. We concluded that Twelve Step Facilitation (TSF; www.nrepp.samhsa.gov/ViewIntervention.aspx?id=358) offered the best fit for our organization.

Our decision to adopt TSF as a core practice was not made hastily—it took us approximately one year. Basing the decision on our background research with NREPP, we followed the Addiction Technology Transfer Centers' (ATTCs') The Change Book: A Blueprint for Change1 to guide us through the process. We acknowledged at the outset that any EBP adoption would be a heavy undertaking, but we never lost sight of our goal.

Training more than 200 clinicians across two states in four divisions comprising 20 locations would not be an easy task. The opening of a new Research and Training Institute gave Gateway the entity it needed to focus on TSF, while managing the day-to-day practicalities of implementation such as training trainers and coordinating training sessions.

The Change Book outlines 10 steps to developing a sustainable change initiative, and we followed each one. Steps one through three identify a problem, organize a team to address the problem, and identify the desired outcome. Problem identification resulted from both anecdotal evidence and portions of a clinical training needs assessment. This assessment consisted of several standardized instruments of training need and organizational readiness to change, and was administered to all clinicians in person. Two major findings in this process were a lack of standardization across treatment sites and clinicians’ report of using EBPs in practice without being trained on them.

For example, in one large program, we found that approximately half of the therapists and counselors used cognitive-behavioral therapy (CBT) as a primary treatment modality, with about one-third of that group reported having received no formal training on the practice. Although we have always been a 12-Step based provider organization, therapists also reported using contingency management and other motivational approaches as well as certain treatments consistent with trauma-informed care. We built a guiding coalition led by the vice president of research and clinical training and included representatives from clinical programs, quality and compliance, and human resources.

In step four (assess the organization), we took a broad inventory of our assets and limitations. Areas of focus included mission appropriateness, organizational structure (i.e., who would be responsible for sustaining TSF within and across sites), and education and experience of staff. We assessed our clinician audience in step five by examining potential barriers to EBP adoption by adding the Evidence Based Practice Attitude Scale (EBPAS)2 to our training needs assessment, looking at areas such as perceived burden and lack of support from a clinical supervisor.

Steps six and seven round out the essential components needed to implement the action plan. We reviewed the literature early on and knew that a “training of trainers” model would be the best fit for the organization, given its size geographically and in clinical staff volume. We included a page in the clinical training needs assessment where clinicians noted whether they were interested in volunteering to become a trainer. If so, they were asked to note which practices were of interest to them. This served as a way for us to advance the skills of Gateway’s existing workforce, which was one aim of the new Research and Training Institute.

The final steps, eight through 10, are the action steps for implementation, evaluation and revision, all of which work toward sustaining the EBP. They are detailed below.

Making it happen

We contacted Joseph Nowinski, PhD, who graciously agreed to train our group of 12 trainers. All trainers were educated as clinicians and worked directly with patients or in a management role. Nowinski spent two days with us in July 2013, providing instruction on TSF practice, training methods, and adherence monitoring. While studying the TSF handbook3, training team members started to implement elements of TSF in their own programs somewhat informally. We continued working with Nowinski by phone until we felt confident that we would be able to train in various treatment milieus (inpatient, outpatient, long-term residential).

Our next step was to coordinate training days at each of our locations as well as to schedule sessions at our corporate office for staff members who may have been off work during their on-site training day. Nowinski continued to provide telephone booster sessions while we rolled out TSF training, which took approximately six months (August 2013 to February 2014) across the system. The total cost of the trainers’ training and materials for the entire system were an estimated $8,000.

In the meantime, much work was happening behind the scenes. The Research and Training Institute published “EBP briefs” and the results of the clinical training needs assessment on our intranet as a way to disseminate important information during implementation. Education is essential to staff buy-in, and we wanted to be certain that we were communicating clearly about our intentions to integrate TSF into the curriculum, and debunk EBP myths along the way. The institute began to have a presence at regularly scheduled monthly meetings with clinical directors and managers so that open discussion could occur about our early TSF struggles and successes. We revised sections of our electronic medical record (EMR) so that clinicians could document TSF topics and related activities consistently and correctly.

Research staff began the process of adherence or fidelity monitoring by visiting one program per site per month. Fidelity monitoring occurs in two phases. The first involves general observation of TSF topic selection, a quick examination of adherence to the checklist, and a review of materials used. Phase two will involve generating a fidelity score for each staff member utilizing TSF.

Another aspect of measurement is the inclusion of the Alcoholics Anonymous Affiliation Scale (AAAS), which can be found in the TSF manual. The AAAS assesses the quantity of meetings attended and the degree of involvement in the meetings. Note that for our purposes, this instrument applies to all 12-Step meetings, not just AA. We incorporated the AAAS into our ongoing follow-up evaluations with patients, so that we can track changes as the individuals move across the continuum of care. We expect to see TSF’s lasting effects throughout the follow-up period, since our data demonstrate a significant relationship between 12-Step involvement and reduced substance use.

Barriers and reduction strategies

We encountered several barriers over the past two years, and developed solutions to address them so that our project would not be derailed.

Our first barrier related to size of the staff. How would we train more than 200 clinicians within a reasonable time frame and capture newly hired therapists? Our implementation plan included training at each site and ongoing sessions at the corporate office. We worked closely with the human resources department to identify new hires and we maintained contact with clinical supervisors to coordinate attendance at an upcoming training session. Staff were given adequate time to try TSF informally, and we asked for feedback so that we created a continuous quality improvement loop as we moved forward.

Our next barrier involved staff resistance to adoption, and TSF not being utilized in sessions. Perhaps one of the biggest ironies with EBP adoption in drug and alcohol treatment is that while we expect our patients to change, it is challenging to change ourselves. Although many staff members have reported an appreciation for the ease with which TSF integrates into their practice, acknowledging the structure it provides in both individual and group settings, others state that it stifles the fluidity of a session, noting that following a checklist is “too restrictive.”

Verbal and written summary reports to clinical managers paved the way for the development of monthly coaching sessions, where specific trainers were matched with specific treatment sites. Coaching has been shown to be an effective strategy when combined with other behaviorally oriented techniques.4 The coaches and other representatives from clinical management and the Research and Training Institute travel to treatment locations to present 90-minute sessions consisting of open dialogue about implementation challenges, role play opportunities, and the sharing of examples of successful implementations at other sites. Without coaching, many of our clinicians, particularly the younger staff, would struggle with group facilitation, and our expectations for adherence to TSF could not be realized.

Another barrier involved staff reports of patients getting bored with TSF material (“TSF is repetitive”). We were uncertain whether these statements were valid or offered further evidence of staff resistance. To keep TSF feeling “new” for both patients and staff, we created a space on our intranet for in-session and take-home treatment assignments matched to TSF topic. For example, the TSF folder called “People, Places, and Routines” contains assignments on triggers and identifying high-risk situations.

Another barrier regarded staff “faking it” during fidelity monitoring sessions. It became obvious rather quickly when certain staff members attempted to fake their way through TSF during a fidelity monitoring session. In addition to reviewing some of the standard TSF tasks with them, we inquired about the TSF handbook and asked to see examples of some of the TSF-related assignments. We reported these findings back to managers, who became more open about why TSF was not being utilized at their sites.

Finally, we encountered a lack of use of certain TSF topics (specifically, those related to conjoint or family sessions). An organization-wide focus on family involvement in treatment, combined with the new resources on our intranet, have increased awareness of using TSF during family programming. We are continuing to examine trends in topic selection during fidelity monitoring, and discussions concerning the breadth of topics are an ongoing focus of coaching visits.

Effort pays off

Implementing an evidence-based practice is not an easy process. We learned important lessons along the way—lessons that apply to any change initiative. A solid implementation plan influences sustainability, and that plan requires commitment, champions for the practice who can inspire others, and an ability to accurately and ethically utilize data to drive decision-making and interpret results.

Combining a “best fit” EBP with the belief that staff members have the passion and expertise required to launch and sustain the practice will generate the energy required for success. EBPs are costly in time and money, but if they serve to advance treatment and recovery, they are well worth the investment.

 

Cara M. Renzelli, PhD, is Vice President of Research and Clinical Training at Gateway Rehabilitation Center, based in western Pennsylvania. She leads the Kenneth S. Ramsey. PhD, Research and Training Institute at Gateway.

 

References

1. Addiction Technology Transfer Center (ATTC) National Office. The Change Book: A Blueprint for Change (2nd ed.). Kansas City, Mo.: University of Missouri–Kansas City; 2010.

2. Aarons GA. Mental health provider attitudes toward adoption of evidence-based practice: the Evidence-Based Practice Attitude Scale (EBPAS). Mental Health Services Res 2004;6:61-74.

3. Nowinski J, Baker S. The Twelve-Step Facilitation Handbook: A Systematic Approach to Recovery From Substance Dependence. Center City, Minn.: Hazelden Publishing; 2003.

4. Herschell AD, Kolko DJ, Baumann BL, et al. The role of therapist training in the implementation of psychosocial treatments: a review and critique with recommendations. Clin Psychol Rev 2010;30:448-66.

 

Timeline of Gateway's EBP implementation

August 2012-October 2012: Training needs assessment development and pilot testing

November 2012-December 2012: Training needs assessment data collection

January 2013-March 2013: Training needs assessment analyses; results dissemination; EBP selection

April 2013-June 2013: Joseph Nowinski, PhD, contacted; training of trainers planned

July 2013: Nowinski on site

August 2013-November 2013: Trainers practice Twelve Step Facilitation (TSF) in their programs; telephone booster sessions with Nowinski; systemwide training sessions scheduled and materials ordered

December 2013-February 2014: Systemwide TSF training and integration into existing treatment curriculum

May 2014-present: Fidelity monitoring

March 2015-present: Monthly coaching sessions; ongoing TSF training for new hires

Advertisement

Advertisement