Skip to main content

Advertisement

Advertisement

ADVERTISEMENT

Feature

Incorporating Real-World Evidence In the Approval Processes

December 2022

Health care stakeholders, including regulatory bodies and payers, are increasingly incorporating real-world evidence into their approval processes, prompting the need for formal guidance.

In 2021, the Food and Drug Administration (FDA) issued 4 draft guidances on the use of real-world data (RWD) and real-world evidence (RWE) to inform regulatory decision-making on the safety and effectiveness of drugs and biologics. The guidance cover issues related to data sources, data standard, and regulatory considerations. The FDA defines RWD and RWE as follows:

  • RWD: data relating to patient health status and/or the delivery of health care routinely collected from a variety of sources; examples include electronic health records, claims/billing, product/disease registries, patient-generated data, data gathered from other sources such as mobile devices.
  • RWE: clinical evidence about the usage and potential benefits or risks of a medical product derived from analysis of RWD; can be generated by different study designs or analyses such as randomized trials, retrospective or prospective observational studies, pragmatic trials, and large simple trials.

This guidance is one more step by the FDA to build on its mandate established in a provision in the 2016 21st Century Cures Act to create a framework for an RWE program to evaluate the potential use of RWE in regulatory decision-making. For investigators and other stakeholders interested in incorporating RWD and RWE into their drug approval submissions, it is hoped the final guidelines, once published, will bring some clarity on what the FDA is looking for in its review of drug approval submissions that include RWD and RWE. A final procedural guidance issued in September 2022 by the FDA offers guidance encouraging sponsors and applicants to identify certain uses of RWD/RWE in their submission cover letters.

Until then, drug developers, pharmaceutical companies, and others already incorporating RWD/RWE in their study designs continue to wade through the many uncertainties about using these kinds of data for establishing drug or biologic safety and efficacy.

“This is a relatively new field and can be daunting,” said Robbert Zusterzeel, MD, PhD, MPH, vice president of regulatory science and strategy, Woebot Health, a startup company focused on using digital therapeutics and behavioral health products to provide mental health solutions. “It is really important for people to keep working with the FDA to make sure they are aligned in how these data can be incorporated.”

Dr Zusterzeel worked for almost 10 years at the FDA in the Center for Drug Evaluation and Research (CDER) and prior to his current position at Woebot, was senior director for us regulatory science and strategy at IQVIA, a global contract research organization and founding member of the Real World Evidence Alliance. From his vantage point, having worked in both the regulatory space and industry, he underscored the need for industry to engage with the FDA early and often in their attempts to incorporate RWD and RWE into their study designs. “If you do the right thing from the outside and come up with a defensible plan on how to incorporate RWE in your submission, you can drive innovation on the inside,” he said.

Getting It Right

For newly interested investigators or for those already trying to incorporate RWD and RWE in their study designs, people working in this space offer a few tips to get things as right as possible in terms of meeting FDA’s expectations.

One thorny area is data standards. RWD, as defined by the FDA, refers to data routinely collected on a patient’s health status or their delivery of care via sources such as electronic health records (EHRs), medical claims, and product and disease registries. These types of data are very different from the data, such as case report forms, used to conduct clinical trials, and they don’t always match clinical trial data fields and standards, said Rachele Hendricks-Sturrup, RWE Research Director at the Duke-Margolis Center for Health Policy RWE Collaborative. For example, she said that data collected on populations of patients with heart failure participating in a clinical trial would be different than the data collected on patients during their treatment for heart failure in the real world.

“When we think about data standards and RWD and RWE that would be acceptable to regulators, that largely depends on the clinical research question and whether or not RWD can fully or help address that question,” she said, adding that if RWD can address that question it would be up to regulators to work with medical product developers to determine the requisite data standards for that research question or case.

Dr Zusterzeel noted the FDA seems currently most comfortable using RWD and RWE for drug approval submissions for rare diseases, given that this type of evidence is often the only data available, and for oncology, given the good job of collecting high-quality RWD in registries and other data sources.

In currently accepting and discussing with drug manufacturers the use of global data for other disease areas, the FDA is expanding into how to use RWD and RWE in clinical areas in which that data is more complicated. Illustrating the messiness of using RWD for a disease such as diabetes, for example, Dr Zusterzeel pointed out that the definition of diabetes can vary among different hospitals and clinics so incorporation of RWD into a drug submission approval will require ensuring a common definition of diabetes is used.

Unlike a clinical trial designed to prospectively examine an outcome using predefined terms (for example, one definition of diabetes based on a given glucose cutoff point), incorporating RWD into a trial requires an intentional strategy to ensure that a common definition of a disease state (like diabetes) is used. “If you run a trial and pull the RWD together, how do you know you are including all patients who meet a common bar defining diabetes?” asked Dr Zusterzeel.

This is the question the FDA will want an answer to and that investigators need to spell out in their proposals they submit to the FDA. Dr Zusterzeel said that typically the drug manufacturer or investigator will come up with the definitions used for data sources. Another common way is for a consortia of academic, nonprofits, and drug manufacturers working together to establish a common definition of, for example, diabetes that can be used as the gold standard.

Regardless of the approach, Dr Zusterzeel emphasized that incorporating RWD in the approval process for a drug submission requires the same scientific rigor as a submission using clinical trial data.

As part of this rigor, he underscored one thing to stay away from—mining your own data to then submit to the FDA. “A lot of people still see RWD, such as claims data or electronic health records, as a way to generate a hypothesis, but then try to use that data to get a drug approved,” he said. “What the FDA does not want you to do is look at the RWD first, assess what looks good, and then do a final analysis on it to try and get your drug approved,” he said.  

One helpful tip to avoid problems with submission is to engage with the FDA early in the application process. Andrew Barnhill, head of policy, Global Legal at IQVIA, underscored the importance of taking to the FDA staff before submitting a proposal for drug approval. “I think one of the most common errors, particularly for those new to clinical research, is that they submit their application for approval and then set up a meeting with the FDA staff,” he said. “By then it is too late because you’ve already prepared your submission materials with certain assumptions.”

Dr Zusterzeel agreed. He suggested first finding a partner with an understanding of RWD and then writing up a really good proposal, before doing an analysis of the data, and discussing it with the FDA staff, and then revising as necessary prior to submission. Critical, he said, is to have a discussion with the FDA prior to analyzing the full data as analyzing the data prior to such a discussion will be considered cherry picking the data and the results would in general be biased. “Do not execute your RWE study and then apply for FDA approval without first discussing your RWE plan with the FDA because historically that has never worked,” he said.

Why Use RWD in Drug Submissions?

Although many people think one main driver of incorporating RWD is to speed up drug approval, Mr Barnhill said that is only a tangential driver. “The motivating factor is the recognition that you want a variety of evidence in front of your regulators to help them better understand clinical effectiveness,” he said.

For example, collecting observational data over a certain period to observe clinical outcomes in a treated population can be useful to inform regulatory decision, according to Ms Hendricks-Sturrup. “Observational insights from RWD are helpful for making the case that is it possible that a certain population may benefit from a new therapeutic label that has the potential to be more effective than existing therapies,” she said.

Managed care companies, and payers in particular, can leverage the RWD they routinely gather (eg, medical claims data) to support regulatory decision-making, Ms Hendricks-Sturrup said. “Payers will have to work with managed care providers to ensure that the data is relevant, of sufficient quality, and substantive to the research question in regulatory submissions,” she said. [For a discussion on a role for RWD/RWE in coverage and reimbursement decisions by payers, see “Real-World Evidence: Bridging Gaps in Evidence to Guide Payer Decisions.”

One emerging use of RWD is to use these data for a control arm of a clinical trial, which alleviates the need for an active control arm, according to Hendricks-Sturrup. Among the references she cited for a description of the methodology used to incorporate RWD in such a study, as well as a more broad description of the use of RWD and RWE in regulatory drug submissions, are a white paper published by the Duke Margolis Center for Health Policy entitled Understanding the Need for Non-Interventional Studies Using Secondary Data to Generate Real-World Evidence for Regulatory Decision Making, and Demonstrating Their Credibility, and research by Shirley Wang and colleagues at Harvard, including a recent 2022 study entitled Assessing and Interpreting Real-World Evidence Studies: Introductory Points for New Reviewers

Advertisement

Advertisement

Advertisement