ADVERTISEMENT
The Complementary Relationship Between Quantitative and Qualitative Research Methods in Enhancing Understanding of Treatment Decisions, Outcomes, and Value Assessment
Introduction
In the July 2021 edition of the Journal of Clinical Pathways, May and colleagues argue that qualitative research can both complement and augment quantitative research methods by providing important, “on-the-ground” context and perspectives in the assessment of treatment value and outcomes.1 We agree with this argument, and firmly believe that the best approach to addressing complexity in health care innovation and care delivery is evidence generated through a mix of both quantitative and qualitative research methods.
Frequently, quantitative research methods are used to illuminate how detrimental a disease might be in health outcomes and associated costs, what effect a treatment might have on the disease in question, and how much we should be willing to pay for said treatment.
However, quantitative methods are limited in that such approaches are often only able to illuminate the part of a problem that can be told numerically and cannot fully account for on-the-ground realities. Qualitative methods can help to fill this gap by providing valuable first-hand perspectives and by placing quantitative study findings into context. Similarly, developing research questions for quantitative studies requires a sound understanding of underlying issues, which can be augmented by qualitative approaches.
Health Research has Relied on a Variety of Quantitative Methods
Quantitative research methods answer a broad range of important questions in health research. Taking COVID-19 as a contemporary example, quantitative research methods have been used to address key epidemiologic concerns that include disease incidence, prevalence, and transmission2 while also considering the overall economic burden of disease3 and examining questions related to pricing of vaccines and therapies.4,5 Additionally, researchers have relied on quantitative methods to study the impact of policies aimed at limiting the spread of COVID-19, including testing, mask mandates, lockdowns, and other social distancing measures.6-8
Among quantitative methods, retrospective database studies, economic/econometric approaches, and simulation models are most frequently used to generate insights. Retrospective database studies or claims analyses often are used to provide real-world evidence on cost and use of therapies, as well as efficacy and safety outside of the clinical trial setting.9 By deploying retrospective study methods, researchers can quickly collect and analyze large, diverse data sets that include longitudinal follow-up reporting.10 However, retrospective studies using administrative claims databases, such as the MarketScan Database,11 or large, nationally representative survey data, such as the University of Michigan’s Health and Retirement Survey,12 also pose various methodological challenges and limitations that include lack of control over quality and inconsistent collection or reporting of data.13,14
Similarly, economic studies have been used to assess the impacts of governmental policies on population health outcomes or to evaluate the effectiveness of different health programs and interventions.15 For instance, a recent study by Lyu and Wehby examined whether county-wide mandates for face masks decreased incidence of COVID-19 and found that by May, 2020, mask mandates had resulted in 200,000 fewer deaths from COVID-19.16 Economic and econometric studies allow researchers to use formal analytic techniques to make evidence-based decisions.17 Nevertheless, these studies typically do not take subjective experiences into consideration, which can limit the overall evaluation of real-world effects.18
Lastly, simulation models have served as ways to estimate and project uncertain long-term effects of programs, treatments, and interventions at a population level.19 In the context of COVID-19, Jahn and colleagues used a dynamic agent-based population model to compare different vaccination strategies to maximize the impact of Austria’s limited vaccine supply.20 The authors found that in order to minimize hospitalizations and deaths related to COVID-19, elderly and vulnerable populations should be prioritized for vaccination until further vaccines are available. These methods allow researchers to explore various scenarios in a structured way.21 Even so, simulation methods cannot be used to accurately evaluate all scenarios, especially in situations where key parameters are unknown or highly uncertain.22
Quantitative Research Approaches Generate Actionable Insights for Policymakers and Health Care Decision-Makers
Quantitative research methods are used frequently to guide decision-making in health policy and health care. For instance, for initial approval of a drug or therapy, regulatory agencies such as the US Food and Drug Administration (FDA) and the European Medicines Agency (EMA) consider analyses of clinical trial data on safety and efficacy end points. Post-approval, regulatory attention often shifts to analyses of real-world evidence to demonstrate how drug efficacy and safety measures hold up outside of the clinical trial setting. Lastly, insurance companies and payer organizations rely on cost-effectiveness and budget impact analyses to determine coverage and reimbursement practices for treatments, and to gauge whether a treatment’s benefits in averted health consequences outweigh its costs.
Cost-effectiveness analysis often use quantitative research methods to evaluate the value and pricing of interventions. In the United Kingdom, the National Institute for Health and Clinical Excellence (NICE) quickly released detailed guidelines for COVID-19.23 In the United States, the Institute for Clinical Effectiveness Research (ICER) performed a cost-effectiveness analysis in 2020, determining that the cost-effective price for remdesivir as a treatment for COVID-19 would be between $4,580 and $5,080 per treatment course.24
Similarly, government agencies such as the Canadian Agency for Drugs and Technologies in Health (CADTH) and NICE often base reimbursement decisions on comprehensive health technology assessments (HTA) that usually contain economic evaluations for the intervention.25
Lastly, the FDA recently has begun considering modeling and simulation methodologies when reviewing and approving drugs and has advocated for their use in predicting clinical outcomes.26 Among other things, the use of physiologically-based pharmacokinetic (PBPK) simulation models has increased in recent years to address uncertainties remaining around clinical trial data, though acceptance by the FDA has been varied.27
The Importance of Moving Toward Integrated, Mixed-Methods Research Approaches
Quantitative studies are the gold standard for establishing causality and drawing conclusions that can be generalized beyond the population studied. However, quantitative research methods also often lack contextual considerations and perspectives that can be further informed by qualitative approaches. To overcome potential limitations of either research approach, we strongly believe that qualitative and quantitative research methods are best used in concert.
For example, as previously noted, HTA are commonly used to estimate whether or not a drug or therapy can be considered “value for money.”28,29 However, within the context of rare diseases or rapidly emerging health crises such as the COVID-19 pandemic, reliance on empirical data alone is often insufficient, as data may be too sparse to inform population health decision or may not be collected quickly enough to guide decision-making in crisis.
According to ICER, conducting cost-effectiveness analyses for rare diseases such as B-cell lymphoma, hemophilia A, and spinal muscular atrophy can be particularly challenging due to the small number of patients afflicted, a lack of available treatments or treatment alternatives, and limited long-term empirical data.30,31 To overcome some of these challenges in the United Kingdom, NICE relied heavily on expert judgment to fill evidence gaps, though processes for eliciting expert assessments have varied across NICE’s guidance-making programs.32 In the context of COVID-19, quantitative research methods have provided highly valuable insights as outlined above, but key questions remain in areas where empirical data is not yet available, or where data collection has not kept pace with the demand for evidence to guide policy decisions. To reference a few examples, qualitative research designs have been used to study vaccine hesitancy among communities of color,33,34 as well as the impact of COVID-19 on family caregivers.35
True mixed-methods research that borrows from both qualitative and quantitative research methods can serve as a solution to this problem: qualitative methods can be used to generate hypotheses and develop research questions, while quantitative methods can be used to confirm hypotheses and further solidify qualitative research findings by using a larger, more diverse sample.
Expert elicitation methods such as the Delphi method or the Sheffield Expert Elicitation Framework (SHELF) are interesting ways to combine qualitative and quantitative approaches to address gaps in empirical knowledge.36-38 Typically, both methods systematically elicit expert knowledge about uncertain parameters and quantities through interviews with, or questionnaires administered to, key subject matter experts. Results from individual assessments are then aggregated in the form of subjective probability distributions. Both methods follow strict protocols to avoid respondent bias and to ensure high levels of objectivity in eliciting key parameters in areas devoid of empirical evidence.36
In a recent application of the Delphi method, Graf and colleagues systematically elicited how US payers evaluate trade-offs between costs, as well as both clinical and real-world evidence of efficacy and safety in the context of a rare disease, Hemophilia A.39 The study concluded that US payers strongly preferred treatments with known efficacy and safety, and well-understood costs over newer treatments with uncertain long-term effects for patients with Hemophilia A, providing important insights into payer decision-making practices in areas with limited empirical evidence and high levels of uncertainty.
Using the SHELF method, Cope and colleagues systematically elicited survival rates and related uncertainty for children and young adults with relapsed or refractory acute lymphoblastic leukemia, 2 to 5 years after receiving chimeric antigen receptor T-cell (CAR-T) therapy.40 Individual expert estimates were combined with observed data using time-to-event parametric models that accounted for experts’ uncertainty, producing an overall distribution of survival over time. The authors found that a combination of clinical trial data and expert judgments substantially improved the precision of extrapolated survival curves when compared with naïve extrapolation methods based solely on clinical trial data.
However, while expert elicitation methods combine both qualitative and quantitative research methods to provide valuable information in areas that lack long-term, empirical evidence, some limitations persist. Most importantly, small sample sizes often prevent the generalizability of results beyond the expert population consulted.37 Additionally, any assessments that rely heavily on individual judgments can be subject to various sources of bias, including anchoring and adjustment bias (eg, providing a specific point estimate or reference in the question, thereby influencing experts), the availability heuristic (eg, experts might overstate the likelihood of an event if they recently experienced it), range–frequency compromise (eg, when provided with a range of options, experts might apportion responses evenly), and overconfidence (eg, experts might provide confidence intervals that are too narrow).36
In an example mixed-methods study, Yelverton and colleagues interviewed both HIV patients and physicians about the use of antiretroviral therapy (ART), and then conducted quantitative ranking exercises to develop priority areas for shared decision-making between patients and their providers.41 Similarly, Barber and colleagues conducted interviews with pediatric hypodontia patients and their parents, in order to inform the development of key treatment attributes for a discrete choice experiment.42 Specifically, interviews were used to assess what factors both patients and their caregivers value in potential treatment alternatives, and how they evaluate trade-offs between these attributes.
Conclusion
Many health research challenges require a combination of quantitative and qualitative research approaches. The importance of quantitative approaches to generate empirical data and to detect patterns and trends cannot be understated. Yet, a quantitative approach alone has its limitations. Quantitative approaches as we have described here can only be strengthened by a mixed-methods approach that serves as a continual feedback loop on the data generated and offers new avenues for exploration. A mixed-methods approach allows for a continued reflection on where there could be gaps in the data collected.
References
1. May SG, Roach M, Murphy R. The importance of qualitative research in enhancing understanding of treatment decisions, outcomes, and value assessment. J Clin Pathways. 2021;7(4):31-33.
2. Schneider KA, Ngwa GA, Schwehm M, Eichner L, Eichner M. The COVID-19 pandemic preparedness simulation tool: CovidSIM. BMC Infect Dis. 2020;20(1):859. doi:10.1186/s12879-020-05566-7
3. Cutler DM, Summers LH. The COVID-19 pandemic and the $16 trillion virus. JAMA. 2020;324(15):1495-1496. doi:10.1001/jama.2020.19759
4. Cerda AA, García LY. Willingness to Pay for a COVID-19 Vaccine. Appl Health Econ Health Policy. 2021;19(3):343-351.
5. Campbell JD, Whittington MD, Rind DM, Pearson SD. Alternative pricing models for remdesivir and other potential treatments for COVID-19. Updated November 10, 2020. Accessed July 9, 2021. http://icer.org/wp-content/uploads/2020/11/ICER-COVID_Updated_Report_11102020.pdf
6. McCombs A, Kadelka C. A model-based evaluation of the efficacy of COVID-19 social distancing, testing and hospital triage policies. PLoS Comput Biol. 2020;16(10):e1008388. doi:10.1371/journal.pcbi.1008388
7. Borchering RK, Viboud C, Howerton E, et al. Modeling of future COVID-19 cases, hospitalizations, and deaths, by vaccination rates and nonpharmaceutical intervention scenarios - United States, April-September 2021. MMWR Morb Mortal Wkly Rep. 2021;70(19):719-724. doi:10.15585/mmwr.mm7019e3
8. IHME COVID-19 Forecasting Team. Modeling COVID-19 scenarios for the United States. Nat Med. 2021;27(1):94-105. doi:10.1038/s41591-020-1132-9 Published correction appears in Nat Med. 2020;26(12):1950.
9. Motheral B, Brooks J, Clark MA, et al. A checklist for retrospective database studies--report of the ISPOR Task Force on Retrospective Databases. Value Health. 2003;6(2):90-97. doi:10.1046/j.1524-4733.2003.00242.x
10. Stein JD, Lum F, Lee PP, Rich WL 3rd, Coleman AL. Use of health care claims data to study patients with ophthalmologic conditions. Ophthalmology. 2014;121(5):1134-1141. doi:10.1016/j.ophtha.2013.11.038
11. IBM. IBM MarketScan Research Databases. Accessed July 29, 2021. https://www.ibm.com/products/marketscan-research-databases/databases
12. University of Michigan Institute for Social Research. The Health and Retirement Study. Accessed July 19, 2021. https://hrs.isr.umich.edu/about
13. Pan L, Fergusson D, Schweitzer I, Hebert PC. Ensuring high accuracy of data abstracted from patient charts: the use of a standardized medical record as a training tool. J Clin Epidemiol. 2005;58(9):918-923. doi:10.1016/j.jclinepi.2005.02.004
14. Tofthagen C. Threats to validity in retrospective studies. J Adv Pract Oncol. 2012;3(3):181-183.
15. Scott JW, Schwartz TA, Dimick JB. Practical guide to health policy evaluation using observational data. JAMA Surg. 2020;155(4):353-354. doi:10.1001/jamasurg.2019.4398
16. Lyu W, Wehby GL. Community use of face masks and COVID-19: evidence from a natural experiment of state mandates in the US. Health Aff (Millwood). 2020;39(8):1419-1425. doi:10.1377/hlthaff.2020.00818
17. Tolpin HG. Overview of CBA and CEA methods. Drug Inf J. 1988;22(3):281-289.
18. Curtis D. Patient experience - the ingredient missing from cost-effectiveness calculations. Patient Prefer Adherence. 2011;5:251-254. doi:10.2147/PPA.S20243
19. Cassidy R, Singh NS, Schiratti PR, et al. Mathematical modelling for health systems research: a systematic review of system dynamics and agent-based models. BMC Health Serv Res. 2019;19(1):845. doi:10.1186/s12913-019-4627-7
20. Jahn B, Sroczynski G, Bicher M, et al. Targeted COVID-19 vaccination (TAV-COVID) considering limited vaccination capacities-an agent-based modeling evaluation. Vaccines (Basel). 2021;9(5):434. doi:10.3390/vaccines9050434
21. Caro JJ, Möller J. Advantages and disadvantages of discrete-event simulation for health economic analyses. Expert Rev Pharmacoecon Outcomes Res. 2016;16(3):327-329. doi:10.1586/14737167.2016.1165608
22. Eldabi T, Irani Z, Paul RJ, Love PED. Quantitative and qualitative decision‐making methods in simulation modelling. Manage Decision. 2002;40(1):64-73. doi:10.1108/00251740210413370
23. National Institute for Health and Care Excellence. COVID-19 rapid guideline: managing COVID-19. Updated August 10, 2021. Accessed August 12, 2021. https://www.nice.org.uk/guidance/ng191
24. Institute for Clinical Effectiveness Research (ICER). ICER Provides First Update to Pricing Models for Remdesivir as a Treatment for COVID-19. Published June 24, 2020. Accessed July 29, 2021. https://icer.org/news-insights/press-releases/updated_icer-covid_models_june_24
25. Angelis A, Lange A, Kanavos P. Using health technology assessment to assess the value of new medicines: results of a systematic review and expert consultation across eight European countries. Eur J Health Econ. 2018;19(1):123-152. doi:10.1007/s10198-017-0871-0
26. Morrison T. How Simulation Can Transform Regulatory Pathways webinar. August 9, 2018. Accessed July 29, 2021. https://www.fda.gov/science-research/about-science-research-fda/how-simulation-can-transform-regulatory-pathways
27. Zhang X, Yang Y, Grimstein M, et al. Application of PBPK modeling and simulation for regulatory decision making and Its impact on US prescribing information: an update on the 2018-2019 submissions to the US FDA’s Office of Clinical Pharmacology. J Clin Pharmacol. 2020;60 Suppl 1:S160-S178. doi:10.1002/jcph.1767
28. Henshall C, Schuller T; HTAi Policy Forum. Health technology assessment, value-based decision making, and innovation. Int J Technol Assess Health Care. 2013;29(4):353-359. doi:10.1017/S0266462313000378
29. Sorenson C, Drummond M, Kanavos P. Ensuring value for money in health care: the role of health technology assessment in the European Union. Accessed July 29, 2021. https://www.euro.who.int/__data/assets/pdf_file/0011/98291/E91271.pdf
30. Ollendorf DA, Chapman RH, Pearson SD. Evaluating and valuing drugs for rare conditions: no easy answers. Value Health. 2018;21(5):547-552. doi:10.1016/j.jval.2018.01.008
31. Chapman R, Kumar V, Samur S, Zaim R, Segel C, Pearson S. Value assessment methods and pricing recommendations for potential cures: a technical brief. Published August 6, 2019. Accessed July 29, 2021. https://icer.org/wp-content/uploads/2020/10/Valuing-a-Cure-Technical-Brief.pdf
32. Peel A, Jenks M, Choudhury M, et al. Use of expert judgement across NICE guidance-making programmes: a review of current processes and suitability of existing tools to support the use of expert elicitation. Appl Health Econ Health Policy. 2018;16(6):819-836. doi:10.1007/s40258-018-0415-5. Published correction appears in Appl Health Econ Health Policy. 2019;17(2):263-264
33. Nephew LD. Systemic racism and overcoming my COVID-19 vaccine hesitancy. EClinicalMedicine. 2021;32:100713. doi:10.1016/j.eclinm.2020.100713
34. Strully KW, Harrison TM, Pardo TA, Carleo-Evangelist J. Strategies to address
COVID-19 vaccine hesitancy and mitigate health disparities in minority populations. Front Public Health. 2021;9:645268. doi:10.3389/fpubh.2021.645268
35. Kilaberia T, Bell J, Bettega K, Mongoven J, Kelly K, Young H. Impact of the
COVID-19 pandemic on family caregivers. Innov Aging. 2020;4(Suppl 1):950. doi: 10.1093/geroni/igaa057.3475
36. O’Hagan A. Expert knowledge elicitation: subjective but scientific. Am Stat. 2019;73(suppl 1):69-81. doi:10.1080/00031305.2018.1518265
37. Khodyakov D, Grant S, Barber CE, Marshall DA, Esdaile JM, Lacaille D. Acceptability of an online modified Delphi panel approach for developing health services performance measures: results from 3 panels on arthritis research. J Eval Clin Pract. 2017;23(2):354-360. doi:10.1111/jep.12623
38. Grigore B, Peters J, Hyde C, Stein K. A comparison of two methods for expert elicitation in health technology assessments. BMC Med Res Methodol. 2016;16:85. doi:10.1186/s12874-016-0186-3
39. Graf M, Tuly R, Harley C, Pednekar P, Batt K. Understanding the evolution of coverage policies for prophylaxis treatments of hemophilia A without inhibitors: a payer Delphi panel [published online ahead of print, 2021 Apr 12]. J Manag Care Spec Pharm. 2021;27(8):996-1008. doi:10.18553/jmcp.2021.20600
40. Cope S, Ayers D, Zhang J, Batt K, Jansen JP. Integrating expert opinion with clinical trial data to extrapolate long-term survival: a case study of CAR-T therapy for children and young adults with relapsed or refractory acute lymphoblastic leukemia. BMC Med Res Methodol. 2019;19(1):182. doi:10.1186/s12874-019-0823-8
41. Yelverton V, Ostermann J, Hobbie A, Madut D, Thielman N. A mixed methods approach to understanding antiretroviral treatment preferences: what do patients really want? AIDS Patient Care STDS. 2018;32(9):340-348. doi:10.1089/apc.2018.0099
42. Barber S, Bekker H, Marti J, Pavitt S, Khambay B, Meads D. Development of a discrete-choice experiment (DCE) to elicit adolescent and parent preferences for hypodontia treatment. Patient. 2019;12(1):137-148. doi:10.1007/s40271-018-0338-0