Author: Dr. Rijo Mathew Choorakuttil, Director and Founder, AMMA Center for Diagnosis and Preventive Medicine, Kochi, Kerala.
The field of radiology and imaging has rapidly advanced with improvements in technology, the ability to provide interventions, cross disciplinary collaborations and use of digital media. Much of these advances are based on clinical, diagnostic or technological research as well as inputs based on observations of those practising radiology and imaging on the ground. Research and patient care can be considered as mirror images, to a large extent, with improvements in one leading to improvements in the other. However, the mirror does not always communicate with each other as the divide between patient care and research, in their conventional forms, is widening.
There are several reasons for the widening chasm of patient care and research. Medical care aims to provide care and promote the well-being of individuals in the immediate sense while health care research aims to produce knowledge that can be generalized to a larger population and often focuses on the future derived from the past. A diagnostician or a doctor focuses on the person in front of them and is committed to improving outcomes for that particular person. A researcher, on the other hand, considers the individual interests subordinate to the general interests of the community. Clinical or health care research aims for outcomes that benefit the most (or average) and do not necessarily focus on the outliers. For patient care, however, an outlier is still a patient who needs the best possible care and solutions. A major factor for the divide is increasing complexity and sophistication needed for research. This includes the increasingly rigorous (and possibly narrower) standards that have to be met including the complexity of statistical and technical measures (besides peer snobbery). It is not always possible for health care practitioners who deal with heavy patient loads to dedicate the time necessary to meet these complex, sophisticated standards.
Evidence based Medicine and Patient Centred Medicine and Practice Based Evidence are some of the approaches that have looked at ways to bridge this divide. We will look at a more detailed exploration of each of these in subsequent posts, for now, we will try to explore how research and practice can work in synergy for better outcomes.
Generation of evidence, for example, that something works or does not work, or that something is associated or not associated with a condition of interest, or that one intervention is better than another, is among the cornerstone for both research and patient care. The quality, reliability, validity and strength of the evidence is important. Typically, evidence is generated through two approaches- 1) a top down approach where evidence is generated at the higher end academic institutes (often based on a consensus approach of experts) and trickles down (example- generation of guidelines that are disseminated for practice), and 2) a bottoms up approach where evidence is generated at an individual level, mostly in the field, in response to local issues and leads to a larger study that looks to validate findings.
The current “levels of evidence” pyramid places observational studies at the bottom of the pyramid and assigns a lower level of evidence to such studies. The level of evidence is assigned more strength as one progresses further up the pyramid, through synopses of evidence, systematic reviews, evidence based summaries, clinical practice guidelines, and system or computer generated algorithms based on large data sets. The current system implies that the best evidence lies in the higher levels and that observational studies provide a lower level of evidence. Thus, if you are searching for evidence, it is worth the while to start search at the higher levels of evidence.
This approach, however, has a limitation. It presumes that the quality of evidence or studies that contribute towards a systematic review or meta-analysis is always high. The quality or evidence from a systematic review or meta-analysis is dependent on the quality of the individual studies that make up the analysis. A higher level of evidence cannot be obtained if the observational studies at the bottom of the pyramid are not methodologically strong. Thus, it makes sense to focus on ensuring that the base of the pyramid is structurally and methodologically strong.
We need to consider how the evidence can translate into routine practice as well. This is an important but not always considered step. We cannot evaluate the impact of applying the evidence if the evidence is not applied to a larger population. There are several aspects to the application of evidence. An understanding of the different, although interrelated thinking styles, may help.
In a clinical practice, one mostly needs a fast, intuitive process (intuitive but based on prior experience) to find an immediate solution. This approach heavily draws on prior experiences (although not always in an obvious manner), identifies familiar elements and acts. Needless to say, the action is heavily dependent on the training received as well as the understanding of and assimilation of prior experiences. The action also has the potential for large inter individual differences. A course of action is initially fixed in a rapid intuitive manner and evidence is then sought to back up this course of action.
A researcher, however, is used to a slower logical thinking, which looks to have facts as its foundation. The primary step is collection, understanding, analysis and synthesis of information before any plan is made. Decisions are sought to be evidence based and rational.
A marriage of both these approaches is ideal. Both approaches require training preferably from the early stages of induction into a medical curriculum itself, if not earlier.
One of the major, if not the most important consideration, is the presentation of the evidence. Is the evidence easily understandable, easily translatable, and easily or intuitively applicable to a patient who needs care? The evidence is not going to be applied if a busy practitioner has to spend hours trying to understand the evidence, or hidden among layers of maths and formulas, that the practitioner has to take additional courses on statistics just to try and understand what the results mean, or if the practitioner has to adopt several complex measures or calculations to know if the evidence can be applied to the patient.
How does one bridge this?
As a practitioner, there are several limitations to the integration of evidence or best research into practice. These include, but are not limited to,
- There are so many research journals and research articles. How do I find the time, in between my clinical practice, to find the appropriate articles?
- Many Journals and articles are accessible only on payment of a fee. How do I know whether the ones I pay for and buy are actually useful? I have to make a decision based on an abstract written by the authors. How reliable is that?
- The medical curriculum does not place a premium on research methods or research ethics training or training on critical evaluation of research studies including a basic understanding of statistical methods. A dedicated framework for research is not provided other than the mandatory dissertation experience. Training on newer advances in clinical applications abound, however, such training, including updating practitioners on newer advances in research methods, is rare.
- Most of the research questions are based on experiences in higher end institutes and do not always reflect concerns or needs of primary or secondary level care providers and patients. Can I afford to spend time poring over countless studies trying to find out the evidence that can help me? When I find the evidence, is it presented in a manner that I can use?
- Forums or Journals or Conferences that focus on sharing information on crises related to individual patients or observational studies is reducing as the focus is shifting to a randomized clinical trial or systematic review or meta-analysis approach. These higher levels of evidence cannot be generated unless the base of the pyramid is strengthened. That requires shortening the distance between “experiments” and “observations”. It requires the integration of or consideration of observations as experiments, and an improved documentation at an individual level. Pragmatic single patient trials or the n of 1 studies are equally valuable. These can be done by any practitioner.
Moving forward, the key concerns are
- How do we generate locally relevant evidence that can answer locally relevant questions and in the present?
- How do we present evidence in a manner that can be understood and applied by everyone?
- How do we train in the basic steps of research methods? We do not need complex solutions for most problems, so a training in the basic steps, is by itself, very useful?
- How do we share our experiences, in an open access manner, with others in the field? To share our experience and knowledge, and to refine these further based on feedback and collaborative interactions.
Research cannot integrate with practice if these concerns are not addressed. A top down approach is only as strong as the work in the field, especially if one is looking towards research and clinical care having a larger impact on the community or population health.
- Litton P, Miller FG. A normative justification for distinguishing the ethics of clinical research from the ethics of medical care. J Law Med Ethics 2005;33:566-74
- Hill AB. Medical Ethics and controlled trials. BMJ 1963; 1(5337):1043-9
- Angell M. Patients preferences in randomized clinical trials. New Engl J Med 1984; 310: 1385-7
- Grunberg SM, Cefalu WT. The integral role of clinical research in clinical care. New Engl J Med 2003;348: 1386-8
- Hellman S, Hellman DS. Of mice but not men. Problems of the randomized clinical trial. New Engl J Med 1991; 324: 1585-9
- Schafer A. The ethics of the randomized clinical trial. New Engl J Med 1982; 307:719-24
- Selby JV, Beal AC, Frank L. The Patient-Centered Outcomes Research Institute (PCORI) national priorities for research and initial research agenda. JAMA 2012; 307: 1583-4
- Freemantle N, Blonde L, Bolinder B, Gerber RA, Hobbs FD, Martinez L, et al. Real-world trials to answer real-world questions. Pharmacoeconomics 2005; 23: 747-54
- Forster HP, Schwartz J, DeRenzo E. Reducing legal risk by practising patient-centered medicine. Arch Intern Med 2002; 162: 1217-9
- Harris J. Scientific research is a moral duty. J Med Ethics 2005; 31(4):242-8
- Sacristan JA. Patient-centered medicine and patient-oriented research: Improving health outcomes for individual patients. BMC Med Inform Decis Making 2013;13:6
- Sacristan JA. Evidence based medicine and patient-centered medicine: some thoughts on their integration. Rev Clin Esp 2013; 213: 460-4
- Sacristan JA. Exploratory trials, confirmatory observations: a new reasoning model in the era of patient-centered medicine. BMC Med Res Methodol 2011; 11:57
- Jull A, Bennett D. Do n-of-1 trials really tailor treatment? Lancet 2005; 365:1992-4
- Jenicek M. Clinical case reports: sources of boredom or valuable pieces of evidence? Nat Med J India 2001;14: 193-4
- Aronson K, Hauben M. Anecdotes that provide definitive evidence. BMJ 2006; 333: 1267-9
- Haynes RB. Of studies, summaries, synopses and systems: the 4S evolution of services for finding current best evidence. Evid Based Ment Health 2001;4(2): 37-39
- DiCenso A, Bayley L, Haynes RB. Accessing preappraised evidence: fine tuning the 5S model into a 6S model. Ann Intern Med 2009; 151(6):JC3-2-JC3-3
- Staunton M. Evidence based radiology: steps 1 and 2-asking answerable questions and searching for evidence. Radiology 2007;242(1):23-31
- Maher MM, Hodnett PA, Kalra MK. Evidence based practice in radiology: steps 3 and 4-appraise and apply interventional radiology literature. Radiology 2007;242(3):658-70
- Halligan S, Altman DG. Evidence based practice in radiology: steps 3 and 4-appraise and apply systematic reviews and meta-analyses. Radiology 2007;243(1):13-27
- Dodd JD. Evidence based practice in radiology: steps 3 and 4-appraise and apply diagnostic radiology literature. Radiology 2007;242(2):342-354
- Malone DE, Staunton M. Evidence Based practice in radiology: step 5 (evaluate)-caveats and common questions. Radiology 2007;243(2):319-328
- van Beek EJ; Malone DE. Evidence based practice in radiology education: why and how should we teach it? Radiology 2007;244(1): 31-38
- Evidence based radiology working group. Evidence based radiology: a new approach to the practice of radiology. Radiology 2001; 220(3): 566-575
- Hollingworth W, Jarvik JG. Technology assessment in radiology: putting the evidence in evidence based radiology. Radiology 2007;244(1):31-38