Ten years ago while doing my masters in health promotion I applauded David St George, Consultant/Senior Lecturer in. Clinical Epidemiology and Public Health Medicine at the Royal Free Hospital in London when he said ”The time has come for complementary medicine to turn away from the need to obtain legitimacy from orthodox medicine by adopting its paradigm and research methods”
Since 70s when Professor George Engle and Ivan Illich et al questioned the reductionist biomedical model of assessing health and focusing on the treatment of symptoms there has been a polarised debate on how evidence should be used in clinical practice. In fact it would appear Chiropractic sceptics only realised there was a debate until recently thinking quantitive studies and the randomised controlled trial were the only acceptable evidence to demonstrate the efficacy of a clinical intervention.
Unfortunately the biomedical wing of the chiropractic profession shared this view, not understanding the difference in assessing the effectiveness of a drug and assessing the effectiveness of an intervention dependent on the skill of the practitioner like surgery or a chiropractic adjustment. The BCA presented their “plethora of evidence” to Simon Singh and made the chiropractic profession a laughing stock. Interventions relying on the skill of the practitioner are more suited to qulaitive investigation and there needs to be a review of how evidence is interpreted, before The “Tyranny of Evidence” which David Sacket was afraid disrupts clinical progress.
Skeptics like Edzard Ernst, Simon Singh, Blue Wode the Editor of EBM first and David Colquhoun are really getting their nickers in a “twitter” because the College of Medicine believe there is a “pressing need for a wide-ranging review of the nature of evidence required to inform clinical practice, and intends to commission such a review”. This is what the College of Medicine have recently published on their website.
I would urge all interested parties to get on Twitter and follow these “skeptics” to understand where their narrow understanding of “ scientific evidence” is coming from.
Perspectives on Evidence
http://www.collegeofmedicine.org.uk/perspectives-evidence from the College of Medicine
Professor Andrew Miles & Dr Michael Loughlin, Science Advisory Council
The College of Medicine believes there is a pressing need for a wide-ranging review of the nature of evidence required to inform clinical practice, and intends to commission such a review.
Evidence is information that provides justification for a belief or action. It gives us a reason to adopt a belief (e.g. stomach ulcers are caused by H. Pylori bacteria) or choose a certain course of action (such as prescribing inhaled steroids to patients with asthma.)[3]
For information to become evidence, you must have:
• a description of the background situation – the context in which the problem occurs
• a clear explanation for regarding certain features of that context as problematic
• an inquiry with the goal of addressing the problem.[4]
Evidence can be acquired by careful observation, but the facts never speak for themselves[1]. Observation is an activity carried out by people in conjunction with other activities – questioning, categorising and hypothesising – in order to frame broad pictures of the world against which these activities acquire meaning. Information is not evidence without a theoretical structure determining which inferences are valid and which are not [1,4].
We recognise features of the world as problematic because we value certain outcomes over others. Illness is problematic because it is bad, and attempts to reduce the value-laden aspect of this claim to a purely descriptive or empirical one are unconvincing. Just as statistical analyses of what is usual and unusual do not capture the meaning of the claim that high intelligence is in itself a good thing and low intelligence a bad thing.[4]
Similarly, we characterise evidence as strong or weak depending on how ‘convincing’ it is, but it is crucial to note that ‘convincing’ is not a descriptive or empirical term. It implies the exercise of judgment. It concerns not what people do believe but what they should believe – a person can ignore the evidence, but to do so is irrational. Evidence is an evaluative concept: a person lacking the ability to make evaluative commitments could have no reason to choose any belief or course of action over any other. What this means for clinical practice Deciding what is good evidence involves making value-judgements. Human reasoning is a value-laden process. So the concept of ‘objectivity’ as a purely empirical, neutral, inquiry is not a sustainable model of good practice or the proper use of evidence. This false assumption can lead researchers – and those using scientific research – astray.[4]
There are alternative definitions of objectivity that are an essential component of sound reasoning and good practice.[4] Objectivity seen as the ability to recognise a plurality of reasonable positions on a given question, including ones based on theoretical and evaluative assumptions different to your own, is an essential skill of the good practitioner.[3] The clinical encounter is an encounter between people,[5] and a reasonable person recognises that alternative positions may be taken by colleagues and patients. He or she engages with colleagues or patients to understand those positions.[2]
The idea that there can be any one type or source of evidence that is always better than any other seems difficult to sustain, for two reasons. First, the vast range of contexts that can be reasonably interpreted as problematic for human health and well-being. Second, the diversity of values and frameworks that can come into play in identifying and characterising those problems.[1,2] We make evaluative and theoretical assumptions whether we own up to them or not. So an intellectually honest approach to evidence and reasoning will involve being as reflective and open as possible about one’s own assumptions.[3]
Without a recognition of the relevant context, theory and reasoning applied to a particular problem, EBM is merely Information-based medicine.[2] These frameworks are necessary to bridge the gap between information and conclusion.[6]
There may be such broad agreement about the undesirable nature of certain physical conditions that it is absurd to question whether they are serious problems. Even so, it is a serious error to ignore or deny the significance of the theoretical framework around that problem.[4] This can lead to a refusal to accept that there is scope for reasonable disagreement about which theoretical frameworks or values to adopt.[1] If we are to avoid dogmatism then we need to remain aware that even our most common and entrenched pictures of the world and our place within it are in principle open to revision [3] – just as we have realised that stomach ulcers are caused by bacteria, not acid burning the lining of the stomach.
The skills of acquiring and utilising information from systematic research are essential but the proper use of evidence goes beyond this. We need to cultivate reflective, well-rounded practitioners, who are able to make intellectually defensible judgements about a range of theoretical and value perspectives. They must be aware of the possibilities of different ways of characterising situations and the nature of health problems. Only then do we move beyond promoting know-how to promoting wisdom in practice.[7]
This is a complex challenge. It means good professional education requires awareness of a range of approaches.[8] This requires a fundamental shift in our way of thinking about the relationships between evidence, information from scientific research, expertise, reasoning, clinical knowledge and good practice. A new definition of medical practice as a human activity and the clinical encounter as a relationship between people, acknowledging the beliefs and value judgments these people bring to the consultation.[2,5,6]
The difficulty of the challenge is not a reason to shy away from it.[9] The College of Medicine will commission a full-scale review on the nature of knowledge for clinical practice. We plan to create an international group of distinguished scholars, and will publish its findings in order to engage professionals in this important debate. The review will be led by Andrew Miles, Professor of Clinical Epidemiology and Social Medicine at the University of Buckingham, and co-ordinated from that institution, entering its planning stage in January 2011. Further information on the methodologies to be employed and the expert members of the review group will be available at that time.
1. Loughlin, M (2009) The basis of medical knowledge: judgement, objectivity and the history of ideas, Journal of Evaluation in Clinical Practice 15 (6) 935-40
2. Miles, A, Loughlin, M & Polychronis, A (2008) Editorial Introduction and Commentary: “Evidence-based health care, clinical knowledge and the rise of personalised medicine”, Journal of Evaluation in Clinical Practice 14 (5) 621-49
3. Loughlin, M (2002) Ethics, Management and Mythology, Radcliffe Medical Press, Oxon.
4. Loughlin, AJ (1998) Alienation and Value-Neutrality, Ashgate, Aldershot
5. Henry, S (2010) Polanyi’s tacit knowledge and the relevance of epistemology to clinical medicine, Journal of Evaluation in Clinical Practice 16 (2) 292-7
6. Tonelli, MR (1998) The philosophical limits of evidence-based medicine, Academic Medicine 73 (12) 1234-40
7. Maxwell, N (2004) Is Science Neurotic? Imperial College Press, London
8. Miles, A (2009) On a Medicine of the Whole Person: away from scientistic reductionism and towards the embrace of the complex in clinical practice, Journal of Evaluation in Clinical Practice 15 (6) 941-9
9. Tonelli, MR (2010) The challenge of evidence in clinical medicine, Journal of Evaluation in Clinical Practice 16 (2) 384-9
13 comments for ““There is a pressing need for a wide-ranging review of the nature of evidence” College of Medicine”