Life

Meet future leaders of evidence based medicine

By:
Peter J Gill,
Helen Macdonald

In the run-up to Evidence Live 2016, we asked health science students, junior doctors, and early career researchers to write about projects or innovative ideas that they have been part of and that deal with some of the conference’s main themes (see box). Below are the top five submissions we received. Their authors will gain free places to attend the conference in Oxford between 22 and 24 June 2016.

At the Evidence Live conference, researchers, doctors, and professionals, working with evidence at different stages in the healthcare chain, learn about important issues in healthcare. The programme is designed to showcase the most innovative ideas, processes, and best practices that form the foundations of an evidence based approach. Several sessions focus exclusively on students and junior doctors, including a networking event and leadership showcase. Discounted tickets are still available (http://evidencelive.org/).

Improving the quality of research evidence

Aaron Dale, third year graduate entry medical student, University of Oxford, and member of the Centre for Evidence-Based Medicine Outcome Monitoring Project (COMPare)

Outcome switching is a major problem in clinical trial reporting that can distort the evidence from which clinical decisions are made. This is when researchers fail to report the outcomes they originally set out to measure and swap them for new outcomes that they didn't initially consider. The best demonstration of why this is a problem is in the webcomic xkcd.[1]

By leaving out the fact that outcome switching has taken place, journals can represent interventions appearing better than they actually are, which misinforms doctors and risks considerable harm to patients.

Outcome switching is highly prevalent in clinical trials[2]; however, prevalence studies alone have not eradicated the problem. This led us to ask how can we, as medical students, make a difference? Our hypothesis was simple: if we could hold individual journals to account for specific cases of outcome switching and engage them in open discussion, we would begin to understand why the problem still persists and, in doing so, help to correct it.

Between October and November 2015 the COMPare team (five medical students in collaboration with three researchers in evidence based medicine) monitored all trials for outcome switching in the top five general medical journals. We compared the prespecified outcomes in the trial protocol or registry entry with those reported in the journal publication. We recorded the number of missing prespecified outcomes and the number of new outcomes silently added, and wrote a letter of correction to journals for each trial containing misreported outcomes. We also published all our results, raw data, and letters to journals on our website.[3]

We have already instigated change in the way The BMJ reviews clinical trials submitted for publication. The journal recently published a correction on the REEACT trial based on our letters, and they now require the original trial protocol to be submitted with the corresponding paper to help editors compare prespecified and reported outcomes.[4] We are also engaged in discussions with the New England Journal of Medicine, Annals of Internal Medicine, The Lancet, and the Journal of the American Medical Association in the hope that we can improve clinical trial reporting. You can read our blogs detailing the responses we have received from these journals on our website.

Librarians to the rescue

Transforming the communication of evidence for better health

Imagine a world where doctors have the superpower to give quick evidence based answers to patients’ concerns about treatments.

In reality, the pressure to answer these types of questions can be overwhelming. You scratch your head and search medical journals; you become so occupied going through one study after another trying to decide what is best for your patient. You are unsure how exactly to do the search,[1] which database or key terms to use.[2]

With this in mind, our research team based in Malaysia decided to provide an evidence retrieval service to assist 20 primary care physicians. Over the course of a month, doctors working at a primary clinic in Kuala Lumpur could submit any clinical questions through WhatsApp or our website (evidence2u.com). We wanted to find out what types of questions doctors had about treatments and how this service could assist their clinical practice. Our team searched for studies to find the most up to date papers with the highest level of evidence to help inform the conversations doctors had with patients about treatments. We sent links to these papers back to the doctors via WhatsApp and email.

Over this period, 37 questions were submitted (we answered 34) and it took an average of 22 hours to respond to each query. Participants were also asked to rate the service when they received the answer to their questions. Most said the service was satisfactory in terms of overall experience and quality of answers given. Only the length of response was found to be unsatisfactory among 18.9% of the respondents.

On the basis of this feedback, this pilot has shown us that an evidence retrieval service has the potential to have a positive impact on clinical practice and we are hopeful that the service will be further utilised to include more primary care physicians and librarians in the future.

Claims made in scientific press releases need greater accountability

Transforming the communication of evidence for better health

Liam Shaw, second year PhD student, University College London

Researchers are fond of blaming journalists for poor medical reporting. Some of the blame may lie slightly closer to home, however. In December 2014, researchers at Cardiff University published a study analysing the quality of scientific press releases.[1] They found many press releases exaggerated studies’ claims, which was reflected in subsequent media reporting.

The Cardiff researchers didn't name the papers they analysed. But as Ben Goldacre pointed out in a linked editorial in The BMJ, it would be possible to use their published data to identify the “academics and institutions associated with the worst exaggerations and publish their names online.”[2] I decided to take on this challenge.

The project took longer than the spare afternoon estimated by Ben. Extracting information from the Matlab database was more complicated than I had anticipated and the Cardiff team conceded to me via email that the data format was “not ideal.”

I put all my code on GitHub and made a spreadsheet with the assessments of each paper.[3 4] One of my favourite studies was from 2011 about ultrasound treatment for leg ulcers with a press release claiming that “laughter is the best medicine.” The resulting misleading media coverage was covered by NHS Choices, but only by looking at the Cardiff study data was it clear that the press release was the source of this exaggeration.[4]

In my analysis I focused on bad press releases, but good ones do exist. Sometimes press releases even link to the scientific paper—but tend not to be vice versa—allowing researchers to dodge responsibility for the quality and accuracy of press releases. Linking to press releases from journal articles would embarrass scientists with bad press releases and stop them passing the buck to press officers. A culture of accountability for the claims made in press releases would reduce the risk of the public being misled and making decisions that are bad for their health.

Can a telemedicine programme improve outcomes for teenagers with inflammatory bowel disease?

Translating evidence into better quality health services

Anke Heida, PhD student at the University Medical Center Groningen, the Netherlands

Telemedicine is often seen as a cost effective alternative to help patients with long term chronic conditions monitor their conditions remotely. Despite this promise, telemedicine solutions are often implemented without a solid evidence base of their efficacy.

In the Netherlands, monitoring of teenagers with inflammatory bowel disease (IBD) is traditionally done during scheduled visits, but this is usually when most patients report full disease control. IBD care could be more efficient if imminent relapses were recognised at home and if patients were seen at times of clinical need. In our study, we hypothesised that IBD-live (a telemedicine monitoring programme) could lead to fewer disease flares and lower costs.

As part of an ongoing multicentre trial, teenagers aged between 10 and 19 years with IBD were randomly assigned in either IBD-live (web-based monitoring) or usual care (three monthly visits to the outpatient clinic) groups. Teenagers assigned to IBD-live had fewer scheduled visits with their IBD team and are monitored at home using a “flarometer,” which is a questionnaire that records disease activity and faecal calprotectin measurements to estimate the probability of relapse.

The frequencies of flarometer measurements and treatment advice depend on the preceding risk stratification. Primary outcome is time to relapse. Secondary outcomes include cost effectiveness and quality of life.

At the time of writing, we have evaluated our primary outcome in 120 of 180 patients with a mean follow-up of 10 months. In this interim analysis we found no difference in time to relapse or number of relapses between web based monitoring and usual follow-up care. We expect use of IBD-live to be associated with lower costs of disease management, but what this study has shown us so far is that web based monitoring strategies such as IBD-live need to show true value before being implemented in clinical practice.

Herding orthopaedic surgeons to create fracture packs

Transforming the communication of evidence for better health

In a busy morning our fracture clinic provides orthopaedic care for up to 40 to 50 patients. During these appointments, decisions regarding non-operative or surgical management are made. This is a tough skill for junior doctors to learn because of the speed at which these decisions are made by senior surgeons.

This presents a challenge for everyone: how can we disseminate the best evidence to help manage referrals to the fracture clinic, boost training, and support shared informed decision making between our team and our patients?

Enter the Kingston Hospital Fracture Pack, which is an online guide to the emergency management of 12 common orthopaedic injuries. It contains hyperlinks to the most up to date meta-analyses and systematic reviews, alongside the expert opinion of the hospital’s orthopaedic consultants. The clinical summaries are written using accessible language and linked to open access resources to encourage patients to use them as well. For emergency department doctors, the pack helps with initial management and highlights some of the current controversies and new evidence around orthopaedic care. The pack is updated every 12 weeks by the registrars for each subspecialty and any additions to the pack are presented at our regular governance meetings before they are added.

The most revealing insight gained from this process was bringing senior orthopaedic surgeons together to see how they justify their clinical decisions to one another and how this compares with the evidence.

Peter J Gill, paediatric resident at The Hospital for Sick Children, University of Toronto and honorary fellow at the Centre for Evidence-Based Medicine, University of Oxford1, Helen Macdonald, clinical editor2