Irrelevant, misdirected, inappropriate, or unnecessary. Reading the list of contents in some lesser known journals or abstracts at a conference, you wonder what some studies really add. Sir Iain Chalmers (The Lind Initiative), who opened the Society of Primary Care conference in Bristol, called it waste. He said that we need to focus on uncertainty about the effects of treatment and suggested a strategy that begins with a systematic review, and if no one knows the answer, then we need a trial. Frustrated by our failure to address important uncertainties, he wondered whose priorities research addressed. Many are unsound: one purpose is to decorate CVs; “Money talks” so research follows funding and; he also spoke about “the divisive effect of reductionist snobbery.” But, ask patients what they consider important priorities and you may get different answers. He encouraged us to consider the substantial waste in medical research, that all research should look to clarify uncertainties (that patients have died because this hasn’t been done), that evaluative research should become part of treatment, and that we need to engage with the public and patients.
There is uncertainty in the prognosis of many cancers. Willie Hamilton (Peninsula Medical School), pointed out the importance of prognostic indicators in diseases like prostate cancer where the outcomes are uncertain but suggested such indicators are less important in conditions where the outlook is rather more bleak – such as lung cancer. But, prognostic studies are by definition historical as they are based on previously published data which may now be decades old. Furthermore they may not include the effect of treatment, and more recent treatments may change the prognosis for a patient consulting you today. Tom Fahey, leader of the research group in RCSI (Dublin), pointed out that the patients in the study under discussion, which addressed prostate cancer, had been treated so their prognostic scores were more useful. Nevertheless, it was a useful reminder that many systematic reviews and prognostic studies may be out of date simply because of the nature of evidence collected.
Post publication peer review is really useful but there are few opportunities for open dialogue. It is even more unlikely to occur at a conference where most presentations are of preliminary findings with abstracts submitted before complete analysis of a dataset. Liam Smeeth (LSHTM, London) presented the findings of a paper that had already been published in the BMJ and the discussion afterwards was a superb illustration of the value of post publication peer review.
Experts were able to dissect out issues of coding, concerns about residual confounding, and dates of inclusion (eg how the date of diagnosis and death could occur on same day). As an editor closely involved in the decision to publish this paper, it was fascinating to hear this discussion and how it might help future researchers. Research doesn’t end with the last full stop but the meaning of work evolves and develops with its interpretation. We try to capture this dialogue at the BMJ through our rapid responses but I wonder if we could create a more dynamic process.
The law of unintended consequences….I spotted a presentation right bang on one of my research interests. It was a great piece of work, elegantly presented and a valuable addition to current knowledge. I asked a question, hoping to clarify and add some precision to the message. I heard afterwards that the research team interpreted my question as a major blow to the likelihood of publication in the BMJ. But, I only asked the question because I was really interested. Funny how things can be (mis)interpreted.
Domhnall MacAuley is primary care editor, BMJ