methods

Getting physical

I had meant to move on from  this issue about the complexity of biomechanics and the quality of the research questions we ask … but then last night, in her comment, Quin drew my attention to a series of articles entitled “Biomedical research – increasing value, reducing waste” that were published in the Lancet in June (these are free to download but you need to register first – also free). They make fascinating reading. If you think I was a bit grumpy and cynical in what I wrote the week before last then you should have a look at what these guys are saying! (The articles are a bit heavy going and an easier and more entertaining alternative is Ben Goldacre‘s book Bad Science, it’s been around long enough that you can pick it up for 1p on Amazon and just pay the postage).

The issue marks the twentieth anniversary of an article, The scandal of poor medical research, written by the statistician Doug Altman in the BMJ which was perhaps the first public recognition of the poor quality of much clinical research with the tag line, “we need less research, better research and research done for the right reasons“. In another commemorative piece, Medical research still a scandal,  Richard Smith, who was editor of the BMJ at the time, laments on how little has changed, despite, perhaps, a wider awareness of the problems.

One of the responses to Bland’s original article which appealed to me has been given the title Theory must drive experiment. In it the author (a JA Morris from the Royal Lancaster Infirmary) attributes the problem of poorly formulated research questions to a failure of clinical scientists to develop an underlying theoretical basis for their experimental observations. This has always puzzled me as well and, over the years, I’ve come to the conclusion that, as someone who trained originally as a physicist, I’ve got a markedly different view of the world to many of my colleagues who trained in medicine or health sciences.

Arthur Eddington, “… do not put too much confidence in experimental results until they have been confirmed by theory”

As a physicist I expect to understand the results of my experiments and to be able to align them with an underlying theory. An understanding of that underlying theory then develops new research questions. My  knowledge continues to develop by the continued construction and refinement of the underlying theory (I’m sure there is a posh name for this in the philosophy of science). Taken to an extreme this results in Eddington‘s warning to the physicist  not to “put too much confidence in experimental results until they have been confirmed by theory“. Whilst at face value this sounds like an injunction against publishing experimental data it is actually a plea for careful consideration of the results in the light of the underlying theoretical framework and a refinement of that framework if  necessary.

Ernest Rutherford, “When it comes to science there is physics and there is stamp collecting”

I wouldn’t claim this as a unique skill of the physicist. Whilst over a hundred years ago Rutherford could quite reasonably(?) claim that “all science is either physics or stamp collecting” , the world has moved on. The rise of anatomy, physiology and particularly biochemical biology since that time mean that there are now underlying theoretical frameworks that we can use to explain the results of clinical and health sciences research.

We very rarely do though and I think this is partly attributable to education of doctors and allied health professionals being rooted in an earlier era. It wasn’t so very long ago that most results of clinical research where, effectively, beyond explanation. There was no point trying to fit those results into any underlying theoretical framework because the basic principles of that framework had not been established.  Knowledge in the clinical domain was essentially phenomenological – a knowledge of what happens rather than why it happens. Education then becomes a matter of teaching the facts rather than the underlying principles that link those facts. Of course if you don’t have an underlying theory to work from then you are going to find it much more difficult to generate sensible questions to drive the next generation of research. This is, of course, exactly the point that Morris was making and links to my post from last week.

As we move out of that era though we’ve got to put a much heavier emphasis on developing the underlying theoretical basis of our subject and using this to drive our research questions. Which leads me to my highlight of the ESMAC-SIAMOC conference which was Adam Shortland’s key-note talk, “The neuromuscular prerequistes of normal walking and the early loss of ambulation in cerebral palsy“. In it he reviewed what is now known about neurophysiological development and laid out a conceptual framework that explains much of what we observe in cerebral palsy and also provides a platform to generate new research questions … but then if you look at his CV you’ll see that he trained first as a physicist as well!

 

Advertisement

Publishing one paper to point out faults in another

This post is prompted by a discussion we had internally about a paper co-authored by one of my colleagues at the University (Dall et al. 2013). This was written as a response to an earlier paper (Tudor-Locke et al. 2011) based on data from the National Health and Nutrition Examination Survey (NHANES) for 2005-6 showing how many steps people took in each minute epoch as measured by an activity monitor. They assumed this was a measure of cadence and came up with the conclusion that:

Self-selected walking at 100+ steps/min was a rare phenomenon in this large free-living sample of the U.S. population, but study participants did accumulate 30 min/day at cadences of 60+ steps/min.

This is simply wrong. Whether the number of steps taken during any minute represents cadence or not will depend on whether the patient has been walking for a full minute or not. Take a person who is recorded as taking ten steps in one minute. This could come from someone who has walking difficulties and walked continuously for a minute but took only ten steps at a true cadence of 10 steps per minute. In this case steps per minute epoch is equal to cadence. Equally it could come from someone who had no difficulty walking and who walked ten steps at a cadence of 120 steps per minute but only for five seconds (ten steps) within the minute. In this case, which will be far more common than the first, cadence and steps per minute epoch are quite different. Recordings of 100 steps per whole minute is not rare because people walk with slow cadence but because it is actually very rare that we walk continuously for a whole minute (Orendurff et al. 2008). If you want to define a threshold value for cadence as was the original intention of Tudor-Locke et al. then you actually have to find some way of recording true cadence and not the number of steps per whole minute.

I think the issues are clear cut so far but then what should our response be? Malcolm and his colleagues had access to data collected with their activPAL device that would allow both true cadence and total number of steps per minute (step accumulation as they call it) to be calculated and demonstrated convincingly, but rather unsurprisingly , that the two are quite different. The published paper (Dall et al. 2013) makes a very interesting read – but should we have to go to this effort? Are there more effective ways of just telling people they are wrong!

Writing a letter to a Journal’s editor is one option but it always feels to me as if there is a time window on this – that the letter should really be submitted fairly soon after an article has been published. I’m not very good at keeping up with the current literature but when I’m working on a particular topic I will often read the relevant articles, both recent and not so recent, quite critically. Working like this it is often some time after publication that I read things that concern me.  A combination of my own inertia and the feeling that I am too late prevent me from doing any more about it.

Maybe I’m wrong in this – maybe we should feel free to use this route at any time that a mistake becomes apparent. Certainly this route ensures that the corrective letter is recorded in the same journal and under the same title as the original article and modern databases are becoming better at flagging this. A disadvantage of the approach of Dall et al. is that the new article is in a different journal published under a completely different title. In this case it has been published in a more technical journal (Medicine and Science in Sports and Engineering) which is unlikely to be read (or even searched) by readers of the original article (in the journal Preventive Medicine).

This wouldn’t be a problem if this were an isolated incident but biomechanics is a complex subject and I suspect that there are many more published mistakes and misconceptions than anyone in the field would want to acknowledge. In the worst case (again more common than we’d want to admit) published mistakes and misconceptions are adopted uncritically by other teams and before you know it what started off as an erroneous paper becomes first a series of erroneous papers and then a tried and trusted method (I’d see the use of CMC  (Kadaba et al. 1989) as a useful measure of repeatability of gait data as an example. Buy my book and read the appendix if you want to know more!).

The situation is exacerbated by the number of people who are involved in biomechanics as a secondary discipline. Some readers (and occasionally authors!) are not in a position to judge whether a method is valid or not. Does this increase the onus on those of us within the community who are aware of problems with specific papers to be more proactive in drawing people’s attention to them?

.

Dall, P. M., McCrorie, P. R., Granat, M. H., & Stansfield, B. W. (2013). Step Accumulation per Minute Epoch Is Not the Same as Cadence for Free-Living Adults. Med Sci Sports Exerc.

Kadaba, M. P., Ramakrishnan, H. K., Wootten, M. E., Gainey, J., Gorton, G., & Cochran, G. V. (1989). Repeatability of kinematic, kinetic, and electromyographic data in normal adult gait. J Orthop Res, 7(6), 849-860.

Orendurff, M. S., Schoen, J. A., Bernatz, G. C., Segal, A. D., & Klute, G. K. (2008). How humans walk: bout duration, steps per bout, and rest duration. J Rehabil Res Dev, 45(7), 1077-1089.

Tudor-Locke C, Camhi SM, Leonardi C, Johnson WD, Katzmarzyk PT, Earnest CP, Church TS. Patterns of adult stepping cadence in the 2005-2006 NHANES. Prev Med 2011;53:178-81.