Heel Strike Uses Less Energy Than Midfoot Strike

There was a time, a few years ago, when research into footstrike and biomechanics and running shoes seemed fresh and exciting. We were all eager to put old assumptions to the test and try out new ideas. These days, it feels like the bloom is off the rose. Clear, unambiguous answers about the "best" way to run have – surprise, surprise – remained elusive, and I get the sense that many people have decided they have enough information to make up their minds, and no longer need to keep asking new questions.

One of the main criticisms of studies that try to compare midfoot (and/or forefoot – the two categories are lumped together in the study, and I'll do the same for the rest of this post) strikes with rearfoot strikes is that they often involve asking people from one category to force themselves to run in an unfamiliar way for a short period of time, then draw conclusions based on this limited sample. Alternately, other studies assemble different groups of midfoot and rearfoot strikers, but with such diverse characteristics that it's hard to draw any conclusions from the comparison. The strength of the new Spanish study is in addressing these concerns: they assembled two very homogeneous groups of sub-elite distance runners, with 10 midfoot strikers and 10 rearfoot strikers. The groups were very closely matched in age, running experience, training volume, BMI, VO2max, HRmax, and fitness, with average half-marathon bests of 1:10:59 and 1:10:21 (all runners in both groups had run between 1:06 and 1:14, and all had completed a half-marathon in the six weeks prior to the study).

The main finding of the study is that, at three different submaximal paces, the rearfoot strikers were more economical (i.e. burned less energy) than the midfoot strikers. At 8:47/mile, they were 5.4% more economical; at 7:26/mile they were 9.3% more economical; and at 6:26/mile they were 5.0% more economical, though the difference was no longer statistically significant at the fastest speed. This isn't a huge surprise (several other studies have reached similar conclusions), but it's an interesting data point for this particular set of subjects.

The question now is: why? Rodger Kram, the director of the University of Colorado's Locomotion Lab, suggests that, while midfoot strike allows you to store more energy in the "springs" of your ankle, it also requires more metabolic energy to generate the higher required muscle tension. The conclusion from a presentation at ASB by Allison Gruber, Brian Umberger and Joseph Hamill of UMass-Amherst was that these two factors balance each other out in the gastrocnemius, but the extra energy requirements dominate in the soleus (those are the two muscles that make up the calf).

All this is interesting stuff that could give rise to all sorts of debates and discussions – so feel free to fire away in the comments. But first, I'd like to highlight a more minor point that I find particularly intriguing. Check out this graph showing the step length and step rate of the runners:

One unfortunate thing is that they don't break out the results for the two groups – the reason being that they were essentially identical, with no differences observable between the two groups. Still, I would have liked to see the actual data, with individual variation shown.

The reason this is intriguing is that there's a common set of assocations that links minimalism to midfoot/forefoot striking to shorter stride/higher cadence to lower loads on the joints. And certainly, if you take a habitually shod runner and ask them to run barefoot, it makes sense that they'll shorten their stride as they adjust to the lack of cushioning. But in this particular group of runners, at least – a very fit, experienced and well-trained group – their footstrike pattern had no bearing on their cadence across the full range of paces. It's tempting to assume that these runners had enough experience that they'd already converged on an efficient cadence, independent of the effects of footstrike. Of course, that may not be true of less experienced runners.

I'd also like to note that, consistent with every single study done with runners of every ability, cadence increases as a function of speed: the faster they run, the quicker their steps. For these particular very fast and very fit runners, they hit 180 steps per minute (i.e. step rate of 3.0 Hz) at about 5 m/s, or 5:22 per mile. By (not so) incredible coincidence, that's also roughly where I hit 180 steps per minute. If you tried to force them to run at 180 when they were jogging at 9:00/mile (on the left of the graph), they'd be uncomfortable – and less efficient. Again, runners with differing levels of experience (and ability and body size) will have different cadence curves, but it will always be a curve rather than a straight horizontal line: the faster you go, the quicker your steps.