Could you shed some light on how runners get national rankings in the ultra trail running world? Is there a set of races these rankings are based on? If someone is cited as “nationally ranked” what does that really mean? Just curious as to how it works and can’t find a straight answer on the web.

There’s certainly no unified trail ultra ranking system, especially during the course of the year. The closest thing there is would be Ultrarunning Magazine’s annual rankings: (2007 results). The voting is done by a bunch of ultrarunners, but tends to be skewed in certain directions as the voters are designated by the magazine editors (I believe).

Ultrarunning Magazine also annually published a long list of the fastest times at particular distances (50k, 50 miles, 100k, and 100 miles, I think) – the list includes the top performance by every runner who breaks a set time standards at the given distance. I guess anyone who is on that list would be “nationally ranked,” but it could be ingenuous for two reasons. First, one could rank high on the 50k list for running a flat, loop road course while many top ultrarunners don’t run such races. Second, the list is very deep…. I think going maybe two or three hundred spots for a given distance. Certainly someone who is 268th ranked for their gender at 100 miles in a given year is stretching the truth to call themselves nationally ranked given the relatively small number of folks who run a 100 in a given year. It would be even worse if their time was from a fast 100 like Rocky Raccoon.

There you have some off-the-cuff thoughts on the subject. Surely, some will feel that ultrarunning shouldn’t involve rankings and what not, but as other ultrarunners (including some of the top runners) are interested in such things, why not discuss this, right? Here are a few questions to ponder:

Should there be any ultrarunner rankings?

What group is best qualified to develop or implement rankings (i.e., Ultrarunning Mag, USATF, an individual’s website with internet voting)? What are the limitations/problems/advantages of each group?

Would a race series or point system for races be an appropriate method on which to base such rankings?

Is there any point in or possibility of a rolling ranking as opposed to year end ranking?

If there was a definitive ranking system, how many runners deep should it go?

About how many elites, national-level, or regional-level ultrarunners do you think there are?

Alright, go for a run, get thinking, and then share your thoughts on iRunFar.com!

Love this subject and love the mathematical possibilities of figuring it out – such as building a weighted scoring system for each race that includes course elevation, altitude, temperature, and misc conditions such as rain, snow, mud, river crossings…etc. Then apply some sort of weight to the race finishing times. There should be a minimum number of races run for the year to make the list for all the different ultra classes – 50k, 50M, 100M, best overall…Great topic!

I'm in on the beers and pizzas. I"m working on the Speedgoat of the Year. I've compiled the 100 mile winners already and there is no overall standout for multiple races, just a few incredibly ridiculously fast performances. A weighted scoring system would be awesome, but I won't be the one figuring that one out. We should leave that one up to the Hardrock committee. They are very good at that stuff. I think we need a more than a few beers, perhaps 54 days of beers and we could figure it out!Cheers!

I think it makes for good fodder (i.e. discussion and pondering over a few beers…again the beer thread) and though subjective it generates some interest.Seems anyone who follows ultra running could throw out a list of names that would be fairly similar to other lists, with standout performances for a particular year pushing someone up near the top.Interesting how the elite marathoners now have essentially a grand slam of the big races with the winner getting a nice chunk of change.Aside from seeing who puts together a dominant year or performance though, from a fan's perspective, it's great fun to see a deep field go at it on the same course, same day (as the same trail race from year to year with temps and conditions can be so different).

I'm with Karl on the beers and pizzas and also on the fact that I'm not the guy to figure out any stinkin' algorithm. We can leave that to the rocket scientists at Los Alamos.That said, a weighted system would be great with, perhaps, a top-5 of the weighted events at each distance based on the previous year's times vs. depth of field.A quick look at a typical year (that is a year when WS was not canceled) would probably have the following as the top "weighted" runs (with some editorial variations for my pleasure:)50K – Way Too Cool (who runs more than one of these a year anyway!)50M – Zane Gray, American River, White River, San Juan Solstice, JFK100K – Miwok, Waldo (not enough races to go 5 deep)100M – Western States and Leadville in the "Runner's Division" and Hardrock and Wasatch in the "Mountain Division". And Vermont because it's a Grand Slam race and a good one for anyone who wants to "cherry pick" a 100 mile win. Honorable mention, Angeles Crest because it's old skool. Fun topic!AJW

There are several challenges that make ultra rankings anything more than subjective entertainment (i.e. college football rankings). Difficulties in objectivity arise in three areas: 1) different race lengths 2) different levels of difficulty at each length and 3) athletes spread out over many different races. I think there's nothing wrong with issuing rankings in the current system, but limitations must be realized. If there is ever going to be a conclusive and objective ranking system, then certain races need to be designated which count towards a yearly championship.