Things looked rosy for Cambridge last month. Yes, the university may have lost pole position in the world university rankings to nerd’s paradise MIT. But in taking second place, three slots clear of its great rival Oxford and two ahead of UCL, it reaffirmed its status as the UK’s leading light in higher education.

Or did it? Today’s world rankings paint a different picture. Cambridge only manages seventh place, while Oxford clambers up to joint-second. UCL is a mere 17th. And what of MIT? Much lauded for its apparently peerless technological research last month, it now gazes up longingly at first-placed California Institute of Technology.

The obvious reason for these discrepancies is the use of different ranking systems. Today’s Times Higher Education tables are a different beast to last month’s QS World University Rankings. Although nominally answering the same question, they don’t share a methodology, a data set or indeed a winner.

Rather than argue over which is right, UK universities should perhaps just be glad that the widely respected Shanghai Ranking is less well-known on these shores – none of our universities come close to ending Harvard’s 10 years at the top of that list.

So, where can prospective students turn for answers? The simple truth is that there is no such thing as a definitive table. But in fact the wildly differing outcomes of these tables make them more, not less, useful. The key is in knowing how to interpret them.

The ‘Shanghai Ranking’, for example, originated in 2003 with Chinese government backing. It was designed to provide a global benchmark against which Chinese universities – enjoying billions in state and private investment – could assess their progress. It is a remarkably stable list, relying on long-term factors such as the number of Nobel Prize-winners a university has produced, and number of articles pubished in Nature and Science journals.

But with this narrow focus comes drawbacks. China's priority was for its universities to “catch up” on hard scientific research. So if you’re looking for raw research power, it’s the list for you. If you’re a humanities student, or more interested in teaching quality? Not so much.

Likewise the THE and QS rankings have their idiosyncrasies. Between 2004 and 2008 they were one and the same, before THE broke away to form its own tables. For THE, the need for more reliable world rankings that could be used for everything from government policy to shaping institutions’ reputations made the previous QS methodology – largely based on surveys – seemingly too volatile.

“Our rankings stand up to more academic scrutiny,” says THE rankings editor Phil Baty. “We produce high-end rankings which are used by governments around the world. And we’re the only global rankings that take teaching seriously.”

Predictably, QS says good riddance. “We’ve always been clear we’re aimed at prospective international students,” says Danny Byrne of QS. “Our rankings are easy to understand and of direct relevance to students – we’re unique in asking potential employers what they think of the universities, for example.

“THE cater for their audience – they’re a trade publication, and have an academic readership.”

The rankings differ in how they collect data too. QS’s rankings are reputation-driven, with 50 per cent of an institution’s score derived from surveys. And while THE does some reputation surveying, sending invitation-only questionnaires to a limited number of institutions around the world, QS opts for quantity to achieve reliability, mass-mailing some 46,000 academics before weighting the results to preclude regional bias.

Similarly, while QS canvasses academics' opinions on research but not teaching quality – reasoning that academics aren’t qualified to comment on the level of the latter at rival institutions – THE has five different measures of teaching quality. Indeed, teaching quality makes up a third of an institution’s THE score.

“Rankings are a useful source of information that wouldn’t otherwise be available – but they don’t make your decision for you,” says QS’s Byrne. “It’s about knowing what they do, and applying them intelligently. More is more.”