Discussion and analysis of international university rankings and topics related to the quality of higher education.
Anyone wishing to contact Richard Holmes without worrying about ending up in comments can go to rjholmes99@yahoo.com

Monday, March 05, 2007

Top Universities Ranked by Research Impact

The THES – QS World Universities Rankings, and its bulky offspring, Guide to the World’s Top Universities (London: QS Quacquarelli Symonds), are strange documents, full of obvious errors and repeated contradictions. Thus, we find that the Guide has data about student faculty ratios that are completely different from those used in the top 200 rankings published in the THES while talking about how robust such a measure is. Also, if we look at the Guide we notice that for each of the top 100 universities it provides a figure for research impact, that is the number of citations divided by the number of papers. In other words it indicates how interesting other researchers found the research of each institution. These figures completely undermine the credibility ot the “peer review” as a measure of research expertise.

The table below is a re-ranking of the THES top 100 universities for 2006 by research impact and therefore by overall quality of research. This is not by any means a erefect measure. For a start, the natural sciences and medicine do a lot more citing than other disciplines and this might favor some universities more than others. Nonetheless it is very suggestive and it is so radically different from the THES-QS peer review and the overall ranking that it provides further evidence of the invalidity of the latter.

Cambridge and Oxford, ranked second and third by THES-QS, only manage to achieve thirtieth and twenty-first places for research impact.

Notice that in comparison to their research impact scores the following universities are overrated by THES-QS: Imperial College London, Ecole Normale Superieure, Ecole Polytechnique, Peking, Tsing Hua,Tokyo, Kyoto, Hong Kong, Chinese University of Hong Kong, National University of Singapore, Nanyang Technological University, Australian National University, Melbourne, Sydney, Monash, Indian Institutes of Technology, Indian Institutes of Management.

The following are underrated by THES-QS: Washington University in St Louis,Pennsylvania State University, University of Washington, Vanderbilt, Case Western Reserve, Boston, Pittsburgh, Wisconsin, Lausanne, Erasmus, Basel, Utrecht, Munich, Wageningen, Birmingham.

The number on the left is the ranking by research impact, i.e. the number of citations divided by the number of papers. The number to the right of the universities is the research impact. The number in brackets is the overall ranking in the THES-QS 2006 rankings.

Thursday, March 01, 2007

THES-QS Bias Chart

LevDau has been kind enough to reproduce my post "More Problems with Method" and to add a couple of very interesting graphs. What he has done is to calculate a bias ratio, which is the number of THE-QS reviewers reported on the topuniversities site divided by the number of highly cited researchers listed in Thomson ISI. The higher the number the more biased is the THES-QS review towards that country and the lower the number the more biased against. Some countries will not appear because they did not have anybody at all in the highly cited list.

If we chose a less rigorous definition of research expertise such as number of papers published rather than the number of highly cited researchers then the bias might be somewhat reduced. It would certainly not, however, be removed. In any case, if we are talking about the gold standard of ranking then the best researchers would surely be most qualified to judge the merits of their peers.