Crooked Timber

The Basic Knowledge Project at the Arché Research Centre at the St Andrews University will be hosting a major conference on 13-14 June. The conference theme will be Scepticism. There will be four 30 minute talks delivered by graduate students, followed by 20 minutes for questions. Please send us your paper via e-mail to Dylan Dodd by 15 March, 2009. Papers should be made suitable for blind refereeing. You will be notified of acceptance by 15 April.

And submissions are due January 15th for the Rutgers-Princeton grad student conference. Papers should be submitted by email to prconf2009@gmail.com. A papers should be no more than 4,000 words and preceded by an abstract of no more than 150 words. Papers should be submitted in blind review format. Notification of acceptance will be sent no later than February 9th, 2009. More information is available here.

Apparently, in their year-end list of “reasons to love New York”, New York Magazine’s reason number 56 is <A href=http://nymag.com/news/articles/reasonstoloveny/2008/52897/>Because New York Has Become a World Capital of Philosophy</a>. I suppose being just a city publication they can only really mention NYU, Columbia, and CUNY, and not Rutgers and Princeton, which are no farther than UC Irvine is from UCLA.

If you’ve been around awhile, think of the 2 or 3 rookie “stars” from your year(s) on the job market (the ones who got most and best interviews and offers), and then ask if these people are the most influential members of their cohort. I suspect that for many of us, this exercise does not engender strong confidence in the profession’s predictive acumen. (Perhaps this is why some major programs avoid hiring junior.)

I don’t want to get into the pros and cons of hiring junior, especially while Rutgers is searching, but I thought this was an interesting experiment.

Here’s one data point. When I was first on the market, 10 years ago, the person who seemed to have the most interviews was Jonathan Schaffer. And the person who has been (deservedly) the most influential from my cohort has been … Jonathan Schaffer. Now Jonathan didn’t do too well through the interview/fly-out process that year, and certainly didn’t get the most offers. So I think we end up with a mixed verdict here. The evidence from the 1998/99 hiring season is that philosophers look relatively prescient when reading files, but less so when interviewing people and/or hearing job talks.

Robbie Williams has a detailed discussion of the results, looking in particular at which way of parsing the rankings matches most closely with Leiter’s rankings.

One other point Robbie makes is important. “[W]ho every thought that there’s a linear ordering to capture in the first place?” As someone who has made a career out of arguing that probabilities aren’t linearly ordered (as many people agree), and that truth values aren’t linearly ordered either (which everyone thinks is crazy), I should agree. Philosophers are much too quick to assume that a comparative induces a linear ranking in general, and this case is no exception. But of course both probabilities and truth values, although they don’t induce linear orderings, do have top values. So while you shouldn’t take any of these tables too seriously, you should remember that St Andrews is at the top.

Slightly more seriously, I do think The One True RAE Ranking of Philosophy Programs does tell us something important. If you focus on averages, UCL is overall best I think, followed closely by St Andrews, then Kings, then it gets a little messy. (I know St Andrews is tied with UCL on GPA, but I’d prefer their profile to ours.) If you focus on total quality, obviously Oxford is first, then Cambridge HPS, then it gets messy, with Kings, Leeds, St Andrews and Sheffield in the mix. I think there’s something to be said for taking both perspectives, the averages and totals, seriously. And if you do that, it’s easy to see St Andrews as doing particularly well, because it is in the mix in both categories. I do think the One True Ranking understates how well some departments (esp UCL) did, but I think it’s worth noting the overall judgment of the RAE panel that St Andrews (like Kings) maintained a spectacular batting average while being a large-ish department.

In non-philosophy news, John Gardner produced a similar series of tables for law. In law it seemed the judgments were easier. LSE was best on averages, Oxford was best once you factored in size. And the Borda count method had them, happily, coming out exactly tied.

For what it’s worth, I agree with Robbie’s comment that we should be cautious about drawing ordinal rankings from the RAE data.

The point I want to make here is the obvious one that some of that information will be more useful for certain purposes than others, although all of those purposes could aptly be described as “assessments of research quality”.

For instance, a student wondering where to go to get the most great discussions with the most great researchers may be uninterested in the difference between department A with a large amount of 4* output and nothing else and department B with a similar amount of 4* output and also a large amount of 1* output. If so she should not take averaging scores too seriously.

To university admins, however, the difference between A and B might be extremely important for determining things like how much research bang for their salary buck they are getting. They should take averages seriously.

Various different kinds of ordinal information might be obtained from the RAE data which might be useful for different purposes (though like almost everyone else I don’t think the methods used to formulate it were perfect), but I don’t think we should assume any ordinal ranking is best (or even approximately best) for all (or even most) research-quality-assessment purposes.

UPDATE: I say this as someone who claims affiliation of one kind or another with a number of British departments, and is feeling neither particularly thrilled or nor particularly dispirited about any of their performances!

I’m (slowly) writing the entry on David Lewis for the Stanford Encyclopaedia. Here’s the tentative table of contents.

Convention and Linguistic Meaning

Counterfactuals

Philosophy of Mind

Modal Metaphysics

Everything Else

The last section could do with a snappier title. But the idea is that I start with the two early books, and the papers that build directly on those books (esp “Languages and Language” and “Time’s Arrow”). Then I look at what I think of as the three big themes of Lewis’s career. These are (a) his theory of mind, (b) his reductionism about the nomic (and related topics), (c) modal realism and its consequences for metaphysics, especially modal metaphysics.

The problem is that this leaves out quite a lot. For instance, it leaves out practically everything from “Papers in Philosophical Logic” and “Papers in Ethics and Social Philosophy”. But I do think that trying to find another theme on a par with those three would amount to shoehorning material into a category in which it doesn’t quite fit. (Not that the three themes are entirely distinct.)

But that doesn’t mean I shouldn’t say anything about the rest of Lewis’s career. So I was wondering what I should focus on outside those five sections. To that end, I made a crude search on Google Scholar of which were Lewis’s most cited papers. The full results are below the fold, but the top 15 is a little surprising.

Title

Citations

Counterfactuals

1688

On the Plurality of Worlds

1444

Convention: A Philosophical Study

1423

Scorekeeping in a Language Game

879

General Semantics

862

Causation

555

Adverbs of Quantification

552

New Work for a Theory of Universals

458

Elusive knowledge

384

Probabilities of Conditionals and Conditional Probabilities II

378

Attitudes de dicto and de se

377

Counterfactual Dependence and Time’s Arrow

342

Psychophysical and Theoretical Identifications

336

Counterpart Theory and Quantified Modal Logic

334

Parts of classes

314

Note that the “II” in “Probabilities of Conditionals and Conditional Probabilities” is misleading. Google Scholar thinks that the two papers with roughly this title are just one paper, and it has merged their citations together.

That the books are up top is no surprise. Books generally do much better than papers on Google Scholar. And it isn’t a surprise to some extent that older papers lead the way. They have more time to collect citations. But the showing of the language papers, and in particular the formal semantics papers, is quite stunning. I think I follow Lewis, and Lewisiania, quite a bit, and I can’t recall the last time I saw someone cite “General Semantics”, for instance. So maybe this isn’t the best measure of the importance and influence of the various works.

I was celebrating the RAE results last night with a couple of Bellhavens, so I didn’t immediately write up a story about them. And this morning I see that in the comments thread over on Brian Leiter’s blog there are a lot of different proposals for how to summarise those results. Here are some of the natural proposals.

First, we could just rank the departments by the percentage of research activity that is category 4, i.e. world-leading. The results are then as below. (I’m just listing top 10s, including ties, and rounding up ties.)

1.

University College London

2.

University of St Andrews

3.

King’s College London

3.

University of Oxford

3.

Cambridge HPS

3.

University of Sheffield

3.

LSE

8.

University of Bristol

8.

University of Reading

10.

Cambridge Philosophy

10.

University of Stirling

10.

University of Nottingham

10.

Birkbeck College

Or we could look at the departments with the most research activity that’s either in category 4 (world leading) or category 3 (internationally excellent). Since I regard myself as more an internationally excellent person than a world leading person, I like this kind of measure.

1.

University College London

1.

University of St Andrews

1.

King’s College London

1.

University of Reading

1.

University of Essex

6.

University of Sheffield

6.

University of Stirling

8.

University of Oxford

8.

Cambridge HPS

8.

LSE

8.

University of Bristol

8.

Cambridge Philosophy

8.

University of Leeds

8.

University of Edinburgh

8.

Middlesex University

8.

Queen’s University Belfast

But this doesn’t discriminate between the 4s and the 3s, and it doesn’t include the value that 2s (recognised internationally) or 1s (recognised nationally) add. So some people, e.g. The Guardian, have summarised the result as a GPA. That’s the sum, as n ranges from 1 to 4, of n times the percentage of work that’s category n. By that measure we get,

1.

University College London

1.

University of St Andrews

3.

King’s College London

3.

University of Reading

3.

University of Sheffield

6.

University of Stirling

6.

University of Oxford

6.

Cambridge HPS

6.

LSE

10.

University of Essex

10.

University of Bristol

But that seems to me to undersell the difference between 4s and 3s, which is more important for many purposes than the difference between 2s and 3s, or 1s and 2s, and so on. If you want to do have a ‘top-weighted’ GPA, the thing to do is for each department, to work out the sum, as n ranges from 1 to 4, of 2n times the percentage of research activity in category n. If you do that, the rankings are.

1.

University College London

2.

University of St Andrews

3.

King’s College London

4.

University of Sheffield

5.

University of Reading

6.

University of Oxford

6.

Cambridge HPS

6.

LSE

9.

University of Bristol

10.

University of Stirling

But many people have been complaining that these averaging methods discount the merits that large deep departments have. (They are also possible to ‘game’ by not submitting the work of all faculty members, but I don’t see much evidence of that happening in the report. If anyone does see evidence of it, I’d be interested to know.) So a lot of people have been advocating that we multiply the number of faculty (who had work submitted) by the numbers generated above. For instance, this is the ranking of number of faculty submitted times percentage of research activity in category 4.

1.

University of Oxford

2.

Cambridge HPS

3.

University of St Andrews

4.

King’s College London

5.

University College London

6.

University of Sheffield

7.

University of Leeds

8.

University of Bristol

9.

Cambridge Philosophy

10.

LSE

And this is the ranking of number of faculty submitted times percentage of research activity in either category 3 or 4. Here is where we start to see Leeds (prominent employer of St Andrews and Rutgers grads, I hasten to add) doing well.

1.

University of Oxford

2.

Cambridge HPS

3.

University of Leeds

4.

King’s College London

5.

University of St Andrews

6.

University of Sheffield

7.

Cambridge Philosophy

8.

University College London

9.

University of Warwick

10.

University of Edinburgh

Here is the table for GPA times faculty submitted. Again, Leeds does well. (And note that Cambridge HPS, though undoubtedly a great department, is in no small part a great history department. It’s not quite an apples-to-apples comparison with other philosophy programs.)

1.

University of Oxford

2.

Cambridge HPS

3.

University of Leeds

4.

King’s College London

5.

University of St Andrews

6.

University of Warwick

7.

University of Sheffield

8.

Cambridge Philosophy

9.

University College London

10.

University of Edinburgh

Finally, here is faculty submitted times the ‘top-weighted’ GPA I described above. It actually doesn’t change a great deal from the overall GPA.

1.

University of Oxford

2.

Cambridge HPS

3.

University of Leeds

4.

King’s College London

5.

University of St Andrews

6.

University of Sheffield

7.

University College London

8.

Cambridge Philosophy

9.

University of Warwick

10.

University of Bristol

There are a lot of different ways to measure how well departments did from the RAE then. It would be nice to have a meta-score, some way of tabulating these differing metrics. The simplest thing to do, and I think the most widely employed approach in cases like this, is something like a Borda Count. We’ll give departments 1 point for each first place finish, 2 points for each second place, and so on.

Because it was easier to do things this way in Excel, I’ve ‘rounded down’ ties. So if 3 departments finish tied for 3rd, they’ll each get 3 points, rather than the 4 they should probably get. But I don’t think this changes things substantially.

Compiling the results in this way produces what I like to call The One True RAE Ranking of Philosophy Programs. It is,

University of St Andrews

24 points

King’s College London

26 points

University of Oxford

27 points

Cambridge HPS

31 points

University College London

33 points

University of Sheffield

41 points

University of Leeds

67 points

Cambridge Philosophy

73 points

University of Bristol

76 points

LSE

77 points

University of Reading

77 points

As feels intuitively correct when eyeballing the numbers, St Andrews comes out on top.

The result is fairly resilient over changes in methodology. That is, if you sum the results using some incorrect method, rather than generating The One True RAE Ranking of Philosophy Programs, you’ll still often get St Andrews on top.

In particular, if you just look at research activity in category 4, do a ranking of departments by percentage in category 4, then do a ranking of departments by faculty size times percentage in category 4, then do a Borda count to combine the two rankings, you get UCL first, St Andrews second. If you do the same thing with categories 3 and 4, you get King’s first, St Andrews second. If you do the same thing with GPA, you get St Andrews first. And if you do the same thing with what I was calling ‘weighted’ GPA, you get St Andrews tied for first with King’s and Oxford. So I think the methodology is reasonably sound.

Obviously over the days, weeks, months and years ahead, you’ll read many people trying to claim that their department really comes out best from the RAE. But if you’ve read this far, you’ll know better. The One True RAE Ranking of Philosophy Programs has St Andrews first.

All of this leads to a natural question. What’s the right beer to celebrate Rutgers’ success with?

I was reading through a typically interesting Jonathan Wilson column on formations when I was taken aback by a gratuitous reference to Wittgenstein, complete with Stanford Encyclopaedia link. Later in that same paragraph is a reference, without even a name dropping, to Kuhn. Not what you normally expect in the football pages. But a great article, even with gratuitous philosophy references.

The deadline for paper submissions to the inaugural Compass Interdisciplinary Virtual Conference has been extended to January 14. If you are interested in submitting a paper to this, more details are available here.

I’ll talk about this more in the morning, but for now I just wanted to note St Andrews’ great performance in philosophy in the 2008 Research Assessment Exercise. St Andrews had the second highest percentage of their submitted work ranked in the top category, behind only University College London, and ahead of Oxford, among others.

I’ll say more about various ways of reading the data tomorrow, but one conclusion I’ll be drawing is that Leeds comes out of the project looking pretty good. That’s well deserved of course; Leeds is a great department. But it’s also nice from a St Andrews perspective to see the most prominent hirer of St Andrews grads in recent years be ranked so highly. Hopefully other departments who hire St Andrews grads in upcoming years will also do well!