Thanks to a link sent to me by Bob Gordon, I just discovered a useful list of philosophy resources created by Anthony Cole of the Worwick School fo Law. There is a ranking of philosophy journals, HPS programs, and much more. There are also rankings of philosophy departments around the world based on publication quality and quantity. Unsurprisingly, the results are quite different from those of the (reputation-based) Philosophy Gourmet Report.
Some aspects of Cole’s results are odd and appear to derive from idiosyncracies in his methodology, which unfortunately is not explained. For instance, he seems to attribute the same weight to publications in any language, even though generally non-English publications have only a marginal impact on the international community. Still, his rankings seem useful to me, especially if taken with a grain of salt.
These lists strike me as completely off the wall, and not even worth looking at.
PhD programs: Miami is better than Pittsburgh? Boulder is better than Princeton? The University of Dallas is better than Ohio State, Washington U., Indiana, etc???? UCLA is level 11, below all of these places?
Journals: Metaphysica is a top-tier journal? Philosophy of Science and Phil Studies are tier 5? below graduate faculty philosophy journal? really? Studies in HPMP is tier 8?? below zygon, etc?
Or how about the HPS list: Rutgers is the 67th best place for HPS??? behind Technical University of Braunschweig???
These are not idiosyncrasies. I’ve only picked out some especially egregious but otherwise representative examples. this is noise. without even knowing the “methodology” of how this was generated, who would consider these lists informative?
Damn that is odd. Someone hit the crack pipe before making that list.
Hi, I was just put onto the mention of the list here (thanks Gualtiero for pointing it out), so I thought I would make a couple of short responses, just to clarify some things. Most of these comments are already made in the textual introductions on the website, but I’m happy to reiterate them.
With the HPS list, it is important to keep in mind what is being evaluated. puzzled’s surprise at the low ranking of Rutgers is perhaps due to a confusion between the Rutgers philosophy department, which has several international quality philosophers of science, and the Program in History of Science, Technology, Medicine and the Environment, which does not – and which is the one being evaluated. The list only addresses specialist programs, not philosophy departments with philosophers of science – they are noted in the regular list. I would, though, also emphasise the more general disclaimers I make on that specific page, as I also have serious reservations about that list, so try to make clear that readers should definitely take it with a grain (or more in this specific case) of salt.
The journals I’m more comfortable defending, as I think it’s a more reliable list. “Metaphysica” I expect to fall in subsequent years. One of the things to remember is that journals just about always have a very strong first year or two, as the editors drag in contributions from friends, etc. – then the journal will settle at a more realistic level. I try to eliminate that problem by leaving out journals that have not been published for at least 5 years, but sometimes a journal seems to be benefiting from being new anyway. Metaphysica I think is one of those, and probably Sensus Communis as well. I would expect both to remain strong journals, but it would be dishonest of me to ignore the results generated just because I didn’t agree with them. Instead, I figure that if there is the occasional odd result, the list is still informative as long as the oddities are only occasional – and if you look at one highly ranked journal, but don’t find it worth reading after all, then the only harm you’ve suffered is that you’ve read a couple of articles you might not otherwise have read. Again, assuming the list is more overwhelmingly useful than “odd”, then the list still seems to me justifiable.
Again, though, it is important to read and take seriously the introductory comments. Certainly Phil Studies publishes a good number of important articles – more than you might expect from a “level 5” journal. But as noted on the page, the list also takes into account the size of the journal – and Phil Studies also publishes an awful lot of articles that, while fine, are decidedly not standouts. Were it trimmed down, publishing only the best articles, it would without question be ranked near the top.
I’ll just break for a second, and continue below.
To continue: Some of puzzled’s comments on the journals also seem to simply reflect his/her interests. Graduate Faculaty Philosophy Journal is in fact enormously highly regarded by “continental” philosophers, but most likely not by those who read Philosophy of Science and Phil Studies. I recently received an e-mail from another site visitor suggesting I had unfairly lowly ranked Process Studies and Philosophy East and West. Putting together this kind of “overall” list is kind of difficult, as it really makes little sense to say whether Philosophy of Science is better or worse than the Graduate Faculty Philosophy Journal. They do entirely different things. So the hope with the list is simply that it represents a useful guide to things you might find helpful. If you don’t do continental philosophy, you should ignore all the continental journals, no matter how highly ranked.
Similarly, on the same rationale, specialised journals unavoidably appear lower in the list than their quality might suggest – as they are less likely to be of interest to the broad variety of people visiting the page. But hopefully the list accurately reflects strengths within a grouping. For example, logic journals are all ranked fairly low, but if you were unfamiliar with the field and looking for a strong logic journal, I believe the stronger journals are ranked higher than the weaker ones. Again, no general list can accomodate specific interests, and some of my own favorite journals are ranked significantly lower than I would put them – as my interests don’t always correlate to the philosophical mainstream.
Finally, on the departments. As I note on the page, the list takes into account not just famous names, but diversity of specialisms, and size compared to strong faculty. So Miami benefits from being a small department with an exceedingly strong faculty, in a diverse range of areas. Most rankings probably wouldn’t put it as high because its faculty members tend to be slightly outside the mainstream for their areas (Thomasson’s metaphysics concentrates on fictional entities; Siegal does philosophy of education; Haack, Slote and McGinn are each very highly regarded but again approach their subjects a little unconventionally). That’s certainly something for potential grad students to consider (job prospects are best when you come from a mainstream department), but not something that justifies ignoring the department’s strength.
And yes, if you’re Catholic, with a particular interest in the history of philosophy, then Dallas has an excellent small department. If you’re not, then it’s probably not the place for you.
And one more, for just a small bit more.
These journal rankings are just horribly uninformed. It’s pretty rare to find someone taking on such a task with literally no idea what he’s doing.
Finally: I do specifically emphasise UCLA as a department not realistically represented in the list. But for a simple reason: it has a significantly high number of highly regarded faculty who simply haven’t published much lately – and the list is based on recent publications. That David Kaplan, for example, hasn’t published in ten years doesn’t mean he’s stopped thinking, but it does tend to reduce his impact in any publication-based survey. However, this is why I do try to emphasise that these lists should only be seen as partial information. A publication-based list can give useful information you won’t get in a reputational survey (e.g. Miami’s main hires have only been in the last few years, so its reasonable to say its reputation lags behind its strength), but publications certainly aren’t everything (despite what many university administrators might think).
Finally, on Gualtiero’s point on non-English publications, that is something I am still trying to work on. I don’t simply give equal weighting to all publications, regardless of language, but I also don’t think it’s reasonable to downgrade publications simply because they are not in English. There is some excellent work done in non-English languages, even if it often has little impact outside its home tongue. So I’ve tried to adopt an approach that recognises that, although it’s certainly something I’m still working on.
Anyway, sorry for the length of this response, but I thought a detailed response might be helpful. I explain on the page why I try to limit the information I give on the methodology, although it may well be that I currently limit it too much. But the page is an ongoing effort, with methodology adjustments being considered/adopted regularly, so I wanted to clarify the points I mention above because I’d be happy to hear anyone’s thoughts.
Tony