Wednesday, July 11, 2007

Journal Rankings

In the last few weeks, philosophy bloggers have posted their various evaluations of the European Science Foundation's journal rankings. For a partial list of the journals and bloggers' comments, see Leiter Reports, Lemmings, Feminist Philosophers, Gone Public, and Brooks Blog.

Although analytic philosophers seem to be mostly satisfied with the list as a rough cut, continental and feminist philosophers have raised some concerns that it reinforces a popular but not universally accepted standard for what counts as the best philosophical writing.

John McCumber has made several important points:
1. that the European Science Foundation has a questionable agenda in ranking journals in the humanities.
2. that simple rankings invite mis-use by tenure committees looking for easy to make and easy to enforce judgments about research quality.
3. that although the list's purpose may be in part to promote rigorous (blind) peer review, not all the journals in the A-List are peer-reviewed in the standard way.
4. that the list is based on journals' "reputations" but it does not say how those reputations have been evaluated. Indeed, McCumber points out that the analytic focus of the A-Listed journals coincides with the analytic research interests of the committee that compiled the list.

There are a number of reasons to be wary of ANY simple ranking scheme.
1. Lists serve the status quo.
Any list is bound to favor older, well-established journals that publish papers across a range of areas in philosophy. These journals have well-earned reputations. But they are less likely to publish philosophical papers that attempt to push philosophy in new directions—that are multi- or cross-disciplinary, that challenge analytic methods and styles, that deepen newer paradigms, such as environmental and feminist frameworks, or that apply philosophy to solving practical problems, as in medical ethics. Most specialty journals--and all specialty journals that are not analytic--are classified as B-List. But the criterion that journals not specialize does not speak to the quality of the papers they publish.
2. Many of the journals on the list are not double-blind reviewed, and as McCumber points out, a decision of whether or not to publish a paper
needs to be blind, and preferably double blind. The way major philosophy journals are editorially reviewed in this country simply reinforces the dominance of the old over the young, and is I think a major reason why there are so few new ideas in American philosophy compared to other disciplines.
3. Lists can serve the interests of administrators looking to deny tenure, but cannot so easily be used in the interests of younger members of the profession. It is not helpful as a guide to where to submit papers because it does not reflect considerations such as acceptance rates, moratoriums, the time it takes to get a response from the journal, reputations within particular sub-disciplines of philosophy, and citation rates.

Indeed, it is not clear why this or any other list would be superior to a simple citation ranking as is used in the sciences (though even that type of ranking is subject to all the above faults and is too often mis-used).

My final thought about journals is a question raised by my friend Jim Johnson, who asked why the prominent philosophy journals are not published by professional organizations. And I don’t know the answer. In political science, two prominent journals are published by professional organizations. This is also common in the sciences, where both the general interest journals (like Science) and specialty journals (like Ecology) are published by organizations (AAAS and ESA, respectively). When journals are published by organizations, then they have a tighter connection with the organization’s membership and a direct duty to be responsive to the profession, and particularly to the young members of the profession. (Although no doubt this accountability causes head-aches for journal editors!)

No comments: