New Site Collects and Standardizes Philosophy Journal Information


The Philosophy Journal Insight Project (PJIP) “aims to provide philosophy researchers with practical insights on potential venues for publication.”

Its main offering is a spreadsheet that provides information about journals’ subject matter, word limits, type of peer review, open access status, rankings, impact information, acceptance rates, review times, reviewing quality, and so on.

Put together by Sam Andrews, a philosophy PhD student at the University of Birmingham, the site, which he says is a work in progress, currently has information about 47 journals.

You can check it out here.

Use innovative tools to teach clear and courageous thinking
Subscribe
Notify of
guest

8 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Who's Keeping Time?
10 months ago

This is a really cool resource and must have taken a lot of pretty tedious/painstaking work. Thanks so much Sam!

I was wondering if you could explain your methodology for estimating the acceptance rate for journals. For want of a better tool, I (a grad student headed for the market) use the APA journal survey data to try to get a sense of journal turnaround times. But this data is pretty dubious for many reasons, including reporting bias and specific data that simply don’t pass the smell test (Synthese accepts over third of the papers it receives–wow!). It would be great if someone could provide an accurate accounting of both acceptance rates and–perhaps more importantly–times to decisions. But it just seems so hard with the data that’s out there.

Two asides:
(i) For Sam: unless I’m missing something, the journal-reported and estimated acceptance rates for Ethics (4% and 4.98%, respectively) don’t seem quite right, since they report lower acceptance rates over the last few years here and here.
(ii) For the profession/into the void: Why is not standard practice to do what Ethics and Phil Review do, and publish acceptance rates and turnaround times?

Sam Andrews
Reply to  Who's Keeping Time?
10 months ago

Thanks for the feedback!

With the data available it is challenging to produce accurate estimates on acceptance rates. The APA surveys provide the next best thing but you’re right to think that the numbers are questionable. Johnathan Weisberg has a great post about it that concludes, ‘Accepted submissions are overrepresented in the survey. Acceptance rates estimated from the survey will pretty consistently overestimate the true rate—in many cases by a lot.’ I’ve improved the APA figure by checking it against the acceptance rates provided by journals and then adjusting (lowering) overall averages accordingly.

For aside (i): I found that Ethics states in their instructions to authors page that ‘Our overall acceptance rate for original articles is about 4%’ and used that figure (for anyone interested here are the sources for all of the journal reported acceptance rates).
I did not know they released stats in an article from the editors and will update the spreadsheet – thanks! But this is exactly the kind of problem the project aims to address. Without a universal standard for how journal stats are presented, it’s anyone’s guess where (if) you’re going to find them.

For aside (ii): Completely agree – Phil Review and Ergo are the only journals I know that have real-time stats pages and it’d be great to see more!

Jack
10 months ago

This is of tremendous value, though I hope managing it can become more or less automated, if simply for the reason that Sam surely has other things to do, and it’s not his responsibility to take care of this for everybody else.

One issue I might raise though is that reporting the Leiter reporting rankings in this way gives them an appearance of objectivity. I can see why they are helpful to include, and not sure what the solution is. Maybe the heading could be changed to “perceived rankings” or simply “recent Leiter survey rankings”, or you could default the page so that this metric wasn’t visible.

Why? Because otherwise we risk treating rankings as fixed things. X is a 2nd tier journal, Y is a 3rd tier, etc. This becomes a cycle, reinforcing patterns of perception.

Thanks again to Sam for this amazing resource, especially for those in an early career position.

Sam Andrews
Reply to  Jack
10 months ago

Hi Jack, thanks for the feedback!

First, I’ve already automated large parts of the data collection and analysis for the spreadsheet – for example updating all of the stats drawn from the APA journal surveys takes about 10 minutes. It is more tedious for the information taken directly from journal websites but the data there doesn’t change that often. Overall, the bulk of the heavy lifting is complete and maintaining the project from here shouldn’t be that time-consuming.

Second, I agree that the rankings shouldn’t appear to convey objectivity and appreciate you raising this worry and making suggestions for alternative headings. As such, I’ll change the column’s heading from ‘Blog Rankings’ to ‘Recent Blog Rankings’.

Jack
Reply to  Sam Andrews
10 months ago

Hi Sam,

That sounds really excellent. Well done on setting it up so well. Also, just wanted to say — in raising this concern about the admin time costs, sorry if I sounded a little discouraging of your contributions. That certainly wasn’t my intention!

And that seems like a good approach: ‘Recent Blog Rankings’.

Thanks again for setting this up!

Alyssa Timin
10 months ago

Thanks very much for creating this. It’s helpful for those of us managing journals, as well!

too bad
10 months ago

Does anyone else find that the spreadsheet never loads? (Maybe it’s just been overwhelmed with hits since it was posted here?)

Björn Lundgren
10 months ago

May I suggest that you consider adding information about the opportunity to submit replies, whether the journal considers replies to papers published elsewhere, and the word limit for replies?