Best PhD Programs in Philosophy (guest post by Carolyn Dicey Jennings)


The following is a guest post* by Carolyn Dicey Jennings, associate professor of philosophy and cognitive science at University of California, Merced, and the principal investigator for the project, Academic Placement Data and Analysis (APDA)

A version of this post first appeared at the APDA site.


Best PhD Programs in Philosophy
by Carolyn Dicey Jennings

What is the best PhD program in philosophy?

On one way of thinking about it, this question doesn’t make much sense. Philosophy is many different things and philosophers can be many different types of people, so the best program will depend on the person in question and the type of philosophy in question.

Nonetheless, our project has been able to gather enough data from 135 programs to provide some guidance on this question, updating previous guidance. Importantly, this post doesn’t assume a single metric of comparison, which distinguishes it from unilateral rankings of philosophy programs. It also provides a combination of subjective and objective measures, while rankings typically rely on one or the other.

All of the below is based on data available on the running tally and the linked blog posts, with anonymized “raw” data available on the home page. The only exception is the information on clusters, which comes from a paper currently under review. (See the end of the post for more information on the clusters.) Worth noting is that the past success of these programs may be less predictive than usual, given the unknown impact of COVID-19 on academic institutions.

Below, programs are listed with asterisks to indicate if they show up in multiple sections. One asterisk (*) means that they show up under “highest recommendation,” two (**) mean that they show up under “highest diversity,” and three (***) mean that they show up under “highest job placement,” with the sets listed in order and separated by spaces, excluding the set relevant to the current section.

To start, do current students and recent graduates recommend the program to prospective graduate students?

Highest Recommendation*

15 programs have an average student recommendation of “definitely would recommend” (5/5; Two programs have an average rating of 4.5 when rounded to the tenth, but 4 when rounded to the integer: Harvard University and University of Oxford). The average for all programs is “somewhat likely” to recommend (4.0). They are ordered below by topical cluster and then by average rating (highest first).

The topical clusters are based on recent work with Pablo Contreras Kallens and Dan Hicks to use different sources of data to divide graduate programs into groups (resulting in the clusters Core Analytic, Metaphysics Focus, Ethics Focus, Mind Focus, Philosophy of Science, Pluralist, Historical Focus, and Core Continental). The majority of these programs are in the Core Analytic cluster (dividing all programs into 8 clusters, the largest share end up in this cluster). Of course, this is a rough guide to the overall topical focus of a program, some programs may be less good of a good fit for a cluster than others, and many students and advisors in each program are likely to work on other topics. The cluster may nonetheless give a sense of fit for prospective students.

Core Analytic

University of California, Berkeley *** (5)
Australian National University *** (4.9)
University of Massachusetts Amherst *** (4.9)
University of Southern California *** (4.9)
Massachusetts Institute of Technology ** *** (4.8)
Rutgers University *** (4.7)
University of North Carolina, Chapel Hill ** *** (4.6)
University of Michigan ** (4.5)

Metaphysics Focus

Saint Louis University ** (4.7)

Ethics Focus

Georgetown University ** (4.5)

Philosophy of Science

Carnegie Mellon University (4.8)
University of California, Irvine (LPS) ** *** (4.7)
University of Cambridge (HPS) ** *** (4.5)
University of Pittsburgh (HPS) *** (4.5)

Historical Focus

University of California, Riverside ** *** (4.7)

Of these, four are in the top 15 for percentage women, excluding non-binary or unknown gender (MIT, UNC, Georgetown, Cambridge HPS), four are in the top 15 for percentage persons of color (Michigan, Saint Louis, Cambridge HPS, Irvine LPS), and one is in the top 15 for percentage first-generation college students (Riverside). (Whenever programs in 15th and later places shared the same rounded value, the later program was also included, so some groups may have more than 15 total programs.) Most of the programs have top 15 placement rates into permanent academic jobs, top 15 placement rates into permanent academic jobs with PhD-granting programs, or top 15 average salary. The highest recommended programs of the clusters not listed above are:

Mind Focus

University of Maryland, College Park (4.3)

Pluralist

University of Hawai’i (4.2)

Core Continental

University of Oregon ** (4.3)

Highest Diversity**

Philosophy has recently begun to address its lack of diversity, but most programs still lack diversity among their graduate students relative to PhD programs in other fields. The average across programs is 30% women, 20% persons of color, and 22% first-generation college students. Yet, the following programs appear to be doing better on this issue. Important to know about what follows is that our data are incomplete on race/ethnicity and socioeconomic status since these rely on self-report (gender is initially assigned based on first name). Thus, only programs with at least 5 respondents are included, and these respondents may not be representative of the program as a whole. With that in mind, the cluster with the largest share of these 42 programs is Ethics Focus (“Ethics”; 11/26%), followed by Core Analytic (“Analytic”; 9/21%) and Core Continental (“Continental”; 9/21%).

Multiple Forms of Diversity
Five programs are in the top 15 for two or more forms of diversity: percentage women (Stony Brook, Waterloo, Cambridge HPS, DePaul), percentage persons of color (Stony Brook, Waterloo, Cambridge HPS, U Conn), or percentage first-generation college students (Stony Brook, DePaul, UConn). The numbers listed after the cluster name correspond with the percentage of each type of diversity, in order, with NA for cases in which fewer than 5 respondents provided this information.

Stony Brook University *** (Continental; 41/43/40)
University of Waterloo (Metaphysics; 50/43/NA)
University of Cambridge (HPS) * *** (Science; 47/33/NA)
DePaul University (Continental; 43/0/60)
University of Connecticut (Analytic; 30/33/43)

Gender Diversity
An additional thirteen are in the top 15 for percentage women, listed in terms of percentage.

Vanderbilt University (Ethics; 52/21/20)
University of Oregon (Continental; 49/18/17)
Michigan State University (Ethics; 48/12/NA)
Massachusetts Institute of Technology * *** (Analytic; 45/26/NA)
Institut Jean Nicod (Mind; 44/25/NA)
University of Arkansas (Analytic; 44/NA/NA)
University of Chicago (CHSS) *** (Science; 42/20/NA)
University of Memphis (Continental; 42/31/NA)
Emory University *** (Continental; 41/21/10)
Georgetown University * *** (Ethics; 41/17/13)
University of North Carolina, Chapel Hill * *** (Analytic; 41/9/22)
University of Washington (Ethics; 41/8/NA)
York University (Mind; 41/22/NA)

Racial and Ethnic Diversity
An additional twelve are in the top 15 for percentage persons of color, listed in terms of percentage.

Binghamton University (Ethics; 35/54/20)
University of California, Santa Cruz (Historical; 37/50/NA)
Arizona State University *** (Ethics; 34/43/13)
The New School (Continental; 28/43/0)
University at Albany (Analytic; 25/40/NA)
Southern Illinois University (Pluralist; 21/40/NA)
Pennsylvania State University (Continental; 40/38/13)
Stanford University *** (Metaphysics; 31/38/33)
University of California, Irvine (LPS) * *** (Science; 26/38/7)
Saint Louis University * *** (Metaphysics; 20/36/25)
Harvard University *** (Analytic; 29/35/13)
University of Michigan * (Analytic; 37/34/14)

Socioeconomic Diversity
An additional twelve are in the top 15 for percentage first-generation college students.

University of California, Riverside * *** (Historical; 24/21/83)
Katholieke Universiteit Leuven (Continental; 26/20/78)
Western University (Ethics; 31/0/60)
University of Cincinnati (Mind; 27/23/57)
Bowling Green State University (Ethics; 33/7/50)
Purdue University (Continental; 22/12/50)
Syracuse University (Analytic; 25/22/45)
University of Utah (Ethics; 24/23/44)
University of Missouri (Analytic; 17/31/43)
McGill University (Pluralist; 33/15/40)
University of California, Irvine (Ethics; 24/24/40)
University of Minnesota Twin Cities (Ethics; 39/22/38)

Highest Job Placement***

Placement success can be measured in multiple ways. Included here is permanent academic placement rate, permanent academic placement rate into PhD-granting programs, and average salary. The average values for these across programs are 43%, 12%, and $69,719, respectively. For the first two, only 2012-2020 graduates are included, and data were updated by our own team over the past 18 months (see the blog posts for more detailed information on each program, including when a program was updated). Permanent academic placement rate is the total number of 2012-2020 graduates currently in a permanent academic position divided by the total number of such graduates in some form of academic position or with unknown position (but not those in nonacademic positions). Worth noting is that salary information is self-reported and may not be representative, and that those in nonacademic positions are included in average salary (and tend to have significantly higher salaries). The cluster with the largest share of these 32 programs (removing the one with unknown cluster) is Core Analytic (“Analytic”; 13/41%), followed by Ethics Focus (“Ethics”), Philosophy of Science (“Science”), and Historical Focus (“Historical”), with 4 programs each (13%).

All Around Placement
Three programs have top 15 permanent academic placement, top 15 permanent academic placement into PhD granting programs, and top 15 average salary. The numbers listed after the cluster name correspond with the each type of placement value, in order.

University of California, Irvine (LPS) * ** (Science; 92%/62%/$105,111)
Harvard University ** (Analytic; 71%/38%/$79,611)
Yale University (Historical; 68%/34%/$90,900)

Top Academic Placement
Four additional programs have top 15 permanent academic placement and top 15 permanent academic placement into PhD-granting programs. “Unknown” means that cluster is unknown due to insufficient data; “NA” means that fewer than 5 respondents provided salary information.

University of Southern California * (Analytic; 77%/40%/$62,667)
University of Pittsburgh (HPS) * (Science; 75%/50%/$71,091)
University of Sydney (Unknown; 67%/54%/NA)
Massachusetts Institute of Technology * ** (Analytic; 61%/39%/$68,357)

Elite Placement
Two additional programs have top 15 percentage permanent academic placement into PhD granting programs and top 15 average salary.

University of California, Berkeley * (Analytic; 59%/38%/$91,933)
University of Arizona (Ethics; 56%/31%/$87,917)

Permanent Academic Placement

Eight more have top 15 percentage permanent academic placement.

Catholic University of America (Continental; 83%/8%/NA)
Stanford University ** (Metaphysics; 71%/26%/$66,429)
Baylor University (Analytic; 65%/3%/$62,679)
Arizona State University ** (Ethics; 63%/13%/NA)
University of California, Riverside * ** (Historical; 63%/13%/$71,667)
University of Virginia (Metaphysics; 63%/13%/$51,167)
Northwestern University (Historical; 62%/8%/$75,408)
Emory University ** (Continental; 61%/0%/$72,222)

Permanent Academic Placement into PhD Granting Programs

Six more have top 15 percentage permanent academic placement into PhD-granting programs.

University of Cambridge (HPS) * (Science; 58%/53%/NA)
New York University (Analytic; 58%/37%/$76,500)
Princeton University (Analytic; 57%/34%/$77,900)
University of Chicago (CHSS) ** (Science; 50%/30%/NA)
Rutgers University * (Analytic; 57%/29%/$70,643)
University of California, Los Angeles (Analytic; 55%/29%/$75,000)

Highest Salary

Finally, eleven more have top 15 average salary.

Duke University (Mind; 53%/0%/$104,620)
University of Chicago (Historical; 46%/24%/$102,955)
University of Calgary (Ethics; 33%/0%/$93,096)
Australian National University * (Analytic; 44%/23%/$88,255)
Cornell University (Analytic; 50%/18%/$86,302)
Columbia University (Pluralist; 54%/26%/$84,444)
University of Massachusetts Amherst * (Analytic; 43%/0%/$83,000)
University of North Carolina, Chapel Hill * ** (Analytic; 58%/19%/$82,786)
Stony Brook University ** (Continental; 50%/3%/$81,643)
Georgetown University * ** (Ethics; 60%/9%/$79,400)

The above covers 64 of the 135 programs reviewed over the past 18 months. In 2017 APDA reviewed the same programs with approximately the same three sets of criteria. How much overlap is there between these?

In the case of student recommendation, 9 programs that were in the top 15 in 2017 are still in the top 15 today: University of California, Berkeley; Australian National University; Massachusetts Institute of Technology; Rutgers University; University of Michigan; Georgetown University; University of California, Irvine (LPS); University of Pittsburgh (HPS); and University of California, Riverside. 6 programs are new to the top 15 (replacing Harvard, Wisconsin, Yale, Oxford, Colorado, and Baylor): University of Massachusetts Amherst; University of Southern California; University of North Carolina, Chapel Hill; Saint Louis University; Carnegie Mellon University; University of Cambridge (HPS).

In the case of diversity, 12 of the 17 programs listed as top 15 for percentage women were also “top 15” in 2017: Vanderbilt University; University of Waterloo; University of Oregon; Michigan State University; University of Cambridge (HPS); Massachusetts Institute of Technology; DePaul University; University of Memphis; University of Chicago (CHSS); University of North Carolina, Chapel Hill; Stony Brook University; University of Washington. 5 new to the list include Institut Jean Nicod, University of Arkansas, Georgetown University, Emory University, and York University (replacing Arizona State, Temple, University of York, Kingston University, and University of New Mexico).

In the case of placement, only 5 of the top 15 for permanent academic placement from this year were in the top 15 in 2017: University of California, Irvine (LPS); University of Pittsburgh (HPS); Baylor University; University of California, Riverside; and University of Virginia. 10 are new: Catholic University of America, University of Southern California, Stanford University, Harvard University, Yale University, University of Sydney, Arizona State University, Northwestern University, Massachusetts Institute of Technology, and Emory University (replacing Cincinnati, U Florida, Rutgers, Tennessee, Indiana, Oregon, Georgetown, Princeton, and UNC).

Do any of these lists line up with the top 15 programs listed in the Philosophical Gourmet Report (a reputational survey of faculty at a select number of programs as judged by select faculty)? The top 15 listed for the PGR worldwide in 2017/2018 were, in order: NYU, Oxford, Rutgers, Princeton, Michigan, Pittsburgh, Yale, MIT, USC, Columbia, Harvard, Stanford, Berkeley, UCLA, and Toronto. Most of these show up in at least one of the lists above, with the exception of Oxford and Toronto. Contrasting our top 15 student recommendations with the top 15 PGR, those missing from the PGR include, in order: Australian National University; University of Massachusetts, Amherst; University of North Carolina, Chapel Hill; Saint Louis University; Georgetown; Carnegie Mellon University; University of California, Irvine (LPS); University of Cambridge (HPS); and University of California, Riverside. Of course, these lists are tracking different qualities in graduate programs, so we should not be surprised by any lack in overlap. A distinction of our project is to focus on the student perspective and student outcomes, rather than judgments about faculty by other faculty. Similar things may be said of other rankings, such as the QS ranking, which is based on “academic reputation, employer reputation and research impact”.

Finally, some of you may want to know a little more about the clusters, and why a program is in one or the other of them. The clusters are based on the areas of specialization of graduates of a program as well as keywords selected by both past graduates and current students that they think best describe their program. Various statistical techniques were then used to determine natural clusters of programs based on this data (paper under review). One might understand programs in a cluster as being more likely to represent some areas of philosophy (+) and less likely to represent others (-)

  • Core Analytic is + Philosophy of Mind & Epistemology but – Continental & Historical Philosophy
  • Metaphysics Focus is + Metaphysics & Logic but – Ethics, German Philosophy & Phenomenology
  • Ethics Focus is + Applied Ethics but – Historical Philosophy, Philosophy of Language & Metaphysics
  • Mind Focus is + Naturalist/Empirical & Cognitive Science but – Historical Philosophy, Metaphysics & Epistemology
  • Philosophy of Science is + Decision Theory, Social Science & Philosophy of Science but – Ethics & Philosophy of Mind
  • Pluralist is + Education, Pragmatism & Aesthetics but – Epistemology, Contemporary Philosophy & Philosophy of Mind
  • Historical Focus is + Early Modern & German Philosophy but – Metaphysics, Logic & Meta-Ethics
  • Core Continental is + Phenomenology & Continental but – Analytic, Metaphysics & Logic

Comments welcome.


USI Switzerland Philosophy
Subscribe
Notify of
guest

33 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Gareth Rhys Pearce
Gareth Rhys Pearce
3 years ago

Interesting data 🙂 Have you been able to filter placement record for value-added? High placement record doesn’t necessarily indicate a quality PhD program, it could easily indicate one that, for whatever reason, the program attracts a high number of PhD students who are already tracking towards a job in academia. If students have to go off of imperfect information when applying for doctoral programs (likely), factors like prestige or name-recognition probably count for quite a bit. These are factors that don’t necessarily correlate with having a good doctoral program.

Carolyn Dicey Jennings
Carolyn Dicey Jennings
Reply to  Gareth Rhys Pearce
3 years ago

No, but I am not sure how we could do that. Do you have ideas on this?

ahem
ahem
3 years ago

Best PhD programs in the anglophone worlds*

I fixed the title for you, you’re welcome!

Carolyn Dicey Jennings
Carolyn Dicey Jennings
Reply to  ahem
3 years ago

The title was intended to be a little cheeky in spirit, but I take your point.

R
R
3 years ago

Question about the “recommendation” ranking: I am in one of the programs that is mentioned as one of the most highly recommeded here. I don’t think I’ve heard of anyone who was asked whether they would recommend the program. Is the data based on input from people who have graduated, or have gotten jobs, or just past graduate students? Since whether students are happy in a program is something that can change drastically within the span of a few years, I wonder if the data represents how current students feel. In my program, I think this kind of change has occurred in the last few years.

Carolyn Dicey Jennings
Carolyn Dicey Jennings
Reply to  R
3 years ago

Hi R,
The data are based on both current students and recent graduates. You will be able to get a bit more information on your program by going to the Running Tally (http://placementdata.com:8182/phd-programs-running-tally/) and clicking on the name of your program. If you go to the section on ratings, you can see how many people responded and whether or not there is a correlation between graduation year and rating (for programs with more than 10 ratings). Some programs do have a negative correlation, such that more recent graduates give lower ratings. Current students are not included in this, since they don’t have a graduation date. If you want more information on how we found and contacted graduates and students, see our 2018 report: https://www.dropbox.com/s/exv5szay8t6c3o3/diversity-inclusivity-survey.pdf?dl=0 If you don’t think you are in the database and would like to be, email [email protected] and we will set up a link for you.

Richard Russell Wood
Richard Russell Wood
3 years ago

Am “outsider’s” observation. MIT and CMU’s rankings? They had such tiny departments a half century ago, iiinm/iirc.
Reflective of the changing focus and content of the discipline? Somewhere a scientist is smirking.

John Fischer
John Fischer
3 years ago

I’d be interested in an amalgamation of the top 15 in this survey/set of surveys and the PGR ranking. Could one simply add together the two numerical overall rankings? Of course, the lower would be better (as in golf).

Carolyn Dicey Jennings
Carolyn Dicey Jennings
Reply to  John Fischer
3 years ago

It would probably make better sense to add together the ratings, since they are both out of 5. But then APDA’s data, which doesn’t currently have a known topical bias, would be skewed towards PGR’s known topical bias (e.g. http://placementdata.com:8182/the-philosophical-gourmet-report-and-placement/). Better would be if PGR managed to seek experts across all areas of philosophy first, because then it would be a more accurate representation of the faculty view of things (which might be nicely compared/contrasted to the student view of things).

efghi
efghi
3 years ago

This data must be inaccurate to some extent. UC Santa Cruz is not a highly diverse department in terms of its graduate students. It regularly has entire cohorts of white male students. As of now, there are a total of four students of color out of a graduate population of roughly 30 or so.

This misrepresents the department and makes it seem like a very hospital department in terms of graduate culture for minorities, which it is not in my experience as a graduate student.

AD
AD
Reply to  efghi
3 years ago

A department can be very hospitable to minority students even if it has very few of them. So, perhaps you mean that there are other ways in which it isn’t very hospitable to minority students.

Carolyn Dicey Jennings
Carolyn Dicey Jennings
Reply to  efghi
3 years ago

Hi efghi,
I don’t know if you saw this, from above: “Important to know about what follows is that our data are incomplete on race/ethnicity and socioeconomic status since these rely on self-report (gender is initially assigned based on first name). Thus, only programs with at least 5 respondents are included, and these respondents may not be representative of the program as a whole.” Another point worth noting, from the running tally: “POC refers to the percentage of current students and past graduates who identify as “Chicanx/Latinx/Hispanic,” “American Indian or Alaskan Native,” “Asian,” “Black or African American,” “Pacific Islander,” or “Two or More Races” divided by the total number of current students and past graduates who provided us with race/ethnicity information.” It may not always be obvious to you that someone identifies as a POC. Someone might, for example, identify as both White and Chicanx/Latinx/Hispanic. But if you already had those things in mind, I am sorry to hear that this data, while accurate, may not be representative. This is harder for smaller programs, like UCSC, and those that have fewer respondents.

frustrated philosopher
frustrated philosopher
Reply to  Carolyn Dicey Jennings
3 years ago

Isn’t it irresponsible and misleading to publish things when you are reporting data that is misleading at best (because the data is not representative)? This seems especially important in things like reporting on racial and ethnic make up of a graduate student body, since presumably prospective graduate students have the most use for your data. Maybe just don’t publish things for which you don’t have good data backing?

(FWIW last I checked the placement data seemed pretty wrong too, but maybe you’ve fixed that now–surely it doesn’t matter if that data (or this data!) is “accurate”–what matters is that it is complete and/or representative, if you are going to use it to make comparative claims that you are then going to publish on a popular blog and elsewhere.)

I recognize the need to supplement (or replace?) the philosophical gourmet’s oversized influence here, but I’m (still) really worried about this project.

Carolyn Dicey Jennings
Carolyn Dicey Jennings
Reply to  frustrated philosopher
3 years ago

You don’t know that the data is not representative. Claiming that it isn’t, or that the data are wrong, seems irresponsible to me, fwiw. And it is starting to feel like propaganda since it comes up each year from people who often haven’t even checked the data. I think our project has a good, responsible process and has made good decisions. We are not perfect, but that isn’t a reasonable standard. If you know of specific errors, or have ideas for how to run the project better, let me know.

AD
AD
Reply to  Carolyn Dicey Jennings
3 years ago

Wait, are you saying we should trust studies as long as we don’t know that the data is not representative? That seems completely backwards: the study should show that its data is representative; we shouldn’t assume it is until proven guilty. It’s a study, after all!

Maybe your point is that we have reasons to think that the data is representative and that no one has given us reason to doubt them. If that’s your point, that’s fine. But the other claim seems obviously false.

Carolyn Dicey Jennings
Carolyn Dicey Jennings
Reply to  AD
3 years ago

Hi AD,
I don’t know why you think I said that/think that.

Frustrated philosopher
Frustrated philosopher
Reply to  Carolyn Dicey Jennings
3 years ago

Hi Carolyn, I was referring to what you yourself said—that the data, while accurate, may not be representative. I’m not propogandizing. I’m remembering the disastrous pluralists guide to philosophy’s recommendations about women-friendly programs. I’m worried that you’re putting something out there that reports data that is not representative. Since you yourself admitted that it is not, I don’t think I have to waste my time searching through your data to see what is going on with it. (FWIW I think the idea behind your project is great, and that we genuinely do need this kind of alternative. But I am worried given what you said about, especially about the diversity-related data.)

Carolyn Dicey Jennings
Carolyn Dicey Jennings
Reply to  Frustrated philosopher
3 years ago

Hi Frustrated,
Saying that something may not be representative is not the same as saying that it isn’t; efghi didn’t give enough reason for me to believe that the data are not representative, due to the reasons I provide in response to efghi above, which is why I responded as I did. Again, I did not “admit” that the data are not representative; in fact, I have reason to think the data set is representative overall. The warning in the blog post is due to the inclusion of small programs. Small programs have small numbers of graduates, and so normal variation means that their data are less likely to be representative of long term trends. I could just not include small programs, but I don’t find that to be a satisfactory solution to this problem.

efghi
efghi
Reply to  Frustrated philosopher
3 years ago

I can confirm that over several years and despite being a small program a student body who considers themselves POC that is 4 out of roughly 25-30 total students is not a diverse graduate student body.

I’m glad you are confident in your data. But I go to school here and know my colleagues. The department, in my opinion, and some others, is not hospitable to minorities and part of that is made visible by the sheer representation. It is problematic when this is something a department needs to work on and they are instead being applauded for it. Moreover, as frustrated philosopher suggests, it tells minority students the wrong message about how welcomed they will be in a department given its culture and dynamics.

Syd Johnson
Syd Johnson
3 years ago

Excellent work. Thanks for doing it.

Carolyn Dicey Jennings
Carolyn Dicey Jennings
Reply to  Syd Johnson
3 years ago

Thanks, Syd!

Devin
3 years ago

For diversity, wouldn’t the diversity of the faculty on these dimensions also be relevant? I would be curious to know how schools rank on that metric.

Nicolas Delon
Nicolas Delon
Reply to  Devin
3 years ago

Thanks for your work, Carolyn. I’m also curious about this. Some programs are not scoring well here but have relatively diverse faculty. Diversity of faculty is also more durable; student diversity could be fleeting and is a lagging indicator.

Carolyn Dicey Jennings
Carolyn Dicey Jennings
Reply to  Nicolas Delon
3 years ago

Oh, sorry…I originally posted a response but it got lost in the system because I posted too many responses at once. We did faculty on the last one (https://infogram.com/philosophy-phd-programs-graduate-ratings-placement-profiles-and-diversity-profiles-1g4qpzlrokwq21y), but I decided not to do it this time. Basically, last time an RA did it by looking at faculty photos and guessing (the RA also looked at rank, AOS, and a number of other things, recording it all in a spreadsheet). There are lots of issues with this method, as you can imagine. Better would be to make a survey for faculty, so that it is self-reported. But that requires a substantial addition/change to the project, and I just didn’t have the resources. In the future, it might be a good idea to add faculty to the database and to add this as a question for them, as I agree it would give another, perhaps better metric on racial/ethnic diversity.

Brian K
Brian K
3 years ago

I’m familiar with the program at the Catholic University of America and can say quite confidently that it shouldn’t be classified as “Continental.” “Historical” is probably the best classification.

Carolyn Dicey Jennings
Carolyn Dicey Jennings
Reply to  Brian K
3 years ago

Here are the keywords provided about CUA by current students and past graduates (“Select from this list up to 5 keywords that you would associate with this program.”):
Medieval 3
Ancient 2
German 2
Historical 2
Metaphysics 2
Phenomenology 2
Continental 1
Early Modern 1
Epistemology 1
Islamic 1
Political 1
History and Philosophy of Science 1

We did not put programs into clusters by hand, but used statistical methods to sort programs into clusters and then tried to understand what those clusters were after the fact, providing labels. One thing we noticed about the Historical Focus cluster is that a positive predictor of it was “Analytic.” It may be that CUA has a historical focus (apparent from above) but that it is closer to those in the Core Continental cluster than those in the Historical Focus cluster because it has some Continental/Phenomenology and no mention of Analytic.

Carolyn Dicey Jennings
Carolyn Dicey Jennings
Reply to  Carolyn Dicey Jennings
3 years ago

(Clusters were based on the keywords above as well as the AOS of past graduates and current students, but I didn’t collect the latter information right now for reasons of time.)

JTD
JTD
3 years ago

Two of the three diversity categories don’t generalize well for international comparisons. At least this follows with the following two assumptions:

1. A graduate program usually mainly caters to local students (which usually means students within the country the program is located in, but could also mean students in the nearby region).

2. A reasonable aspiration for diversity is demographic representation (i.e., having the demographics of the intake resemble the demographics of the general population that the program is serving).

The problem is this. Suppose that Country A has historically made higher education accessible and thus a high percentage of the older generation went to university. By contrast, Country B has historically made it inaccessible and thus a lower percentage of the older generation went to university. When comparing a university from A with one from B we might find that the B-university has more first-generation college students. But this doesn’t show that the B-university is doing better with socio-economic diversity. Indeed, the A-university might have a generous scholarship system for those who are disadvantaged which ensures that the percentage of first-generation college in their intake matches the percentage in the relevant target population, whereas the percentage in B-university’s intake may fall substantially behind the relevant target population.

This also applies to racial and ethnic diversity. Suppose that in country A people of color are 10% of the population and country B they are 45% of the population. A university in A with a 15% intake of people of color might be doing a great job in terms of this kind of diversity. A university in B with a 25% intake of people of color might be doing a not so great job. Yet the measure used here would say that the university in B is doing much better!

At a quick glance this problem seems to be more than theoretical. When we compare the demographics of the US versus that of various European and Australasian countries whose universities were included in the datasets we see the kinds of differences that would make a difference in the way I suggest above. Furthermore, these differences are such that US universities will end up being rated better on these diversity categories than they actually are (when compared to universities outside the US). Finally, there seems to be a fairly straightforward fix to this problem. Rather than working with raw percentages you could work with the ratios in each category of intake vs national percentage.

Carolyn Dicey Jennings
Carolyn Dicey Jennings
Reply to  JTD
3 years ago

This is very helpful, thank you. I will try to incorporate your suggestion, or something like it.

Karl Martin Adam
Karl Martin Adam
3 years ago

Thank you for doing this research! The more kinds of data incoming grad students have access to the better, and I wish I would have had something like this in addition to the PGR when I went through the process of figuring out which of the long list of programs to apply to.

I have a question about placement data. Why do you report the percentage of people with academic placement only out of graduates with academic or unknown placement as opposed to out of all graduates? If I think back to what I would have wanted to know as an undergrad, it was the chance of getting an academic job as well as the chance of getting a job of some kind. Excluding people who tell you that they’re not unemployed from the date radically raises the percentage shown as having academic placement I presume.

Also, not including incoming first years (because I don’t know them yet), my program, UNC, has over twice the number of POC as shown in your data (I can think of 8 grad students who identify as people of color off the top of my head out thirty some students who are rising second year and above). Perhaps you could suplament your data by asking students for what they know about gender and race of the grad student body in addition to their own identity? There are lots of problems with that metric, but there are also lots of problems with only self reports.

Carolyn Dicey Jennings
Carolyn Dicey Jennings
Reply to  Karl Martin Adam
3 years ago

Hi Karl,

The project collects information on all graduates. It excludes only those currently in nonacademic jobs from the denominator of the placement rates. This is due to the idea that those people should not be assumed to have desired an academic job. So the placement rate is something closer to how many people placed in permanent academic jobs of those who wanted such jobs. (Around 5% of graduates are in nonacademic jobs, in our database, and around 91% desire an academic job.)

On race/ethnicity: we considered asking placement officers, but worried that it would violate FERPA. I think the same may be true of asking other graduates about their peers. I plan to send out the survey again this summer, which will allow for updates and more complete data on this next year.

Jonathan
Jonathan
3 years ago

This is definitely an interesting and worthwhile project, and I’m sure that many prospective graduate students will find this very useful. But I have to say, there are some things about it that I find a bit puzzling. In particular, I’m baffled by many of the clusters assigned to these departments. While there are some cases where it’s very clear how it should be labeled, there are many others for which the cluster tag completely misses the mark. For example: Stanford as “Metaphysics”; UC Riverside as “Historical”; UC Irvine as “Ethics”. The “pluralist” tag doesn’t seem to be correlated at all with departments that are genuinely pluralistic. In many cases, the cluster tags seem to lag behind the actual focus of the department by a decade or more. And in many cases where the cluster tag doesn’t misrepresent the target department, it’s completely uninformative (for instance, there’s very little distinguishing many of the “core analytic” departments from many of those sorted into other specialties). I also share the concerns that others have raised about whether or not much of this data is representative. I suppose that this might be connected with my first worry, since I’d imagine that a more representative sample would also yield more consistent cluster groupings. I hope that this doesn’t come off as overly critical, since gathering and compiling information like this is no easy task. But I’m concerned that the graduate student surveys are not particularly representative, and that in many cases they are either misleading or just plain wrong.

Jeff
Jeff
3 years ago

Hi, thank you for offering this research! I think it would be helpful if we could see a breakdown of each category by cluster, not just overall. For instance, I have no interest in core analytic programs, but I am interested in historical and continental. This gets tricky, granted. Many elite programs in ancient philosophy have departments that skew analytic overall. I know that’s the trend of the discipline to begin with, but it still colours things.