Philosophers: Disappointingly Normal


Philosophers are sometimes thought of as expert thinkers, more rational and less prone to errors in reasoning than others. Whether this is true, though, is an empirical question, and it is one that several researchers have taken up over the past decade or so. Their findings regularly show that philosophers, like the ordinary folk, are susceptible to various cognitive biases. A recent study by Eric Schwitzgebel (Riverside) and Fiery Cushman (Harvard) hammers home the point, but in a way that even the authors admit to finding surprising. It’s called “Philosophers’ Biased Judgments Persist Despite Training, Expertise and Reflection.” Ouch. From the abstract:

We examined the effects of framing and order of presentation on professional philosophers’ judgments about a moral puzzle case (the “trolley problem”) and a version of the Tversky & Kahneman “Asian disease” scenario. Professional philosophers exhibited substantial framing effects and order effects, and were no less subject to such effects than was a comparison group of non-philosopher academic participants. Framing and order effects were not reduced by a forced delay during which participants were encouraged to consider “different variants of the scenario or different ways of describing the case”. Nor were framing and order effects lower among participants reporting familiarity with the trolley problem or with loss-aversion framing effects, nor among those reporting having had a stable opinion on the issues before participating the experiment, nor among those reporting expertise on the very issues in question. Thus, for these scenario types, neither framing effects nor order effects appear to be reduced even by high levels of academic expertise.

In short, there was an “across-the-board failure to find evidence for philosophical expertise and reflection in moderating biased moral judgment.”

What to do about findings like these? Well, if philosophers wish for “philosopher” to truthfully refer to something like “expert thinker” then perhaps we need to revamp certain parts of our philosophical training. Logic has been the core element here, but over time it has been supplemented with aspects of decision theory and statistical reasoning. We may need to do more here, not only educating philosophers about cognitive biases (as that itself seems to make little difference), but also training them in various debiasing techniques.

Going even further, since it has been suggested that “ambient and contextual factors may create high risk situations that dispose decision makers to particular biases” and that “fatigue, sleep deprivation and cognitive overload appear to be important determinants,” perhaps certain life management skills should be thought of as part and parcel of philosophical training. Maybe the ancients were onto something.

UPDATE: An article in Harvard Business Review (get over it) describes several strategies for overcoming biases in business contexts: make three estimates, think twice, use premortems, take an outside view, seek advice on objectives, cycle through objectives, use joint evaluation, and vanishing options test. Are any of these applicable in philosophical contexts? Are there analogues to these for philosophical problems? I can recall one paper that suggests a specific debiasing strategy in moral philosophy—“The Reversal Test: Eliminating Status Quo Bias in Applied Ethics” by Nick Bostrom and Toby Ord. Are there others out there? I mean besides basic stuff like Chapter 2 of On Liberty.

There are 14 comments

Your email address will not be published. Required fields are marked *

  
Please enter an e-mail address