Public School Curriculum Denies Moral Facts


Our public schools teach students that all claims are either facts or opinions and that all value and moral claims fall into the latter camp. The punchline: there are no moral facts. And if there are no moral facts, then there are no moral truths.

The inconsistency in this curriculum is obvious. For example, at the outset of the school year, my son brought home a list of student rights and responsibilities. Had he already read the lesson on fact vs. opinion, he might have noted that the supposed rights of other students were based on no more than opinions. According to the school’s curriculum, it certainly wasn’t true that his classmates deserved to be treated a particular way — that would make it a fact. Similarly, it wasn’t really true that he had any responsibilities — that would be to make a value claim a truth. It should not be a surprise that there is rampant cheating on college campuses: If we’ve taught our students for 12 years that there is no fact of the matter as to whether cheating is wrong, we can’t very well blame them for doing so later on.

That’s Justin McBrayer (Fort Lewis College), in today’s “The Stone” in The New York Times. The piece is interesting, not just for laying out those elements of typical and Common Core curricula that apparently lead to a significant inconsistency and later problems with student behavior, but as an example of the potential value of bringing philosophical reasoning into public policy decisions.

Horizons Sustainable Financial Services
Subscribe
Notify of
guest

33 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Joel
Joel
9 years ago

I have noticed this as well. Another irksome implication (given that the ‘or’ is taken in this context to be exclusive) is that no one has two essential ingredients for knowledge – true belief. What an embarrassment.

Greg Littmann
Greg Littmann
9 years ago

Why are schools teaching that there are no moral facts? While there are powerful arguments that there are no moral facts, the public disagrees and there is no consensus in academia. So why is it being taught in schools as fact rather than mere opinion?

Stephen Clark
9 years ago

See C.S.Lewis *The Abolition of Man* (1946) – it’s a longstanding problem that (some) educationalists purvey bad philosophy without ever admitting that it is philosophy at all.

Bob Kirkman
Bob Kirkman
9 years ago

I offered the following comment in reply to the full post at http://opinionator.blogs.nytimes.com/2015/03/02/why-our-children-dont-think-there-are-moral-facts/ .

I’ve long chafed at the language of “fact” v. “opinion”, which is grounded in a very particular, limited view of human cognition. In my own ethics courses, I work actively to undermine the distinction, focusing instead on considered judgment . . . or even more narrowly, on consideration itself. (See http://wp.me/p5Ag0i-6M )

With this, I think I’ve had some success in overcoming the easy relativism my students often bring with them, introducing them to the standards and methods of critical ethical inquiry – and to the idea that one moral claim might be more plausible than another – none of which reduces to a finite set of “moral facts.”

I take more serious issue with Professor McBrayer’s causal argument: because students are taught the difference between fact and opinion, for example, they cheat in college. There is at best a correlation here, which does not prove causality.

That the fact/opinion distinction is built into Common Core is as likely to be an effect as a cause of wider patterns in American culture, and the alleged increase in cheating may just as much be the effect of institutional arrangements and competitive pressures on students (regarding which see http://wp.me/p5Ag0i-6Q ).

Ben
Ben
9 years ago

“The inconsistency in this curriculum is obvious. For example, at the outset of the school year, my son brought home a list of student rights and responsibilities. Had he already read the lesson on fact vs. opinion, he might have noted that the supposed rights of other students were based on no more than opinions.”

The inconsistency is hardly obvious. There are obviously facts about social expectations, norms, disciplinary procedures, etc. What is not obvious is that these norms and expectations are grounded in absolute fact.

Duncan Richter
9 years ago

I wish McBrayer hadn’t written this: “many of the things we once “proved” turned out to be false.” It’s true, no doubt, but surely none of the things we once proved (as opposed to “proved”) turned out to be false. I don’t see proof as “a feature of our mental lives”. Other people have noted other shortcomings above. Even so, I think the piece is more right than wrong, and brings attention to a real problem.

Eric
Eric
9 years ago

This curriculum is much older than Common Core (although I suppose it may be newly implemented in some districts as a result of CC?). I know of a dozen or more elementary schools across four Midwestern states that have been teaching this for nearly a decade.

Six years ago or so when I was in graduate school, my then-seven-year-old nephew asked me to help him with one of these Fact vs. Opinion worksheets. Even apart from the worries expressed in the article, the whole curriculum was rather confused; one thing that still sticks in my head to this day were several statements classified as Opinions that by the curriculum’s own lights should count as “Facts about Opinions” (for instance, statements like “More people like chocolate ice cream than vanilla ice cream”, which could be shown to be true via survey, were labeled as Opinion).

Abraham Graber
Abraham Graber
9 years ago

I start all of my introductory level courses with a short lesson on logic. Whenever I define statements as “sentences that are either true or false” and ask for an example of a sentence that is *not* a statement, some student mentions “opinions.” Though the Stone article does not explicitly make the point, most of my students are taught that opinions are neither true nor false. This is, of course, immediately self defeating since the claim that “opinions are neither true nor false” cannot be “tested or proven” and as such, is an opinion, and thus neither true nor false.

This is, I think, the most obvious and devastating problem with the supposed fact vs. opinion distinction. It makes my blood boil that students are being taught a philosophical view that is so obviously a non-starter. Has whoever is putting together this pedagogy never bothered to take five minutes to think about what they’re telling students?

Anon
Anon
9 years ago

“I wish McBrayer hadn’t written this: ‘many of the things we once ‘proved’ turned out to be false.’ It’s true, no doubt, but surely none of the things we once proved (as opposed to ‘proved’) turned out to be false.”

Perhaps his phrasing is a little misleading (though he was careful to scare quote ‘proved’), but the point is still necessary, since the point is to distinguish between proof and the belief that something is proven, between fact and assertion of fact. The point is that facts and proofs are always, in practice if not in principle, matters of opinion: to assert that something is factual or proven is to believe it so and, consequently, to possibly be mistaken.

The ideological function of our culture treating facts and values as opposites is to deny us the moral right to question an authority’s claims about facts (e.g., yellow cake, WMDs), while also denying us the moral right to question an authority’s moral authority (sure, you believe torture’s wrong, but it’s all a matter of opinion, so shut up and play with this neato phone).

So, it’s necessary to remind everyone that fact and opinion overlaps in assertion. And that ignoring this always serves to prop up given authorities.

NCBD
NCBD
9 years ago

Philosophers are really on the front lines of this–we have to deal with the consequences of these sloppy distinctions. (One question I often get: ethics is all opinions, so how can grades even be assigned?) Part of the way I handle it in my class is to talk about the difference between opinions and arguments (claims supported by reasons). I find this is very effective. Getting them to construct good arguments is hard, but at least they understand what they are trying to do. Getting them to question their naive relativism has been much harder.

Jamie Dreier
Jamie Dreier
9 years ago

I am pretty sure ‘fact’ in this context doesn’t mean what philosophers mean by it. It means more like ‘publicly establishable truth’ or ‘publicly known truth’.

While it’s a bit irritating, philosophers shouldn’t get too upset when non-philosophers don’t use words the same way we do.

Chris
Chris
9 years ago

I am in my early 30s now and definitely remember these kinds of lessons in my middle school grammar books. In the same workbook where you had to circle the subject and verb of a sentence, or distinguish declarative and interrogative sentences, would be pages where you had to put “F” or “O” for fact or opinion.

Now that McBrayer points out the philosophical implications of these exercises, I definitely agree that they are problematic!

I suppose the key questions with regard to middle school instruction are: what skill is being taught in these exercises, and how can the exercises be repackaged to avoid divisive philosophical or religious assumptions?

Andrew Flynn
Andrew Flynn
9 years ago

I find the following claims, found in some of the worksheets, distinctly disturbing: “Pizza is not a healthy food to eat. Answer: Opinion. Explain: What one person views as healthy might be unhealthy to another.” “Eating fast food isn’t bad if you only eat it once a week. Answer: Opinion.”

So, to be clear, we are teaching our children that claims about human health are not truth-apt. And we wonder why the anti-vaccine movement has so much traction.

JDRox
JDRox
9 years ago

Yes, but one can still believe publicly established truths. The problem is that even if we grant the curriculum its definitions, many opinions are facts.

JDRox
JDRox
9 years ago

On the one hand, I agree that the inconsistency isn’t blatant, but I do think it is there. Can you (Ben) say a bit more about what your criticism is? What does it mean for something to be “grounded in absolute fact”?

Anon
Anon
9 years ago

The comments on the piece at the NYT website seem to confirm that philosophers are fighting a totally losing battle, even with respect to those who read the Times:

“He is arguing, in a dishonest form, that public schools should be teaching moral facts. Of course moral facts is code for the Ten Commandments.” (currently the second most popular comment)
“Mr. McBrayer needs to take a few science classes to get his head straight on this issue. That’s a fact.” (currently second)
“The phrase “moral facts” is deliberately provocative; it surely derives not from an ethical perspective but a religious perspective.” (currently fifth)
“I think for the purpose of a second grader, the definitions are adequate. For people who wish to study and live their lives in the pursuit of philosophy, splitting the definitions of words over and again it may not be. But then that’s why philosophers outside the academic world are called baristas.” (currently seventh)

I’m going to go cry in my office with the door closed now!

Avi
Avi
9 years ago

If the fact vs. opinion distinction is both bad philosophy and also a morally harmful myth, we need to ask whether inculcating “good morals” in young people requires giving them good philosophy or whether it requires giving them good myths, or perhaps something else entirely. Putting aside whether it is good philosophy (I think it isn’t), is talk of “moral facts” a beneficial myth? Teaching philosophy rather than myths to young people would probably involve a critique of the fact vs. opinion distinction. Taking courses in moral philosophy, however, is no guarantee of becoming a morally good person and sometimes has the opposite effect.

Ben
Ben
9 years ago

JDRox – What does it mean for something to be “grounded in absolute fact”? Well, let us imagine you and I disagree about whether or not there are any camels in Mexico. We have different perspectives, but common sense would hold that one of our perspectives accords with the way things are, i.e. that there is some non-perspectival feature of the world that endorses one of our perspectives – namely, the presence or absence of camels in Mexico (general worries about truth aside). The question is, can a moral disagreement (about, say, the wrongness of abortion) be similarly resolved by appeal to some non-perspectival features of the world?

If so, then we might say that one of our views is grounded in absolute fact. This is what I meant. In the quote from the article, McBrayer seems to me to suggest that it is inconsistent to have a moral perspective, while supposing that that perspective has no such ‘absolute’ endorsement. This is far from obvious to me. Consider a Hobbesian picture, for example – we all might share a moral perspective not because it makes Really True claims about right and wrong, but simply because it is to our mutual advantage to do so.

O. Pine
O. Pine
9 years ago

‘Eric’ points out above that the fact/opinion distinction (in the form of a ‘fact vs. opinion’ distinction) is obviously much older than the Common Core requirement, and he is right. It should also be pointed out that Macbrayer’s complaints have been raised before, even by philosophers – Perry Weddle’s “Fact from Opinion”, for example, mentions some difficulties with and inconsistencies in the distinction as it was discussed and taught way back in 1985. It is good to see the complaints reaching a wide audience, however, since Weddle’s odd little article was published, I think, in a journal of informal logic, and nobody ever reads those.
As far as I have been able to determine (I have been looking into this as a side project for the past few months), versions of the ‘distinction’ have been part of our elementary and high school curriculum since at least the 50’s, so I sincerely doubt that any recent insensitivity to the wrongness of cheating is a result of Common Core implementation, or of any other particular recent educational policy, unless we are meant to believe that American students have somehow ‘gotten better’ over time, and are now actually learning what we are trying to teach them.
A more plausible scenario, I suppose, would be that teachers are ‘getting better’ at teaching the distinction, and there is some reason to believe that trends in 80’s and 90’s educational thought, influenced by post-modernism and social constructionism, were keen to endorse teaching truth-relativism-friendly content. I am not sure, however, that it will be easy to point fingers at anyone in particular.
In my opinion, the ‘fact vs. opinion’ distinction has, in fact, a much more interesting although poorly understood history, and my suspicion is that versions of the distinction as we now know it began to be taught in elementary and high schools as a reading skill sometime after it began to be taught in journalism schools as a writing skill, but I have no solid evidence of this.
Here is what seems to be the case: Journalists do tend to believe that distinguishing fact from opinion is a valuable journalistic skill, and not just because they want to provide the public with ‘the facts’, unclouded by ‘mere opinion’: The nature of American defamation law may be partly for this common position, since defamation law has sometimes invoked a distinction between ‘statements of fact’ and ‘statements of opinion’, with statements of opinion being ‘protected’ under the law. journalists, then, have a practical interest in separating fact from ‘mere opinion’. The first step in a defamation case has often been to determine whether what was said is a ‘statement of fact’, and so ‘possibly false’, and newspapers can easily avoid lawsuits by defending what they publish as merely a ‘statement of opinion’.
This legal formulation of the distinction, in terms of different types of ‘statements’, is frequently (and often inconsistently) used in primary school materials, and a common justification for teaching the distinction involves equipping students with the ability to separate fact from opinion in news coverage of recent events, which leads me to believe that the influence trickles down, from law, through journalism, into the classroom.
Like most stuff that trickles downward, the distinction is somewhat gross, of course, and if I remember correctly, the people at the top are starting to notice. I believe the Supreme Court has recently denounced the fact vs. opinion distinction as ‘spurious’, and suggested that its use be discontinued when determining issues of accountability in defamation cases.
If these pseudo-historical speculations are correct, and the ‘fact vs. opinion’ distinction presented in the elementary and high school curriculum has (if I can change metaphors here) deep roots both in journalism and law, clearing up the widespread confusion is going to be extraordinarily difficult. The fact vs. opinion distinction is believed (by the teachers I have questioned) to be very useful as a means of effectively tracking dozens of other undeniably important distinctions (e.g., fact/value, belief/preference, explanation/justification, proof/evidence, argument/inference, justified/unjustified belief, controversy/common ground, etc.), and convincing them that it is not only ineffective, but potentially very harmful to the individual and to society as a whole (which it is!), will take more than an opinion piece, especially since we philosophers have no single distinction to replace it with.
Try telling elementary school teachers that what they ought to be required to do is teach a room of screaming kids 13 different, complicated distinctions, and see what sort of looks you get.

Chris
Chris
9 years ago

I guess I would propose a three-fold distinction:

(1) Empirical claims (to be verified by the sciences)
(2) Religious and philosophical claims (can be true or false, but would be justified by non-scientific methods–philosophical argument, logical analysis, religious experience or lack thereof, etc.)
(3) Preferences (desires and matters of taste)

I think the distinction is relevant to middle school language students, because it tells students how to justify claims and what kind of justification to expect from others.

Andrew Moon
9 years ago

One thing that irks me is the presumptive attitude expressed by some of the commenters who clearly are not trained philosophers. I am not saying that such commenters should not criticize or ask questions or say that McBrayer’s right about everything. It’s the dismissive “this guy is OBVIOUSLY making mistake X” attitude toward someone who is clearly their epistemic superior on the issue.

Here are some relevant things a trained philosopher could do:

– Distinguish between an opinion and the proposition that one has an opinion toward. (There is the OPINION that the dress is yellow and the PROPOSITION that the dress is yellow)
– Have at least a general description of a proposition (even if we have different theories of propositions, everybody will agree that propositions are truth-bearers and are objects of belief and knowledge)
– Be able to distinguish between a rational belief, a true belief, and a proven belief (and be able to easily give instances where you have one but not one of the others).
– Easily know the difference between knowledge and mere true belief and be able to give examples where you have one but not the other.
– At least question the mutual exclusivity of options (as he points out, one can have an opinion that p AND it might also be a fact that p).
– See implications that anybody who’s taken an Intro Ethics class should be able to see (e.g., “If there are no moral facts, then it is not a fact that the Holocaust was bad.”)

These are distinctions and points that anybody who’s had a decent grad. school philosophy education will be able to discern pretty easily. In fact, I hope that my Intro students can make at least most of these distinctions.

If training in philosophy doesn’t do anything, then there is little reason for many of us to have busted our butts off in grad. school. But as nearly anybody who has gone to grad. school in a decent program will tell you, we clearly did grow and progress in our philosophical abilities and knowledge. Many in the public do not seem to appreciate that spending 5+ years in graduate school honing one’s thinking skills, learning logic, learning the views of great thinkers throughout history, dialoguing w/other professors, fellow grad. students, and nonphilosophers (students), and doing philosophy constantly, I mean, EVERY DAY, would give someone skills and knowledge of the subject that they might not have. We have our papers torn to shreds by our professors and then are told, “Write it again.” We must present papers before audiences of really smart and well-read people who will ask hard questions about the slightest possible holes in our arguments. The public just doesn’t know just how much LEARNING happens when one undergoes these experiences.

Their attitudes, in my mind, betrays a complete ignorance that there is something to be learned in philosophy grad. school. How else could they so boldly and easily dismiss him?

Of course, this is only one of many problems to identify in the comments section. Like most, I am saddened and feel overwhelmed by it. But I guess the way forward is to start with one person at a time, do our best to teach our classes well, and portray philosophy (and what do we in grad. school) in a positive and informative manner to our nonphilosophy friends.

David Wallace
David Wallace
9 years ago

I think some people are being a bit unfair on the philosophical cogency of the “fact vs opinion” distinction. From the description given, it sounds basically like simplified logical positivism, minus any discussion of analytic truth, and with the emotivist theory of ethics that the positivists generally used. It’s way too quick to suppose that this position is trivially self-undermining or that emotivist ethics trivially entails moral nihilism.

Of course, we shouldn’t be teaching schoolchildren logical positivism as unchallenged orthodoxy (or probably at all) given that it’s been so widely rejected – but it was rejected for fairly subtle reasons, not blindingly obvious ones.

JDRox
JDRox
9 years ago

This seems apropos: http://econlog.econlib.org/archives/2012/06/there_is_no_arb.html

“Real subjects have no arbiter.”

Nick
Nick
9 years ago

I agree with O.Pine, #19, that the fact/opinion distinction as taught by schools is likely taken from our judicial system, where the Supreme Court appeals to just such a distinction for purposes of determining whether speech is defamatory. I am neither lawyer nor legal scholar, but a quick perusal of the literature suggests — in agreement with Jamie Dreier, #11 above — that, roughly, facts are claims for which there exist cogent/strong appeals to testimony, while opinions are claims for which there exist no such cogent/strong appeal. Appeals to testimony are understood quite broadly, to include appeals to the testimony of our senses as well as more familiar appeals to expert testimony and testimony from those in positions that authorize their likely knowledge. Even if, as O.Pine suggests, the Supreme Court no longer considers the fact/opinion distinction legally helpful, surely we all can agree that there is a logically significant distinction between strong appeals to testimony and weak ones — and so we further can agree that there is a logically significant distinction between fact and opinion, understood as claims that conclude such respective arguments.

A fact/opinion distinction grounded upon strong/weak appeals to testimony makes sense of comments that, for philosophers, seems absurd (as reported in some prior comments here). Facts can change, because the strength of appeals to testimony can change (argument by appeal to testimony is non-monotonic) — and so arguing that facts never change amounts to assuming that we can never have better evidence than we have now. There are no moral facts, because there are no established moral authorities (due to pervasive disagreement) — and so arguing in favor of moral facts amounts to arguing that there is, in fact, some established moral authority (and, in the US context, a charitable interpretation is that the authority is the Pope, the Ten Commandments, or some other Christian source). Pizza’s being healthy is an opinion, because (I suppose) our nutritional experts espouse mixed testimony about pizza’s health. (Note that this has nothing to do with truth-aptness.)

I would encourage philosophers to search for more charitable interpretations of the fact/opinion distinction as used in our schools. I wonder, too, whether others have considered explicating the distinction as I have suggested (in terms of appeals to testimony) and, if so, how closely that tracks school homework answers and such. Some bothersome aspects of some comments in this thread, and comments I’ve seen elsewhere when philosophers discuss this issue, are the eager pleasure philosophers exercise in dismissing the fact/opinion distinction as a bit of nonsense without considering options for more charitable interpretation, the glee expressed in taking vague or borderline cases to undermine the distinction entirely, the ease with which discussion of the fact/opinion issue is replaced by discussions of much more abstruse and academic issues (positivism, postmodernism, emotivism, “absolute” facts, …). If we think of comments as sketches of undergraduate term paper ideas for an assignment geared to making sense of the fact/opinion distinction as used in education (perhaps part of a philosophy of education course…), I would not award good grades to many.

A teacher writes
A teacher writes
9 years ago

A few things to bear in mind:
1. One of the most important, and difficult, tasks high-school teachers face is getting kids to understand (a) the importance of supporting their assertions with arguments and evidence (b) what counts as evidence.
2. All teachers make outrageous ad hoc distortions and simplifications for the purpose of instilling methods . (“Remember Timmy, in this essay I want only facts not opinions.You remember the difference? [points to shiny poster])
3. Many teachers, and very many authors of curricula, are quite stupid, and so easilyy come to take their own ad-hoc simplifications literally.
4. Dealing with controversial moral topics in the classroom is very hard and sometimes career-endangering. Many teachers avoid this difficulty by dividing moral issues into (a) things about which there can be no discussion because there is one self-evident and obligatory answer and no room for opinion, and (b) things about which there can be no discussion because it’s all just a matter of opinion and all opinions are equally valid and deserve respect and please don’t report me to your priest/congressman/lawyer/talk-radio host.

Bob Kirkman
Bob Kirkman
9 years ago

“4. Dealing with controversial moral topics in the classroom is very hard and sometimes career-endangering. Many teachers avoid this difficulty by dividing moral issues into (a) things about which there can be no discussion because there is one self-evident and obligatory answer and no room for opinion, and (b) things about which there can be no discussion because it’s all just a matter of opinion and all opinions are equally valid and deserve respect and please don’t report me to your priest/congressman/lawyer/talk-radio host.”

This point about “controversial topics” suggests a still more charitable interpretation of the fact/opinion distinction: it’s rooted in a kind of misplaced democratic pluralism.

Now, the question: Is there a way to help second graders to understand – without just confusing them – that some claims are well established and widely accepted (e.g., “the earth is round”) while reasonable people may disagree about some other claims (e.g., “the Affordable Care Act should be repealed and replaced with something else”), but that the latter set of claims have cognitive content and may be the subject of critical inquiry and reasoned debate?

I suspect there is language clear and accessible enough in which to do that. At the very least, I think I have found some ways to bring students in my college-level ethics classes around to such an understanding. In one version, from long ago, I started with a “fact/opinion” worksheet, followed by a worksheet with examples of the very same opinion expressed in various ways – some supported by argument, others backed by mere assertion, or insults, or emotion, or bad arguments – and asked students to rank them . . . then to derive the basis of their ranking.

A teacher writes
A teacher writes
9 years ago

There certainly are ways. In my experience, formal debate is a very useful tool. That process of reasoning is turned into a rule-governed competitive game. If you want to win, you have to grasp the difference between an assertion and evidence, and to think carefully about what kinds of evidence can support different kinds of claim. It’s important that students also get to participate, as judges or audience members, in weighing up the arguments. The teacher can manipulate the format as necessary without being seen to pick sides or impose their prejudices.
Truly teaching usually works much better than mere telling. Trying to explain subtle distinctions, or getting students to memorise and regurgitate textbook definitions, just leads to trouble. If you just say e.g. ‘your emotions are not evidence, weak students will feel belittled and provoked, the smart-alecs will pick holes in your categories. Let them try to argue from pure emotion in a debate and learn the limitations.

Paul Hammond
Paul Hammond
9 years ago

Andrew Moon @21: I agree that the comments are incredibly disheartening for the reasons that you cite. It would be a lot easier to defend McBrayer though if he didn’t make what seems like (to me, someone who did go the philosophy grad school for 5 years) an obviously ridiculous argument when he said that: “[in the world beyond grade school] consistency demands that we acknowledge the existence of moral facts. If it’s not true that it’s wrong to murder a cartoonist with whom one disagrees, then how can we be outraged?”

I understand that that’s probably a synechdoche for a complete argument that isn’t stupid, but as it is it precisely begs the question against expressivism. if philosophers want members of the public to take our expertise seriously, it might help to be a little more considered and less polemical and superior than Prof. McBrayer is. It might not, of course, and then we can all go back to crying in our offices, but if we want people not to think we’re smug, we could make a special effort not to be smug.

Joshua Knobe
9 years ago

The discussion here thus far has focused on fairly traditional metaethical issues, but McBrayer also makes a claim that goes beyond what philosophers have traditionally done in metaethics. Specifically, one of the central points of his piece is that people’s moral relativism is in part the product of their education.

Now, this seems to be a straightforward empirical claim. It is a claim about the causal origins of people’s moral relativism. The proper way to assess this claim, it seems, would be to look for empirical data about moral relativists.

Over the past few years, there has actually been a surge of empirical work on moral relativism, and a lot is now known about the topic. To give just one example, studies consistently find that people who see morality as relative tend to be more tolerant, while those who believe in objective moral truths tend to be more intolerant. For example:

http://www.uvm.edu/~phildept/documents/M-EP-Wright-background2.pdf
http://leeds-faculty.colorado.edu/mcgrawp/pdf/Goodwin.Darley.2012.pdf

These data provide some evidence to indicate that an attitude of tolerance leads to relativism while intolerance leads to objectivism. (An alternative hypothesis, however, would be that the relativism is causing the tolerance rather than the other way around.)

Similarly, studies show a correlation whereby people who are high in the trait of ‘openness to experience’ tend to be more inclined to be moral relativists:

http://141.14.165.6/users/cokely/Feltz_%26_Cokely_2008_ProcCS.pdf

Finally, studies show that people who get the correct answers on certain logic problems are more likely to be relativists, while those who get the wrong answers to those problems are more likely to believe in objective moral facts:

http://link.springer.com/article/10.1007/s13164-009-0013-4/fulltext.html

An obvious interpretation of this last result would be that people’s ability to think logically in certain respects leads to an increased inclination toward moral relativism. (Here again, a number of different interpretations are possible.)

In saying all this, I don’t at all mean to suggest that there is something wrong with the hypothesis that people’s public school education leads them to be more relativist. That is certainly a plausible hypothesis, and one worth investigating. The point is simply that these are difficult empirical questions, worthy of serious and systematic research, and it certainly wouldn’t be possible for us to resolve them just by speculating based on anecdotal evidence.

Dave Baker
Dave Baker
9 years ago

Hi Josh,

“studies consistently find that people who see morality as relative tend to be more tolerant, while those who believe in objective moral truths tend to be more intolerant”

Do these studies control for the degree of religiosity of subjects? Because my first thought is, of course moral absolutists are on average more intolerant: they’re more likely to be religious conservatives, because conservative religions teach moral absolutism, and conservative religious beliefs are a leading cause of intolerance.

I would expect that atheist (or e.g. Unitarian or Reform Jewish) moral absolutists would be about as tolerant on average as atheist (or Unitarian or Reform Jewish) moral relativists.

Joshua Knobe
9 years ago

Dave,

Very good point. Those studies don’t control for religiosity, but other studies find, surprisingly, that religiosity actually does not itself predict objectivism.

Also, my faith in the media was somewhat restored by this wickedly entertaining Slate article, which responds to the NYT piece with a review the experimental metaethics literature:
http://www.slate.com/articles/health_and_science/science/2015/03/does_common_core_teach_children_to_be_immoral_as_justin_mcbrayer_says_meta.html

Dave Baker
Dave Baker
9 years ago

That result (religion doesn’t predict absolutism) really flies in the face of common sense. I wonder what definition of religiosity they’re using for these studies. Perhaps their definition does something like count a Reform Jewish person who goes to synagogue once a week as equally religious as an Evangelical Christian who goes to church once a week.

It would still be surprising, though, because my sense is that conservative religious types also go to church more often, etc., than non-fundamentalist types.

I thought the Slate article was a little over-zealous, since it granted that there is empirical evidence to back up most of McBrayer’s claims, yet referred to his article as “total crap.” (Dustin Locke pointed this out to me on Facebook.)

Joshua Knobe
9 years ago

Dave,

Very good point about religiosity. I checked back, and you are actually completely right. The key result is from Experiment 3 in the Goodwin & Darley paper (http://leeds-faculty.colorado.edu/mcgrawp/PDF/Goodwin.Darley.2008.Objectivity.pdf). They checked whether religiosity predicts objectivism after controlling for belief that ethics is grounded in God, but they do not report the result for religiosity when you don’t control for grounding in this way. My apologies for the error.

Then, regarding the Slate article, what you say is very reasonable, but I wonder if you might agree that there is a perspective from which what the author says is also reasonable. He seems to think that the most important thing is not which conclusion you arrive at but rather whether your inquiry is guided in the right way by the empirical data. So if you try your best to take the data into account but happen not to get the right answer, he would presumably think that you still got the most important thing right. Conversely, if you ignore all the data but just happen to get the right answer, he wold think that you still got the most important thing wrong. Does that make sense?