A few years ago, a meta-analysis of studies about whether colleges do a good job of teaching critical thinking revealed “no differences in the critical-thinking skills of students in different majors.”
At the time, I wrote “it appears that philosophers lack good empirical evidence for what I take to be the widespread belief that majoring in philosophy is a superior way for a student to develop critical thinking skills.”
Perhaps this is not so surprising: course subject matter may make some contribution to the skills students acquire, but more important to skill development, it seems, is how the material is taught. And as with other disciplines, there is quite a bit of variability in how critical thinking is taught in philosophy courses, and some methods may work better than others.
One method that now has some empirical evidence counting in its favor is argument-visualization. A new study published in Nature’s education journal, Science of Learning, concludes that having students learn how to map arguments “led to large generalized improvements in students’ analytical-reasoning skills” and in their ability to write essays involving arguments.
The study, “Improving analytical reasoning and argument understanding: a quasi-experimental field study of argument visualization,” was conducted by Simon Cullen (Princeton), Judith Fan (Stanford), Eva van der Brugge (Melbourne), and Adam Elga (Princeton), and builds upon previous work on argument mapping that Professor Cullen has written about here.
The current study improves on previous research, its authors say, by being a controlled study, by using a well-known reliable test of analytic reasoning, and having students train “using real academic texts, rather than the highly simplified arguments often used in previous research.”
The study took place over the course of a seminar-based undergraduate course. During it, students worked in small groups, with intermittent faculty assistance, constructing visualizations of arguments in philosophical texts. Using LSAT Logical Reasoning forms as pre- and posttests, they assessed the performance and improvements of the seminar students and the control group. They found that seminar students performed better on the posttest than they had on the pretest, and that the degree of improvement exhibited by seminar students was greater than that exhibited by control students. They also had students in both groups write essays, and found that, in comparison with the control group, seminar students better understood the relevant arguments, structured their essays more effectively, and presented their arguments more accurately.
Below is an example of an argument-visualization exercise. You can read the study here. If you are interested in trying out argument-mapping, you may be interested in MindMup, a free online argument-mapping platform Professor Cullen helped develop (discussed here).