Do you use Turnitin or SafeAssign in your courses to help deter and catch plagiarism? It turns out such software is not very good, reports Inside Higher Ed. Here are the results of a recent test conducted by Susan E. Schorn, a writing coordinator at the University of Texas at Austin:
Out of a total of 37 sources, the software fully identified 15, partially identified six and missed 16. That test featured some word deletions and sentence reshuffling — common tricks students use to cover up plagiarism.
Such results are consistent with her findings from a study conducted eight years earlier:
Of the 23 sources, used in ways that faculty members would consider inappropriate in an assignment, Turnitin identified only eight, but produced six other matches that found some text, nonoriginal sources or unviewable content. That means the software missed almost two-fifths, or 39.34 percent, of the plagiarized sources.
We’re paying instructors less, we’re having larger class sizes, but we’re able to find money for this policing tool that doesn’t actually work. It’s just another measure of false security, like having people take off their shoes at the airport.
My university uses SafeAssign, and I have my undergraduates submit their papers through it as a preliminary screening measure. Sometimes I or my TAs catch passages that the software misses. I usually get a few cases a year, and I doubt I am catching it all.
Like many other professors, I aim for writing prompts that don’t lend themselves to easy plagiarism, but, as the Internet grows, that is getting more and more difficult to do. Comments about how you thwart plagiarism in your own classes would be welcome.
(Once, most of a student’s paper was marked by SafeAssign as plagiarized. I confronted the student with the accusation. He repeatedly and strenuously denied plagiarizing. Further investigation revealed I was mistaken. He hadn’t plagiarized. He had simply bought the paper from someone who had.)