Turning in intellectual property

Turnitin.com controversy sparks legality questions

Students+nationwide+have+begun+to+refuse+the+policy+of+turning+in+written+work+to+Turnitin.com%2C+a+site+that+checks+for+plagiarism.+

Alyssa Higgins

Students nationwide have begun to refuse the policy of turning in written work to Turnitin.com, a site that checks for plagiarism.

Story by Maddie Anderson, staff writer

Every student is familiar with Turnitin. A for-profit software designed to detect plagiarism within student submitted work. However, as the site expands and is integrated into more and more school, students and teachers alike are questioning the softwares validity and legality.

The software is set up so that all submitted work is added to a database to be cross-checked for plagiarism against 22 million student papers, online sources and electronic journals. Students all over the country are rebelling against this database by objecting to Turnitin’s automatic addition of their essays to this database, stating that this is an infringement on their intellectual property. Some students went as far as to petition against their school’s use of the site, such as McLean High School in McLean, Virginia, collecting over 1,190 student signatures.

“It irked a lot of people because there’s an implication of assumed guilt,” Ben Donovan, a senior at McLean High said. “It’s like if you searched every car in the parking lot or drug-tested every student.”

Questions surrounding the validity and efficacy of Turnitin’s detection ability are spreading way beyond McLean High. Susan Schorn, a writing coordinator for the University of Texas at Austin, has been studying the accuracy of Turnitin’s plagiarism detection since 2007.

To test the software dependability at detecting plagiarism, Schorn submitted six essays containing copied work from 23 sources, such as textbooks, syllabi, Wikipedia and free online essays. Of these 23 sources, Turnitin only identified eight inappropriate uses of text, but also produced six other matches of non-original sources. This shows that the software a missed nearly two-fifths of the plagiarized sources.

“We say that we’re using this software in order to teach students about academic dishonesty, but we’re using software we know doesn’t work,” Schorn said. “In effect, we’re trying to teach them about academic dishonesty by lying to them.”

We say that we’re using this software in order to teach students about academic dishonesty, but we’re using software we know doesn’t work. In effect, we’re trying to teach them about academic dishonesty by lying to them. 

— Susan Schorn

Susan Lang, Texas Tech professor of rhetoric and technical communication and director of first-year writing, agrees with Schorn’s results. In 2009, Lang also conducted research into Turnitin’s abilities, stating that Turnitin frequently flagged certain phrases that weren’t actually plagiarized, or frequent false positives and negatives.

“Given the exponential explosion of information quantities on the Internet, as well as the constant shifting and posting of content, the potential is always there for missed or inaccurate detection,” Lang wrote. “However, one would think that in eight years, Turnitin would have been able to improve their performance with false negatives — that doesn’t seem to be the case.”

As the site continues to expand and is used more frequently, Schorn says that the questioning of the software’s efficiency has become less important than quick grading.

“The real ethical question is how you can sell a product that doesn’t work to a business — the sector of higher education — that is really strapped for cash right now,” Schorn said. “We’re paying instructors less, we’re having larger class sizes, but we’re able to find money for this policing tool that doesn’t actually work.