Presentation 23 June 2010. 4th International Plagiarism Conference, Northumbria University. The advent and use of digital technologies, which open up a plethora of useful and credible information for use by students, at the same time expose the risks of uncritical and unacknowledged use of other people’s work. Institutions have met these concerns with the implementation of electronic detection systems. The situation has moved very quickly, from the introduction of the UK national license for Turnitin in 2002/3 to the present situation where this software is used by over 95% of Higher Education Institutions. Electronic detection of plagiarism is one of the most widely spread technologies used in education and the evidence base for its use is only just beginning to yield results. This paper will examine the evidence to date for the effects of plagiarism detection systems. It is based on a HEA-funded review ‘Digital with plagiarism in the digital age’ which is available online at http://evidencenet.pbworks.com/Dealing-with-plagiarism-in-the-digital-age.
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
How effective are electronic plagiarism detection systems and does it matter how you use them? Reviewing the evidence
1. How effective are electronic plagiarism detection systems and does it matter how you use them? 4th International Plagiarism Conference 23 June 2010 Dr Jo Badge (@jobadge) School of Biological Sciences University of Leicester
6. Effectiveness Cross comparison reviews mostly focus on usability Live testing with scoring for detection rates carried out by Debora Weber-Wuff Rates Safeassign above Turnitin in terms of detection rates
7. Mode of use: prevention Long term effects Risk / benefit perceptions Punishment as education
21. Davis & Carroll, 2009 Reduction in Amount of plagiarism (45.5%) Over-reliance on one source (45.5%) Citation errors (62%) Insufficient paraphrasing (38%) Percentages= total final drafts showing reduction where n=66 (over 3 years 2007-2009)
23. Acknowledgements Higher Education Academy University of Leicester Teaching Enhancement Forum GENIE CETL Dr NadyaYakovchuk Dr Jon Scott
Editor's Notes
How effective are electronic plagiarism detection systems and does it matter how you use them? Reviewing the evidence Badge, JAbstract:The advent and use of digital technologies, which open up a plethora of useful and credible information for use by students, at the same time expose the risks of uncritical and unacknowledged use of other people’s work. Institutions have met these concerns with the implementation of electronic detection systems. The situation has moved very quickly, from the introduction of the UK national license for Turnitin in 2002/3 to the present situation where this software is used by over 95% of Higher Education Institutions. Electronic detection of plagiarism is one of the most widely spread technologies used in education and the evidence base for its use is only just beginning to yield results. This paper will examine the evidence to date for the effects of plagiarism detection systems. It is based on a HEA-funded review ‘Digital with plagiarism in the digital age’ which is available online at http://evidencenet.pbworks.com/Dealing-with-plagiarism-in-the-digital-age. (http://bit.ly/eDetection)
http://bit.ly/eDetectionThis presentation is the result of a literature review carried out in November 2009 and funded by the Higher Education Academy.The full report is available on the evidencenet wiki http://bit.ly/eDetection
All the references used in the review were added to a social citation site, CiteULike:http://www.citeulike.org/group/11256This group is open and dynamic and new papers relating to the theme of electronic detection can be added.
Electronic detection systems for plagiarism only detect text that matches other sources. They do not detect plagiarism without human interpretation.Turnitin – in the UK since 2003EVE – essay verification system, now EVE2Copycatch – predominately collusion only (Open University have custom version to compare against course texts and selected sites)
For a full list see http://delicious.com/jobadge/electronic_detection+softwareSoftware systems available include:1. TurnitinExample of Turnitin report and preview of the update coming in August 20102. SafeassignNow incorporated into Blackboard3. CopycatchMore focussed on collusion/ copying from set texts (used the Open University for assignments and UCAS to scan personal statements)4 WCopyFindCollusion only, or internet copying if submit URLs of sites for comparison18 students left the University of Virginia after Wcopyfind led to the investigation of 158 students for plagiarism Early system developed by an academic (Lou Bloomfield) in Virginia (last updated Oct 2009)Image URLs:Turnitin preview: http://submit.ac.uk/static_jisc/ac_uk_tii_static_what_is_new_writecycle2_4.htmlWcopyfind: http://plagiarism.phys.virginia.edu/WCopyfind_2.7.htmlCopy catch GOLD: http://cflsoftware.com/?page_id=42
Debora – gave parallel session 3 yesterday (22 June 2010) about testing systemshttp://plagiat.htw-berlin.de/software/2008/Tests on standardised pieces of work, created and copied in different ways.Three major tests so far, 2004, 2007, 2008.
Long term use (Culwin 2006)Data from three academic years differed only slightly when data from first year submissions was compared internally. However, there is a decrease in the non-originality of work from first year to third year submissions as student progress through their university career.Culwin points out that there are many reasons why this would happen, including variance in the detection systems themselves. Important point when looking at base-line measures for comparison. For example, as the Turnitin database increases, the noise (false positives, or high number of small matches) increases, so using straight percentages of non-originality from Turnitin for judgments is very misleading.Image: screenshot from paperCulwin, 2006
Badge (2007) deterrent effectIn our initial small-scale trial, the detection rate was 2.06% (n = 2 out of 97 submissions tested), representing 18% of the total cases of plagiarism detected that year. In the first proper year of use, the detection rate rose to 2.73% (n = 14 out of 513 submissions tested), representing 40% of the total number of cases detected. In the second year of use, the both the total number of cases and the detection rate fell to 0.94% (n = 10 out of 1060 submissions tested), representing 71% of the total number of cases detected.
Risk/ benefit perceptions by students :Woessner, 2004Interesting use of risk benefit analysis to decide what level of penalty would deter students from plagiarising
Punishment as education (Johnson, 2004; Sutherland-Smith & Carr, 2005; Bennett 2005)
Free/ paid systems aimed at students include:Free: Viper http://www.scanmyessay.com/ Paid: Writecheck (provided by Turnitin) http://www.writecheck.com/static/home.htmlSingle paper upload is $4.95 (up to 5K words) or bulk purchase $49.95 (up to 200K words, or 40 paper credits)
Very few institutions in the UK offer complete open access (resubmission in Turnitin limited to once every 24 hours)
Tutor-released evidence of plagiarism resulting in reduction of plagiarism in subsequent submissionsUniversity of Illinois at Urbana-Champaign. Two classes of first level politics students with two essays to write. Gave verbal and written warnings against plagiarism. Used EVE for detection on first essay. Plagiarised essays were marked down, marking on grading curve, this had an effect of lowering the mean, and therefore some honest students received higher grades than they would otherwise have done. Tutors fed back this information to students prior to second essay submission, answered questions about plagiarism. Only one student submitted a paper containing plagiarism for second essay, turns out she had missed the class with the feedback and not read a student newspaper article about it as she had a new part-time job keeping her busy.
Tutor gives generic feedback about penalties awarded on one assignment prior to submission of secondLedwith (2008) taught a class of 205 first year engineering students in Ireland, which already utilized peer review to ease the burden of marking whilst providing students with individual feedback on their work. Of six summative peer-reviewed assignments, four were paper based and two submitted electronically. The electronic submissions were scanned by Turnitin for non-originality. Students were told of fixed penalties that would be applied to work above certain thresholds of non-originality and they were informed (as a group) of the generalized results of the detection software after their first electronic submission. The amount of non-original text detected in the second assignment was statisticallysignificantly less than in the first (Ledwith & Risquez, 2008). (amount of plagiarism = TII score above 25%)Image: screenshot from paperLedwith & Risquez, 2008
Students rated their peer’s performance significantly lower when Peer review system in Turnitin was used to correct the assignments’ (over paper corrections) suggesting that the use of this technology altered their perceptions of the standards of the work.Image: screenshot from paperLedwith & Risquez, 2008
Tutor supported evidence of plagiarism in colour given to students, resubmission offered as optionUsed TII and Ferret to produce combined report on likely originality of student work. Highlighted matching words in colour.Fed back during tutorial with tutor supported interpretation.Express aim of tutors NOT to coach on how to beat TII by changing matching words aloneResubmission allowed if plagiarism was detected but capped at a pass.Resource intensive for staff.Image: screenshot from paper Barrett & Malcolm, 2006
Tutor supported access to Turnitin reportsStudents wrote draft of a 3000 word assignment (Master’s level dissertation preparation) and worked in a one to one tutorial with tutor to look at originality report for the draft. Resubmissions for final draft showed marked reductions in poor practices.
http://bit.ly/eDetectionThis presentation is the result of a literature review carried out in November 2009 and funded by the Higher Education Academy. The complete synthesis is available no the evidencenet wiki.