Adventures in Technology Enhanced Learning @ UoP

Tag: turnitin

Similarity scoring is a secondary consideration for online assessment…

Similarity scoring should be a secondary consideration for online assessment. Much more important factors, from my point of view, are ease of marking for academics; access to quality feedback for students; and innovative authentic assessment workflows.

Turnitin are close to monopolising the market on similarity scoring of student papers but many assessment platforms already use Turnitin and Urkund as plugin services to provide similarity scoring.

Where should we be focusing our effort at UoP?

As an institution one of our strengths lies in quiz/question-based assessments. This is particularly the case in the Science and Technology faculties. We have a mature sophisticated platform in Moodle to deliver these types of assessments and a deep level of staff expertise across the organisation, which has developed further through-out the pandemic.

The risk factors for UoP include a need to increase capacity for online exams (or diversify some of our assessment types onto an external platform at peak periods) and the ability to be able to innovate in terms of essay/file-based assessments.

From what I can see, Turntin has stagnated in terms of assessment innovations in recent years and have not yet improved service reliability at key assessment periods by migrating their platforms to a service like AWS. This has been promised repeatedly but not delivered on as yet.

This is potentially a reason why we saw growth in Moodle assignment and quiz usage during the pandemic rather than a big increase in Turnitin usage (trust in the reliability of the service and flexibility of the functionality).

So where could we focus our effort to improve the assessment tools for educators and students to gain the most benefits?

Innovative assessment workflows

Posing a long-form question to a student and easily marking the finished product should be a simple process – and it is on platforms such as Turnitin. However, we are increasingly adapting our assessments to be more authentic: assessments that more closely match how students will operate in the workplace. This often requires more sophisticated workflows and mechanisms, which should still be straightforward for academics to engage with and make sense of if they are to be successful. 

Traditional paper-based exams (potentially bring your own device)

During the pandemic staff were forced to transition away from paper-based exams. Many exams were instead delivered as coursework or window assignments (e.g. a 2hr assignment within a 24hr window) or as question-based quiz exams. When exam halls are available again staff may revert back to previous paper-based solutions. After all, we know how these work and paper doesn’t need charging or a stable wifi connection. However, we can harness this forward momentum with a platform dedicated to supporting timed essay assignments on students’ own devices or University machines. Several platforms offer functionality for students to download assignments at the start of an exam with no need to have an internet connection until it’s time to submit at the end. This could represent a robust, safe exam experience that more closely matches how students study today. Who handwrites for three hours any more? I’d be willing to bet most students don’t.

There are challenges with BYOD (bring your own device) particularly around charging and ensuring student machines are reliable. Many of these challenges can be solved with a small stock of fully charged devices, which can be swapped out to students when needed. Chromebooks are ideal online exam devices for this very reason, due to their long battery life and simple configuration. 

Assessment feedback

Workflows such as “feedback before grades” can help students better engage with their feedback, but better access to feedback for students in a variety of places is also key.

Services that offer a holistic view of assessment feedback, or the ability to extract these comments via API so we can build our own views, are increasingly valuable. This functionality will enable key staff such as personal tutors or learning support tutors to view student feedback as a whole (rather than in silos) to spot key areas to help students improve their academic work.

To round out where I started with this post, providing similarity checking is an important part of modern assessment – but it is a problem that has already been solved, multiple times.

If we make assessment more authentic, more flexible and more collaborative there will be less need for plagiarism detection because students will be demonstrating more of the attributes we want them to leave University with. I accept this is perhaps an overly idealistic viewpoint, as there are a lot of students to assess each year, but this is more reason to explore flexible assessment solutions that can make the lives of academics and students a bit easier.

Online assessment in the time of Covid

In pre-Covid times, exams delivered via Moodle were limited by the availability of suitable physical spaces. Exam rooms represented a bottleneck to the number of students taking exams concurrently.

For the last year, we’ve used Moodle (and integrated platforms) to deliver the majority of our teaching and assessment online.

A visualisation of the online assessment mix at the University of Portsmouth:

Diagram of how the Assignments and the Exams overlap during assessment period

In May 2020 many academics who had previously planned to deliver paper-based exams had to quickly adapt and deliver online assessments. In some cases, these required students to scan or take pictures of their work and upload these to assignments (Moodle or Turnitin) for marking. 

In recent months, newer platforms to handle this workflow and ease the marking burden for academics have been developed – platforms such as Turnitin Gradescope and CrowdMark. These platforms leverage the similarities in students’ answers so academics can mark many answers at once. When time allows, we hope to be able to evaluate these platforms in more detail.

In the diagram above you can see “Assignments under exam conditions” as the meeting point between traditional essays and restricted online exams. This year we have seen a big growth in this area as academics move from paper-based written exams to time-restricted assignments. An obvious caveat here is that these haven’t been conducted under true exam conditions and so are best described as open book exams. Many digital assessment platforms now include various types of proctoring and would be able to handle remote time-restricted essays (and other assessment types) securely. There are, however, a number of ethical issues to be considered with online proctoring, and we need to proceed cautiously here. 

As a University, I feel we should also be looking to expand our capacity for online assessment as over the next decade we will probably see the end of paper-based exams in favour of typed essay papers delivered online due in part to student expectations.

Academics have had a year to adapt to exams in lockdown and many have discovered the benefits of Moodle quizzes for exams that offer automatic marking. (And note that Moodle is excellent at delivering scientific and mathematical exam questions as well as longer coursework assignment submissions.) Generally speaking the Technology and Science and Health faculties deliver the majority of our Moodle quiz based exams and the number of exams has grown significantly during the lockdown. Many academics don’t want to go back to paper.

In Technology Enhanced Learning we oversee online exams and assessments in terms of supporting and evaluating the digital tools and making sure Moodle can handle the number of exams thrown at it. The number of online exams has increased substantially over the last year, all funnelled into two exam windows. As a team we work closely with colleagues in IS to provide more capacity in Moodle and with timetabling to ensure the exams are evenly distributed to avoid terminal peaks of concurrent users, providing a stable Moodle platform for all users.

Without the bottleneck of physical exam rooms, the January 2021 exams were initially weighed in the favour of academic requests around having exams earlier in the day and only using the first week of the exam window to maximise available marking time. Unfortunately, this translated into a scenario that would have presented a significant number of terminal peaks of concurrent users on Moodle. Members of TEL worked closely with the central timetabling unit to level out these peaks and with the exception of one or two slow points, we all delivered a successful exam window in January.

In advance of the May/June exams, we have gone further and set hard parameters around how many exams (quizzes) or timed assignments (Turnitin or Moodle assignments) can be timetabled in any given time slot. We’d like to thank CTU for their tireless effort to make this happen. It wasn’t an easy job to manage all the necessary requirements but it’s given us an exam timetable that looks like the image below. This really is invaluable work to the University when assessment represents so much effort by students, academics and support staff.

A screenshot of the exams for a week, days, dates, section, then slips into assignments and exams then the total of students expected to be in Moodle during that period

Our increasing reliance on online assessment means, I think, that we should investigate new technologies to support that function. Platforms such as Gradescope or CrowdMark could help relieve the marking burden; one of the many platforms such as Wiseflow or Mettl or Inspera could provide extra exam capacity (with the functionality to proctor exams if that was something the University wanted to do). Moodle, with its advanced quiz and assignment capabilities, would continue to play a key role.

I believe we will get through this coming assessment period well, but as our reliance on online assessment grows so must our technologies to support it. 

As a University the Covid-19 pandemic has been a driver for the uptake of online learning and assessment. As a University community, we need to harness this positive momentum and diversify our offering of assessment platforms to support students and staff.

Credit Image: Photo by MayoFi on Unsplash 

Guest Blogger: Mary Watkins – Using Turnitin to increase consistency in assessment marking and feedback

The School of Education and Childhood Studies (SECS) have used electronic submission since 2015 with all students asked to submit their artefacts through Turnitin, Moodle Assignments or Moodle Workshops. SECS reviewed our electronic submission process at the end of 2016/17 and looked for ways we could improve the process for students as well as markers.

Student feedback suggested that changes could be made to improve the consistency of feedback across units as well as the transparency of grades and the ways in which grades could be improved in the future.

Using Rubrics

Joy Chalke (Senior Lecturer) and Chris Neanon (Associate Head Academic) worked with Mary Watkins (Senior Online Course Developer) to create an electronic version of the School’s new grade criteria/mark sheet for each level of study from level four undergraduates through to level seven postgraduates. The electronic version was created using a Turnitin rubric which reflected the four key areas students’ submissions are marked by:

  1. research,
  2. content,
  3. argument and analysis,
  4. organisation, style and presentation.

The new Turnitin rubrics help to ensure that all markers evaluate student work via the same set criteria, in this case based on these four key areas, and therefore respond to the students’ request for consistent marking.

Having attached the rubric to all Turnitin dropboxes, markers were able to indicate to students areas of strength as well as areas for improvement in the artifact submitted and link these to the sections on the rubric. Upon receiving this feedback students have been able use the rubric to see how improvements to their research, content, argument, and organisation will increase their grade in future submissions.

Using QuickMarks

Joy and Mary also worked together to create sets of ‘QuickMarks’, a tool in Turnitin that allows markers to drag and drop comments on to students’ submissions and therefore reduces the need for online markers to type the same comments repeatedly.

Joy drafted a set of QuickMarks for each level of study which Mary converted into corresponding sets of QuickMarks. Markers were then able to drag and drop relevant comments from the set on to students’ submissions, amending or adding to the comment(s) as appropriate.

A pilot was conducted in two units with large numbers of markers on the UG programmes and trialed in a unit on a PG program. SECS will be monitoring this process throughout 2017/18 and using feedback from students as well as staff to improve the way marking is conducted and feedback is given to our students.

Turnitin – What’s in a number?

The University of Portsmouth uses the Turnitin service to provide facilities for plagiarism detection, online marking and as a development tool for academic writing, although most users are interesting in one thing – a number.

Contained within the Originality Report is a Similarity Score out of 100, which many users wrongly believe to a be plagiarism score with a magic number, at which in can be conclusively determined whether plagiarism has or has not occurred. The problem is, this figure can be manipulated, there will also be mitigating circumstances and lastly let us not forget the system is not perfect either – there will be some margin for error.

Crudely speaking the Similarity Score number is a percentage of the words in your document which matched text from other documents that Turnitin searched against. For shorter assignments with a direct question and consequently a more concise correct answer may well therefore see higher score when compared to a longer assignment with more scope to include to include diverse material.

The number of students in your class and whether the assignment has been set in previous years (or at different institutions) may limit the scope for truly original material, that’s not to say a very high score is necessarily acceptable however it does mean that the latest content may not be unique for genuine reasons. An assignment based upon group work is also a recipe for a higher than usual Similarity Score since students are likely to be working from the same research, data and figures so will in all likelihood draw the same conclusions.

What does Turnitin check an assignment against? There are stored student papers in both a global central repository and the University of Portsmouth own repository (where we might store more sensitive documents). Turnitin also searches against material found on the internet and can check journals, periodicals and publications. Personally I would check against everything, if the service is available, use it.

Turnitin offers several filters which may be toggled, for example whether to include or exclude bibliographic references. Personally I cannot think of a reason why you want to include bibliographic references in the Similarity Score as citing sources is a requirement of good academic writing. That said if the assignment were a lab report and references were not expected then it might be safer to include bibliographic references just in case the Turnitin software incorrectly identified a bibliography and consequently excluded all of the text that followed. You can also toggle quoted material, quotes would not normally be considered within a plagiarism report although the volume of them may indicate a lack of original content from the author. Where quoted material is excluded from the Originality Report, Turnitin helpfully points out when more than 15% of the paper is quoted material. The final filter is for small matches, usually matches of 3-4 words are rather inconsequential, you may also have longer phrases that appear repeatedly throughout the assignment – you can exclude this from being repeatedly matched and skewing the Similarity Score using the ‘exclude small matches’ filter. Personally I use all the filters, excluding bibliographic references, quoted material and small matches – I can always turn them back on later when reviewing a paper if I am suspicious.

So after searching against all of the available material, excluding bibliographic references, quoted material and small matches, what is the magic number? Well, the magic number is… the number at which you become suspicious of course!

Finally, to wrap up this post, and just in case a concerned student has stumbled across this blog post, I would like to emphasise that if they know they have not deliberately plagiarised then they have nothing to worry about. If they are concerned that they have used another source and may not have referenced it properly, then guidance is available from the Academic Skills Unit (https://kb.myport.ac.uk/Article/Index/12/4?id=2747)

 

Email: academicskills@port.ac.uk

Telephone: +44 (0)23 9284 3462

Or, visit the Academic Skills Unit in person during our opening hours:

Third floor Reception, The Nuffield Centre

St Michael’s Road

Portsmouth

PO1 2ED

Turnitin – Multiple Markers

*Currently we have had to disable this feature for some standard functionality to work, we will look at reactivating it as soon as a more stable version is available.*

Turnitin, as we all know, allows students to submit their work electronically and get a ‘similarity report’ – a comparison of the submitted work against a vast database of existing papers and websites. Academics have access to the similarity reports, which can be a great help in cases where they suspect a student might have committed plagiarism. Turnitin, through features such as comment banks and drag-and-drop comments, also works well for marking work electronically.

While we have been using Turnitin at Portsmouth for many years, the interface has changed somewhat; it’s now called Feedback Studio.

Feedback Studio has a much cleaner interface than the classic version of Turnitin, and it now works within a mobile device without needing to install the Turnitin app (which is only available on iPad).

The newest feature to become available is Multiple Markers, which is currently in beta. Multiple Markers helps with second marking. A marker’s initials are placed next to any comment or quickmark that has been placed into the document. As you can see from the image, there are three comments here: two from the first marker (with initials PQ; you can see the bubble comment and quickmark added to the text) and one from the second marker (with initials TL; the initials are placed next to a bubble comment). Any plain text comments or strikethroughs are not initialled.

Multiple Markers is a great feature for academics who need the ability to share marking or do second marking, while students can quickly and easily see where different markers have annotated their work.

© 2024 Tel Tales

Theme by Anders NorénUp ↑