Along with colleagues from about 30 different UK universities I attended the Remaking Marking: Electronic Management of Assessment conference held on 4 September 2018 at the University of Reading. I came back feeling confident that, here at Portsmouth, we are developing the electronic management of assessment (EMA) in a reasonable way. Our use of Moodle gives us an element of flexibility that some institutions, using other VLEs, are lacking. Furthermore, the drivers for implementing EMA and people’s hopes for this approach appear to be the same here as everywhere else in the sector. On the other hand, academics have some legitimate concerns – DSE worries; offline marking; having to scroll through documents; the functionality and usability of marking tools; the need to take account of disciplinary differences – and these are shared across the sector, Portsmouth included.
One worrying aspect of the conference was the number of institutions that had tried to implement a marks integration project – and failed. It seems strange that data held in one electronic system (the VLE) cannot readily be transferred to another electronic system (the Student Record System (SRS)), but this seems to be the case. It is particularly strange given that the data we want to transfer – student marks – is so important; surely we shouldn’t have to rely on an intermediate stage in which humans can introduce error? We thought we’d cracked the problem here, several years ago, but the proposed solution was not guaranteed to be sustainable at the SRS end. Perhaps the new SRS will open up new possibilities for us.
Some interesting discussions centred around:
- the use of rubrics, and whether (and how) they should be used more widely;
- the use of shared QuickMarks – should a central department or section provide a library of generic QuickMarks, containing links to high-quality support resources, for use across a faculty or institution?
- the increased use of audio feedback – research suggests that students appreciate audio feedback, but only in certain cases.
One suggestion I found particularly interesting came from Dr Rachel Maxwell, University of Northampton. They thought it important to manage student expectations regarding EMA. In particular, they found that students didn’t have a clear idea of what the assessment process entails at university level.
A student-generated illustration of the assessment process in place at the University of Northampton. (Credit: Katie May Parsons, all rights reserved)
Header image taken from the Remaking Marking: Electronic Management of Assessment Conference (2018). Birds Migrating in Formation
(Assessed: 12th November 2018). Thank you to The University of Reading for letting us use their poster.
The School of Education and Childhood Studies (SECS) have used electronic submission since 2015 with all students asked to submit their artefacts through Turnitin, Moodle Assignments or Moodle Workshops. SECS reviewed our electronic submission process at the end of 2016/17 and looked for ways we could improve the process for students as well as markers.
Student feedback suggested that changes could be made to improve the consistency of feedback across units as well as the transparency of grades and the ways in which grades could be improved in the future.
Joy Chalke (Senior Lecturer) and Chris Neanon (Associate Head Academic) worked with Mary Watkins (Senior Online Course Developer) to create an electronic version of the School’s new grade criteria/mark sheet for each level of study from level four undergraduates through to level seven postgraduates. The electronic version was created using a Turnitin rubric which reflected the four key areas students’ submissions are marked by:
- argument and analysis,
- organisation, style and presentation.
The new Turnitin rubrics help to ensure that all markers evaluate student work via the same set criteria, in this case based on these four key areas, and therefore respond to the students’ request for consistent marking.
Having attached the rubric to all Turnitin dropboxes, markers were able to indicate to students areas of strength as well as areas for improvement in the artifact submitted and link these to the sections on the rubric. Upon receiving this feedback students have been able use the rubric to see how improvements to their research, content, argument, and organisation will increase their grade in future submissions.
Joy and Mary also worked together to create sets of ‘QuickMarks’, a tool in Turnitin that allows markers to drag and drop comments on to students’ submissions and therefore reduces the need for online markers to type the same comments repeatedly.
Joy drafted a set of QuickMarks for each level of study which Mary converted into corresponding sets of QuickMarks. Markers were then able to drag and drop relevant comments from the set on to students’ submissions, amending or adding to the comment(s) as appropriate.
A pilot was conducted in two units with large numbers of markers on the UG programmes and trialed in a unit on a PG program. SECS will be monitoring this process throughout 2017/18 and using feedback from students as well as staff to improve the way marking is conducted and feedback is given to our students.