Adventures in Technology Enhanced Learning @ UoP

Tag: marking

Remaking Marking Conference

Along with colleagues from about 30 different UK universities I attended the Remaking Marking: Electronic Management of Assessment conference held on 4 September 2018 at the University of Reading. I came back feeling confident that, here at Portsmouth, we are developing the electronic management of assessment (EMA) in a reasonable way. Our use of Moodle gives us an element of flexibility that some institutions, using other VLEs, are lacking. Furthermore, the drivers for implementing EMA and people’s hopes for this approach appear to be the same here as everywhere else in the sector. On the other hand, academics have some legitimate concerns – DSE worries; offline marking; having to scroll through documents; the functionality and usability of marking tools; the need to take account of disciplinary differences – and these are shared across the sector, Portsmouth included.

One worrying aspect of the conference was the number of institutions that had tried to implement a marks integration project – and failed. It seems strange that data held in one electronic system (the VLE) cannot readily be transferred to another electronic system (the Student Record System (SRS)), but this seems to be the case. It is particularly strange given that the data we want to transfer – student marks – is so important; surely we shouldn’t have to rely on an intermediate stage in which humans can introduce error? We thought we’d cracked the problem here, several years ago, but the proposed solution was not guaranteed to be sustainable at the SRS end. Perhaps the new SRS will open up new possibilities for us.

Some interesting discussions centred around:

  • the use of rubrics, and whether (and how) they should be used more widely;
  • the use of shared QuickMarks – should a central department or section provide a library of generic QuickMarks, containing links to high-quality support resources, for use across a faculty or institution?
  • the increased use of audio feedback – research suggests that students appreciate audio feedback, but only in certain cases.

One suggestion I found particularly interesting came from Dr Rachel Maxwell, University of Northampton. They thought it important to manage student expectations regarding EMA. In particular, they found that students didn’t have a clear idea of what the assessment process entails at university level. 

A student-generated illustration of the assessment process in place at the University of Northampton. (Credit: Katie May Parsons, all rights reserved)  

Header image taken from the Remaking Marking: Electronic Management of Assessment Conference (2018). Birds Migrating in Formation 
(Assessed: 12th November 2018).  Thank you to The University of Reading for letting us use their poster.


Keeping everyone happy – tricky but not impossible

Anonymous or blind marking is an important part of the assessment and feedback process. For a student it ensures work is marked fairly without bias. However, there is an equally valuable requirement for academics and support staff to be able to identify students who have yet to submit their assignment and may be in need of additional support.

In the paper-based past, this was a relatively easy task. Students submitted assignments with cover sheets which could be easily removed my administrators. Assignments were tracked and handed to academics for blind marking.

Online assessment technology such as Turnitin and the Moodle assignment match-up quite closely to the workflow of paper-based assessment but with a few extra tools to help academics. There is no longer a need for students to identify themselves within their assignments as we know who they are when they log into Moodle. In fact, by the letter of the law, a student can be penalised for adding their name to an assignment. In reality, though, some departments still require students to provide a cover sheet in their assignment which invalidates the the blind marking setting in their Moodle or Turnitin assignment. My guess at the motivation for identifying students would be one of trying to help students and make ensure they don’t miss their deadlines. I’d be genuinely interested to hear the reasons for the need for cover sheets in the comments below.

What if there was a way for all the assessment stakeholders to get what they need and still preserve anonymity? Well luckily there now is a way to do this in Moodle.

On each UoP Moodle unit you will find a new report under Course Administration > Reports > Course Submissions.

When an assignment is live, course administrators and Online Course Developers can see a submission status, Turnitin paper id (or Moodle participant number), provisional grade and identifying information for each student in a cohort or group. This is all the information they will need to keep an eye on the process and transfer grading information to student records later on. With a bit of extra magic lecturers get to see a subset of this information including the identifying student information and a submission status even when an assignment is anonymised. For academics there is no link between the submission status and a specific submission, this is released to the academic after the post date. Coupled with a release threshold, which prevents anyone guessing who’s who, the report attempts to keep everyone happy.

Here’s an idea of what the report looks like in practice.

Click image for full size version

In the near future we plan to allow staff to download the data from the course submissions report to a spreadsheet making it easier to transfer to student records.

I’d be interested to hear if this makes online assessment a little easier. Feel free to share your thoughts in the comments box at the bottom of this page. If you find the report useful you may find the new assessment course format helps you out too. A short introduction video is available here:

“Anonymous” image courtesy of Luciano Castello CC:

© 2024 Tel Tales

Theme by Anders NorénUp ↑