Adventures in Technology Enhanced Learning @ UoP

Category: Assessment (Page 2 of 2)

Great feedback is essential

Wouldn’t it be great if students could read the feedback they’ve received for their assignment, write a short reflection on what they could do to improve (perhaps also identifying what they’d like to receive feedback on next time round) and then see their grade? 

Our current online assessment tools (Turnitin and Moodle Assignment) don’t allow us to do this. Luckily we know an assignment tool that does – and it has many other modern assessment feedback mechanisms too.

I’m passionate about helping improve assessment feedback for students. It’s one of the things I’ll be working on in my new secondment as a Senior Lecturer in Digital Learning & Innovation. On Mondays, Tuesdays and Wednesdays I’ll be working between the TEL and AcDev teams to help coordinate projects to better support academics, Online Course Developers and students with a focus on digital education. In particular, I’ll be working to help get a small pilot off the ground for EdWord – a fantastic new assessment tool that promises to address many of the requirements of modern assessment and feedback. If you’re interested in taking part in this pilot please let me know.

In addition, I’ll also be helping to establish an online staff community alongside the APEX programme featuring special interest groups. This will be a great place to make contact with like-minded staff from other faculties and exchange ideas.

Tom Langston and I will be creating a support mechanism for Online Course Developers who are interested in completing their CMALT portfolio and who might be interested in taking part in future elearning projects with TEL.

I’ll also be doing a bit of lecturing on the Research Informed Teaching programme, which I’m looking forward to. So this will be a busy year for me!

Please get in touch if you’ve got any ideas or projects we can help you with. Both the TEL and AcDev teams would appreciate  your feedback as we work to ensure we’re offering the services that will provide value to you and your students (you can reach me on ext. 3194).

Image credit: https://commons.m.wikimedia.org/wiki/File:Paper_Plane_Vector.svg

 

Assessment online – Are we past the “hand-in” date?

Introduction

In eLearn, we have just reached the end of the exam period with our faculties intact (excuse the pun) and with very little drama (which is not normally the case). The sight of nervous students queuing up outside of Spinnaker for an exam inside a gym hall bought all those memories of dread I had experienced nearly 20 years ago flooding back.

When I think about how much has changed in the teaching landscape in terms of the integration of technology into teaching, as well as the diverse ways in which people attend university, I can’t help but feel this method of summative assessment is rather antiquated.

This could very easily turn into a blog about the nature of summative assessment, which I wrote far too many assignments about in 2004 as part of my teaching degree. I don’t want this to turn into a virtual trip down memory lane for myself but a means to highlight what is different and future possibilities.

The wonder of Turnitin

With my teacher hat firmly still on my head, I can’t be more positive about this technology when it comes to marking, having lived the late nights devoted to marking never ending piles of papers. True, it has its faults and the late nights may have merely been transferred from pen and paper to in front of a screen but it has so many facets designed to make the experience easier for both marker and student. You can’t help but feel its implementation has been a large forward step in the progression of assessment. Being able to customise and apply quickmarks across assignments prevents the numerous occasions “RTQ” would have to be written. The possibility of copy and pasting comments or highlighting text to directly link to aspects of a rubric are all seemingly small things that actually take hours when going through the work of 90 students and that is before you give personalised feedback that moves learning on.

The student gets a rich visual experience that can be accessed on any device and feedback is so easily obtainable/downloadable that it could only promote reflective practise. The hand-in process has changed dramatically with the long line outside of the faculty admin office with bound assignment in hand is a thing of the past and it can now be submitted in bed with a cuppa. Don’t get me wrong, you will still get students who will leave it till the last minute and those who perhaps have been a little too influenced by other sources within their writing but nevertheless a snapshot of this process in 2019 vastly differs from 2009 and is a world away from 1999. The same of which can’t be said for the end of year exam.

Quizzes – More than just for daytime tv

Perhaps it is slightly unfair to portray examinations at university to be solely desk based due to the increase in exams being carried out online using Moodle Quiz. The Quiz tool is far more powerful and robust than perhaps people realise. Yes you can use it to create multiple choice “pop quizzes” for the end of topic or to elicit prior conceptions at the start of something new but it can also be used to make 100 questioned essay-based behemoths which include a variety of different question types. Safe Exam Browser allows for it to be taken under true exam restrictions and the ease in which times and restrictions can be customised makes them far more accessible than its paper-based counterpart. Claro Reader software can be used to overlay colours and intuitively applies text-to-speech (dependant on how the exam has been written of course!). The possibility of including image or video within an exam assessment not only opens up a wealth of ways to question but leads me on to my next point.

The Audiovisual Essay

I was very fortunate to have witnessed a presentation from the inspiring Dr Catherine Grant who spoke about the concept The Audiovisual Essay in Film & Moving Image Studies. I would certainly recommend visiting the website, which explores the concept in great detail. There are some amazing examples and relevant research that has been undertaken about the subject. For those who are unfamiliar with this form it is essentially the expression of critical, analytical and theoretical work using the resources of audiovisuality (images/sound/video in montage) I begrudge trying to pigeon hole the genre further but it truly flies in the wind against sitting in a hall for 3 hours writing an English Literature exam. While it lends itself to creative, historical, visually rich courses and cannot be applied across the board, the premise of it being a “different” way to demonstrate understanding is valid.

Final Thoughts

This brings us back to assessment types and again perhaps explains the shift towards the greater emphasis on coursework-based assessment models. That in my eyes is a different debate, this blog is exploring whether sitting in hall to carry out an end of year assessment still has a place in modern university life. You have to question over their time in Higher Education, how many opportunities students get to sit at a desk for a considerable time and demonstrate their understanding in that way. Are we providing students with a rather unnatural medium by which to demonstrate their understanding? Does that in turn affect their ability to reach their true potential? Particularly as the end of year summative assessment the culmination of the blood, sweat and tears of their learning journey, do we not owe it to the learner to reassess the way we make this final assessment. The flip side of this is to give students more exam practice and opportunities but is this a direction where we want to go? To me that seems to be a practice that would be looking in the rear view mirror where I would argue we should have our eyes on the road ahead.

 

Featured Image:

Image by PublicDomainPictures from Pixabay

Online Exams in Moodle

We’ll start to see a lot of online Moodle exams from Monday 29th April.

At this time of year a lot of time is dedicated to preparing for the formal exam weeks to begin. Academics and Online Course Developers are creating questions and testing exam quizzes. The TEL team are testing the Moodle infrastructure and exam reporting, and working with IS to ensure we have enough server resources at key times during the exam period. This year the formal examination weeks will run from Monday 13th May 2019 through to Friday 7th June 2019. It’s worth making a note in your diary about these dates but as I’ll highlight in this blog post, you’ll probably want to set these dates to start from Monday 29th April.

What constitutes an online Moodle exam?

We encourage staff to tag Moodle-based exams with the ‘exam-official’ tag and set appropriate dates. This information, along with data provided by Online Courses Developers, helps us build a picture of the when exams are happening. It means we can spot ‘pinch points’, times where we expect a lot of concurrent exam attempts, which could represent a need for more server resources to be made available. In short if you don’t tag your assessments we’ll find it difficult to guarantee a seamless exam experience for your students.

Tagging also means you can run your online exams in one of our supported secure exam browser environments (Safe Exam Browser, FAQs available here or Chromebooks in secure exam mode). This means students can’t easily access other websites during their exam and makes invigilation a lot easier.

How many online exam attempts were there last year?

During the official exam period last year (13.05.2018–07.06.2018) there were 5426 exams attempts that we all helped support. However, this figure isn’t the full picture. In the two week preceding the exam period 2737 official online exam attempts took place. That’s quite a staggering number of exam attempts happening outside of the time-frame that staff are focussed on supporting and something we should all be mindful of this year. Monday April 29th is the start date for your diaries.

Exams change freeze

IS and TEL enforce a number of Moodle change freezes throughout the academic year. These are periods when no updates can be made to Moodle. We have one in place during the exam period to ensure changes don’t inadvertently interrupt exams. The exam change freeze this year will run from Monday 29th April through to Friday 7th June (inclusive).

How do we monitor exams are going well?

The TEL team gather data from a number of sources to help monitor what’s happening through the exam period. We combine it into a real-time dashboard of where exams are happening and how Moodle is performing. You can see a picture below from last year.

Exam Analytics Dashboard

We rely on support from a number of sources to produce this dashboard. Lead Online Course Developers have their finger on the pulse of when exams will happen and any last minute amendments. In the very near future we’ll be asking again for your help to populate a spreadsheet with the exams you know about. We combine this information with data from student records, which is thorough but lacks local knowledge. We also use Google Analytics, Moodle database and server infrastructure reporting to keep an eye on how our systems our performing. This combined data is extremely useful at spotting pinch points and monitoring how things are running but it’s not as effective as people in faculties such as exam invigilators and online course developers ensuring all is running well and reporting to elearn@port.ac.uk any issues that are encountered.

A new exam theme for Moodle

It’s worth mentioning at this point that we’re close to final testing of an updated version of the Moodle exam theme. This is a stripped down version of our theme intended for use in exams. We’ll provide more information on this in the next few weeks. It will look very similar to the existing exam theme but will be a bit closer to our regular theme in terms of question layout and styles.

Thank You

We just wanted to say a big thank you to everyone involved in making sure the exam period goes well. It’s very much a big team effort and all the work we put in ensures students get as stress-free an experience as possible. I’m sure they appreciate it, I know that the team in TEL certainly do.

If you have any questions about online exams, please get in touch at elearn@port.ac.uk

 

Thank You Image

Image credits: Photo by Lip on Unsplash

Remaking Marking Conference

Along with colleagues from about 30 different UK universities I attended the Remaking Marking: Electronic Management of Assessment conference held on 4 September 2018 at the University of Reading. I came back feeling confident that, here at Portsmouth, we are developing the electronic management of assessment (EMA) in a reasonable way. Our use of Moodle gives us an element of flexibility that some institutions, using other VLEs, are lacking. Furthermore, the drivers for implementing EMA and people’s hopes for this approach appear to be the same here as everywhere else in the sector. On the other hand, academics have some legitimate concerns – DSE worries; offline marking; having to scroll through documents; the functionality and usability of marking tools; the need to take account of disciplinary differences – and these are shared across the sector, Portsmouth included.

One worrying aspect of the conference was the number of institutions that had tried to implement a marks integration project – and failed. It seems strange that data held in one electronic system (the VLE) cannot readily be transferred to another electronic system (the Student Record System (SRS)), but this seems to be the case. It is particularly strange given that the data we want to transfer – student marks – is so important; surely we shouldn’t have to rely on an intermediate stage in which humans can introduce error? We thought we’d cracked the problem here, several years ago, but the proposed solution was not guaranteed to be sustainable at the SRS end. Perhaps the new SRS will open up new possibilities for us.

Some interesting discussions centred around:

  • the use of rubrics, and whether (and how) they should be used more widely;
  • the use of shared QuickMarks – should a central department or section provide a library of generic QuickMarks, containing links to high-quality support resources, for use across a faculty or institution?
  • the increased use of audio feedback – research suggests that students appreciate audio feedback, but only in certain cases.

One suggestion I found particularly interesting came from Dr Rachel Maxwell, University of Northampton. They thought it important to manage student expectations regarding EMA. In particular, they found that students didn’t have a clear idea of what the assessment process entails at university level. 

A student-generated illustration of the assessment process in place at the University of Northampton. (Credit: Katie May Parsons, all rights reserved)  

Header image taken from the Remaking Marking: Electronic Management of Assessment Conference (2018). Birds Migrating in Formation 
(Assessed: 12th November 2018).  Thank you to The University of Reading for letting us use their poster.

 

Guest Blogger: Mary Watkins – Using Turnitin to increase consistency in assessment marking and feedback

The School of Education and Childhood Studies (SECS) have used electronic submission since 2015 with all students asked to submit their artefacts through Turnitin, Moodle Assignments or Moodle Workshops. SECS reviewed our electronic submission process at the end of 2016/17 and looked for ways we could improve the process for students as well as markers.

Student feedback suggested that changes could be made to improve the consistency of feedback across units as well as the transparency of grades and the ways in which grades could be improved in the future.

Using Rubrics

Joy Chalke (Senior Lecturer) and Chris Neanon (Associate Head Academic) worked with Mary Watkins (Senior Online Course Developer) to create an electronic version of the School’s new grade criteria/mark sheet for each level of study from level four undergraduates through to level seven postgraduates. The electronic version was created using a Turnitin rubric which reflected the four key areas students’ submissions are marked by:

  1. research,
  2. content,
  3. argument and analysis,
  4. organisation, style and presentation.

The new Turnitin rubrics help to ensure that all markers evaluate student work via the same set criteria, in this case based on these four key areas, and therefore respond to the students’ request for consistent marking.

Having attached the rubric to all Turnitin dropboxes, markers were able to indicate to students areas of strength as well as areas for improvement in the artifact submitted and link these to the sections on the rubric. Upon receiving this feedback students have been able use the rubric to see how improvements to their research, content, argument, and organisation will increase their grade in future submissions.

Using QuickMarks

Joy and Mary also worked together to create sets of ‘QuickMarks’, a tool in Turnitin that allows markers to drag and drop comments on to students’ submissions and therefore reduces the need for online markers to type the same comments repeatedly.

Joy drafted a set of QuickMarks for each level of study which Mary converted into corresponding sets of QuickMarks. Markers were then able to drag and drop relevant comments from the set on to students’ submissions, amending or adding to the comment(s) as appropriate.

A pilot was conducted in two units with large numbers of markers on the UG programmes and trialed in a unit on a PG program. SECS will be monitoring this process throughout 2017/18 and using feedback from students as well as staff to improve the way marking is conducted and feedback is given to our students.

Keeping everyone happy – tricky but not impossible

Anonymous or blind marking is an important part of the assessment and feedback process. For a student it ensures work is marked fairly without bias. However, there is an equally valuable requirement for academics and support staff to be able to identify students who have yet to submit their assignment and may be in need of additional support.

In the paper-based past, this was a relatively easy task. Students submitted assignments with cover sheets which could be easily removed my administrators. Assignments were tracked and handed to academics for blind marking.

Online assessment technology such as Turnitin and the Moodle assignment match-up quite closely to the workflow of paper-based assessment but with a few extra tools to help academics. There is no longer a need for students to identify themselves within their assignments as we know who they are when they log into Moodle. In fact, by the letter of the law, a student can be penalised for adding their name to an assignment. In reality, though, some departments still require students to provide a cover sheet in their assignment which invalidates the the blind marking setting in their Moodle or Turnitin assignment. My guess at the motivation for identifying students would be one of trying to help students and make ensure they don’t miss their deadlines. I’d be genuinely interested to hear the reasons for the need for cover sheets in the comments below.

What if there was a way for all the assessment stakeholders to get what they need and still preserve anonymity? Well luckily there now is a way to do this in Moodle.

On each UoP Moodle unit you will find a new report under Course Administration > Reports > Course Submissions.

When an assignment is live, course administrators and Online Course Developers can see a submission status, Turnitin paper id (or Moodle participant number), provisional grade and identifying information for each student in a cohort or group. This is all the information they will need to keep an eye on the process and transfer grading information to student records later on. With a bit of extra magic lecturers get to see a subset of this information including the identifying student information and a submission status even when an assignment is anonymised. For academics there is no link between the submission status and a specific submission, this is released to the academic after the post date. Coupled with a release threshold, which prevents anyone guessing who’s who, the report attempts to keep everyone happy.

Here’s an idea of what the report looks like in practice.

Click image for full size version

In the near future we plan to allow staff to download the data from the course submissions report to a spreadsheet making it easier to transfer to student records.

I’d be interested to hear if this makes online assessment a little easier. Feel free to share your thoughts in the comments box at the bottom of this page. If you find the report useful you may find the new assessment course format helps you out too. A short introduction video is available here:

Attributions
“Anonymous” image courtesy of Luciano Castello CC: www.flickr.com/photos/luccast85/6250260580

Turnitin – What’s in a number?

The University of Portsmouth uses the Turnitin service to provide facilities for plagiarism detection, online marking and as a development tool for academic writing, although most users are interesting in one thing – a number.

Contained within the Originality Report is a Similarity Score out of 100, which many users wrongly believe to a be plagiarism score with a magic number, at which in can be conclusively determined whether plagiarism has or has not occurred. The problem is, this figure can be manipulated, there will also be mitigating circumstances and lastly let us not forget the system is not perfect either – there will be some margin for error.

Crudely speaking the Similarity Score number is a percentage of the words in your document which matched text from other documents that Turnitin searched against. For shorter assignments with a direct question and consequently a more concise correct answer may well therefore see higher score when compared to a longer assignment with more scope to include to include diverse material.

The number of students in your class and whether the assignment has been set in previous years (or at different institutions) may limit the scope for truly original material, that’s not to say a very high score is necessarily acceptable however it does mean that the latest content may not be unique for genuine reasons. An assignment based upon group work is also a recipe for a higher than usual Similarity Score since students are likely to be working from the same research, data and figures so will in all likelihood draw the same conclusions.

What does Turnitin check an assignment against? There are stored student papers in both a global central repository and the University of Portsmouth own repository (where we might store more sensitive documents). Turnitin also searches against material found on the internet and can check journals, periodicals and publications. Personally I would check against everything, if the service is available, use it.

Turnitin offers several filters which may be toggled, for example whether to include or exclude bibliographic references. Personally I cannot think of a reason why you want to include bibliographic references in the Similarity Score as citing sources is a requirement of good academic writing. That said if the assignment were a lab report and references were not expected then it might be safer to include bibliographic references just in case the Turnitin software incorrectly identified a bibliography and consequently excluded all of the text that followed. You can also toggle quoted material, quotes would not normally be considered within a plagiarism report although the volume of them may indicate a lack of original content from the author. Where quoted material is excluded from the Originality Report, Turnitin helpfully points out when more than 15% of the paper is quoted material. The final filter is for small matches, usually matches of 3-4 words are rather inconsequential, you may also have longer phrases that appear repeatedly throughout the assignment – you can exclude this from being repeatedly matched and skewing the Similarity Score using the ‘exclude small matches’ filter. Personally I use all the filters, excluding bibliographic references, quoted material and small matches – I can always turn them back on later when reviewing a paper if I am suspicious.

So after searching against all of the available material, excluding bibliographic references, quoted material and small matches, what is the magic number? Well, the magic number is… the number at which you become suspicious of course!

Finally, to wrap up this post, and just in case a concerned student has stumbled across this blog post, I would like to emphasise that if they know they have not deliberately plagiarised then they have nothing to worry about. If they are concerned that they have used another source and may not have referenced it properly, then guidance is available from the Academic Skills Unit (https://kb.myport.ac.uk/Article/Index/12/4?id=2747)

 

Email: academicskills@port.ac.uk

Telephone: +44 (0)23 9284 3462

Or, visit the Academic Skills Unit in person during our opening hours:

Third floor Reception, The Nuffield Centre

St Michael’s Road

Portsmouth

PO1 2ED

Skills4StudyCampus – online study skills support

Many students who arrive at the University often fall short of the necessary study skills required for them to achieve their academic goals. With the support of the Academic Skills Unit (ASK) students are able to enhance their skills by attending workshops and one-to-one sessions, as well as receive paper handouts. However, the University have also invested in a licence for the Palgrave MacMillan resource Skills4StudyCampus which is available to our students online.

Skills4StudyCampus is an interactive online tool that allows students to prepare for studying at university level and to help them develop their study skills. There are 6 modules: Getting ready for academic study; Reading and note-taking; Critical thinking skills; Writing skills; Exam skills; and Time management.

Skills4StudyCampus Moodle site ‘Skills4Study@Portsmouth’

Students can access this resource by logging into Moodle and selecting the site Skills4Study@Portsmouth from the ‘Useful Sites’ drop-down menu.

Students are then free to actively participate in activities as and when they need to. The modules include: diagnostic tests to help students recognise areas in which they need to improve their skills; self assessment tasks to help them gain a deeper understanding of their knowledge and skills; interactive activities to help reinforce the skills and knowledge they have learnt; and module assessments to test understanding of what has been learnt. Students can also use the My Journal feature which allows them to make notes and reflect on their learning.

The modules have been designed to suit different types of learners and has been created to support students with accessibility issues.

Embed Skills4StudyCampus into your Moodle course units
Skills4StudyCampus modules / sections can now be easily embedded directly into a Moodle course unit using the External tool activity.

To find out more on how to embed sections of this resource into your Moodle course unit, log into Moodle and select ‘Staff Help Site’ from the ‘Help Sites’ drop-down menu. You will then find a section that will provide you with guidance along with the generated embedding links.

Making online exams work for you

When it comes to online exams there are a number of questions that cause headaches for support staff and academics. Where am I going to find the time to create all the questions? How do I make sense of all these settings in a Moodle Quiz? How can I keep an eye on so many students during the exam itself?

The simple answer to all these questions is normally to speak to the right people. The first port of call, if you’re interested in getting started with online assessment, it’s your friendly Faculty Online Course Developer (or the central eLearn team), who will be happy to advise or point you in the right direction.

Moodle is of course not the only tool for conducting online exams, but it is very good at handling large groups of students who are attempting many questions all at the same time. These questions generally have a right or wrong answer, most of which can be automatically marked. Essay questions can also be posed, but these will require manual grading. (Many students these days have difficulty in writing by hand  for three hours, so if your exam is heavily essay-based you might want to investigate a tool such as DigiExam, which  allows students to type their answers (contact the eLearn team for more information about DigiExam).

A tremendous amount of question-writing effort has already been made at UoP by staff across faculties. There are close to a million questions already in Moodle, most created directly by staff but with a significant percentage having been imported from existing Word documents, shared by colleagues in other departments or institutions, purchased from commercial suppliers or imported from older systems. You don’t always have to start from scratch, as many academics already have treasure troves of questions that can be adapted or imported.

Once you have the questions you wish to pose, your next step will be setting up the quiz that will be used to deliver the questions. This annotated pdf of typical Moodle exam settings walks you through the various quiz settings (many of which are set to the optimum setting by default). Your Faculty Online Course Developer will be able to help out here, and also assist with the important job of testing the quiz or exam.

By this point you’ll have a working, thoroughly tested Moodle quiz that you could use for a summative assessment. As a member of staff you’ll have gone through a process of familiarisation. It’s important that you allow your students the same familiarisation with the online exam process (what to expect on exam day, how the software works and so on), not to mention any administrative staff and moderators who will be involved. It’s advisable to schedule some mock exam sessions well in advance of your first exam so your students are fully prepared when it comes to the real thing. Although it’s by no means compulsory, Safe Exam Browser (SEB) can be leveraged here. SEB is a web browser, available on all student PCs, which locks students down to a single Moodle quiz and prevents them from accessing other web sites or resources. SEB will help you keep an eye on large groups of students and be certain they are concentrating on the task at hand. Take a look at this Safe Exam Browser FAQs if it’s something you might be interested in. DCQE also have a set of 30 Chromebooks which can be locked down into exam mode potentially turning any wifi enabled room into an exam room. More information along with the Chromebook booking form can be found here.

Hopefully this blog post has sparked your enthusiasm for giving online exams a go. The keys to success are (i) getting in touch with your faculty online course developer who can help you at various points along the way, and (ii) starting with non-critical familiarisation exercises which give room for finding the edges of online assessment. It’s fair to say that you will have to dedicate a bit of time to start with creating quiz questions, but the downstream benefits of online assessment can be significant.

Some useful resources

eAssessment at the University of Portsmouth

Quiz support materials for staff

Quiz questions examples and templates

DigiExam

Image credits: https://pixabay.com/p-1828268/?no_redirect

Newer posts »

© 2024 Tel Tales

Theme by Anders NorénUp ↑