Adventures in Technology Enhanced Learning @ UoP

Category: Assessment (Page 1 of 2)

WiseFlow – Looking in the mirror with reflective portfolios in WiseFlow

Hey there, fellow exhausted souls!

Can you believe it? We’re finally coming towards the end of the academic year, and boy, has it been a fun ride!  Our WiseFlow pilot has gone from strength to strength as we support academics through a new assessment process.  More importantly, we have successfully run two separate assessments using our innovative approach of using WiseFlow as a reflective portfolio – the first use case of this we know about!  We’ve grown, learned, and potentially discovered an exciting prospect for the future of reflective portfolios at Portsmouth University, so let’s take a moment to reflect on the journey we’ve been on. 

You may have read our previous blog post on “Unlocking the power of WiseFlow: Transforming ePortfolio assessments” where we discussed the possibilities of using WiseFlow as a viable reflective portfolio platform and the benefits a reflective portfolio approach brings.  For students, this helps develop their metacognitive skills and self-awareness as learners over a period of time.  Academics, on the other hand, can use reflective portfolios to assess students’ learning outcomes in a more comprehensive and authentic manner.  This is all part of our wider WiseFlow pilot to provide one integrated assessment platform that serves our current (and future) assessment needs within Portsmouth University, which Mike Wilson spoke to us about recently on our podcast – you can listen here

Teach Well and Research-Informed Teaching

This year we ran two reflective portfolios within WiseFlow as part of our pilot project – to test the water and find out if this was even possible. The first was within our Researched Informed Teaching module, which supports early career academics to apply their learning in educational enhancements into their own contexts, through reflection and innovation.  Students will draw together higher education policy, research methods and educational developments to build students knowledge for their future work.  Secondly, we ran a reflective portfolio in our new level seven Teach Well: Principles to Practice module, which is a professional development route for those in roles related to supporting student learning. Students in this module embark on a pedagogical journey through three pillars of practice for teaching well in higher education, gaining the confidence to critically evaluate learning and design approaches and reflecting on what it means to teach well across different modes of study.  We recently caught up with Maria Hutchinson who runs this module in our podcast series, if you missed this one, you can listen here

We’ve worked closely with these academics and our support teams to develop reflective portfolios for these modules that can be used as a summative assessment vehicle which is both intuitive for learners and versatile enough to encompass a broad range of tools which enable the course learning outcomes to be demonstrated in an engaging and meaningful way.

What the students said…

Following the submission of reflective portfolios into WiseFlow, we sent out a survey to participants to gain their feedback and views.  Some of the headline figures are detailed below…

  • 90% of students found the WiseFlow reflective portfolio easy to navigate
  • 90% of students agreed that a reflective portfolio suited this type of assessment (compared with traditional essay-based assessment methods)
  • 82% of students felt their own students would enjoy using a reflective portfolio in WiseFlow
  • 71% of students enjoyed the interactive assessment methods, such as histograms, voice recorders etc. 
  • We received multiple comments about the clear instructions that were given on how to access and use Wiseflow as well as its reliability and stability as a platform.  Many users also commented positively on the functionality that WiseFlow offered compared to previously used portfolio solutions. 

Students also commented on…

  • If there was a need to add another system to Portsmouth University’s available assessment platforms – “There are too many platforms for submitting the work, Moodle, ePortfolio, WiseFlow, it is really confusing and frustrating that is necessary to learn how to use different platforms for different modules.”
  • The lack of formatting transfer from applications such as Word, when copying and pasting into WiseFlow – “Transfer of formatted MS Word document to WiseFlow could be improved. Currently, the document format is lost during the cut & paste process which then requires more effort to re-format within the WiseFlow portal.”
  • Better integration with Moodle and WiseFlow – “I’d like to see direct access from Moodle”. 

The data presented highlights the positive reception of WiseFlow as a reflective portfolio solution by students. The high percentage of students that recognized the suitability of a reflective portfolio as an assessment method, in comparison to traditional essay-based approaches and praised its usability is a really positive sign. The positive feedback on the interactive assessment methods further emphasizes the adaptability of the question bank in a traditional FlowMulti assessment to be used in an innovative way. 

However, some concerns were raised by students, such as the frustration of managing multiple assessment platforms at the university, indicating a need for better integration. This all links to our Digital Success Plan to (re)design robust assessments to meet the needs of the diverse student population within a blended and connected setting and incorporate a robust specialist end-to-end assessment platform. Our aims in the project were to make it easier for academics to design assessments, easier for students to find their assessments and feedback, and support staff by reducing the manual workaround assessments for academics.  During the next stage of the pilot project, integration into our current systems is a top priority and will alleviate these challenges.  Furthermore, the lack of formatting transfer from applications like Word to WiseFlow was highlighted as an area for improvement. These critical comments provide valuable insights for further refining and optimizing the WiseFlow system.

The evidence is clear to see – WiseFlow has the ability to provide a viable solution to reflective portfolios, with a bit of refinement – it could be excellent. 

What the staff said…

It was also vital to us that we gathered feedback from our academic staff.  

  • 100% of staff agreed that WiseFlow allowed them to develop their assessment in ways that were not previously possible
  • All staff agreed the WiseFlow reflective portfolio allowed them to fully cover learning objectives and meet the needs of their students
  • We received multiple comments about the speed of the platform, intuitive nature and search functionality which made the verification/moderation process seamless.  Staff also commended the accuracy of the rubrics for grading and how new interactive elements made them rethink how they could better use this type of functionality in the future.

Staff also commented on…

  • Comparisons to previously used portfolio platforms – “Historically the module used [another portfolio system] which was really clunky and didn’t work well at all. I really liked that Wiseflow could be scrolled across (as opposed to clicking through each page) and the layout was great”
  • Design elements within the marking interface – “It would have been useful to have had the comment box movable (I work with two screens and being able to drag the box to another screen to write on would have been a nice touch – several times I had to keep opening and closing the box as I wasn’t able to see the text underneath it)”
  • Having more time to explore the platform – “I did not feel I had enough time to play before it went live for students, but this was not WISEflow’s fault – it was just timing”. 

As an honest answer, we’ve been blown away by our staff feedback.   The unanimous agreement that WiseFlow enables new possibilities for assessment development speaks very highly of this solution and its potential in enhancing the teaching and learning experience for students at Portsmouth University.  The potential to create authentic assessments through the use of reflective portfolios is exciting.  The accuracy of the grading rubrics was also very highly commended – allowing students to have a greater chance of achieving a clear and defined target and making academic decision-making easier, fairer and more accurate.  In terms of developmental areas, the movement of the comment box is a fair point – we’ve heard from other academics about the size of the comment box before – hopefully, something that WiseFlow’s New Marker Journey will alleviate. 

Where do we go from here?

As we raised in our first blog post – the reflective portfolio solution in WiseFlow is far from perfect, with a few simple tweaks the solution could become very appealing. Sadly, some of these are out of our hands and lie within the code of the platform.  We’ve learnt a lot during the duration of this assessment as a project team, including developmental areas we have highlighted for the future.  

The single biggest limiting factor when using a reflective portfolio is when using a file upload question type.  This is limited to twelve files that are no more than 10Mb each – multiple file upload questions can be used, but will still have limits on them.  We have approached WiseFlow about this for development purposes, however, we have yet to have any significant movement on removing this limit.  The removal of this limit puts WiseFlow in an incredibly powerful position to offer another “string to their bow” in terms of assessment choice and would truly open up the use of reflective portfolios within the platform.  Sadly, with this limit in place, using reflective portfolios with some faculties such as our Creative and Cultural Industry, where students would regularly upload large .psd, CAD files, HD video, and high-quality audio etc) is just not a viable option.  Creative students will often build a “portfolio career” and we would love to be able to work with them on developing reflective portfolios, but this limit stops us.  Until this is removed, careful consideration must be taken at the planning stage of an assessment as to whether the reflective portfolio is the correct solution.  Further to this, other limitations must be considered – for example, once the reflective portfolio is live for students to complete, it cannot be altered, changed or adapted.  During the pilot, we’ve worked extensively with academics and our support teams to iron out any issues prior to release. Careful planning and consideration must take place in the authoring phase of an assignment, which will then be rigorously checked prior to release – in the same way an exam would.  This has worked at a small scale but we would need to ensure appropriate support mechanisms are in place at a larger scale.  

Our student feedback gave us valuable insight into the process of using WiseFlow.  Although reflective portfolios save every 10 seconds, if a student deletes a file or a piece of text and exits the platform, this cannot be recovered.  Over the duration of the assessments that took place, we encountered one reported instance of this. We also had some reports of formatting that will not copy from Word documents.  Again, we approached WiseFlow regarding this and it is recommended to copy/paste plain text from Word and finish the styling in the text editor of WiseFlow.  Although this solution works, having formatting that copies across would make students’ work translate much easier – particularly for those who write on external documents before copying into the platform at the last minute (like myself). In terms of progression beyond WiseFlow, we’d love for students to be able to take their work from the platform and have the ability to store it themselves or share it beyond the WiseFlow platform.  Currently, there is no solution to this.  A “zip folder” that contained all exports and uploaded files of any inputted answers into WiseFlow would be a great starting point.  Again, we’ve put forward the idea to WiseFlow, but have yet to have any movement on this.  

Where do we take our pilot now?

Although these are risks with using a reflective portfolio solution in WiseFlow, the prospect and the potential gain of this authentic assessment are exciting.  We’ve taken the plunge and proven the concept works, highlighting potential development areas which we really hope get some traction and we’d like to think WiseFlow will be open to listening to these developmental ideas.  As for our pilot project as a whole, we move into a second phase of the pilot with a continued focus on reflective portfolios but also some other areas of assessment we have struggled with in the past, such as large file submissions.  We have a plethora of training and support we are actively developing and working with local teams to ensure staff feel confident using the systems.  

We continue to have a waiting list for academics who are wanting to work with us to develop reflective portfolios in WiseFlow. I find myself meeting with academics on a weekly basis to discuss potential projects and reflective portfolio solutions in their disciplines.   So far, we’ve done no real advertising, and this interest has been created from word of mouth and from those who have used it as students. We are keen to share our experiences with other Universities in WiseFlow user groups, who are actively keen to explore this and want to learn about our innovative approach. However, we need to be open and honest about the limitations that this solution has at the moment. Collectively, we might hold enough power to make change happen but until that point, caution must be taken before embarking on a reflective portfolio to ensure this is the correct fit for assessment.

The potential of this solution is game-changing, not just for us, but for a lot of other Higher Education institutions across the world.

The future of reflective portfolios in WiseFlow is exciting – keep watching this space.  

Chris

Credit Image: Photo by MidJourney 

Guest Blogger: Co-Creating Expectations with Vevox

Introduction by Tom:

I was asked by Vevox (a company we work closely with that facilitates audience response) to run the first session in their autumn webinar series. I was happy to do this and you can watch the recording of the session on Youtube.

After the session, Joe from Vevox was asked if I would mind someone writing a blog relating to the session. I was flattered and said of course. Dr Rachel Chan from St Mary’s University in Twickenham wrote her blog and shared it with me and I asked her if we could re-publish it here on TelTales. She was happy to let us use the blog…so this blog is a short reflection from Rachel after attending my webinar on “Co-Creating Expectations with Vevox”.

Co-creation Blog

St Mar's logoMy name is Rachel Chan, I am a Senior Lecturer – Clinical Specialist Physiotherapist teaching on the BSc in Physiotherapy at St Mary’s University in Twickenham. Throughout my academic career, I have always been hugely committed to Teaching and Learning. I recently listened to a talk by Tom Langston from the University of Portsmouth about co-creation and thought it might be valuable to write a short blog to share some of his key messages.

Tom began by asking us a question ‘What is co-creation?’ We were all on the right track, people suggested things like ‘student partnership,’ ‘collaboration’ and ‘support.’ Bovill and colleagues(2016) define it as ‘…when staff and students work collaboratively with one another to create components of curricula and /or pedagogical approaches.’ Great, so, Where does it work? Tom showed us that co-creation can work in many areas of pedagogy including setting expectations, assessment criteria, curriculum content and assessment design. I was already sold by this point but there are many, less obvious benefits, to adopting co-creation in your pedagogical practice.

  1.  It enables you to better meet expectations (the students’ expectations of you, your expectations of the student and more subtly but equally important, the students’ expectations of each other). An important tip Tom shared was setting these expectations as early as possible so that everyone knows the playing field from day 1.
  2. It facilitates a dynamic approach to your teaching practice, encouraging you to reflect on what you do and allowing you to evolve as an educator. CPD in action!
  3. It gives the students’ a voice – of course, it is impossible to accommodate all of their suggestions, no one is suggesting that you do. Phew! But listening to students, and showing them that you will try to accommodate some of them, opens the channels of communication – they know that you care and that you have heard them. This is SO important.

The idea of co-creation may make some educators feel anxious and, in some areas, it will be easier to implement than others (assessment design may be more challenging for example) but you can and should start small. Bovill and Bulley have created a ladder that models co-creation, it shows dictated curriculum at the bottom and an anarchic level of students in control at the top (ttps://eprints.gla.ac.uk/57709/1/57709.pdf). Tom wasn’t suggesting you aim too high but believes adopting some co-creation in your practice will have huge benefits for all.

How to adopt this principle of co-creation? There are many ways in which you can successfully include co-creation in your teaching such as using an EVS to make quizzes or simply creating a collaboration space to stimulate discussions with students.

My take-home message…Step 1. always try to engage your students in your teaching, and perhaps more importantly…Step 2. respond to that engagement. Thanks, Tom, I am inspired!

If you have any questions or would like to know more about co-creation, please contact Tom at:  tom.langston@port.ac.uk

Using video (Panopto) feedback to encourage student engagement with assessment

This blog is written by Tom Langston, Digital Learning and Teaching specialist and Dr Jo Brindley who is the Course Leader for the Academic Professional Apprenticeship (APA). The APA is a course for new career academics within the University and provides them with ideas, support and guidance on developing their teaching skills. The course is constantly evolving in how it is delivered as it was designed to highlight best practices and current ideas within Higher Education. The course, as you will see, decided to innovate and deliver feedback using Panopto. Providing the opportunity for academics (as students) to experience a range of feedback types and engage with non-traditional forms of feedback on assessment. 

Tom: 

While the University has always had the option of recording videos, it wasn’t until Panopto, and the integration within Moodle, that I had considered using video as a tool for feedback. That’s not to say others hadn’t done it. I know people like Philip Brabazon have been doing audio feedback (not video I know) which has received positive feedback.

So with that being said, I recently graduated the first cohort for the Academic Professional Apprenticeship and have since started working closely with the course team. I was asked to help mark a few assignments and before I undertook this I asked if I could do the feedback as a video. Jo was keen to see how this would go and decided to do her feedback in the same way (I think she just needed an excuse and possibly a safety net to do it).  

We planned how we would approach the feedback and decided that Panopto would give us the easiest way to implement it. It allowed us to have our face on a screen, allowing us to demonstrate that we might have some points that need investigation but in a non-confrontational and open way. The ability to record a screen with the assignment and the marking criteria displayed at the same time helped us show how we mapped our thoughts and marking to the submission. 

This was my first experience as a marker which might mean this is a little unrepresentative. Having never marked scripts in the traditional way (either pen and paper or on-screen) giving video feedback felt a more comfortable way to mark as I knew what I was saying would not be misinterpreted. Now the argument here might be that it is a “quick” option, however, being new to marking I actually did both. Firstly, I worked through the submissions and wrote my feedback about each section down, and then secondly marked it again but on camera, and read back what I had noted the first time I read it. 

When we were devising this marking process, we made the conscious decision to not worry about being “perfect” and going back to rerecord mistakes. We wanted it to be as conversational as possible so it felt natural and genuine. Not everyone is going to want to be on camera, but the same can be achieved from the audio feedback I mentioned earlier. 

The other nice feature of Panopto is the tracking ability of views. It is possible to see how much of a video someone has watched and how long they spent reviewing the material. For me, the eye-opening part was that many of the submissions I marked, the student watched the first few minutes of the introduction and then skipped through the bulk of the video until they got to the feedback for the final higher weighted part of the submission, which was a literature review, similar to that written in a journal article. They skipped much of the feedback surrounding the reflective elements from the portfolio they had created as part of the submission, which I found interesting as these were the areas that I felt most people needed more work on, compared to the final section which was more similar to research work they may have previously undertaken.

With this in mind, I would still provide the detailed feedback that I did as I only marked a few submissions and not every student will approach their feedback in the same way. It is something though that I will review each time as it would be an element that I would discuss at the start of the assessment process with the students to find what they might value. If they just want a grade for certain sections or if they want a detailed breakdown across the whole assignment. These conversations would be a key part in helping students to engage as if they are asking for a certain level of feedback they will hopefully then investigate each area accordingly. 

Jo:

The impact of the pandemic has been a catalyst to try out new ways of working and I was excited by the suggestion from Tom that we try out using a screencast as a way of providing more personalised feedback as part of the assessment process. 

I have, historically, used audio feedback and I know that this was always positively received by learners, so the opportunity to use Panopto was one I was keen to experiment with. For me, one of the benefits was that the screencast enabled four views; the assessment artefact itself, the marking criteria, the marker and the associated captions. This felt like a really robust way of delivering feedback as it was easy to link the marking criteria with the submitted assessment on screen, which assisted the learners to join the dots up regarding the award of marks. 

As Tom has said, we took a conversational approach to the feedback, but this didn’t make the feedback provision a swift process as planning/note-taking was also required. I think this was useful to Tom as it was his first time marking, when we met for calibration following marking the same submissions at the start of the process, these notes assisted with our conversation.  As I progressed through the bulk of the marking I started to utilise the pause facility, which I think made the process quicker (fewer notes to capture) and this didn’t seem to affect the overall quality of engagement with the feedback. 

I was pleased with the approach and quality of feedback provided. There was definitely more scope to work on feed-forward and we will be providing feedback in this way during the next assessment diet.  Comments from the External Examiner around this approach were very positive. 

Credit Image: Photo by Przemyslaw Marczynski on Unsplash 

Similarity scoring is a secondary consideration for online assessment…

Similarity scoring should be a secondary consideration for online assessment. Much more important factors, from my point of view, are ease of marking for academics; access to quality feedback for students; and innovative authentic assessment workflows.

Turnitin are close to monopolising the market on similarity scoring of student papers but many assessment platforms already use Turnitin and Urkund as plugin services to provide similarity scoring.

Where should we be focusing our effort at UoP?

As an institution one of our strengths lies in quiz/question-based assessments. This is particularly the case in the Science and Technology faculties. We have a mature sophisticated platform in Moodle to deliver these types of assessments and a deep level of staff expertise across the organisation, which has developed further through-out the pandemic.

The risk factors for UoP include a need to increase capacity for online exams (or diversify some of our assessment types onto an external platform at peak periods) and the ability to be able to innovate in terms of essay/file-based assessments.

From what I can see, Turntin has stagnated in terms of assessment innovations in recent years and have not yet improved service reliability at key assessment periods by migrating their platforms to a service like AWS. This has been promised repeatedly but not delivered on as yet.

This is potentially a reason why we saw growth in Moodle assignment and quiz usage during the pandemic rather than a big increase in Turnitin usage (trust in the reliability of the service and flexibility of the functionality).

So where could we focus our effort to improve the assessment tools for educators and students to gain the most benefits?

Innovative assessment workflows

Posing a long-form question to a student and easily marking the finished product should be a simple process – and it is on platforms such as Turnitin. However, we are increasingly adapting our assessments to be more authentic: assessments that more closely match how students will operate in the workplace. This often requires more sophisticated workflows and mechanisms, which should still be straightforward for academics to engage with and make sense of if they are to be successful. 

Traditional paper-based exams (potentially bring your own device)

During the pandemic staff were forced to transition away from paper-based exams. Many exams were instead delivered as coursework or window assignments (e.g. a 2hr assignment within a 24hr window) or as question-based quiz exams. When exam halls are available again staff may revert back to previous paper-based solutions. After all, we know how these work and paper doesn’t need charging or a stable wifi connection. However, we can harness this forward momentum with a platform dedicated to supporting timed essay assignments on students’ own devices or University machines. Several platforms offer functionality for students to download assignments at the start of an exam with no need to have an internet connection until it’s time to submit at the end. This could represent a robust, safe exam experience that more closely matches how students study today. Who handwrites for three hours any more? I’d be willing to bet most students don’t.

There are challenges with BYOD (bring your own device) particularly around charging and ensuring student machines are reliable. Many of these challenges can be solved with a small stock of fully charged devices, which can be swapped out to students when needed. Chromebooks are ideal online exam devices for this very reason, due to their long battery life and simple configuration. 

Assessment feedback

Workflows such as “feedback before grades” can help students better engage with their feedback, but better access to feedback for students in a variety of places is also key.

Services that offer a holistic view of assessment feedback, or the ability to extract these comments via API so we can build our own views, are increasingly valuable. This functionality will enable key staff such as personal tutors or learning support tutors to view student feedback as a whole (rather than in silos) to spot key areas to help students improve their academic work.

To round out where I started with this post, providing similarity checking is an important part of modern assessment – but it is a problem that has already been solved, multiple times.

If we make assessment more authentic, more flexible and more collaborative there will be less need for plagiarism detection because students will be demonstrating more of the attributes we want them to leave University with. I accept this is perhaps an overly idealistic viewpoint, as there are a lot of students to assess each year, but this is more reason to explore flexible assessment solutions that can make the lives of academics and students a bit easier.

Online assessment in the time of Covid

In pre-Covid times, exams delivered via Moodle were limited by the availability of suitable physical spaces. Exam rooms represented a bottleneck to the number of students taking exams concurrently.

For the last year, we’ve used Moodle (and integrated platforms) to deliver the majority of our teaching and assessment online.

A visualisation of the online assessment mix at the University of Portsmouth:

Diagram of how the Assignments and the Exams overlap during assessment period

In May 2020 many academics who had previously planned to deliver paper-based exams had to quickly adapt and deliver online assessments. In some cases, these required students to scan or take pictures of their work and upload these to assignments (Moodle or Turnitin) for marking. 

In recent months, newer platforms to handle this workflow and ease the marking burden for academics have been developed – platforms such as Turnitin Gradescope and CrowdMark. These platforms leverage the similarities in students’ answers so academics can mark many answers at once. When time allows, we hope to be able to evaluate these platforms in more detail.

In the diagram above you can see “Assignments under exam conditions” as the meeting point between traditional essays and restricted online exams. This year we have seen a big growth in this area as academics move from paper-based written exams to time-restricted assignments. An obvious caveat here is that these haven’t been conducted under true exam conditions and so are best described as open book exams. Many digital assessment platforms now include various types of proctoring and would be able to handle remote time-restricted essays (and other assessment types) securely. There are, however, a number of ethical issues to be considered with online proctoring, and we need to proceed cautiously here. 

As a University, I feel we should also be looking to expand our capacity for online assessment as over the next decade we will probably see the end of paper-based exams in favour of typed essay papers delivered online due in part to student expectations.

Academics have had a year to adapt to exams in lockdown and many have discovered the benefits of Moodle quizzes for exams that offer automatic marking. (And note that Moodle is excellent at delivering scientific and mathematical exam questions as well as longer coursework assignment submissions.) Generally speaking the Technology and Science and Health faculties deliver the majority of our Moodle quiz based exams and the number of exams has grown significantly during the lockdown. Many academics don’t want to go back to paper.

In Technology Enhanced Learning we oversee online exams and assessments in terms of supporting and evaluating the digital tools and making sure Moodle can handle the number of exams thrown at it. The number of online exams has increased substantially over the last year, all funnelled into two exam windows. As a team we work closely with colleagues in IS to provide more capacity in Moodle and with timetabling to ensure the exams are evenly distributed to avoid terminal peaks of concurrent users, providing a stable Moodle platform for all users.

Without the bottleneck of physical exam rooms, the January 2021 exams were initially weighed in the favour of academic requests around having exams earlier in the day and only using the first week of the exam window to maximise available marking time. Unfortunately, this translated into a scenario that would have presented a significant number of terminal peaks of concurrent users on Moodle. Members of TEL worked closely with the central timetabling unit to level out these peaks and with the exception of one or two slow points, we all delivered a successful exam window in January.

In advance of the May/June exams, we have gone further and set hard parameters around how many exams (quizzes) or timed assignments (Turnitin or Moodle assignments) can be timetabled in any given time slot. We’d like to thank CTU for their tireless effort to make this happen. It wasn’t an easy job to manage all the necessary requirements but it’s given us an exam timetable that looks like the image below. This really is invaluable work to the University when assessment represents so much effort by students, academics and support staff.

A screenshot of the exams for a week, days, dates, section, then slips into assignments and exams then the total of students expected to be in Moodle during that period

Our increasing reliance on online assessment means, I think, that we should investigate new technologies to support that function. Platforms such as Gradescope or CrowdMark could help relieve the marking burden; one of the many platforms such as Wiseflow or Mettl or Inspera could provide extra exam capacity (with the functionality to proctor exams if that was something the University wanted to do). Moodle, with its advanced quiz and assignment capabilities, would continue to play a key role.

I believe we will get through this coming assessment period well, but as our reliance on online assessment grows so must our technologies to support it. 

As a University the Covid-19 pandemic has been a driver for the uptake of online learning and assessment. As a University community, we need to harness this positive momentum and diversify our offering of assessment platforms to support students and staff.

Credit Image: Photo by MayoFi on Unsplash 

Situational judgement assessment

So I think it is fair to say, Covid-19 has thrown us all into having to think outside of the box when it comes to developing the usual, more ‘normal’, course assessment delivery types. This couldn’t be more true for our academics within the Paramedic Cert HE course. 

A few weeks ago, I was asked to support a colleague within this team to help deliver an online assessment that would replicate what would have been a practical examination on adult and paediatric resuscitation. 

With COVID not going away anytime soon, and practical assessment dates looming, we had a very short space of time to develop an online assessment that would best simulate ‘hands-on’ resuscitation scenarios.

As a response to the need to deliver an alternative assessment, Jane Reid – the Course Leader for the Paramedic Cert HE course – wrote four situational judgement assessments; each scenario had sequential situations and each situation had a series of serious judgement questions to test a student’s knowledge of actions to take within the ever-changing scenario.

Situational judgement assessment has been used in healthcare for years. It allows participants to experience as close to real-life scenarios as possible, without risk (in our case COVID), enabling them to identify, in order, their responses to given situations.

Three different coloured text boxes displaying the steps

Storyboarding a situational judgement test:

  1. Scenario – Includes detailed descriptive text, containing key information that sets the scene. This can also include images, audio or video to further illustrate the scenario.
  2. Situation 1 – Content that builds on the initial scenario, it contains the next layer of information relating to the scenario at hand and includes the first set of serious judgement questions.
  3. Situation 2 – Content builds on the previous situation and includes the next set of serious judgement questions… and so on until the end of the scenario.

There were many considerations that had to be made whilst developing this assessment type – mainly to keep the assessment as authentic as possible. For example:

  • providing media to set the scene; 
  • keeping the narrative on track – ‘time’ is of the essence in any resuscitation scenario, so it was important to include timely details within the situations; and 
  • replicating the quick thinking process that would be required in a real-life situation by using the sequential format in the quiz, so that students had to take notes or work from memory as they couldn’t return to previous situations to guide them.

The student experience was another really important factor in delivering this assessment – most of whom may never have experienced this type of examination. It was essential to provide clear and consistent instructions to guide them through this process. Before the main assessment, we also created a formative version of the quiz so that students could familiarise themselves with what was expected from the assessment.

We used a Google Document, with tables, to structure the content in the development stage and a Moodle Quiz Activity (Multiple Choice) to build and deliver the assessment which worked very well. The feedback from both students and examiners has been really positive, with more scope for using this assessment as a CPD exercise for practitioners. 

Interest has also been shown by academics at other universities who wish to explore this particular method of assessment along with Ambulance Trust managers. The methods for assessing the learning of resuscitation has seen little evolution from the traditional OSCE format therefore, this format that was created for a small group of students may well develop over time.

Developing this was by no means easy given the time constraints. However, it is a great example of an alternative assessment that has been developed from creative thinking during the lockdown.  

Peermark – a tool for group feedback.

Recently Coventry University released a new plugin for Moodle around the idea of group and peer feedback. A colleague highlighted the new tool to me and at first glance I thought it looked like a promising solution to one of the requirements many academics have while running group work: the ability for students to score the contribution of individuals within the group and provide either public or anonymous feedback to group members.

Currently Moodle provides various options to support group work and peer learning, because Moodle HQ realises that these approaches hold an important place in the arsenal of many academics. Firstly, Moodle provides a generic framework for creating groups – these can then be allocated to an activity (such as discussion boards, wikis or group assignment submissions).

Secondly, and with a greater focus on the use of peer learning, Moodle provides the Workshop tool.

While groups can be Moodle Workshop screenshotadded and used within the Workshop, the idea is predominantly that students add a submission. The submission is then allocated to a specified number of their peers, who then grade and provide feedback on it.

If you haven’t used the Workshop tool in anger, here is a quick overview of how to use it as a peer-assessment tool:

  1. All students submit their work (traditionally this will be an essay, but it could be work in some other format).
  2. The work is allocated to the other students. This can be scheduled and automated if required.
  3. Every student marks the assessments they have been given (academics can also provide feedback, although this is not a requirement).
  4. Each student receives a final grade for the submission and a grade for their ability to assess the work (academics can overwrite grades should they feel the process has proven unfair).

This tool provides students a fantastic opportunity to reflect on their own writing and work while comparing it to that of their peers. However, it does not allow for a group to provide anonymous feedback to their peers on projects. To do this academics currently have to find solutions outside of Moodle. The most notable option for this is TeamMates. TeamMates allows groups to feedback on the overall project work and then score the engagement of the rest of the team throughout the project.

We now have a new Moodle-based solution! Peerwork, created by Coventry University, is an integration with Moodle that provides a peer feedback option for group work. You can learn more about this approach from the video they have produced:

While working through Peermark, I was really impressed with its simplicity of set-up and use. I created the framework as an academic, but also completed the process as a student. Using multiple test accounts, I was able to understand how the process would work from both sides and see how you can adjust the overall grade given to a group though the peer reviews on the work.

The only criticism was really just my understanding of what the tool did (so not really a criticism of the system). When I uploaded a document as a student it cascaded it to each other members of the group. Each student does not need to upload a file, it is targeting the students for feedback on their peers and how the group worked throughout a project. The upload was almost a secondary consideration to the process.

Peermark is not the Workshop reimagined. They are two very different tools that serve a specific purpose.

The Workshop facilitates a student writing a piece of work, submitting it and other students provide feedback and evaluation of that work.

Peermark allows groups to discuss, rank and analyse how the entire team worked together over the course of the project. The work is created by the team for evaluation by the academic but the feedback given by the group on each other member will directly affect the shared grade of the team.

Peermark is currently on a test installation of Moodle.

If you would like a demonstration to see whether it would fit your need, please contact tom.langston@port.ac.uk

Image taken from Unsplash :John Schnobrich
John Schnobrich

Some comments on “The future of assessment”

The Curriculum Framework Specification document, which provides detailed precepts and guidance for the design, development and review of all new courses at the University, contains UoP’s policy on assessment. The policy’s authors made a conscious choice to call it an Assessment For Learning Policy: the policy advocates assessment for learning rather than assessment of learning. As the policy states, assessment for learning enables a culture in which: 

  • students receive feedback from academics and peers that helps them to improve their work prior to final/summative assessments; 
  • students understand what successful work looks like for each task they are doing; 
  • students become more independent in their learning, taking part in peer and self-assessment; 
  • formative assessment is, where possible, aligned to the module summative assessment, in order to facilitate cyclical feedback opportunities which will clarify expectations and standards for the summative assignment (e.g. the student’s exam or portfolio submission).

As the University considers how to implement its new five-year strategy, however, and how to meet its ambitious vision for 2030, might we need to rethink assessment? Not rethink the approach of assessing for learning, but look again at some of the details of how we assess?

The changing nature of assessment over the coming five-year period happens to be the subject of a recent publication from JISC: The Future of Assessment: Five Principles, Five Targets for 2025. This report, the output of a day-long meeting held in 2019, identifies five key aspects of assessment and the role that technology can play. The report argues that assessment should be (in alphabetical order, not order of importance):

  • Accessible – taking an inclusive approach to assessment is the ethical thing to do, of course, but we now have a legal requirement to meet certain accessibility standards. Digital technology can certainly help with accessibility. Contact DCQE if you would like further advice in this area. 
  • Appropriately automated – it hardly needs to be said that marking and feedback, although crucial elements of the assessment process, is time consuming. Technology can help here, too. Technology can be used to automate the process and, if the assessment has been properly designed, students get the benefit of immediate feedback. Technology might also be used to improve the quality of feedback: in this regard TEL is currently exploring the Edword platform.   
  • Authentic – this is, I believe, a key area for the University to develop. How does it benefit students to make them sit down for three hours and hand write an essay under exam conditions? This doesn’t prepare them for the world beyond university. Surely it’s better to assess students’ ability to work in teams; display their knowledge in a realistic setting; use the digital skills they will undoubtedly need in the workplace?  
  • Continuous – in order to be successful in their chosen careers, our students will need to keep up with changes wrought by technology. So perhaps the most important skill we can teach our students is how to be independent, self-directed learners. An over-reliance on high-stakes, summative exams does not help. Of particular interest to me, in the JISC report, was the mention of using AI to personalise learning and assessment: the technology is not there yet, but it might come in the next few years. 
  • Secure – if we are going to assess a student then we need to know we are assessing the right student! For a long time the focus in HE has been on detecting and deterring plagiarism. Nowadays, though, we also face the threat of essay mills and contract cheating. Once again technology can play a role: data forensics, stylistic analysis tools and online proctoring platforms can help tackle the problem. Such tools are best used, however, in a culture that promotes academic integrity: we should use technology to help promote a sense of academic community rather than to “catch the bad guys”.

The five principles identified by the JISC working group seem to me to be realistic and practical. They are also, if I’m being honest, slightly unambitious. I think mixed-reality technology, for example, opens up many opportunities to develop assessment for learning. But perhaps that is more for a 2030 vision than a 2025 strategy.   

Credit Image: Needpix.com

Engaging students with online assessment feedback

An Exploration Project

Technology Enhanced Learning and Academic Development are leading an exploration project centered around engaging students with online assessment feedback. We’re specifically exploring an assessment platform called Edword.

It’s worth mentioning that we’re taking a more scientific approach to this project, you could almost imagine it as an education lab experiment. 

Academics and educational technologists within our team have evaluated the functionality and advanced workflows that Edword offers. We think that it offers some real tangible benefits to students and staff. The platform has been designed based on some pedagogically sound principles, that’s really what’s most exciting about it. I’ll demonstrate some examples of these in action later in this post.

It’s not enough that we’re excited about a new assessment tool though. We need to explore and test whether our students and staff actually do experience a benefit from using Edword when compared to one of our existing assessment platforms such as Turnitin or the Moodle assignment.

In order for me to explain what Edword allows us to do, I need to explain what’s missing from our existing assessment systems. 

Current feedback workflow

Turnitin / Moodle assignment

Assessment graded, student sees grade, end of workflow

When an online assignment is handed back to a student via Moodle or Turnitin students see their grade immediately, before they’ve had a chance to read any inline or summary  feedback added by their lecturer. The grade is often seen by students as the end point within their assessment, their grade is a students entry point to the next stage of their course. What we actually want students to engage with is the meaningful and constructive feedback their academics have produced for them. This will help students improve their next piece of work. Unfortunately many students don’t read their assessment feedback and miss out on the benefits to them.

Edword has a ‘lock grade’ feature which means students can’t see their grade until after they’ve read their feedback and potentially also submitted a reflection on how they will put their feedback into practise. In this way, Edword supports the feed forward model of good academic practise.

The Edword workflow looks more like this:

Edword workflow

Assignment is graded, student reads feedback, student writes reflection on feedback, student sees grade, student improves on next assignment

We also hope the feedback provided within Edword will be more engaging. Academics can enrich inline feedback with learning objects such as videos or H5P interactive learning objects. Rather than the flat text based feedback comments within Turnitin and Moodle, feedback in Edword helps students understand the mistakes they are making along with an immediate way to re-test their knowledge. The platform supports assessment for learning concepts.

 

A h5p learning activity embedded into assessment feedback for a student

A H5P interactive learning object within feedback in Edword

Edword records how long a student spends engaging with their feedback and allows students to rate the usefulness of the feedback they receive. These metrics are presented to staff as a way to evaluate how engaged students are and which feedback comments could be improved from a student perspective. 

We will make Edword available to staff and students during teaching block two with an on-boarding event for staff happening in early February. If you would like to take part in the project or ask some questions, please get in contact:

Mike Wilson

Ext. 3194

michael.wilson@port.ac.uk

A video introduction to Edword can be found here

« Older posts

© 2024 Tel Tales

Theme by Anders NorénUp ↑