Adventures in Technology Enhanced Learning @ UoP

Tag: online assessment (Page 1 of 2)

WiseFlow – Looking in the mirror with reflective portfolios in WiseFlow

Hey there, fellow exhausted souls!

Can you believe it? We’re finally coming towards the end of the academic year, and boy, has it been a fun ride!  Our WiseFlow pilot has gone from strength to strength as we support academics through a new assessment process.  More importantly, we have successfully run two separate assessments using our innovative approach of using WiseFlow as a reflective portfolio – the first use case of this we know about!  We’ve grown, learned, and potentially discovered an exciting prospect for the future of reflective portfolios at Portsmouth University, so let’s take a moment to reflect on the journey we’ve been on. 

You may have read our previous blog post on “Unlocking the power of WiseFlow: Transforming ePortfolio assessments” where we discussed the possibilities of using WiseFlow as a viable reflective portfolio platform and the benefits a reflective portfolio approach brings.  For students, this helps develop their metacognitive skills and self-awareness as learners over a period of time.  Academics, on the other hand, can use reflective portfolios to assess students’ learning outcomes in a more comprehensive and authentic manner.  This is all part of our wider WiseFlow pilot to provide one integrated assessment platform that serves our current (and future) assessment needs within Portsmouth University, which Mike Wilson spoke to us about recently on our podcast – you can listen here

Teach Well and Research-Informed Teaching

This year we ran two reflective portfolios within WiseFlow as part of our pilot project – to test the water and find out if this was even possible. The first was within our Researched Informed Teaching module, which supports early career academics to apply their learning in educational enhancements into their own contexts, through reflection and innovation.  Students will draw together higher education policy, research methods and educational developments to build students knowledge for their future work.  Secondly, we ran a reflective portfolio in our new level seven Teach Well: Principles to Practice module, which is a professional development route for those in roles related to supporting student learning. Students in this module embark on a pedagogical journey through three pillars of practice for teaching well in higher education, gaining the confidence to critically evaluate learning and design approaches and reflecting on what it means to teach well across different modes of study.  We recently caught up with Maria Hutchinson who runs this module in our podcast series, if you missed this one, you can listen here

We’ve worked closely with these academics and our support teams to develop reflective portfolios for these modules that can be used as a summative assessment vehicle which is both intuitive for learners and versatile enough to encompass a broad range of tools which enable the course learning outcomes to be demonstrated in an engaging and meaningful way.

What the students said…

Following the submission of reflective portfolios into WiseFlow, we sent out a survey to participants to gain their feedback and views.  Some of the headline figures are detailed below…

  • 90% of students found the WiseFlow reflective portfolio easy to navigate
  • 90% of students agreed that a reflective portfolio suited this type of assessment (compared with traditional essay-based assessment methods)
  • 82% of students felt their own students would enjoy using a reflective portfolio in WiseFlow
  • 71% of students enjoyed the interactive assessment methods, such as histograms, voice recorders etc. 
  • We received multiple comments about the clear instructions that were given on how to access and use Wiseflow as well as its reliability and stability as a platform.  Many users also commented positively on the functionality that WiseFlow offered compared to previously used portfolio solutions. 

Students also commented on…

  • If there was a need to add another system to Portsmouth University’s available assessment platforms – “There are too many platforms for submitting the work, Moodle, ePortfolio, WiseFlow, it is really confusing and frustrating that is necessary to learn how to use different platforms for different modules.”
  • The lack of formatting transfer from applications such as Word, when copying and pasting into WiseFlow – “Transfer of formatted MS Word document to WiseFlow could be improved. Currently, the document format is lost during the cut & paste process which then requires more effort to re-format within the WiseFlow portal.”
  • Better integration with Moodle and WiseFlow – “I’d like to see direct access from Moodle”. 

The data presented highlights the positive reception of WiseFlow as a reflective portfolio solution by students. The high percentage of students that recognized the suitability of a reflective portfolio as an assessment method, in comparison to traditional essay-based approaches and praised its usability is a really positive sign. The positive feedback on the interactive assessment methods further emphasizes the adaptability of the question bank in a traditional FlowMulti assessment to be used in an innovative way. 

However, some concerns were raised by students, such as the frustration of managing multiple assessment platforms at the university, indicating a need for better integration. This all links to our Digital Success Plan to (re)design robust assessments to meet the needs of the diverse student population within a blended and connected setting and incorporate a robust specialist end-to-end assessment platform. Our aims in the project were to make it easier for academics to design assessments, easier for students to find their assessments and feedback, and support staff by reducing the manual workaround assessments for academics.  During the next stage of the pilot project, integration into our current systems is a top priority and will alleviate these challenges.  Furthermore, the lack of formatting transfer from applications like Word to WiseFlow was highlighted as an area for improvement. These critical comments provide valuable insights for further refining and optimizing the WiseFlow system.

The evidence is clear to see – WiseFlow has the ability to provide a viable solution to reflective portfolios, with a bit of refinement – it could be excellent. 

What the staff said…

It was also vital to us that we gathered feedback from our academic staff.  

  • 100% of staff agreed that WiseFlow allowed them to develop their assessment in ways that were not previously possible
  • All staff agreed the WiseFlow reflective portfolio allowed them to fully cover learning objectives and meet the needs of their students
  • We received multiple comments about the speed of the platform, intuitive nature and search functionality which made the verification/moderation process seamless.  Staff also commended the accuracy of the rubrics for grading and how new interactive elements made them rethink how they could better use this type of functionality in the future.

Staff also commented on…

  • Comparisons to previously used portfolio platforms – “Historically the module used [another portfolio system] which was really clunky and didn’t work well at all. I really liked that Wiseflow could be scrolled across (as opposed to clicking through each page) and the layout was great”
  • Design elements within the marking interface – “It would have been useful to have had the comment box movable (I work with two screens and being able to drag the box to another screen to write on would have been a nice touch – several times I had to keep opening and closing the box as I wasn’t able to see the text underneath it)”
  • Having more time to explore the platform – “I did not feel I had enough time to play before it went live for students, but this was not WISEflow’s fault – it was just timing”. 

As an honest answer, we’ve been blown away by our staff feedback.   The unanimous agreement that WiseFlow enables new possibilities for assessment development speaks very highly of this solution and its potential in enhancing the teaching and learning experience for students at Portsmouth University.  The potential to create authentic assessments through the use of reflective portfolios is exciting.  The accuracy of the grading rubrics was also very highly commended – allowing students to have a greater chance of achieving a clear and defined target and making academic decision-making easier, fairer and more accurate.  In terms of developmental areas, the movement of the comment box is a fair point – we’ve heard from other academics about the size of the comment box before – hopefully, something that WiseFlow’s New Marker Journey will alleviate. 

Where do we go from here?

As we raised in our first blog post – the reflective portfolio solution in WiseFlow is far from perfect, with a few simple tweaks the solution could become very appealing. Sadly, some of these are out of our hands and lie within the code of the platform.  We’ve learnt a lot during the duration of this assessment as a project team, including developmental areas we have highlighted for the future.  

The single biggest limiting factor when using a reflective portfolio is when using a file upload question type.  This is limited to twelve files that are no more than 10Mb each – multiple file upload questions can be used, but will still have limits on them.  We have approached WiseFlow about this for development purposes, however, we have yet to have any significant movement on removing this limit.  The removal of this limit puts WiseFlow in an incredibly powerful position to offer another “string to their bow” in terms of assessment choice and would truly open up the use of reflective portfolios within the platform.  Sadly, with this limit in place, using reflective portfolios with some faculties such as our Creative and Cultural Industry, where students would regularly upload large .psd, CAD files, HD video, and high-quality audio etc) is just not a viable option.  Creative students will often build a “portfolio career” and we would love to be able to work with them on developing reflective portfolios, but this limit stops us.  Until this is removed, careful consideration must be taken at the planning stage of an assessment as to whether the reflective portfolio is the correct solution.  Further to this, other limitations must be considered – for example, once the reflective portfolio is live for students to complete, it cannot be altered, changed or adapted.  During the pilot, we’ve worked extensively with academics and our support teams to iron out any issues prior to release. Careful planning and consideration must take place in the authoring phase of an assignment, which will then be rigorously checked prior to release – in the same way an exam would.  This has worked at a small scale but we would need to ensure appropriate support mechanisms are in place at a larger scale.  

Our student feedback gave us valuable insight into the process of using WiseFlow.  Although reflective portfolios save every 10 seconds, if a student deletes a file or a piece of text and exits the platform, this cannot be recovered.  Over the duration of the assessments that took place, we encountered one reported instance of this. We also had some reports of formatting that will not copy from Word documents.  Again, we approached WiseFlow regarding this and it is recommended to copy/paste plain text from Word and finish the styling in the text editor of WiseFlow.  Although this solution works, having formatting that copies across would make students’ work translate much easier – particularly for those who write on external documents before copying into the platform at the last minute (like myself). In terms of progression beyond WiseFlow, we’d love for students to be able to take their work from the platform and have the ability to store it themselves or share it beyond the WiseFlow platform.  Currently, there is no solution to this.  A “zip folder” that contained all exports and uploaded files of any inputted answers into WiseFlow would be a great starting point.  Again, we’ve put forward the idea to WiseFlow, but have yet to have any movement on this.  

Where do we take our pilot now?

Although these are risks with using a reflective portfolio solution in WiseFlow, the prospect and the potential gain of this authentic assessment are exciting.  We’ve taken the plunge and proven the concept works, highlighting potential development areas which we really hope get some traction and we’d like to think WiseFlow will be open to listening to these developmental ideas.  As for our pilot project as a whole, we move into a second phase of the pilot with a continued focus on reflective portfolios but also some other areas of assessment we have struggled with in the past, such as large file submissions.  We have a plethora of training and support we are actively developing and working with local teams to ensure staff feel confident using the systems.  

We continue to have a waiting list for academics who are wanting to work with us to develop reflective portfolios in WiseFlow. I find myself meeting with academics on a weekly basis to discuss potential projects and reflective portfolio solutions in their disciplines.   So far, we’ve done no real advertising, and this interest has been created from word of mouth and from those who have used it as students. We are keen to share our experiences with other Universities in WiseFlow user groups, who are actively keen to explore this and want to learn about our innovative approach. However, we need to be open and honest about the limitations that this solution has at the moment. Collectively, we might hold enough power to make change happen but until that point, caution must be taken before embarking on a reflective portfolio to ensure this is the correct fit for assessment.

The potential of this solution is game-changing, not just for us, but for a lot of other Higher Education institutions across the world.

The future of reflective portfolios in WiseFlow is exciting – keep watching this space.  

Chris

Credit Image: Photo by MidJourney 

WiseFlow ePortfolio – Unlocking the power of WiseFlow: Transforming ePortfolio assessments

In the digital age, traditional paper-based portfolios have given way to ePortfolios, harnessing a powerful way to showcase a student’s work that demonstrates their learning, progress, reflections and achievements, over a period of time. ePortfolios are increasingly becoming popular in education as they offer several benefits to both students and academics.

For students, ePortfolios provide an adaptable platform to showcase their learning journey, including their best work and reflections on when it didn’t go quite to plan, and draw on evidence from a range of sources whether that be PDFs, images, videos, audio snippets or written text. This process helps students develop their metacognitive skills and self-awareness as learners over a period of time.  Academics, on the other hand, can use ePortfolios to assess students’ learning outcomes in a more comprehensive and authentic manner. In turn, this allows academics to gain insights into students’ thought processes, identify their strengths and weaknesses, and provide targeted feedback. Additionally, ePortfolios allow academics to track students’ progress and provide evidence of their achievements.

Using ePortfolios also builds several skills, including digital literacy, communication and critical thinking – all of which are vital in the modern workplace. Students have to select, curate, and present their work in a clear and engaging manner. They also have to reflect on their learning process and map this to learning outcomes. These skills are crucial for success in the modern workplace, where digital communication and collaboration are essential. With a background in teaching vocational courses for 12 years at Further Education level, I’ve seen first-hand the impact and outcomes of effective ePortfolio use for both students and academics. 

At Portsmouth University, we have struggled to find a solid ePortfolio solution. We currently use a popular open-source ePortfolio platform that allows students to create and share their digital portfolios. While the platform has several benefits, including flexibility, customizability, and integration with other systems, it also faces some challenges. One major issue is its user interface, which can be overwhelming and confusing for some users – particularly in the setup stage of having to import the portfolio into your own profile. This process often leads to a lot of technical issues and puts up an immediate barrier to entry for those not tech-savvy. Additionally, the learning curve for using the platform can be steep, and it may take some time for users to become familiar with all the features and functionalities. However, despite these challenges, academics and students value the use of the ePortfolio system on offer and the benefits this provides.  

We are currently coming towards the end of our first stage of a pilot with a new system: WiseFlow. This is a cloud-based digital end-to-end exam and assessment platform that supports the assessment and feedback lifecycle for students, assessors and administrators. It’s fair to say that staff feedback about the WiseFlow pilot has been overwhelmingly positive. As a core project team, we’ve had the pleasure of working with academic teams to support students with innovative assessments in Wiseflow, across a range of disciplines. This all links to our Digital Success Plan to (re)design robust assessments to meet the needs of the diverse student population within a blended and connected setting and incorporate a robust specialist end-to-end assessment platform. Our aims in the project were to make it easier for academics to design assessments, easier for students to find their assessments and feedback, and reduce the manual workaround assessments for academics and support staff.  All of which, WiseFlow seems to have been able to deliver. 

Within the pilot, we wanted to really push the boundaries of WiseFlow – utilising a wide range of assessment types to really test if WiseFlow can become the go-to platform for assessments at Portsmouth University.  One of the big challenges for us was to find an ePortfolio solution that is user-friendly, and adaptable across a range of disciplines as well as providing a versatile feedback loop where students could receive formative feedback on their work from assessors and develop ideas, prior to final submission. After challenging the team at WiseFlow to this – they came back with a solution. Block arrows showing the timeline of a flow ePortfolio for Online Course Developers, eLearn Team, Students and Academics

Traditionally, a FlowMulti (just one of the many ‘flow types’ WiseFlow offers for assessment) would be used for open/closed book multiple-choice exams, where the participants fill out a provided multiple-choice test.  However, the team at WiseFlow suggested we could utilise this functionality to use as a bespoke ePortfolio solution.

Using a FlowMulti allowed us to replicate the layout and design of current ePortfolios as well as allow us to adapt the setup to truly take ePortfolios to the next level. To create the feedback loop, we allowed assessors early access to the work, early release of feedback to students, and students to submit unlimited times before the deadline.  The portfolios could be easily updated year-on-year, were inviting for students to engage with, and could be authored by multiple academics at the same time. This seemed like the perfect solution.  

After testing, adapting and re-testing, we felt this solution offered a totally new level of ePortfolio to our current offering. The ability to re-purpose traditional multiple-choice questions allowed us to push the boundaries of assessment further, like never before. The only limitation is our own creativity to adapt and repurpose these. We put together a showcase of a PGCert portfolio to show our academics the findings, who immediately fell in love with the platform and we started working together to develop a portfolio to run within the pilot.

“As a course team, we are incredibly excited about the flexibility that the Wiseflow ePortfolio has to offer. Working with the project team we have been able to design a summative assessment vehicle which is both intuitive for learners and versatile enough to encompass a broad range of tools which enable the course Learning Outcomes to be demonstrated in an engaging and meaningful way.”  Dr Joanne Brindley, Academic Practice Lead & Senior Lecturer in Higher Education.

A screenshot of an image taken from a computer of the page that the participant will see asking them to reflect on their skills. A screenshot of an image taken from a computer of the page that the participant will see. The image is of empty drop box for the participant to upload their activity into it. A screenshot of an image taken from a computer of the page that the participant will see asking them to type up a reflective statement on the using of Technology to support learning.

We are now in the “participation” phase of two ePortfolios – one for the Research Informed Teaching module and one for the new Level 7 Teach Well: Principles to Practice professional development module.  We have had great experiences re-designing pre-existing portfolios to really push the boundaries of what is possible in WiseFlow. We’ve added interactive elements, by turning traditional questions and approaches on their head – such as using a histogram for reflection, allowing students to visually reflect on skillsets pre- and post-observation. We’ve provided students freedom of choice with assessment by integrating a voice recorder into the portfolio and also utilising existing platforms to integrate into the WiseFlow portfolio. Really, the only limitation is our own imagination.  

“We teach the PG Cert Higher Education so our students are staff. The platform is incredibly user-friendly for both staff and students. We used it for ePortfolio as the last platform created lots of complaints, whereas this platform has led to lots of compliments.  The staff members spoke highly of the platform and I believe, many have asked to be part of the pilot next year due to their positive experience.”  Tom Lowe, Senior Lecturer in Higher Education

There has been overwhelmingly positive feedback from academics and students regarding the usability and functionality of using WiseFlow as an ePortfolio solution.  Through word of mouth and firsthand experiences from early career academics, particularly those who are studying on the Research Informed Teaching module, the platform’s potential in enhancing their own teaching has become widely recognized.  I remember being invited to one of Tom’s lectures to showcase the platform to his students who would be using it and the response was overwhelming. Staff were excited to use this as students and saw the immediate potential for their own teaching. It is always a good sign of a new innovation when there is an immediate benefit to both staff and students that can be applied instantly in the classroom. Essentially, we now have a waiting list for academics who are wanting to work with us to develop ePortfolios in WiseFlow – with no advertising at all and purely from those who have used it as students. We believe that when this is advertised, we will see a huge influx of academics wanting to use this. We have also spoken to other Universities in WiseFlow user groups, who are actively keen to explore this and want to learn about our innovative approach. The potential of this solution is game-changing, not just for us, but for other Higher Education institutions. 

However, using an innovative approach and essentially turning a quiz assignment on its head does not come without some drawbacks that need to be considered before academics embark on an ePortfolio solution within WiseFlow.  There is currently a 12-file limit, set at 10Mb per file when students upload files into the portfolio. Although it is great that students can do this, it does not lend itself to modern file sizes or some of our subject areas (for example, our Creative and Cultural Industry Faculty, where students would regularly upload large .psd, CAD files, HD video, and high-quality audio etc). In our initial pilot, we haven’t encountered this issue – but it’s worth considering if this is the correct way to proceed with an assessment. The limit on the number of files is also a concern. For example, some students in our pilot have reached the 12-file upload limit. While there are workarounds, such as storing files in a Google Drive folder and sharing the link or combining multiple files into one, however, it defeats the purpose of an ePortfolio as an all-encompassing system. Perhaps, a better approach would be to have an upload limit as a whole, with a defined combined file size.  The final consideration to make is that once the ePortfolio is live, we cannot make changes.  We’ve worked extensively with academics and our support teams to iron out any issues prior to release, but again, this is important for academics to understand. Careful planning and consideration must take place in the authoring phase of an assignment, which will then be rigorously checked prior to release – in the same way an exam would. Despite these setbacks, we’re actively in discussions with WiseFlow regarding developing this and hope to make progress on these in the near future. 

The future of ePortfolios in WiseFlow is exciting, and we can’t wait to see how they will continue to be developed across the University. The ability to adapt and transform ePortfolios will open up new doors for our students and academics to really develop the ways in which students can showcase their knowledge and understanding. We’re hoping for a successful run of ePortfolio use within our pilot and looking forward to developing new ideas as we move into the future.  

Until next time. Watch this space.

Chris.

Similarity scoring is a secondary consideration for online assessment…

Similarity scoring should be a secondary consideration for online assessment. Much more important factors, from my point of view, are ease of marking for academics; access to quality feedback for students; and innovative authentic assessment workflows.

Turnitin are close to monopolising the market on similarity scoring of student papers but many assessment platforms already use Turnitin and Urkund as plugin services to provide similarity scoring.

Where should we be focusing our effort at UoP?

As an institution one of our strengths lies in quiz/question-based assessments. This is particularly the case in the Science and Technology faculties. We have a mature sophisticated platform in Moodle to deliver these types of assessments and a deep level of staff expertise across the organisation, which has developed further through-out the pandemic.

The risk factors for UoP include a need to increase capacity for online exams (or diversify some of our assessment types onto an external platform at peak periods) and the ability to be able to innovate in terms of essay/file-based assessments.

From what I can see, Turntin has stagnated in terms of assessment innovations in recent years and have not yet improved service reliability at key assessment periods by migrating their platforms to a service like AWS. This has been promised repeatedly but not delivered on as yet.

This is potentially a reason why we saw growth in Moodle assignment and quiz usage during the pandemic rather than a big increase in Turnitin usage (trust in the reliability of the service and flexibility of the functionality).

So where could we focus our effort to improve the assessment tools for educators and students to gain the most benefits?

Innovative assessment workflows

Posing a long-form question to a student and easily marking the finished product should be a simple process – and it is on platforms such as Turnitin. However, we are increasingly adapting our assessments to be more authentic: assessments that more closely match how students will operate in the workplace. This often requires more sophisticated workflows and mechanisms, which should still be straightforward for academics to engage with and make sense of if they are to be successful. 

Traditional paper-based exams (potentially bring your own device)

During the pandemic staff were forced to transition away from paper-based exams. Many exams were instead delivered as coursework or window assignments (e.g. a 2hr assignment within a 24hr window) or as question-based quiz exams. When exam halls are available again staff may revert back to previous paper-based solutions. After all, we know how these work and paper doesn’t need charging or a stable wifi connection. However, we can harness this forward momentum with a platform dedicated to supporting timed essay assignments on students’ own devices or University machines. Several platforms offer functionality for students to download assignments at the start of an exam with no need to have an internet connection until it’s time to submit at the end. This could represent a robust, safe exam experience that more closely matches how students study today. Who handwrites for three hours any more? I’d be willing to bet most students don’t.

There are challenges with BYOD (bring your own device) particularly around charging and ensuring student machines are reliable. Many of these challenges can be solved with a small stock of fully charged devices, which can be swapped out to students when needed. Chromebooks are ideal online exam devices for this very reason, due to their long battery life and simple configuration. 

Assessment feedback

Workflows such as “feedback before grades” can help students better engage with their feedback, but better access to feedback for students in a variety of places is also key.

Services that offer a holistic view of assessment feedback, or the ability to extract these comments via API so we can build our own views, are increasingly valuable. This functionality will enable key staff such as personal tutors or learning support tutors to view student feedback as a whole (rather than in silos) to spot key areas to help students improve their academic work.

To round out where I started with this post, providing similarity checking is an important part of modern assessment – but it is a problem that has already been solved, multiple times.

If we make assessment more authentic, more flexible and more collaborative there will be less need for plagiarism detection because students will be demonstrating more of the attributes we want them to leave University with. I accept this is perhaps an overly idealistic viewpoint, as there are a lot of students to assess each year, but this is more reason to explore flexible assessment solutions that can make the lives of academics and students a bit easier.

Online assessment in the time of Covid

In pre-Covid times, exams delivered via Moodle were limited by the availability of suitable physical spaces. Exam rooms represented a bottleneck to the number of students taking exams concurrently.

For the last year, we’ve used Moodle (and integrated platforms) to deliver the majority of our teaching and assessment online.

A visualisation of the online assessment mix at the University of Portsmouth:

Diagram of how the Assignments and the Exams overlap during assessment period

In May 2020 many academics who had previously planned to deliver paper-based exams had to quickly adapt and deliver online assessments. In some cases, these required students to scan or take pictures of their work and upload these to assignments (Moodle or Turnitin) for marking. 

In recent months, newer platforms to handle this workflow and ease the marking burden for academics have been developed – platforms such as Turnitin Gradescope and CrowdMark. These platforms leverage the similarities in students’ answers so academics can mark many answers at once. When time allows, we hope to be able to evaluate these platforms in more detail.

In the diagram above you can see “Assignments under exam conditions” as the meeting point between traditional essays and restricted online exams. This year we have seen a big growth in this area as academics move from paper-based written exams to time-restricted assignments. An obvious caveat here is that these haven’t been conducted under true exam conditions and so are best described as open book exams. Many digital assessment platforms now include various types of proctoring and would be able to handle remote time-restricted essays (and other assessment types) securely. There are, however, a number of ethical issues to be considered with online proctoring, and we need to proceed cautiously here. 

As a University, I feel we should also be looking to expand our capacity for online assessment as over the next decade we will probably see the end of paper-based exams in favour of typed essay papers delivered online due in part to student expectations.

Academics have had a year to adapt to exams in lockdown and many have discovered the benefits of Moodle quizzes for exams that offer automatic marking. (And note that Moodle is excellent at delivering scientific and mathematical exam questions as well as longer coursework assignment submissions.) Generally speaking the Technology and Science and Health faculties deliver the majority of our Moodle quiz based exams and the number of exams has grown significantly during the lockdown. Many academics don’t want to go back to paper.

In Technology Enhanced Learning we oversee online exams and assessments in terms of supporting and evaluating the digital tools and making sure Moodle can handle the number of exams thrown at it. The number of online exams has increased substantially over the last year, all funnelled into two exam windows. As a team we work closely with colleagues in IS to provide more capacity in Moodle and with timetabling to ensure the exams are evenly distributed to avoid terminal peaks of concurrent users, providing a stable Moodle platform for all users.

Without the bottleneck of physical exam rooms, the January 2021 exams were initially weighed in the favour of academic requests around having exams earlier in the day and only using the first week of the exam window to maximise available marking time. Unfortunately, this translated into a scenario that would have presented a significant number of terminal peaks of concurrent users on Moodle. Members of TEL worked closely with the central timetabling unit to level out these peaks and with the exception of one or two slow points, we all delivered a successful exam window in January.

In advance of the May/June exams, we have gone further and set hard parameters around how many exams (quizzes) or timed assignments (Turnitin or Moodle assignments) can be timetabled in any given time slot. We’d like to thank CTU for their tireless effort to make this happen. It wasn’t an easy job to manage all the necessary requirements but it’s given us an exam timetable that looks like the image below. This really is invaluable work to the University when assessment represents so much effort by students, academics and support staff.

A screenshot of the exams for a week, days, dates, section, then slips into assignments and exams then the total of students expected to be in Moodle during that period

Our increasing reliance on online assessment means, I think, that we should investigate new technologies to support that function. Platforms such as Gradescope or CrowdMark could help relieve the marking burden; one of the many platforms such as Wiseflow or Mettl or Inspera could provide extra exam capacity (with the functionality to proctor exams if that was something the University wanted to do). Moodle, with its advanced quiz and assignment capabilities, would continue to play a key role.

I believe we will get through this coming assessment period well, but as our reliance on online assessment grows so must our technologies to support it. 

As a University the Covid-19 pandemic has been a driver for the uptake of online learning and assessment. As a University community, we need to harness this positive momentum and diversify our offering of assessment platforms to support students and staff.

Credit Image: Photo by MayoFi on Unsplash 

Situational judgement assessment

So I think it is fair to say, Covid-19 has thrown us all into having to think outside of the box when it comes to developing the usual, more ‘normal’, course assessment delivery types. This couldn’t be more true for our academics within the Paramedic Cert HE course. 

A few weeks ago, I was asked to support a colleague within this team to help deliver an online assessment that would replicate what would have been a practical examination on adult and paediatric resuscitation. 

With COVID not going away anytime soon, and practical assessment dates looming, we had a very short space of time to develop an online assessment that would best simulate ‘hands-on’ resuscitation scenarios.

As a response to the need to deliver an alternative assessment, Jane Reid – the Course Leader for the Paramedic Cert HE course – wrote four situational judgement assessments; each scenario had sequential situations and each situation had a series of serious judgement questions to test a student’s knowledge of actions to take within the ever-changing scenario.

Situational judgement assessment has been used in healthcare for years. It allows participants to experience as close to real-life scenarios as possible, without risk (in our case COVID), enabling them to identify, in order, their responses to given situations.

Three different coloured text boxes displaying the steps

Storyboarding a situational judgement test:

  1. Scenario – Includes detailed descriptive text, containing key information that sets the scene. This can also include images, audio or video to further illustrate the scenario.
  2. Situation 1 – Content that builds on the initial scenario, it contains the next layer of information relating to the scenario at hand and includes the first set of serious judgement questions.
  3. Situation 2 – Content builds on the previous situation and includes the next set of serious judgement questions… and so on until the end of the scenario.

There were many considerations that had to be made whilst developing this assessment type – mainly to keep the assessment as authentic as possible. For example:

  • providing media to set the scene; 
  • keeping the narrative on track – ‘time’ is of the essence in any resuscitation scenario, so it was important to include timely details within the situations; and 
  • replicating the quick thinking process that would be required in a real-life situation by using the sequential format in the quiz, so that students had to take notes or work from memory as they couldn’t return to previous situations to guide them.

The student experience was another really important factor in delivering this assessment – most of whom may never have experienced this type of examination. It was essential to provide clear and consistent instructions to guide them through this process. Before the main assessment, we also created a formative version of the quiz so that students could familiarise themselves with what was expected from the assessment.

We used a Google Document, with tables, to structure the content in the development stage and a Moodle Quiz Activity (Multiple Choice) to build and deliver the assessment which worked very well. The feedback from both students and examiners has been really positive, with more scope for using this assessment as a CPD exercise for practitioners. 

Interest has also been shown by academics at other universities who wish to explore this particular method of assessment along with Ambulance Trust managers. The methods for assessing the learning of resuscitation has seen little evolution from the traditional OSCE format therefore, this format that was created for a small group of students may well develop over time.

Developing this was by no means easy given the time constraints. However, it is a great example of an alternative assessment that has been developed from creative thinking during the lockdown.  

Engaging students with online assessment feedback

An Exploration Project

Technology Enhanced Learning and Academic Development are leading an exploration project centered around engaging students with online assessment feedback. We’re specifically exploring an assessment platform called Edword.

It’s worth mentioning that we’re taking a more scientific approach to this project, you could almost imagine it as an education lab experiment. 

Academics and educational technologists within our team have evaluated the functionality and advanced workflows that Edword offers. We think that it offers some real tangible benefits to students and staff. The platform has been designed based on some pedagogically sound principles, that’s really what’s most exciting about it. I’ll demonstrate some examples of these in action later in this post.

It’s not enough that we’re excited about a new assessment tool though. We need to explore and test whether our students and staff actually do experience a benefit from using Edword when compared to one of our existing assessment platforms such as Turnitin or the Moodle assignment.

In order for me to explain what Edword allows us to do, I need to explain what’s missing from our existing assessment systems. 

Current feedback workflow

Turnitin / Moodle assignment

Assessment graded, student sees grade, end of workflow

When an online assignment is handed back to a student via Moodle or Turnitin students see their grade immediately, before they’ve had a chance to read any inline or summary  feedback added by their lecturer. The grade is often seen by students as the end point within their assessment, their grade is a students entry point to the next stage of their course. What we actually want students to engage with is the meaningful and constructive feedback their academics have produced for them. This will help students improve their next piece of work. Unfortunately many students don’t read their assessment feedback and miss out on the benefits to them.

Edword has a ‘lock grade’ feature which means students can’t see their grade until after they’ve read their feedback and potentially also submitted a reflection on how they will put their feedback into practise. In this way, Edword supports the feed forward model of good academic practise.

The Edword workflow looks more like this:

Edword workflow

Assignment is graded, student reads feedback, student writes reflection on feedback, student sees grade, student improves on next assignment

We also hope the feedback provided within Edword will be more engaging. Academics can enrich inline feedback with learning objects such as videos or H5P interactive learning objects. Rather than the flat text based feedback comments within Turnitin and Moodle, feedback in Edword helps students understand the mistakes they are making along with an immediate way to re-test their knowledge. The platform supports assessment for learning concepts.

 

A h5p learning activity embedded into assessment feedback for a student

A H5P interactive learning object within feedback in Edword

Edword records how long a student spends engaging with their feedback and allows students to rate the usefulness of the feedback they receive. These metrics are presented to staff as a way to evaluate how engaged students are and which feedback comments could be improved from a student perspective. 

We will make Edword available to staff and students during teaching block two with an on-boarding event for staff happening in early February. If you would like to take part in the project or ask some questions, please get in contact:

Mike Wilson

Ext. 3194

michael.wilson@port.ac.uk

A video introduction to Edword can be found here

New year, blank page, fresh start!

The festive season can be a hectic time for us all, rushing here there and everywhere, feeling that you’re constantly racing against the clock to get things done at home and at work. The last minute dash to the shops to buy your loved ones gifts for Christmas, the big food shop to get the all important ingredients for your Christmas dinner (the most eagerly awaited meal of the year!) and the work deadlines that seem to be ongoing and need to be met before everyone breaks up for annual leave – it can be relentless and can easily become overwhelming. 

At this busy time, it’s important to take time out not only to recharge your batteries but also to take stock of what you’ve achieved over the last year, time to reflect and to look ahead to the coming year. The beginning of a new year brings with it a fresh start and blank page for all our hopes for 2020.

‘Tomorrow is the first blank page of a 365-page book. Make it a good one’. - Brad Paisley

So, before we start thinking about the Tel team’s hopes for 2020, and all the things we want to achieve and work on, I would like to spend some time practicing ‘reflection’ (for tips on reflection visit my post: Through the mirror – learning through reflection) and look at what we’ve worked on in 2019. It’s easy to forget what we’ve achieved in a year and that’s why it’s so important when actively reflecting to make sure you have recorded your achievements and things you would like to improve on in some way or another. Our blog site, Tel Tales, is a great way of keeping a record of the Tel team’s work over the year and by contributing to the site it helps us take stock of what we have learnt, whether bad or good, collaboratively.

A whistle-stop tour of a year in the life of the Tel team

Accessibility was at the forefront of our minds in 2019. We wanted to ensure staff were aware that we, as a university, have a legal requirement to provide accessible content to our students. We developed Moodle Baseline and, to help with accessibility for digital content, we installed a plugin called Ally into Moodle. We looked at Grackle for accessible Google Docs and Slides, Automatic Media Transcription and how we could convert Print-based booklets to accessible online resources.

We attended conferences such as Remaking Marking Conference, Digifest 2019,  a Adobe/Higher Times forum called: Making digital literacy a pillar of education, the 19th EAIR Forum, TED Global Conference and Wonkfest 2019.

We looked at Scenario Based Learning, Wikipedia, Online Exams in Moodle, Videos in Higher Education, Content Capture and Digital Badges. We revisited topics such as Copyright and the all-important informative Did you Know? posts for Moodle. Moodle was upgraded to 3.7. We explored the increasing interest in Audiovisual in Education.

Assessments and feedback were also running topics of 2019. We looked at the different types of online assessments we had to offer and looked at feedback and shifting culture in the way we provided feedback to our students by ‘feeding forward’ to students using Edword.

We started to think about 5G and how this may affect us, by looking at Education 4.0 and Natalie 4.0. We looked at the pedagogy behind practice in learning and how visual note taking such as doodling could aid concentration and memory.

We revisited our social media platforms and looked at ways we could revamp them a little to increase our followers on Instagram and Twitter. So far this has worked and we are now advertising Tel and AcDev workshops, which again has helped increase the number of attendees.

We made time to reflect and looked at the importance of reflection. We explored change particularly in our current climate at the university and how we react to change. We looked at technology and our mental health and wellbeing.

Our regular guest blogger; Adrian Sharkey, kindly contributed to Tel Tales to tell us all about the new Linkedin Learning and Julian Ingle told us all about his writing retreats.

Stuart Sims and Andy Clegg joined the AcDev team and I returned from maternity leave.

We said a fond farewell to Jerry, who was seconded to IS.

Finally, we finished the year by looking at Three Useful Apps in Learning and Teaching. The 12 apps of Christmas had a make-over and was revamped into 12 days of Christmas – Learning and Teaching Advent Calendar.

Phew! I think that was everything! Surprising, when you do take the time to reflect, just how much you can achieve in one year. As you can see, for us it’s quite a lot!!

What’s in store for 2020?

As you may have been aware we have had a few internal promotions within the team: Mike Wilson has been seconded to a Senior Lecturer In Digital Learning & Innovation for part of the working week and Senior Educational Technologist for Tel the rest of the week. Shaun Searle has now been seconded into Jerry Collingswood’s post as a Senior Educational Technologist and Tom Cripps seconded to back-fill Mike’s role when he is working with AcDev. 

The new year will therefore welcome Jo Fairwood, seconded to Shaun’s original role on eLearn as an OCD, and Abigail Lee seconded to Tom’s OCD role. 

Content capture will continue to grow organically and, following a staff–student consultation, the university will be introducing a Content Capture policy in 2020.

Accessibility will continue to be a focus for the Tel team. We will also be looking at best practice when it comes to external examiners.

We will continue to explore Edword, an online platform that will allow richer feedback to students. And we will be working much more closely with the AcDev team in 2020, which we are very much looking forward to!

We intend to carry on blogging away – so please, when you have a spare five minutes, check in with us to see what we’ve been up to. And feel free to follow us on our social media platforms! Please leave comments – we appreciate any feedback you have for us.

Finally – we are always looking for guest bloggers! So if you have something of interest that you would like to share on Tel Tales then please contact me at marie.kendall-waters@port.ac.uk.

So from myself and all the team we hope you all have a very happy 2020

keep exploring

 

Image credits:

Case Study – Gill Wray

The Shorthand Units

Gill Wray, an academic member of staff in the School of Social Historical and Literary Studies within the Faculty of Humanities and Social Sciences is responsible amongst other things, for the Journalism Shorthand units. I’ve been talking to her about some of the interesting elements of her units that she has implemented for students with the help of the Faculty’s Online Course Developers, Scott, Joe and Daren.

Journalism Shorthand units run in the first and second years as a core requirement aiming to teach shorthand to those taking a Journalism course. As part of her teaching Jill has been involved with the development of some interesting interactive elements on her Moodle site.

I think this sort of work is worth highlighting to others as it shows how Moodle can be much more than just a repository for work, and handouts. Moodle allows an incredible amount of flexibility in terms of what content you can make available for students – it doesn’t just have to be downloadable PDF revision sheets!

The Test Your Shorthand WebApp

The ‘Test Your Shorthand’ app for practicing shorthand knowledge has been around for a while, though due to problems with audio playing on an older version, has recently been rebuilt as a responsive web app to remain functional on various devices across a variety of screen sizes.

The app, which you can see in the screenshots here, gives a student three different difficulty levels to test a student’s shorthand knowledge. Choosing one of these gives a short multiple choice shorthand quiz tuned to the difficulty of the option the student selected. The app also provides a series of shorthand ‘outlines’ (the squiggles that form the core part of journalistic shorthand) as revision aid, as well as 10 different voice recordings to practice note taking on. The audio is offered in 100, 110 and 120 words a minute format, perfect for a student learning to record what they hear.

The app is available as part of the Shorthand Year One Moodle site, and is offered as a supplement to the existing course content, which includes videos that are timed to release to students each week, and also other more traditional worksheet activities.

Digraph Train

Gill’s Shorthand site also includes The Digraph Train. When I asked her why she had added this interactivity to her Moodle site she said:

“One of the main challenges has been the inability of some students to recognise that digraphs ‘sh’, ‘ch’, ‘th’ and ‘wh’ make specific sounds.  We therefore produced a very simple ‘early learning’ style visual in the form of a moving train with carriages adding letters one at a time. There is audio as each carriage joins the train. This helps students understand how two letters come together to make a particular sound.”

The Digraph Train was produced by Gill, with the help of the Online Course Developers in the School, using a software package called Articulate Storyline. When I spoke with Joe Wright, who was responsible for the project, about why he chose Storyline he said:

“I chose to use Storyline because I found it gave me all the tools that would fulfil the task in hand. It is a great e-learning package which you can use to create unique projects using triggers and timing. It’s simple to use as it uses an interface similar to the Microsoft packages which makes it very easy to navigate, to add animations, images and sound to the project. Gill told me that the students found the end result to be very engaging”.

It’s worth mentioning, that both these projects took time, and required skills that are not reflected across every faculty. If you have an idea for something you want to create, but don’t know where to start, visit your Online Course Developers first more often than not they’ll be happy to help. If you think your idea might benefit students (or staff) in a faculty other than your own Technology Enhanced Learning would also be happy to work with you to get your idea off the ground.

Highlighting your own creative and innovative use of Moodle is a difficult thing. There is no University wide platform, no place a member of staff can go and say ‘hey! I helped make this and I think it’s good!’ Case studies like this are our way of putting good work out there for people to see. Currently both of these projects are available only to students studying the Shorthand units on Journalism courses.

Keeping everyone happy – tricky but not impossible

Anonymous or blind marking is an important part of the assessment and feedback process. For a student it ensures work is marked fairly without bias. However, there is an equally valuable requirement for academics and support staff to be able to identify students who have yet to submit their assignment and may be in need of additional support.

In the paper-based past, this was a relatively easy task. Students submitted assignments with cover sheets which could be easily removed my administrators. Assignments were tracked and handed to academics for blind marking.

Online assessment technology such as Turnitin and the Moodle assignment match-up quite closely to the workflow of paper-based assessment but with a few extra tools to help academics. There is no longer a need for students to identify themselves within their assignments as we know who they are when they log into Moodle. In fact, by the letter of the law, a student can be penalised for adding their name to an assignment. In reality, though, some departments still require students to provide a cover sheet in their assignment which invalidates the the blind marking setting in their Moodle or Turnitin assignment. My guess at the motivation for identifying students would be one of trying to help students and make ensure they don’t miss their deadlines. I’d be genuinely interested to hear the reasons for the need for cover sheets in the comments below.

What if there was a way for all the assessment stakeholders to get what they need and still preserve anonymity? Well luckily there now is a way to do this in Moodle.

On each UoP Moodle unit you will find a new report under Course Administration > Reports > Course Submissions.

When an assignment is live, course administrators and Online Course Developers can see a submission status, Turnitin paper id (or Moodle participant number), provisional grade and identifying information for each student in a cohort or group. This is all the information they will need to keep an eye on the process and transfer grading information to student records later on. With a bit of extra magic lecturers get to see a subset of this information including the identifying student information and a submission status even when an assignment is anonymised. For academics there is no link between the submission status and a specific submission, this is released to the academic after the post date. Coupled with a release threshold, which prevents anyone guessing who’s who, the report attempts to keep everyone happy.

Here’s an idea of what the report looks like in practice.

Click image for full size version

In the near future we plan to allow staff to download the data from the course submissions report to a spreadsheet making it easier to transfer to student records.

I’d be interested to hear if this makes online assessment a little easier. Feel free to share your thoughts in the comments box at the bottom of this page. If you find the report useful you may find the new assessment course format helps you out too. A short introduction video is available here:

Attributions
“Anonymous” image courtesy of Luciano Castello CC: www.flickr.com/photos/luccast85/6250260580

« Older posts

© 2024 Tel Tales

Theme by Anders NorénUp ↑