This blog is written by Tom Langston, Digital Learning and Teaching specialist and Dr Jo Brindley who is the Course Leader for the Academic Professional Apprenticeship (APA). The APA is a course for new career academics within the University and provides them with ideas, support and guidance on developing their teaching skills. The course is constantly evolving in how it is delivered as it was designed to highlight best practices and current ideas within Higher Education. The course, as you will see, decided to innovate and deliver feedback using Panopto. Providing the opportunity for academics (as students) to experience a range of feedback types and engage with non-traditional forms of feedback on assessment.
While the University has always had the option of recording videos, it wasn’t until Panopto, and the integration within Moodle, that I had considered using video as a tool for feedback. That’s not to say others hadn’t done it. I know people like Philip Brabazon have been doing audio feedback (not video I know) which has received positive feedback.
So with that being said, I recently graduated the first cohort for the Academic Professional Apprenticeship and have since started working closely with the course team. I was asked to help mark a few assignments and before I undertook this I asked if I could do the feedback as a video. Jo was keen to see how this would go and decided to do her feedback in the same way (I think she just needed an excuse and possibly a safety net to do it).
We planned how we would approach the feedback and decided that Panopto would give us the easiest way to implement it. It allowed us to have our face on a screen, allowing us to demonstrate that we might have some points that need investigation but in a non-confrontational and open way. The ability to record a screen with the assignment and the marking criteria displayed at the same time helped us show how we mapped our thoughts and marking to the submission.
This was my first experience as a marker which might mean this is a little unrepresentative. Having never marked scripts in the traditional way (either pen and paper or on-screen) giving video feedback felt a more comfortable way to mark as I knew what I was saying would not be misinterpreted. Now the argument here might be that it is a “quick” option, however, being new to marking I actually did both. Firstly, I worked through the submissions and wrote my feedback about each section down, and then secondly marked it again but on camera, and read back what I had noted the first time I read it.
When we were devising this marking process, we made the conscious decision to not worry about being “perfect” and going back to rerecord mistakes. We wanted it to be as conversational as possible so it felt natural and genuine. Not everyone is going to want to be on camera, but the same can be achieved from the audio feedback I mentioned earlier.
The other nice feature of Panopto is the tracking ability of views. It is possible to see how much of a video someone has watched and how long they spent reviewing the material. For me, the eye-opening part was that many of the submissions I marked, the student watched the first few minutes of the introduction and then skipped through the bulk of the video until they got to the feedback for the final higher weighted part of the submission, which was a literature review, similar to that written in a journal article. They skipped much of the feedback surrounding the reflective elements from the portfolio they had created as part of the submission, which I found interesting as these were the areas that I felt most people needed more work on, compared to the final section which was more similar to research work they may have previously undertaken.
With this in mind, I would still provide the detailed feedback that I did as I only marked a few submissions and not every student will approach their feedback in the same way. It is something though that I will review each time as it would be an element that I would discuss at the start of the assessment process with the students to find what they might value. If they just want a grade for certain sections or if they want a detailed breakdown across the whole assignment. These conversations would be a key part in helping students to engage as if they are asking for a certain level of feedback they will hopefully then investigate each area accordingly.
The impact of the pandemic has been a catalyst to try out new ways of working and I was excited by the suggestion from Tom that we try out using a screencast as a way of providing more personalised feedback as part of the assessment process.
I have, historically, used audio feedback and I know that this was always positively received by learners, so the opportunity to use Panopto was one I was keen to experiment with. For me, one of the benefits was that the screencast enabled four views; the assessment artefact itself, the marking criteria, the marker and the associated captions. This felt like a really robust way of delivering feedback as it was easy to link the marking criteria with the submitted assessment on screen, which assisted the learners to join the dots up regarding the award of marks.
As Tom has said, we took a conversational approach to the feedback, but this didn’t make the feedback provision a swift process as planning/note-taking was also required. I think this was useful to Tom as it was his first time marking, when we met for calibration following marking the same submissions at the start of the process, these notes assisted with our conversation. As I progressed through the bulk of the marking I started to utilise the pause facility, which I think made the process quicker (fewer notes to capture) and this didn’t seem to affect the overall quality of engagement with the feedback.
I was pleased with the approach and quality of feedback provided. There was definitely more scope to work on feed-forward and we will be providing feedback in this way during the next assessment diet. Comments from the External Examiner around this approach were very positive.