Adventures in Technology Enhanced Learning @ UoP

Category: Student experience (Page 2 of 4)

Personal Tutoring Project

As part of the OfS-funded project Raising Awareness, Raising Aspirations (RARA) staff from a number of teams – Technology Enhanced Learning (TEL), Information Systems, Academic Development, and the Academic Skills Unit – joined forces to develop a platform, website and learning resources to support tutors and tutees in the personal tutoring process.

RARA, a collaborative project between the University of Sheffield, King’s College London and the University of Portsmouth, investigated the extent to which an enhanced personal tutoring system might help reduce the attainment gap for Black and Minority Ethnic (BME) students and those from lower socioeconomic groups. The project had its roots in research (Cousin and Cuerton, 2012; Thomas, 2012; Mountford-Zimdars, 2015) which suggested that such a system could reduce the attainment gap, ‘based on evidence that the personal tutor can play a particularly important role in the academic integration of BME students and students from lower economic groups.’

We used an agile project methodology, drawing on the skills, experience and knowledge across the University. From the start we wanted to create a space for tutors, in consultation with tutors. From these consultations, it was clear that there was a varying understanding of the personal tutoring role across the university. Many personal tutors felt they were not equipped with the knowledge they needed to fulfill their role to the best of their ability, and this was especially true of those new to teaching.

TEL’s main project deliverable was to lead on creating staff- and student-based personal tutoring resources. In August 2018 we launched the website Personal Tutoring @ UoP for tutors and those that support this process. Since this initial launch TEL have been working to develop the site further – a new, more extensive version of the site will launch in February. The site provides information about the personal tutoring role, developing tutees, supporting and signposting tutees, and training resources.

Personal tutoring @ UoP Website

 

TEL have also developed student-facing resources within Learning at Portsmouth – a student website to support transition into higher education. As well as online provision, we also developed a paper-based guide for all first-year, campus-based students to be given at their first tutorial session.

Burke et al. (2016) found that academic staff play a key role in how students construct their feelings about capability, which ultimately lead to success or failure in higher education.

The guides include information for students on how to develop themselves whilst at Portsmouth and also provided contact details of services across the University and their faculty to support them in their studies and in times of personal difficulties.

The end of the two-year RARA project was marked by our University’s first personal tutoring conference for academic staff, and the launch of a RARA personal tutoring toolkit. As an institution we are now well on our way to implementing the recommendations made in the 2019 RARA Report. Student and staff feedback has been positive – the website has not only had an impact at Portsmouth but has formed part of a national toolkit for personal tutors. These have been presented at conferences and have received positive feedback on the clarity of their design. Looking to the future, TEL will continue to work with colleagues across the institution in the development of work in this area so that as an institution we can help tackle the attainment gaps that are prevalent nationally in higher education.

References

Cousin, G., and D. Cureton. 2012. Disparities in Student Attainment (DISA). York: HEA.

Mountford-Zimdars, A., Sabri, D., Moore, J., Sanders, S., Jones, S., & Higham, L. (2015). Causes of Differences in Student Outcomes. Higher Education Funding Council for England, HEFCE. Accessed July 23, www.hefce.ac.uk/pubs/rereports/Year/2015/diffout/Title,104725,en.html

Thomas, L. (2012). Building student engagement and belonging in Higher Education at a time of change: final report from the What Works? Student Retention & Success programme. London: Paul Hamlyn Foundation.

 

Guest Blogger: Simon Brookes – A content capture policy for the University

For several years, the University has provided staff with the technology to record video and/or audio for the purpose of extending teaching and learning activity beyond the confines of the classroom. This has included the provision of limited lecture capture technology in some large lecture theatres, as well as providing access to software that allows staff to produce learning materials on their computers.

The content captured in these ways is of particular use for revision purposes, for scrutinising difficult concepts, for students with caring responsibilities, and for students for whom English is not their first language.

In response to growing demand from our student body, the University has been investigating the possibility of expanding the availability of content-captured materials. This investigation was co-ordinated by the Content Capture Working Group, and included a full consultation with all University staff and students, via online surveys, as well as in-depth discussions at a series of “town hall” meetings, which were attended by staff and students from across the University.

This consultation informed the development of a Content Capture Policy, which is now available here, in draft format, for further scrutiny. This Policy aims to promote inclusivity and increase the accessibility of our teaching whilst reducing any potential barriers to learning. By implementing these adjustments, which would benefit all students, it should reduce the need for individual adjustments, promote good practice and maximise learning opportunities. If you would like to provide feedback please email it to Harriet Dunbar-Morris, Dean of Learning and Teaching, at DeanLandT@port.ac.uk.

The Policy will go to the University’s Student Experience Committee, before making its way, via the University Education and Student Experience Committee, to Academic Council for final approval. It will then be published in time for the start of the next academic year.

We will be introducing staff development sessions prior to implementation. Please look out for these.

Image Credits: Photo by Forja2 Mx on Unsplash

Engaging students with online assessment feedback

An Exploration Project

Technology Enhanced Learning and Academic Development are leading an exploration project centered around engaging students with online assessment feedback. We’re specifically exploring an assessment platform called Edword.

It’s worth mentioning that we’re taking a more scientific approach to this project, you could almost imagine it as an education lab experiment. 

Academics and educational technologists within our team have evaluated the functionality and advanced workflows that Edword offers. We think that it offers some real tangible benefits to students and staff. The platform has been designed based on some pedagogically sound principles, that’s really what’s most exciting about it. I’ll demonstrate some examples of these in action later in this post.

It’s not enough that we’re excited about a new assessment tool though. We need to explore and test whether our students and staff actually do experience a benefit from using Edword when compared to one of our existing assessment platforms such as Turnitin or the Moodle assignment.

In order for me to explain what Edword allows us to do, I need to explain what’s missing from our existing assessment systems. 

Current feedback workflow

Turnitin / Moodle assignment

Assessment graded, student sees grade, end of workflow

When an online assignment is handed back to a student via Moodle or Turnitin students see their grade immediately, before they’ve had a chance to read any inline or summary  feedback added by their lecturer. The grade is often seen by students as the end point within their assessment, their grade is a students entry point to the next stage of their course. What we actually want students to engage with is the meaningful and constructive feedback their academics have produced for them. This will help students improve their next piece of work. Unfortunately many students don’t read their assessment feedback and miss out on the benefits to them.

Edword has a ‘lock grade’ feature which means students can’t see their grade until after they’ve read their feedback and potentially also submitted a reflection on how they will put their feedback into practise. In this way, Edword supports the feed forward model of good academic practise.

The Edword workflow looks more like this:

Edword workflow

Assignment is graded, student reads feedback, student writes reflection on feedback, student sees grade, student improves on next assignment

We also hope the feedback provided within Edword will be more engaging. Academics can enrich inline feedback with learning objects such as videos or H5P interactive learning objects. Rather than the flat text based feedback comments within Turnitin and Moodle, feedback in Edword helps students understand the mistakes they are making along with an immediate way to re-test their knowledge. The platform supports assessment for learning concepts.

 

A h5p learning activity embedded into assessment feedback for a student

A H5P interactive learning object within feedback in Edword

Edword records how long a student spends engaging with their feedback and allows students to rate the usefulness of the feedback they receive. These metrics are presented to staff as a way to evaluate how engaged students are and which feedback comments could be improved from a student perspective. 

We will make Edword available to staff and students during teaching block two with an on-boarding event for staff happening in early February. If you would like to take part in the project or ask some questions, please get in contact:

Mike Wilson

Ext. 3194

michael.wilson@port.ac.uk

A video introduction to Edword can be found here

Accessibility: Investigating Automatic Media Transcription

Background

Accessibility is now an important aspect of digital learning. We need to take accessibility seriously both to satisfy the needs of an increasingly diverse student body and the requirements recently brought into law. Of course, digital learning often encompasses a wide variety of resources in a range of media. The challenge of bringing all these resources in line with regulations is considerable, both on a technical and organisational level. Fortunately technology can help to ease the burden, with a number of integrations available to help examine and fix content-related accessibility issues.

One particularly large challenge, and one that is particularly helped by the use of technology, is video. While it is possible to watch and transcribe a video manually, when faced with a library of nearly 8000 hours of video, the challenge becomes insurmountable! This is where technology can step in: it can automate the process and reduce the number of person-hours required.

For quite some time, YouTube has been able to automatically caption videos. In the past, however, the transcriptions produced by the algorithms have often been the subject of ridicule for the sometimes bizarre and hilarious interpretations. Thankfully things have moved on considerably, with increasingly advanced AI and machine learning helping to increase the reliability of computer transcription.

For the majority of our video content, we rely upon a home-spun system composed of a Wowza Streaming Media server and a custom-built front-end to manage content and metadata. While this system has the facility to allow subtitles to be added, it does not feature any way to automate the process of creating transcriptions. For this reason, we are currently investigating our options, with a view to either hosting our video content elsewhere or improving our current provision by implementing auto-transcription facilities.

The contenders

We have been investigating a few services to judge the accuracy of the transcription. We have tried each service with the same videos to see how accurately they can transcribe a variety of media content. Below are some details of three services we are currently examining.

Mozilla Deepspeech

An open-source option that can be run on-premises, Deepspeech requires a certain amount of technical skill in deploying and managing Linux servers. Being open-source and community driven, the more effort you put in, the better the output will be. It allows you to train your own neural network to increase the accuracy of transcriptions, so theoretically it would be possible to improve your transcription accuracy, although it may require a large investment of time and effort. As we are simply testing the out-of-box abilities, we have used the default models provided by the developers.

Google Speech to Text Engine

This is an API made available through the Google Cloud Platform. The service itself is used by YouTube to provide auto-transcriptions of uploaded videos. While using it through YouTube is free at the point of upload, utilising the API in your own projects can cause costs to rack up quickly (and remember that we have 8000 hours of video sitting on our servers, waiting to be transcribed). The pricing options are transparent, however, so we can easily calculate the cost of transcribing all of our existing content.

Amazon Transcribe

This cloud service is utilized by Amazon’s virtual assistant “Alexa” and works in a similar way to Google’s offering, with transcription charged based upon the number of seconds of content transcribed. The service is used by the content capture service Echo 360 to transcribe material. By our rough calculations, transcribing our 8000 hours of content through Amazon would be a little cheaper than through Google. 

The results

Here are some example transcriptions of one short piece of video content

Mozilla Deepspeech

so wee al seend apisode of the dragon tf dend where the ontroprenel holks in with a really great idea good looking numbers the dragons e recing out their hands and then one of the dragons pipes up let see your contract and os soddenly ontrepenelox exposed because they thought they had a contra they don’t what they have iser some verbal understanding your colercial contracts are really important to you business mey should be kept clear concise so the point to add value when seeking in bestment wor in ed if you come to sellin a business also commercial contracts areningportant to the void conslote because both sides of the contract should now wot their obligations are a more their rights are

Google Speech to Text (through youtube)

so we’ve all seen episodes of the Dragons Den where the entrepreneur walks in with a really great idea good-looking numbers the Dragons are eating out their hands and then one of the Dragons pipes up let’s see your contract and all the sudden the entrepreneur looks exposed because they thought they had a contract they don’t what they have is a some verbal understanding your commercial contracts are really important to your business they should be kept clear concise to the point to add value when seeking investment or indeed if you come to sell the business also commercial contracts are really important to avoid conflict because both sides of the contract should know what their obligations are and what their rights are

Amazon Transcribe

So we’ve all seen episodes of the Dragon’s Den, where the entrepreneur walks in with a really great idea, good looking numbers that dragons reaching out their hands. And then one of the dragons pipes up. Let’s see your contract over something. The entrepreneur let’s exposed because they thought they had a contract. They don’t. What they have is a some verbal understanding your commercial contracts of really important to your business. They should be kept clear, concise to the point. Add value when seeking investment, or indeed, if you come to sell the business. Also, commercial contracts are really important to avoid conflict because both sides of the contract should know what their obligations are, what their rights on.

Conclusion

As you can see from the output above, while the Mozilla software makes a good guess at a lot of the content, it also gets confused in other parts, inventing new words along the way and joining others together to form a rather useless text that does not represent what has been said at all well. I’m sure its abilities will improve as more time is spent by the community training the neural network. However, Google and Amazon clearly have the upper hand – which is not surprising, given their extensive user base and resources. 

While Amazon Transcribe makes a very good attempt, even adding punctuation where it predicts it should appear, it is not 100% accurate in this case. Some words are mis-interpreted and others are missing. However, in the main, the words that are confused are not essential to the understanding of the video.

Google Speech to Text makes the best attempt at transcribing the video, getting all words 100% correct, and even adding capital letters for proper nouns that it clearly recognises. There are options to insert punctuation when using the API, but this feature is not available in the YouTube conversion process.

From this (preliminary and admittedly small) test, it seems you get what you pay for: the most expensive service is the most accurate and the cheapest is the least accurate. Also, the headline cost of using Google Speech to Text on 8000 hours of video is not necessarily accurate. We need to remember that not all of this content is actively used: this is an accumulation of 8 years of content, and it’s possible that only a small fraction of it is still actually being watched. We now need to spend some time interrogating our video statistics to determine how much of the old content really needs to be transcribed. 

The best value compromise, if we choose to continue to host video ourselves, may be to transcribe all future videos and any that have been watched at some point in the last year. In addition, it should be possible to provide an ‘on-demand’ service, whereby videos are flagged by users as requiring a transcription at the click of a button. Once flagged, the video is queued for transcription and a few minutes later a transcription is made available and the user alerted.

Video title: Warner Goodman Commercial Contracts.
Copyright: Lynda Povey ( Enterprise Adviser) nest, The University of Portsmouth.

Image Credit: Photo by Jason Rosewell on Unsplash

Great feedback is essential

Wouldn’t it be great if students could read the feedback they’ve received for their assignment, write a short reflection on what they could do to improve (perhaps also identifying what they’d like to receive feedback on next time round) and then see their grade? 

Our current online assessment tools (Turnitin and Moodle Assignment) don’t allow us to do this. Luckily we know an assignment tool that does – and it has many other modern assessment feedback mechanisms too.

I’m passionate about helping improve assessment feedback for students. It’s one of the things I’ll be working on in my new secondment as a Senior Lecturer in Digital Learning & Innovation. On Mondays, Tuesdays and Wednesdays I’ll be working between the TEL and AcDev teams to help coordinate projects to better support academics, Online Course Developers and students with a focus on digital education. In particular, I’ll be working to help get a small pilot off the ground for EdWord – a fantastic new assessment tool that promises to address many of the requirements of modern assessment and feedback. If you’re interested in taking part in this pilot please let me know.

In addition, I’ll also be helping to establish an online staff community alongside the APEX programme featuring special interest groups. This will be a great place to make contact with like-minded staff from other faculties and exchange ideas.

Tom Langston and I will be creating a support mechanism for Online Course Developers who are interested in completing their CMALT portfolio and who might be interested in taking part in future elearning projects with TEL.

I’ll also be doing a bit of lecturing on the Research Informed Teaching programme, which I’m looking forward to. So this will be a busy year for me!

Please get in touch if you’ve got any ideas or projects we can help you with. Both the TEL and AcDev teams would appreciate  your feedback as we work to ensure we’re offering the services that will provide value to you and your students (you can reach me on ext. 3194).

Image credit: https://commons.m.wikimedia.org/wiki/File:Paper_Plane_Vector.svg

 

Minerva – the university rethought?

On 25 June I attended an Adobe/Times Higher forum called “Making digital literacy a pillar of education”, along with representatives from 40 or so other HE institutions.

There was no disagreement at the forum about the recent recommendation from the DCMS Select Committee that digital literacy should sit alongside the “3R’s” as a fourth pillar of education. Everyone agreed that, as the pace of technological change quickens, employers are less interested in a student’s knowledge than in their personal qualities – and in particular their ability to engage in lifelong learning. But there was no consensus on how universities can best prepare their students for life in a world in which digital technology will play an increasingly important role.

Of the institutions present at the forum, undoubtedly the most innovative approach to Education 4.0  was that adopted by the Minerva Schools. Minerva built a first-year undergraduate curriculum from scratch, but rather than base the curriculum on subject-specific knowledge they built it around 81 “habits of mind” and “foundational concepts”. Students engage in cross-contextual learning activities in small-seminar format, all of which require or exercise the use of those foundational concepts. Through these activities students pick up subject knowledge, but they are assessed on how well they satisfy the foundational concepts. 

In the first year of study the Minerva School’s students are based in San Francisco. Subsequently they spend time in Seoul, Berlin, London, Hyderabad, Bangladesh, Buenos Aires and Taipei. Sounds terrific! (And expensive…) And all of this is made possible using digital technology – it’s a fundamental enabling technology for Minerva.

Minerva Schools were able to take this approach because they were small, well resourced – and also because they were starting from scratch. It would be a huge task (probably an impossible task) for an existing university with thousands of students to change its curriculum in this way. But there might be elements of the approach that universities can adopt. It’s interesting that the Minerva project have recently opened its bespoke educational technology platform, called Forum, to partners: they claim that the platform, which was designed for use in a small-seminar format, can scale to support up to 400 students. It will be worth keeping an eye on this development. 

Image Credit: Commons Wikimedia: The Greek Goddess Minerva

Digital Badges

This blog post links, indirectly, to my previous post on gamification. While gamification can help promote learning and engagement in a ‘fun’ way, digital badges can be used to reward and encourage learning. So – what are digital badges?

Digital badges are an excellent way to recognise student achievement and engagement. Badges can be awarded via Moodle on completion of an activity, for example, or after the attainment of a specific grade/mark in a quiz. Upon graduation, students can take their badges with them via Open Badges and export them to a ‘backpack’ service such as Badgr. The badges can also be linked to LinkedIn.

In the words of Dr Joanne Brindley (Senior Lecturer in Education): “The Academic Professional Apprenticeship is delivered via blended learning. As such, it was important to me that I was able to effectively monitor the engagement and development of the course members, on an individual basis. Digital badges were an obvious choice, as they would enable the course members to have the flexibility  and autonomy to focus and work on tasks that met their individual learning needs, in a structured way, whilst also providing the course member and myself with an opportunity to gauge their personal progression.

The other benefit, was the ability to identify and set the criteria for each badge. In this instance, the badges were designed to reflect the values, knowledge and areas of activity in the UK Professional Skills Framework (UKPSF). By using this approach I can be assured, that upon completion, the course members have engaged with the dimensions of the framework required for Fellowship.”

For those new to digital badges here’s a quick ‘how to..’

Before issuing a badge you first have to create it. A variety of tools are available to do this such as Adobe Illustrator or sites such as Accredible Badge Creator (https://www.accredible.com/badge-designer/)  and Openbadges.me (https://app.openbadges.me/) both of which are free to use, Openbages.me is the better of the two.

User interface for creating badges in Openbadges.me

Figure 1 – user interface for creating badges in Openbadges.me

Figure 2 - Moodle Badge created using Open Badges

Once a badge has been created, you will need to download it so it can be uploaded to Moodle for issuing to students.

 

Figure 2 – Moodle badge created using Open badges

 

In Moodle badges are added via the ‘Administration’ menu, which is available by clicking on the cog icon on the top-right of your Moodle page, then clicking on More to access the Badges section (see Figure 3 below) which will allow you to manage and add new badges.

Figure 3 - shows what you'll see in the administrative area in Moodle to create new badges and manage the badges

Figure 3 – Moodle badge manager

In the Badges section, choose ‘Add a new badge’. This will allow you to upload the image for the badge and set various basic details such as a description, issuer and badge expiry date. You can manage your badges here as well

Once the badge has been added, you will need to set up the criteria for the awarding of the badge. To do this, click on ‘Manage badges’.

You can award badges according to one of three criteria: manual by role; course completion; activity completion. (Note: for activity completion you will need to make sure this is enabled via Course settings).

Badges are automatically issued once the set criterion (for example, achieving a certain grade in a selected quiz) has been met. Students can view the badges they have been awarded on the Moodle course page. You will just need to add a Latest Badges block from the ‘Add a block’ menu.

The Mozilla Backpack, which allowed badge recipients to ‘store’ their badges online has now been replaced by badgr (https://badgr.io/recipient/badges) Students would need to download their badges from Moodle and then upload them to badgr. The reason for doing this is so any earned badges can be shared with an employer for example. At the moment badgr is not linked to Moodle but hopefully it will be added in the near future.

Students can view their badges via the ‘Latest Badges’ block in Moodle (this can be added by anyone with editing rights on your Moodle Unit/Course page).

Figure 4 badge details accessed via latest badges block. This images shows student details

Clicking on the badge icon in the Latest Badges block opens up full details of the badge such as recipient and Issuer. From here the student can download the badge to add to their online backpack which will be badgr. It is hoped that once badgr is integrated students will be able to add it to the backpack directly from Moodle.

So, are badges worth having?

One concern often raised regarding the use of digital badges is that, well, aren’t they just a bit inappropriate at university level? Will university students take them seriously?

To overcome the possible scepticism of students it is important to be clear what the badges are being used for. If badges form an element of gamification and/or are linked to assessment then the badges are more likely to be seen as something worth acquiring as argued by Samuel Abramovich (Abramovich, S 2016). The badges can be linked to the acquisition of specific skills required as part of an assessment or awarded based on the achievement of a certain grade. Greater value can be placed on the badges if they are relatively difficult to obtain: there needs to be an element of genuine challenge before a badge can be earned. Badges can also be awarded for completion of a task – however, the effectiveness of this approach is something that would seem to need further research. A project carried out by the University of Southampton (Harvey, F.2017), whereby Geography students were awarded badges as they completed course milestones, found that only 25% of students actually claimed badges as they progressed. The reason for this relatively low take up was not clear.

Outside education, digital badges are now being used by companies such as Dell as part of their assessment programme (see https://er.educause.edu/blogs/2016/9/digital-badges-and-academic-transformation). The more digital badges are recognised by major employers, the more likely students are to view badges as worthwhile. Indeed, a recent case study (Anderson, L. et al (2017) Open Badges in the Scottish HE Sector: The use of technology and online resources to support student transitions. Project Report July 2017. Universities of Dundee, Aberdeen and Abertay) found that that 40% of students would value digital badges if employability were to be enhanced by their use. For lots more useful information on digital badges visit

https://elearningindustry.com/guide-to-digital-badges-how-used

References

Abramovich, S Understanding digital badges in higher education through assessment, On the Horizon, Vol. 24 Issue: 1, pp.126-131 2016, https://doi.org/10.1108/OTH-08-2015-0044

Harvey, F Journal of Educational Innovation, Partnership and Change, Vol 3, No 1, 2017

https://journals.gre.ac.uk/index.php/studentchangeagents/article/view/549

Photo by Melinda Martin-Khan on Unsplash

Melinda Martin-Khan

Scenario Based Learning

What is scenario based learning?

Jean Lave and Etienne Wenger, in their influential book Situated Learning (Lave and Wenger, 1991) argued that learning is most effective when it occurs within the context in which it is going to be used. Scenario based learning (SBL) is rooted in this idea. SBL, according to the definition provided by Massey University, “uses interactive scenarios to support active learning strategies such as problem-based and case-based learning”. The course developer creates a narrative – typically based on a complex, real-world problem – that the student works through and solves. SBL thus provides a safe yet realistic environment for the student to demonstrate their subject-specific knowledge and problem-solving skills. Furthermore, because SBL is often non-linear, it can provide numerous feedback opportunities to students, based on the decisions they make at each stage of the narrative.

I’m a course developer. When should I use scenario based learning?

Sometimes – in cases, for example, where students are required to make decisions and display critical thinking in complex situations – it can be difficult to provide realistic practice opportunities within the confines of a traditional course. In these cases, SBL comes into its own. Amongst countless other examples, SBL has been used successfully in engineering, nursing and business studies. It can be used to support both formative and summative assessment – but note that, for routine tasks that don’t require decision-making or critical thinking, there are more appropriate methods of assessment.

What tools can help me develop scenario based learning?

Moodle contains several tools that can be used to develop an SBL approach. The four tools I’d suggest can be used to build a learning narrative are: Database, Workshop, Forum and Lesson. (This is only a suggestion. The most important thing is to connect various activities and reinforce student learning.)

Below is one model that would permit the assessment process to become a wider, more holistic approach over the duration of a course. A range of short, targeted activities would give the students time to research their next task and help them develop their own learning profile.

Database -> Workshop -> Forum -> Lesson

  1. The student writes a short essay, based on their own experiences relating to a given task, and submits to the Database tool. Then, from these submissions, the academic allocates each student a different piece of work to mark/analyse.
  2. After assessing their assigned piece of work, each student submits their analysis to the Workshop tool (following criteria defined by the academic). The Workshop tool allows students to peer assess the submissions and get a final grade based on both their submission and their ability to assess others’ work.
  3. Once all this is finished, the Forum is used to get the students to discuss their experiences of the subject and how they could each improve certain aspects of their work.
  4. Lastly, the Lesson tool presents a high-risk situation to the student. The lesson can be developed to provide a realistic yet safe environment to explore the situation. The Lesson tool allows for either a branching or a linear format.

Each phase offers the student the chance to reflect on what they have learned and offers them ideas on what they should now do with the new information and theory they have researched as part of the unit.

These various elements could be done one straight after the other, or spaced out over the course of a unit. My recommendation would be to allow time between each assessment, which would give students the chance to develop and learn from what they have previously done.

If you are considering SBL as a means of assessment but would like to have a discussion about how you can implement within your teaching the TEL team runs a training session called “Facilitating Scenario Based Learning” or you can contact me tom.langston@port.ac.uk

Reference

Lave, J., & Wenger, E. (1991) Situated Learning. Cambridge: CUP.

Image credits: Photo by Fancycrave on Unsplash

Fancycrave

Accessibility of digital learning content at UoP

On 15 January 2019, following a two-month pilot, the TEL team switched on the Blackboard Ally plug-in across all modules on Moodle. In brief, if a lecturer has uploaded some digital course content to Moodle (typically Word documents, Powerpoint presentations or PDF files) then Ally permits students to download that content in an alternative format (electronic Braille, html, epub, tagged pdf, or mp3). This is great for accessibility, of course, but this is also an inclusive approach: any student, not just one with a particular need, might choose to download a Word document in mp3 format (to listen to on the go) or in epub format (to get the benefit of reflowable text on a e-reader). The TEL team will be providing students with more information about Ally over the coming weeks, but in this post I want to mention a feature of Ally that is of interest to authors of digital course content.

Ally generates an institutional report about the accessibility of course content on the institution’s VLE. So we now know what the most common accessibility issues are for the 38,462 course content files on Moodle. The top five are (drum-roll please):

  1. The document has contrast issues. Just under half of all documents (48%, to be precise) have contrast issues.   
  2. The document contains images without a description. Roughly 43% of all documents commit this accessibility sin.
  3. The document has tables that don’t have any headers. Just over a quarter (26%) of all documents have this issue; I suspect that the documents without this issue are simply those without tables.
  4. The document does not have any headers. This is a problem for 24% of documents.
  5. The document is missing a title. Again, 24% of documents have this problem.

The first four are classed as major accessibility issues; the fifth issue is classed as minor.

At first glance this seems shocking: about half of all documents suffer from a major accessibility issue to do with contrast. When we compare ourselves against other institutions, however, we learn that these issues seem to be common across the HE sector; indeed, we seem to be doing slightly better than many institutions. And the important thing is, now that we know what the issues are, we can start to address them. Over time, we should be able to drastically reduce the number of documents with these common – and easily fixable – problems.

One piece of good news: we have a relatively small number of documents that possess accessibility issues classed as severe. The most common severe issue at Portsmouth – just as it is at other universities – involves scanned PDFs that have not been put through OCR. There might be good, valid reasons why a scanned PDF has been used. But accessibility would certainly improve if authors minimised their use of such files.

Header image taken from Blackboard.com link. Retrieved from  https://www.blackboard.com/accessibility/blackboard-ally.htmlpng
(Assessed: 17th January 2019). Thank you to Ally for giving us permission to use their image.

Digital Experience Insights – a community of practice

I gave a talk on 14 November at the launch of JISC’s new Community of Practice in Digital Experience Insights. The JISC Insights service builds on their work with the Student Tracker – a survey of students’ experience of the digital environment. Portsmouth, as one of the initial pilot institutions for the Tracker, has more experience than most in using insights gained from the student survey.  

One of the key take-home messages from the event, at least for me, was that the issues we are grappling with here at Portsmouth are exactly the same issues with which other institutions are grappling. The event also provided a valuable sanity-check: the approaches we are taking are the same approaches that others are either taking, planning to take, or would like to take!

The graphic below shows one example of how the student digital experience at Portsmouth is not dissimilar to the student experience elsewhere. Students were asked to name an app they found particularly useful. The word cloud on the left shows the national response. Once the various types of VLE (Blackboard, Moodle, Canvas) are combined, the three most popular apps are: VLE, Google, YouTube. The word cloud on the right shows the Portsmouth response. The three most popular apps are: VLE, Google, YouTube. It’s the smaller words that carry the institutional flavour – and I think Portsmouth does extremely well in this regard; 93% of our students rate our digital provision as good or better.

Speech Bubbles

Helen Beetham gave one of the most interesting talks of the day. Helen gave an overview of a pilot into the digital experience of teaching staff. There were insufficient responses to publish statistically robust findings, but there were some interesting titbits in there. For example, students are much more positive than teachers about the digital environment. Is this because teachers are more critical? Or perhaps they have higher expectations of what a digital environment should look like? On the other hand, teachers are much more likely than students to want more digital technology in their courses. Are students more conservative when it comes to expectations around learning and teaching?

The majority of responders to the staff survey identified themselves as early adopters – and yet about 50% never search online for resources; 84% of them are unsure of their responsibilities in relation to assistive technologies; 81% are uncertain when it comes to dealing with their own health and well-being in a digital environment; and 13% never update their digital skills. By far the biggest problem staff face in improving their digital teaching is – of course – lack of time to do so.

For the past three years Portsmouth has sought to understand the student experience of the digital environment, and we plan to run the JISC insight service for a fourth year. But we could build a much richer picture by asking teaching staff as well as students. So this year we plan to run the staff-facing digital insights survey. More details to follow in 2019! 

Credit image: Photo by Annie Spratt on Unsplash

« Older posts Newer posts »

© 2024 Tel Tales

Theme by Anders NorénUp ↑