Adventures in Technology Enhanced Learning @ UoP

Tag: feedback (Page 3 of 3)

Guest Blogger: Mary Watkins – Using Turnitin to increase consistency in assessment marking and feedback

The School of Education and Childhood Studies (SECS) have used electronic submission since 2015 with all students asked to submit their artefacts through Turnitin, Moodle Assignments or Moodle Workshops. SECS reviewed our electronic submission process at the end of 2016/17 and looked for ways we could improve the process for students as well as markers.

Student feedback suggested that changes could be made to improve the consistency of feedback across units as well as the transparency of grades and the ways in which grades could be improved in the future.

Using Rubrics

Joy Chalke (Senior Lecturer) and Chris Neanon (Associate Head Academic) worked with Mary Watkins (Senior Online Course Developer) to create an electronic version of the School’s new grade criteria/mark sheet for each level of study from level four undergraduates through to level seven postgraduates. The electronic version was created using a Turnitin rubric which reflected the four key areas students’ submissions are marked by:

  1. research,
  2. content,
  3. argument and analysis,
  4. organisation, style and presentation.

The new Turnitin rubrics help to ensure that all markers evaluate student work via the same set criteria, in this case based on these four key areas, and therefore respond to the students’ request for consistent marking.

Having attached the rubric to all Turnitin dropboxes, markers were able to indicate to students areas of strength as well as areas for improvement in the artifact submitted and link these to the sections on the rubric. Upon receiving this feedback students have been able use the rubric to see how improvements to their research, content, argument, and organisation will increase their grade in future submissions.

Using QuickMarks

Joy and Mary also worked together to create sets of ‘QuickMarks’, a tool in Turnitin that allows markers to drag and drop comments on to students’ submissions and therefore reduces the need for online markers to type the same comments repeatedly.

Joy drafted a set of QuickMarks for each level of study which Mary converted into corresponding sets of QuickMarks. Markers were then able to drag and drop relevant comments from the set on to students’ submissions, amending or adding to the comment(s) as appropriate.

A pilot was conducted in two units with large numbers of markers on the UG programmes and trialed in a unit on a PG program. SECS will be monitoring this process throughout 2017/18 and using feedback from students as well as staff to improve the way marking is conducted and feedback is given to our students.

UoP does the TESTA test! An introduction to the TESTA project

From January 2018 to September 2018, The University of Portsmouth will run the Transforming the Experience of Students through Assessment (TESTA) project. Initially involving 10 courses, the aim is to expand this pilot project to more courses and improve the quality of student learning through addressing programme-level assessment​ ​across the university.

What is TESTA?

TESTA, originally funded by the Higher Education Academy, is currently sustained by the University of Winchester and Southampton Solent University. TESTA aims to improve the quality of student learning through addressing programme-level assessment. Over 50 UK universities as well as universities in Australia, India and the USA have engaged with TESTA since its early project days (2009-2012). TESTA works with academics, students and managers – and for students, academics and managers – to identify study behaviour, generate assessment patterns to foster deeper learning across whole programmes, and promote assessment for learning.

Why TESTA?

Because:

  • There needs to be more consistency between modules, across programmes, and a greater emphasis on progressively developing students’ internalisation of programme-level standards, over time, rather than relying on documentation to specify criteria at the level of assignments or modules.
  • The programme view shifts perspectives from: figures/percentages  and student experience surveys (e.g. NSS)  to enhancement strategies;‘my’ unit to ‘our course’; teacher-focused on module delivery to student experience of the whole programme; from individualistic modular design to coherent team design.
  • It engenders a team approach. The process enables the researcher to get to know the team and programme;. It’s a listening process, and a valuing process​. The team make decisions based on data, knowledge and guidance.​
  • It enhances curriculum Design and Pedagogy; a.rebalancing formative and summative, b.making connections across modules, and c. ensuring sequencing and progression of assessment across the programme. Also, developing approaches to formative, including more authentic assessment, influencing curriculum design (content-load etc.) etc.

What does TESTA involve?

The process involves mixed research methods for the sake of a. exploring various dimensions of the programme and b. triangulating the data. The process for each course/programme includes: a TESTA audit; an Assessment Experience Questionnaire; and Student focus groups. The process results in a programme case study report with summary of findings, interpretations and recommendations and an interactive workshop presenting this report.

What people say:

The value was to look at what we do from a scientific perspective and look at things objectively, and that is really enabling us to re-think how we do things. Because it’s driven by the project the staff are very willing and accepting of the data. I don’t think anybody, after we had the meeting with you guys, sat there and said ‘They’re talking absolute rubbish.  What do they know?’ (Programme Leader, Nursing). ​

​‘I’ve found it useful to have a mirror held up, to give a real reflection. We talk about the ‘student voice’, but actually this has provided a mechanism, which isn’t part of the programme, which isn’t the evaluation’ (Programme Leader, Education).​

TESTA has revealed some really interesting and, I believe, accurate information about our programme/ approaches/ student experience.  The details of your report have enabled some really strong shifts. We would not have reached these conclusions otherwise and I feel that TESTA has had the desired effect of enabling us to think a little more progressively.’ (Programme Leader, Dance)

‘Our very productive TESTA meeting has stimulated much discussion about how we can develop our modules to include more formative feedback and more engagement in large lectures. Somedevelopments will be incorporated in the interim validation and others will influence our departmental policy on assessment and feedback for next year’ (Programme Leader, Psychology.)

More information about TESTA and a variety of resources can be found at http://www.port.ac.uk/departments/services/dcqe and https://www.testa.ac.uk/

Any interested programme/course leaders can send an email to amy.barlow@port.ac.uk or melita.sidiropoulou@port.ac.uk

Image credits: Photo by Andrew Neel on Unsplash

Listening to the Student Voice | an Overview

The University of Portsmouth places the student experience at the centre of its philosophy and vision. The University’s vision as expressed in its education strategy 2012–2017 is: “To provide an excellent, inspiring and challenging educational experience underpinned by research, scholarship and professional and ethical practice, through which our students will be able to achieve personal, academic and career success”. Since the University strives to provide an excellent student experience, it creates and follows policies that promote ways in which such an experience can be facilitated. Such ways include teaching and other staff practices, support services, mechanisms that enable student participation in the shaping of University policies, student surveys, and other forms of feedback that allow the student voice to be heard.

In order to improve its standards, various teams are involved in undertaking research and conducting surveys. The Department for Curriculum and Quality Enhancement (DCQE) plays a major role in these activities. Other departments that are involved include the Academic Registry and the Graduate School. In addition to working with its people (staff and students), the University of Portsmouth often works closely with other institutions, the government, and bodies such as the Higher Education Statistics Agency (HESA) and the Higher Education Funding Council for England (HEFCE).

With both external and internal support and participation, the University of Portsmouth conducted a number of student experience surveys over the last few years, including the:

  • annual National Student Survey (NSS);
  • biennial Postgraduate Research Experience Survey (PRES);
  • biennial University of Portsmouth Postgraduate Research Experience Survey (UPPRES);
  • biennial Postgraduate Taught Experience Survey (PTES);
  • International Student Barometer (ISB);
  • UK Engagement Survey (UKES)
  • Mres Postgraduate Research Experience Survey (MPRES)
  • JISC Student Digital Experience Tracker
  • Unit Satisfactions Questionnaires (USQ); and the
  • University of Portsmouth Student Experience Survey (UPSES).

Furthermore, the University participates in various student experience projects, such as the Postgraduate Experience Project (PEP) and policy change projects that focus their efforts on the student experience, such as the Transform Project. These among other surveys and projects explore aspects of the student experience and educational excellence which revolve around the key areas of ‘teaching quality’, the ‘learning environment’, ‘student outcomes’, and ‘learning gain’ (as stated in the Teaching Excellence Framework). Overall, the University of Portsmouth promotes and achieves a student experience of a very high standard which results in a number of desirable outcomes: it places us very high up in the national rankings; encourages the pursuit and attainment of teaching and learning excellence; offers an equally rewarding experience to its staff; and contributes to the academic ethos that the University strives for.

The very existence of such a variety of student experience surveys and projects reflects the values that the University puts on a quality student experience – values that are upheld in the University’s policies. The high performance of the University – as presented in reports following these surveys and projects – as well as the subsequent action taken in response to such surveys demonstrate this. The University will continue to undertake research and conduct surveys in order to promote its values and strategies; provide first class educational opportunities to its students; improve its standards for and with society; develop the potential of its areas of strength; and gain a better understanding of areas in need of improvement.

Image credits: Photo by Japheth Mast on Unsplash

Turnitin – Multiple Markers

*Currently we have had to disable this feature for some standard functionality to work, we will look at reactivating it as soon as a more stable version is available.*

Turnitin, as we all know, allows students to submit their work electronically and get a ‘similarity report’ – a comparison of the submitted work against a vast database of existing papers and websites. Academics have access to the similarity reports, which can be a great help in cases where they suspect a student might have committed plagiarism. Turnitin, through features such as comment banks and drag-and-drop comments, also works well for marking work electronically.

While we have been using Turnitin at Portsmouth for many years, the interface has changed somewhat; it’s now called Feedback Studio.

Feedback Studio has a much cleaner interface than the classic version of Turnitin, and it now works within a mobile device without needing to install the Turnitin app (which is only available on iPad).

The newest feature to become available is Multiple Markers, which is currently in beta. Multiple Markers helps with second marking. A marker’s initials are placed next to any comment or quickmark that has been placed into the document. As you can see from the image, there are three comments here: two from the first marker (with initials PQ; you can see the bubble comment and quickmark added to the text) and one from the second marker (with initials TL; the initials are placed next to a bubble comment). Any plain text comments or strikethroughs are not initialled.

Multiple Markers is a great feature for academics who need the ability to share marking or do second marking, while students can quickly and easily see where different markers have annotated their work.

Newer posts »

© 2024 Tel Tales

Theme by Anders NorénUp ↑