Adventures in Technology Enhanced Learning @ UoP

Author: Melita Sidiropoulou

UoP does the TESTA test! An introduction to the TESTA project

From January 2018 to September 2018, The University of Portsmouth will run the Transforming the Experience of Students through Assessment (TESTA) project. Initially involving 10 courses, the aim is to expand this pilot project to more courses and improve the quality of student learning through addressing programme-level assessment​ ​across the university.

What is TESTA?

TESTA, originally funded by the Higher Education Academy, is currently sustained by the University of Winchester and Southampton Solent University. TESTA aims to improve the quality of student learning through addressing programme-level assessment. Over 50 UK universities as well as universities in Australia, India and the USA have engaged with TESTA since its early project days (2009-2012). TESTA works with academics, students and managers – and for students, academics and managers – to identify study behaviour, generate assessment patterns to foster deeper learning across whole programmes, and promote assessment for learning.

Why TESTA?

Because:

  • There needs to be more consistency between modules, across programmes, and a greater emphasis on progressively developing students’ internalisation of programme-level standards, over time, rather than relying on documentation to specify criteria at the level of assignments or modules.
  • The programme view shifts perspectives from: figures/percentages  and student experience surveys (e.g. NSS)  to enhancement strategies;‘my’ unit to ‘our course’; teacher-focused on module delivery to student experience of the whole programme; from individualistic modular design to coherent team design.
  • It engenders a team approach. The process enables the researcher to get to know the team and programme;. It’s a listening process, and a valuing process​. The team make decisions based on data, knowledge and guidance.​
  • It enhances curriculum Design and Pedagogy; a.rebalancing formative and summative, b.making connections across modules, and c. ensuring sequencing and progression of assessment across the programme. Also, developing approaches to formative, including more authentic assessment, influencing curriculum design (content-load etc.) etc.

What does TESTA involve?

The process involves mixed research methods for the sake of a. exploring various dimensions of the programme and b. triangulating the data. The process for each course/programme includes: a TESTA audit; an Assessment Experience Questionnaire; and Student focus groups. The process results in a programme case study report with summary of findings, interpretations and recommendations and an interactive workshop presenting this report.

What people say:

The value was to look at what we do from a scientific perspective and look at things objectively, and that is really enabling us to re-think how we do things. Because it’s driven by the project the staff are very willing and accepting of the data. I don’t think anybody, after we had the meeting with you guys, sat there and said ‘They’re talking absolute rubbish.  What do they know?’ (Programme Leader, Nursing). ​

​‘I’ve found it useful to have a mirror held up, to give a real reflection. We talk about the ‘student voice’, but actually this has provided a mechanism, which isn’t part of the programme, which isn’t the evaluation’ (Programme Leader, Education).​

TESTA has revealed some really interesting and, I believe, accurate information about our programme/ approaches/ student experience.  The details of your report have enabled some really strong shifts. We would not have reached these conclusions otherwise and I feel that TESTA has had the desired effect of enabling us to think a little more progressively.’ (Programme Leader, Dance)

‘Our very productive TESTA meeting has stimulated much discussion about how we can develop our modules to include more formative feedback and more engagement in large lectures. Somedevelopments will be incorporated in the interim validation and others will influence our departmental policy on assessment and feedback for next year’ (Programme Leader, Psychology.)

More information about TESTA and a variety of resources can be found at http://www.port.ac.uk/departments/services/dcqe and https://www.testa.ac.uk/

Any interested programme/course leaders can send an email to amy.barlow@port.ac.uk or melita.sidiropoulou@port.ac.uk

Image credits: Photo by Andrew Neel on Unsplash

BOS online research tool (available to all staff and students) | an overview

For its student and other surveys, the University of Portsmouth (UoP) uses a variety of tools and research platforms including Bristol Online Surveys (BOS).  In addition to these, more options are being investigated for future use across the university, such as the Qualtrics research platform – currently already used by some departments, e.g., the Department of Psychology. This article will focus on BOS, since it is already used by the university and it gives open access to all UoP staff and students for the time being.

The UoP holds a licence which allows its users to create unlimited numbers of surveys for unlimited respondents. BOS is an online survey tool designed for academic research, education, and public sector organisations. It is an easy-to-use tool for creating online surveys. Run by JISC, BOS is used by over 300 different organisations in the UK and internationally. BOS has the ability for multiple organisations to run the same survey simultaneously and form ‘Benchmarking Groups’ to get answers to common questions or issues (and common surveys). The UoP runs the following national surveys using BOS: the Postgraduate Research Experience Survey (PRES); the Postgraduate Taught Experience Survey (PTES); the JISC Digital Tracker; and the UK Engagement Survey.

BOS has a very comprehensive knowledge base at: www.onlinesurveys.ac.uk/help-support/. A brief summary of the main survey functions as described on the BOS website will be described later (below), with most of the text taken from the aforementioned website. The three main BOS functions are:

  • Distributing a survey
  • Analysing the survey data
  • Creating a new survey

Creating and designing a new survey

There are three ways to create a new survey:

  • Create a new survey from scratch.
  • Create a new survey by copying an existing survey.
  • Create a new survey by importing a survey structure.
  1. Creating a new survey from scratch

To create your new survey:

  1. Click + Create new at the top left of the Dashboard.
  2. Enter a name for your new survey (you can change it later).
  3. Click Create survey.

This will take you straight to the Survey Builder where you can start adding pages to your survey.

2. Creating a new survey by copying an existing survey

To copy a survey:

  1. On your Dashboard , find the survey that you want to copy.
  2. Click on the  Copy survey icon.  This is found at the far right of the Dashboard.
  3. Enter a new survey name (you can change it later).
  4. Click Copy survey.

The new draft survey will appear at the top of your survey list (make sure that you have selected the DRAFT tick box at the top right of the Dashboard ).

To share your survey with another user:

  1. Check that the person that you want to share the survey with has an active BOS user account.
    • You can only share a survey with another BOS user.  If the person you want to share the survey with does not have a BOS user account, they will need to request one from the BOS account administrator at their institution.
  2. On your Dashboard , find the survey that you want to share and click on the   View/Edit survey permissions icon (or, from the Design tab of your survey, click on  Survey permissions in the left-hand menu).
  3. Any users who already have access to the survey are listed in the Survey permissions table, alongside their permission settings. To see your own permissions, click on + Show me at the top of the first column.
  1.    Enter a user’s email address in the search box at the top of the table, and click Add user. (Note that the user has to be registered with this email address in BOS.) The user will be added to the table.
  1. Tick the relevant permission(s) and click Save.

Note: Survey access control settings and survey permissions are not copied along with the survey. You may need to set these up again, if required.

Designing a new survey

To add a new question:

  1. Decide where you want to place your question.  Adding a question into a blue area of the survey builder will add a new, independent question.  Adding a question within another question (inside the brown box surrounding an existing question) will create a sub-question. Sub-questions are useful for following up a question to gather additional information and can be set up to be optional or mandatory depending on the respondent’s first answer.
  2. Click Add item.  This will bring up a list of items that you can add to your survey.
  3. Select the type of question that you want to insert. The question editor will open.
  4. Type in the question text and format it using the tool bar, if required.
  5. Add links, images or embedded media to the question text, if required.
  6. Depending on the question type, you will also be able to add answer options and advanced options below the question text.
  7. Click Add question.

Your question will appear inside a box on the main survey builder page. Here you can:

  • Make changes to it by clicking on the  Edit question icon.
  • Move, copy or delete it by using the Question actions icon.
  • Preview it by clicking on the Preview icon at the top of the page that the question appears on.

An example of a question options follows (multiple line free text question)

Furthermore, there is the option to convert a question into a different question type.

Distributing a survey

 

The Distribute tab gives you a variety of options regarding: piloting your survey; launching your survey; distributing your survey URL; and the Survey access control.

Piloting your survey

The best way to check a survey before its official launch is to pilot a full version of it. This ‘dry run’ of your survey allows you to test all of its features, including data capture and reporting. It also means reviewers can test your survey without needing access to a BOS account.

Launching your survey

Before you launch a survey, it’s important to make sure that it works properly.

The best way to check these things is to thoroughly pilot your survey. Simple surveys should at least be proofread and tested using the Survey preview.

Certain things cannot be edited once you have launched your survey.  Please ensure that you have checked your survey thoroughly before launching it.

The survey preview allows you to see what your survey will look like, navigate through the survey like a respondent and answer questions without any data being saved. The survey preview also offers the option to print your survey or to save it as a PDF. You can access the survey preview at any point while creating or running your survey.

A variety of distribution options is available and the distribution settings offer flexibility.

Distributing your survey

Once your survey has been launched, you must distribute your survey URL to your respondents so that your respondents are aware that the survey is open and know how to access it.  You can:

Analysing a survey

When it is time to analyse your survey you can to the following:

  • Accessing survey responses
  • Filtering survey responses / Browsing and excluding individual responses  
  •  Exporting response data

Final Remark

Any member of staff or student can ask to have access to the BOS online survey tool by sending an email to studentsurveys@port.ac.uk. More information about BOS can be found here https://www.onlinesurveys.ac.uk/about/ and help articles are available here https://www.onlinesurveys.ac.uk/help-support/ . BOS is not the only research tool the University of Portsmouth uses, and more research tools are being investigated for future use. Overall, BOS is a useful tool for qualitative and quantitative surveys.

 

Image credits: Photo by William Iven on Unsplash

Listening to the Student Voice | an Overview

The University of Portsmouth places the student experience at the centre of its philosophy and vision. The University’s vision as expressed in its education strategy 2012–2017 is: “To provide an excellent, inspiring and challenging educational experience underpinned by research, scholarship and professional and ethical practice, through which our students will be able to achieve personal, academic and career success”. Since the University strives to provide an excellent student experience, it creates and follows policies that promote ways in which such an experience can be facilitated. Such ways include teaching and other staff practices, support services, mechanisms that enable student participation in the shaping of University policies, student surveys, and other forms of feedback that allow the student voice to be heard.

In order to improve its standards, various teams are involved in undertaking research and conducting surveys. The Department for Curriculum and Quality Enhancement (DCQE) plays a major role in these activities. Other departments that are involved include the Academic Registry and the Graduate School. In addition to working with its people (staff and students), the University of Portsmouth often works closely with other institutions, the government, and bodies such as the Higher Education Statistics Agency (HESA) and the Higher Education Funding Council for England (HEFCE).

With both external and internal support and participation, the University of Portsmouth conducted a number of student experience surveys over the last few years, including the:

  • annual National Student Survey (NSS);
  • biennial Postgraduate Research Experience Survey (PRES);
  • biennial University of Portsmouth Postgraduate Research Experience Survey (UPPRES);
  • biennial Postgraduate Taught Experience Survey (PTES);
  • International Student Barometer (ISB);
  • UK Engagement Survey (UKES)
  • Mres Postgraduate Research Experience Survey (MPRES)
  • JISC Student Digital Experience Tracker
  • Unit Satisfactions Questionnaires (USQ); and the
  • University of Portsmouth Student Experience Survey (UPSES).

Furthermore, the University participates in various student experience projects, such as the Postgraduate Experience Project (PEP) and policy change projects that focus their efforts on the student experience, such as the Transform Project. These among other surveys and projects explore aspects of the student experience and educational excellence which revolve around the key areas of ‘teaching quality’, the ‘learning environment’, ‘student outcomes’, and ‘learning gain’ (as stated in the Teaching Excellence Framework). Overall, the University of Portsmouth promotes and achieves a student experience of a very high standard which results in a number of desirable outcomes: it places us very high up in the national rankings; encourages the pursuit and attainment of teaching and learning excellence; offers an equally rewarding experience to its staff; and contributes to the academic ethos that the University strives for.

The very existence of such a variety of student experience surveys and projects reflects the values that the University puts on a quality student experience – values that are upheld in the University’s policies. The high performance of the University – as presented in reports following these surveys and projects – as well as the subsequent action taken in response to such surveys demonstrate this. The University will continue to undertake research and conduct surveys in order to promote its values and strategies; provide first class educational opportunities to its students; improve its standards for and with society; develop the potential of its areas of strength; and gain a better understanding of areas in need of improvement.

Image credits: Photo by Japheth Mast on Unsplash

© 2024 Tel Tales

Theme by Anders NorénUp ↑