Adventures in Technology Enhanced Learning @ UoP

Tag: learning (Page 1 of 4)

WiseFlow – Looking in the mirror with reflective portfolios in WiseFlow

Hey there, fellow exhausted souls!

Can you believe it? We’re finally coming towards the end of the academic year, and boy, has it been a fun ride!  Our WiseFlow pilot has gone from strength to strength as we support academics through a new assessment process.  More importantly, we have successfully run two separate assessments using our innovative approach of using WiseFlow as a reflective portfolio – the first use case of this we know about!  We’ve grown, learned, and potentially discovered an exciting prospect for the future of reflective portfolios at Portsmouth University, so let’s take a moment to reflect on the journey we’ve been on. 

You may have read our previous blog post on “Unlocking the power of WiseFlow: Transforming ePortfolio assessments” where we discussed the possibilities of using WiseFlow as a viable reflective portfolio platform and the benefits a reflective portfolio approach brings.  For students, this helps develop their metacognitive skills and self-awareness as learners over a period of time.  Academics, on the other hand, can use reflective portfolios to assess students’ learning outcomes in a more comprehensive and authentic manner.  This is all part of our wider WiseFlow pilot to provide one integrated assessment platform that serves our current (and future) assessment needs within Portsmouth University, which Mike Wilson spoke to us about recently on our podcast – you can listen here

Teach Well and Research-Informed Teaching

This year we ran two reflective portfolios within WiseFlow as part of our pilot project – to test the water and find out if this was even possible. The first was within our Researched Informed Teaching module, which supports early career academics to apply their learning in educational enhancements into their own contexts, through reflection and innovation.  Students will draw together higher education policy, research methods and educational developments to build students knowledge for their future work.  Secondly, we ran a reflective portfolio in our new level seven Teach Well: Principles to Practice module, which is a professional development route for those in roles related to supporting student learning. Students in this module embark on a pedagogical journey through three pillars of practice for teaching well in higher education, gaining the confidence to critically evaluate learning and design approaches and reflecting on what it means to teach well across different modes of study.  We recently caught up with Maria Hutchinson who runs this module in our podcast series, if you missed this one, you can listen here

We’ve worked closely with these academics and our support teams to develop reflective portfolios for these modules that can be used as a summative assessment vehicle which is both intuitive for learners and versatile enough to encompass a broad range of tools which enable the course learning outcomes to be demonstrated in an engaging and meaningful way.

What the students said…

Following the submission of reflective portfolios into WiseFlow, we sent out a survey to participants to gain their feedback and views.  Some of the headline figures are detailed below…

  • 90% of students found the WiseFlow reflective portfolio easy to navigate
  • 90% of students agreed that a reflective portfolio suited this type of assessment (compared with traditional essay-based assessment methods)
  • 82% of students felt their own students would enjoy using a reflective portfolio in WiseFlow
  • 71% of students enjoyed the interactive assessment methods, such as histograms, voice recorders etc. 
  • We received multiple comments about the clear instructions that were given on how to access and use Wiseflow as well as its reliability and stability as a platform.  Many users also commented positively on the functionality that WiseFlow offered compared to previously used portfolio solutions. 

Students also commented on…

  • If there was a need to add another system to Portsmouth University’s available assessment platforms – “There are too many platforms for submitting the work, Moodle, ePortfolio, WiseFlow, it is really confusing and frustrating that is necessary to learn how to use different platforms for different modules.”
  • The lack of formatting transfer from applications such as Word, when copying and pasting into WiseFlow – “Transfer of formatted MS Word document to WiseFlow could be improved. Currently, the document format is lost during the cut & paste process which then requires more effort to re-format within the WiseFlow portal.”
  • Better integration with Moodle and WiseFlow – “I’d like to see direct access from Moodle”. 

The data presented highlights the positive reception of WiseFlow as a reflective portfolio solution by students. The high percentage of students that recognized the suitability of a reflective portfolio as an assessment method, in comparison to traditional essay-based approaches and praised its usability is a really positive sign. The positive feedback on the interactive assessment methods further emphasizes the adaptability of the question bank in a traditional FlowMulti assessment to be used in an innovative way. 

However, some concerns were raised by students, such as the frustration of managing multiple assessment platforms at the university, indicating a need for better integration. This all links to our Digital Success Plan to (re)design robust assessments to meet the needs of the diverse student population within a blended and connected setting and incorporate a robust specialist end-to-end assessment platform. Our aims in the project were to make it easier for academics to design assessments, easier for students to find their assessments and feedback, and support staff by reducing the manual workaround assessments for academics.  During the next stage of the pilot project, integration into our current systems is a top priority and will alleviate these challenges.  Furthermore, the lack of formatting transfer from applications like Word to WiseFlow was highlighted as an area for improvement. These critical comments provide valuable insights for further refining and optimizing the WiseFlow system.

The evidence is clear to see – WiseFlow has the ability to provide a viable solution to reflective portfolios, with a bit of refinement – it could be excellent. 

What the staff said…

It was also vital to us that we gathered feedback from our academic staff.  

  • 100% of staff agreed that WiseFlow allowed them to develop their assessment in ways that were not previously possible
  • All staff agreed the WiseFlow reflective portfolio allowed them to fully cover learning objectives and meet the needs of their students
  • We received multiple comments about the speed of the platform, intuitive nature and search functionality which made the verification/moderation process seamless.  Staff also commended the accuracy of the rubrics for grading and how new interactive elements made them rethink how they could better use this type of functionality in the future.

Staff also commented on…

  • Comparisons to previously used portfolio platforms – “Historically the module used [another portfolio system] which was really clunky and didn’t work well at all. I really liked that Wiseflow could be scrolled across (as opposed to clicking through each page) and the layout was great”
  • Design elements within the marking interface – “It would have been useful to have had the comment box movable (I work with two screens and being able to drag the box to another screen to write on would have been a nice touch – several times I had to keep opening and closing the box as I wasn’t able to see the text underneath it)”
  • Having more time to explore the platform – “I did not feel I had enough time to play before it went live for students, but this was not WISEflow’s fault – it was just timing”. 

As an honest answer, we’ve been blown away by our staff feedback.   The unanimous agreement that WiseFlow enables new possibilities for assessment development speaks very highly of this solution and its potential in enhancing the teaching and learning experience for students at Portsmouth University.  The potential to create authentic assessments through the use of reflective portfolios is exciting.  The accuracy of the grading rubrics was also very highly commended – allowing students to have a greater chance of achieving a clear and defined target and making academic decision-making easier, fairer and more accurate.  In terms of developmental areas, the movement of the comment box is a fair point – we’ve heard from other academics about the size of the comment box before – hopefully, something that WiseFlow’s New Marker Journey will alleviate. 

Where do we go from here?

As we raised in our first blog post – the reflective portfolio solution in WiseFlow is far from perfect, with a few simple tweaks the solution could become very appealing. Sadly, some of these are out of our hands and lie within the code of the platform.  We’ve learnt a lot during the duration of this assessment as a project team, including developmental areas we have highlighted for the future.  

The single biggest limiting factor when using a reflective portfolio is when using a file upload question type.  This is limited to twelve files that are no more than 10Mb each – multiple file upload questions can be used, but will still have limits on them.  We have approached WiseFlow about this for development purposes, however, we have yet to have any significant movement on removing this limit.  The removal of this limit puts WiseFlow in an incredibly powerful position to offer another “string to their bow” in terms of assessment choice and would truly open up the use of reflective portfolios within the platform.  Sadly, with this limit in place, using reflective portfolios with some faculties such as our Creative and Cultural Industry, where students would regularly upload large .psd, CAD files, HD video, and high-quality audio etc) is just not a viable option.  Creative students will often build a “portfolio career” and we would love to be able to work with them on developing reflective portfolios, but this limit stops us.  Until this is removed, careful consideration must be taken at the planning stage of an assessment as to whether the reflective portfolio is the correct solution.  Further to this, other limitations must be considered – for example, once the reflective portfolio is live for students to complete, it cannot be altered, changed or adapted.  During the pilot, we’ve worked extensively with academics and our support teams to iron out any issues prior to release. Careful planning and consideration must take place in the authoring phase of an assignment, which will then be rigorously checked prior to release – in the same way an exam would.  This has worked at a small scale but we would need to ensure appropriate support mechanisms are in place at a larger scale.  

Our student feedback gave us valuable insight into the process of using WiseFlow.  Although reflective portfolios save every 10 seconds, if a student deletes a file or a piece of text and exits the platform, this cannot be recovered.  Over the duration of the assessments that took place, we encountered one reported instance of this. We also had some reports of formatting that will not copy from Word documents.  Again, we approached WiseFlow regarding this and it is recommended to copy/paste plain text from Word and finish the styling in the text editor of WiseFlow.  Although this solution works, having formatting that copies across would make students’ work translate much easier – particularly for those who write on external documents before copying into the platform at the last minute (like myself). In terms of progression beyond WiseFlow, we’d love for students to be able to take their work from the platform and have the ability to store it themselves or share it beyond the WiseFlow platform.  Currently, there is no solution to this.  A “zip folder” that contained all exports and uploaded files of any inputted answers into WiseFlow would be a great starting point.  Again, we’ve put forward the idea to WiseFlow, but have yet to have any movement on this.  

Where do we take our pilot now?

Although these are risks with using a reflective portfolio solution in WiseFlow, the prospect and the potential gain of this authentic assessment are exciting.  We’ve taken the plunge and proven the concept works, highlighting potential development areas which we really hope get some traction and we’d like to think WiseFlow will be open to listening to these developmental ideas.  As for our pilot project as a whole, we move into a second phase of the pilot with a continued focus on reflective portfolios but also some other areas of assessment we have struggled with in the past, such as large file submissions.  We have a plethora of training and support we are actively developing and working with local teams to ensure staff feel confident using the systems.  

We continue to have a waiting list for academics who are wanting to work with us to develop reflective portfolios in WiseFlow. I find myself meeting with academics on a weekly basis to discuss potential projects and reflective portfolio solutions in their disciplines.   So far, we’ve done no real advertising, and this interest has been created from word of mouth and from those who have used it as students. We are keen to share our experiences with other Universities in WiseFlow user groups, who are actively keen to explore this and want to learn about our innovative approach. However, we need to be open and honest about the limitations that this solution has at the moment. Collectively, we might hold enough power to make change happen but until that point, caution must be taken before embarking on a reflective portfolio to ensure this is the correct fit for assessment.

The potential of this solution is game-changing, not just for us, but for a lot of other Higher Education institutions across the world.

The future of reflective portfolios in WiseFlow is exciting – keep watching this space.  

Chris

Credit Image: Photo by MidJourney 

Guest Blogger: Tom Langston – How ChatGPT Revolutionized My Writing Process – The AI version

(Co-writer: ChatGPT)

This version was generated after feeding in my attempt at writing this blog post and asking ChatGPT to ‘shorten this text into a blog post’.

I recently discovered the incredible potential of ChatGPT, and it completely revolutionized the way I approached my writing. As a student pursuing a Professional Doctorate in Education, I often struggled with crafting cohesive and focused assignments. My writing style was scattered and verbose, making it difficult for readers to follow my train of thought. Enter ChatGPT, a powerful AI language model that became my secret weapon in developing a well-structured research proposal.

Initially, I sought human help from my project supervisor, Stephen Webb, who pointed out that my writing relied too heavily on vague terms like “this” and “it.” While I understood the meaning behind my words, Stephen reminded me that readers might not share the same clarity. Armed with his feedback, I set out to improve my proposal.

When I completed the next draft, I thought I was finished. However, the word count was far too high, so I turned to ChatGPT for assistance in condensing my writing. It did the job, but the result lacked my personal touch and voice as a writer. I realized I would have to manually trim down the content to achieve the desired outcome.

During this editing process, I discovered the true power of ChatGPT. I started using it to check sentences that I struggled to articulate clearly. When I asked for grammar assistance, ChatGPT not only provided corrections but also suggested ways to enhance sentence structure and improve readability. This iterative process allowed me to refine my writing, discarding extraneous thoughts and rearranging sentences for better flow. It became evident that if a particular text no longer fit within the new leaner version, despite my attachment to it, it had to be eliminated. This experience epitomized the essence of editing.

As I continued to utilize ChatGPT, I encountered questions of ethics. While it proved immensely useful in reducing my word count, I couldn’t help but feel uneasy about presenting the AI-generated content as my own. The ethical implications of using AI in writing remain a divisive topic, with differing perspectives on its appropriateness and boundaries. To navigate this issue, some suggest referencing AI tools in an appendix, detailing their specific use and purpose in the writing process. However, this raises questions about the extent to which we should acknowledge AI contributions and how that compares to other forms of assistance, such as spell checkers or language tools like Grammarly.

One important consideration is whether AI tools are always correct. While they excel at constructing language, they may strip away the author’s voice and inadvertently introduce errors. In my experience, Grammarly sometimes made corrections to the text generated by ChatGPT, highlighting the need for human judgment in refining the output.

In conclusion, ChatGPT became a game-changer for me, transforming my writing process in ways I had never imagined. I believe it should be embraced as a valuable resource for students to tighten their drafts before seeking extensive academic support. By incorporating AI tools, we can alleviate the burden on educators, who can then focus on reviewing more polished submissions requiring minimal intervention. However, it is crucial to strike a balance and ensure that AI enhances our own voices rather than overshadowing them. AI is undoubtedly here to stay, and by harnessing its power thoughtfully, we can elevate our writing while preserving our unique identities as authors.

Thoughts from the Proofreader:

I was asked to proofread Tom’s blog post here and have a look at ChatGPT’s version too. Comparing the two, I definitely found Tom’s writing much more personable – I could imagine him speaking as I read through it. ChatGPT’s version presented the information efficiently and effectively, and I found nothing to correct. However, in its response to the prompt, it had taken away everything that made Tom sound like Tom!

This experience also really made me realise how many misconceptions I have about AI tools. I’d assumed that ChatGPT would have a very grammatical approach to its interpretation of language – rather like I’ve been told I have! However, when Tom asked it about the difference between ‘that allowed’ and ‘allowing’, ChatGPT talked with authority about implications and potential meanings. This answer was a long way from my interpretation, which attributed the difference to the grammar of relative clauses (X refers to one thing, Y refers to another). As Tom demonstrated with his irony example, it’s worth being cautious with how far we trust its responses. And I think we can be confident that human input will still be needed for a few years (or at least months) yet. 

Credit Image: Photo by Bram Naus on Unsplash

Guest Blogger: Tom Langston – Last Night ChatGPT Saved My Life…

The classic line from Indeep’s 1982 post-disco hit “Last Night a DJ Saved My Life” serves as the basis of my title. As ChatGPT did – not literally, but figuratively – save my life.

I am currently studying for the university’s Professional Doctorate in Education and, since February, have been completing my final taught assignment entitled “Research Proposal”.

It takes me a while to write, as I don’t find it easy, and I am not that good at it. My approach to writing is best described as little and often, frequently getting distracted and losing focus on the task at hand. If you have ever seen Dave Gorman’s “Googlewhack Adventure”, you will understand my problem: his book and stage show outlined how he was meant to write a fictional novel and ended up writing and performing about his experiences of Googlewhacks (Googlewhacking?) instead. He got distracted by less important but much more fun endeavours. 

The other problem is that I don’t get to the point. I write a verbose mess of unconnected thoughts. So with that in mind, I am going to explain how the rise of the machines (we are not far away from ChatGPT working out it wants to be Skynet) assisted me in writing my “research proposal” assignment. 

First Drafts

To start with, I had human help. Stephen Webb, as my project supervisor, read a draft I had written. He demonstrated that a lot of my writing relied on small words to, as he put it “do the heavy lifting”. Using ‘this…’ and ‘it…’ frequently because I understood what I was trying to say –  he pointed out that the reader might not. The only reason he could even start to understand my work was because he knew me and the context of my subject. 

From his extensive feedback, I redrafted, edited and tried to improve my proposal. 

After completing the next draft, I thought I’m done. However, I was well over the word count, so I put sections into ChatGPT and told it to re-write it within my word limit. It only bloomin’ did it! The problem was it was very sterile; it lost all sense of who I am as a writer. This meant I was going to have to manually get the word count down. 

After another version, I asked Stephen for some clarification on some of his earlier feedback. His reply was:

“In the section you sent through, there are still individual sentences that are hard to parse. The initial word of a sentence or clause primes the reader to think in a certain way; if the sentence or clause doesn’t deliver, then confusion arises. You really should pass this through a grammar checker, Tom.”

Not being a confident writer, I already knew that my writing needed work. What this feedback demonstrated to me was that I lacked cohesion and structure. My train of thought constantly jumping around the entire document is a by-product, I guess, of writing in such short unfocussed bursts. 

I wrote a heading titled “Introduction” but did not actually introduce anything. As Stephen had pointed out, I had poorly communicated my narrative. What this meant was that readers were getting lost in my disconnected thoughts. I could now see that certain sections felt shoe-horned in because I liked what I had written. Realistically, those “gems” made no sense to the reader as they were not connected with the text surrounding them. 

I tried to use ChatGPT to see if it could make sense of it, but there was too much there for it to tell me how to rearrange it. The system did not accept the entirety of the text and could not provide clarity to the sections I had written. ChatGPT could not offer clear guidance for how best to rework and rearrange it.

A Quick Aside

At this point, on re-reading the above section, I think it nicely illustrates the problem I was talking about. I have said the same thing here in three different ways. 

  1. I tried to use ChatGPT to see if it could make sense of it, but there was too much there for it to tell me how to rearrange it. 
  2. The system did not accept the entirety of the text and could not provide clarity to the sections I had written. 
  3. ChatGPT could not offer clear guidance for how best to rework and rearrange it.

I only need to use one. In the past, Stephen had raised this issue with me, but it is a hard thing to change. What I am going to discuss next is my recent experiences of using ChatGPT as a tool to develop the parts in my writing I would not have picked up on myself. Heck, as I go back through this blog, I’m sure I’ll find more. I just won’t mention anything I’ve changed, so you’ll never know… 

Editing with ChatGPT

As I worked my way through the editing process, I started to use ChatGPT to check sentences I wanted to keep but was struggling to make coherent. They were either too long, or the idea I was trying to convey was not coming through clearly enough. 

An example of this would be when I asked ChatGPT to “check the grammar”:

My original question:

My situation should be made clear to the participants, allowing them to have fully informed consent, but to also highlight that if they perceive issues in the current hybrid environments I am hoping to act on their concerns and not silence their voices.

ChatGPTs reply:

The grammar of the provided text is correct. However, there are a few suggestions for improving clarity and readability:

“My situation should be made clear to the participants, enabling them to provide fully informed consent. Moreover, it is important to emphasise that if they perceive any issues in the current hybrid environments, I am committed to addressing their concerns and not silencing their voices.”

In this revised version, I made a few adjustments to enhance the sentence structure and clarify the message.

It was this repeated process that helped me (despite, at times, slowing me down) as I kept re-reading the submission each time, trying to tighten up my writing. Finding new thoughts that were slightly out and attempting to move them to the correct place. It became a good rule of thumb that, if the text I wrote did not fit in the new, leaner version, despite how much I wanted to keep it, it was gone. (Well done Tom – you just discovered the true meaning of editing!)

Below is an example of how I started to use ChatGPT to my advantage so I could improve my writing. 

My question to ChatGPT:

Should it be “that allowed” or “allowing” in the following text – 

Before the pandemic only the most confident teaching staff employed interactive tools (Nearpod, Mentimeter, etc.) within a lecture space that allowed students the ability to creatively engage with the material (Tan et al., 2020).

ChatGPTs reply:

Both “that allowed” and “allowing” are grammatically correct in the given sentence. However, the choice between them depends on the intended meaning and emphasis.

“That allowed”

Using “that allowed” suggests that the interactive tools were specifically designed or configured in a way that granted students the ability to creatively engage with the material. It implies that the tools had a built-in functionality or feature enabling such engagement.

“Allowing”

Using “allowing” suggests that the interactive tools, in a general sense, provided the opportunity or capability for students to creatively engage with the material. It implies that the tools had the potential or capacity to facilitate such engagement.

Both options are valid, but if you want to emphasize that the interactive tools had a specific functionality designed to enable creative engagement, “that allowed” would be more suitable. If you want to convey a more general sense of opportunity or capability, “allowing” would be a better choice.

As you can see, ChatGPT helped to explain how the slight difference in the sentence structure changed the meaning of what I was writing. I was able to consider what it was I’m trying to say and use the most appropriate version. 

Through all my previous doctoral assignments, I had a different tutor who helped me with my drafts. They told me they liked my writing style, which gave me the impression I was good at it. I can’t say what they specifically liked about my writing. I can make an educated guess that, because it was a level 8 assessment, my mistakes were forgiven. The marker was able to interpret the little words “that are doing the heavy lifting” (as Stephen had pointed out) more easily than your ‘average’ reader. Stephen helped me understand that it is one thing to have an interesting voice running through your text but is quite something else to actually be good at writing. 

The ethics of using AI

When I got ChatGPT to reduce my word count, I spoke with Stephen about the ethics of it and (ignoring the sanitisation of the output) it felt like a great way for a writer to do this. However, it felt wrong to take what ChatGPT had created as my own.

There is going to be an ethical divide between those who see using any form of AI as wrong and those who see it as a tool for improvement. The problem (as always) lies in the grey area and the boundaries of where people choose to deploy it, for example how far the tool is shaping their work beyond what would have been possible by them alone. While knowing it might be unethical, some will use it due to other commitments (work, family, etc). This scenario is a foreseeable eventuality, much like those who copied other work or paid essay mills for their work. But perhaps AI may feel slightly more ethical? As I am about to discuss, maybe a strong referencing code is what is required. But people (I was going to put students, but felt that unfair as we all have a personal line we move and adjust depending on the subject) will always push and flex the boundaries of fairness.

Referencing AI

In a recent ALT mailing list post, the question was asked about referencing when AI was used to  support work. The reply pointed to Boston University’s faculty of computing and data science guidance “Using generative AI in coursework”. The post highlighted this text:

“When using AI tools on assignments, add an appendix showing

(a) the entire exchange, highlighting the most relevant sections;

(b) a description of precisely which AI tools were used (e.g. ChatGPT private subscription version or DALL-E free version),

(c) an explanation of how the AI tools were used (e.g. to generate ideas, turns of phrase, elements of text, long stretches of text, lines of argument, pieces of evidence, maps of conceptual territory, illustrations of key concepts, etc.);

(d) an account of why AI tools were used (e.g. to save time, to surmount writer’s block, to stimulate thinking, to handle mounting stress, to clarify prose, to translate text, to experiment for fun, etc.).”

This almost feels as if there are too many barriers to use for a subject (or level) where you may be writing assignments of 6000+ words. Especially if you have to reference every section of text AI helped with. The follow-up question is then do you have to highlight every time spellcheck made a correction? Do you have to disclose that you used a service like Grammarly to make minor amendments? That is of course hyperbole. I think that if you get AI to generate starting paragraphs or change substantial amounts of your work, yes, this should be explained in an appendix. For re-writing a sentence and making grammatical choices, then no. That raises another question, how many sentences constitute a substantive change? 

Are AI tools always correct?

The final point relates to whether they are always correct. Now I know these tools are better than me at the construction of language. I have always been better at face-to-face conversations as a form of communication. During a conversation I can see when people don’t understand me, so adapt how and what I say accordingly. Writing does not afford the writer or reader the same luxury. So why do I ask if they are correct? As I have already mentioned, ChatGPT sterilised my voice. When asked to reduce the word count, it did what it was supposed to do but it strips out the humanity of the author (unsurprising being that it is a computer).

Another example (inside a paragraph that starts “The final point”, I do get the irony) is that Grammarly made corrections to the generated text from ChatGPT. 

This is a screenshot of a reply from ChatGPT placed within a box saying 'Both "that allowed" and "allowing" are grammatically correct in the given sentence. This box appears on the lefthand side of the screen on the righthand side is a box created by Grammarly. Grammarly has written Formality, then says to engage with the material creatively. The to-infinitive to engage has been split by the modifier creatively. Avoiding split infinitives can help your writing sounds more formal.
Another “another example” of a final point is: I put the above paragraph into ChatGPT to ask if I had used irony correctly. This is what it said:

Yes, the use of the word “irony” in this context is correct. Irony is a literary device that involves a contrast between expectations and reality. In your statement, you are highlighting the irony in two situations:

The first example of irony is that despite recognizing that language generation tools like ChatGPT are better than you at constructing language, you also acknowledge that you excel at face-to-face conversations where you can adapt and clarify your communication based on people’s reactions. The irony lies in the contrast between the effectiveness of these tools in writing and your personal strength in face-to-face communication.

The second example of irony is that Grammarly, a tool designed to enhance writing and make corrections, ends up making corrections to the text generated by ChatGPT. Here, the irony arises from the fact that a tool meant to improve the quality of the text is itself making changes to the output of another tool that is supposed to assist in generating text.

Overall, both examples demonstrate a contrast between expectations and reality, highlighting the ironic situations in your statement.

The system had not understood how I had used irony in the sentence but said that the overarching point I was making was ironic.

This image is of a person standing, with their eyes focusing as if they're looking at someone else who might be sitting down, because their chin is tilted down a little. The wording across the top of the image reads: I am still none the wiser if I used irony correctly in the parenthesis when I keep adding examples to a final point. The wording along the bottom of the image reads: And at this point I'm too afraid to ask.

Conclusion

In conclusion, ChatGPT ‘saved my life’ by allowing me to interrogate my writing in a way I have never been able to before. We should promote ChatGPT to students as a resource that can help tighten up their drafts before needing academic support. It should be utilised to alleviate the burden on academics, who are often asked to critique early drafts of work. Academics should hopefully then only see much more polished versions of submissions that require less input from them. 

As a final example, I didn’t like my last sentence. ChatGPT gave me another version:

It should be used to relieve the burden on academics who are frequently asked to critique early drafts. Ideally, academics would primarily review more polished submissions that require minimal input.

I didn’t like that version either. Maybe then I should just delete it? That is where the human side wins out over AI, your sense of self and how you want to sound when writing.

AI is here to stay (and take over the world if the Terminator documentaries are anything to go by), but actually, that is alright (AI being here as a tool, not taking over the world). At levels 4 and 5, we need to find creative ways to incorporate AI tools into the assessment process (which may mean fewer essays). Levels 6, 7 and 8, it’s about using AI to help improve our own voice but not lose it. 

The ChatGPT re-write: How ChatGPT Revolutionized My Writing Process – The AI version.

Credit Image: Photo by rupixen.com on Unsplash

Image in the text created by T. Langston using imgflip.com

TEL in ’22 – and looking forward to ’23

(Co-writer: ChatGPT)

In 2022 the TEL team said “goodbye” to some valued colleagues, who moved to take up different roles within the University, and we said “hello” to new colleagues who joined us. Chris, Jo, and Mike have already introduced themselves on TEL Tales, so I would like to use this end-of-year post to discuss a couple of work-related highlights: our implementation of Moodle 4.0 and, regarding the key area of assessment and feedback, our pilot of the WiseFlow end-to-end assessment platform.

Moodle 4.0

Moodle 4.0 is the latest version of the Moodle learning management system, and it includes many new features and improvements that aim to enhance the user experience and support better learning outcomes. Some of the key improvements in Moodle 4.0 include:

  • A new and improved user interface: Moodle 4.0 features a redesigned and modern user interface that is more intuitive and user-friendly, and that provides easy access to the most important features and functions.
  • Enhanced learning analytics and reporting: Moodle 4.0 includes improved learning analytics and reporting tools that provide teachers with more detailed and actionable insights on students’ learning, allowing them to track their progress and identify areas for improvement.
  • Improved accessibility and support for mobile devices: Moodle 4.0 has been designed to be more accessible and user-friendly for users with disabilities, and it includes support for mobile devices, allowing students to access their learning materials and activities on the go.
  • More options for personalization and customization: Moodle 4.0 provides teachers and administrators with more options for personalization and customization, allowing them to tailor the learning environment to the specific needs and preferences of their learners.

Overall, Moodle 4.0 is a significant improvement over previous versions of the learning management system, and it offers many new features and enhancements that can support better learning outcomes and a more engaging and effective learning experience.

At this point I would like to ask the reader: did you notice anything unusual about my discussion of Moodle 4.0?

Moving on, another major project for the TEL team has been to support a pilot implementation of the WiseFlow end-to-end assessment platform. Our hope is that a dedicated platform will allow us to improve our practices around assessment and feedback. Let’s explore that idea below in a little more detail.

Assessment and feedback

There are many different ways to assess students, and the best approach will depend on the specific learning goals and objectives, as well as the context and needs of the learners. Some key principles and strategies that can help to ensure effective assessment of students include:

  • Align assessment with learning goals: The assessment of students should be closely aligned with the learning goals and objectives of the course or programme. This will help to ensure that the assessment is focused on the most important and relevant learning outcomes and that it provides valid and reliable information on students’ progress and achievement.
  • Use a variety of assessment methods: Different assessment methods can provide different types of information and insights into students’ learning, and it is important to use a range of methods in order to get a comprehensive picture of their progress and achievement. Some common assessment methods include tests, quizzes, projects, presentations, portfolios, and observations.
  • Provide timely and meaningful feedback: Feedback is an essential component of assessment, and it is important to provide students with timely and meaningful feedback on their progress and performance. This feedback should be clear, specific, and actionable, and it should help students to understand their strengths and weaknesses, and identify areas for improvement.
  • Engage students in the assessment process: Students should be actively involved in the assessment process, and they should be given opportunities to reflect on their own learning, evaluate their progress, and set goals for improvement. This can help to foster a growth mindset and a sense of ownership and responsibility for their own learning.

Overall, effective assessment of students requires careful planning, the use of a variety of assessment methods, timely and meaningful feedback, and student engagement in the assessment process.

Again, at this point I would like to ask the reader: did you notice anything unusual about my discussion of assessment and feedback?

I inserted those italicised questions above because I (Stephen Webb) did not write any of the text in the two subsections. My “co-author” (ChatGPT) wrote the text.

In 2022, GPT3 (Generative Pre-trained Transformer 3) became a mainstream technology. At least a dozen apps are available that can provide clear, comprehensible text-based responses to prompts provided by a user. An app such as ChapGPT can write essays (and poems, and computer code, and much else besides). And many of our students will know this.

Similar technology can generate artwork. The artwork accompanying this blog post is an original by DALL-E, from a prompt I provided. DALL-E is a much better artist than I can ever hope to be.

I hesitate to call this technology Artificial Intelligence because it is neither artificial nor intelligent; I prefer the term Assistive Computation. Whatever we call the technology, though, I believe that in 2023 we need, as educators, to start grappling with its implications. Can we reasonably assess students by setting them essays to write, when apps can spit out the essays for them? What does the technology mean for notions of authorship and plagiarism? (You will notice that I have credited ChatGPT as being the co-author of this blog post; is that reasonable or meaningful?) What are the best strategies for learning with this technology?

Of course, teachers as well as students can use Assistive Computation. As an experiment, recently, an academic asked ChatGPT to generate a question in a particular knowledge domain that could be set for a first-year undergraduate essay. (The question it generated was entirely suitable.) The academic then asked ChatGPT to generate a rubric for the question. (The rubric was acceptable.) Then the academic asked ChatGPT to write an essay based on the question it had set. (The essay showed no insight, made no new points but would have received a passing grade.) Finally, the academic asked ChatGPT to mark the essay according to the rubric. (The feedback, if nothing else, was at least provided in a more timely fashion than human teachers can provide.)

This experiment suggests to me that either (i) we cut out the middlemen (students and teachers) and let AI do whatever it wants; or (ii) we work together – both students and teachers – to understand how to use Assistive Computation to improve teaching and learning. 

There are many questions here. And I believe 2023 is the year we all need to start grappling with them. Another question: should educators be concerned about this technology? I’ll let my co-author answer:

It’s natural for educators to be concerned about the potential impact of technology on their profession. However, it’s important to note that systems like ChatGPT are not capable of replacing the role of a human educator. While they may be able to generate text, they do not have the ability to understand or evaluate the content of that text in the same way that a human can. It’s also worth considering that the use of such systems may actually highlight the importance of critical thinking and analysis skills, which are vital for success in education and beyond. In the end, the role of educators will continue to be important in helping students develop these skills and prepare for the future.

I (the human, not the AI) would like to wish you an enjoyable festive period and a happy New Year!

Credit Image: An original by DALL-E

Guest Blogger: Teach Well: Principles to Practice Module

Hi everyone, I’m Maria Hutchinson and I joined the Academic Development team back in June as a Learning Designer. One of the projects I was given early on was to create a professional development module to support the pedagogical upskilling of our Online Course Developers (OCDs), Seniors OCDs, Learning Technologists, Educational Technologists, Learning Support Tutors, Associate Lecturers, or other relevant roles related to supporting student learning.

The aptly named Teach Well: Principles to Practice module has been approved and we are actively recruiting for TB2 Jan-May. This new 30-credit L7 professional development module is FREE for UoP and will run TB1 and TB2.

Join us on a pedagogical journey through 3 pillars of practice for teaching well in higher education, and gain the confidence to critically evaluate learning and design approaches and reflect on what it means to teach well across different modes of study.

On completion of the module, you will be able to support colleagues in the fields of learning design and wider pedagogic practice, including supporting workshops such as enABLe, the University’s framework to support innovative team-based learning design. You will also engage with the UKPSF and be able to work towards an appropriate level of Fellowship.

This practical module focuses on learning design, teaching practice, and assessment and feedback, in the context of a solid pedagogic framework linked to blended and connected learning. A significant component of the module content and associated skills is practical teaching.

Academic teaching students in classroomYou will learn via a mixture of face-to-face away days* and online synchronous sessions, including workshops, discussions and guest speakers, where you will be encouraged to engage. Guided learning will include asynchronous online activities, in addition to which, you will be expected to engage in assessment activities and independent study. Key dates of online sessions and away days.

*NOTE: Attendance at face-to-face away days are mandatory, therefore, you should ensure that you have prior approval from your line manager to attend them.

For more information and for details on how to enrol, please contact: maria.hutchinson@port.ac.uk

Digital Accessibility in Teaching and Learning – What is it?

‘Digital Accessibility’ or ‘Accessibility’ is a heated topic at the moment. Public sector bodies like us have the obligation by law to comply with Accessibility Regulations 2018 with a series of deadlines to meet. It is also an important part of our University Vision 2030 and Strategy 2025 where it says we should ‘respect and celebrate diversity and equal opportunity through an inclusive culture’. 

But what does ‘Digital Accessibility’ mean and how does it apply to us in teaching and learning? 

Before we look into that, let’s first find out what is ‘Accessibility’. 

What is ‘Accessibility’?

Accessibility is about removing disability. 

What is disability? Disability happens when there’s a barrier between people and their environment. It is commonly seen as a condition or a problem of the body or mind (impairment) that requires medical treatment. However, UK Equality Act 2010 recognised and acknowledged that disability, or barriers, can be caused not just by the impairment(s) but also by the way society is organised. This is defined in the social model of disability. According to the social model of disability, these barriers can be physical, like buildings not having accessible toilets, or they can be caused by people’s attitudes to difference, like assuming disabled people can’t do certain things [3]. For many people with impairment(s), the main barrier they experience does not stem directly from their bodies, but rather from their unwelcome reception in the world, in terms of how physical structures, institutional norms, and social attitudes exclude and/or denigrate them. [4]

This is where ‘Accessibility’ plays a part. 

Accessibility is about finding and dismantling these social barriers, creating an environment that adapts to the needs, ideally as early as possible in the process. For example: accessible toilets, lifts, wheelchair ramps, braille on printed materials, even simple things like left-handed scissors etc. When barriers are removed, disabled people can be independent, autonomous and equal in society.

Accessibility supports and celebrates inclusion; it should be ok to be different, with impairment(s) or not. It is about ending exclusion and oppression so that people with impairment(s) are not required to change who they are in order to be entitled to the same rights and opportunities as everyone else. 

What is ‘Digital Accessibility’? 

Digital accessibility is ‘Accessibility’ in digital media. 

It is about making digital products like websites, mobile apps and other digital tools and technologies accessible to everyone. It is the ability for all users to have an equal opportunity to access and benefit from the same services or digital products, regardless of any impairment(s) they may have. 

So, what is ‘Digital Accessibility’ in Teaching and Learning? 

Digital accessibility in teaching and learning is ‘Accessibility’ in digital teaching and learning products – the courses’ contents and activities, and the service we offer to our students.

It means all students are given access to all teaching materials and the ability to participate in all teaching and learning activities, regardless of any impairment(s) they may have. 

Taking digital accessibility on board in teaching and learning is very much about understanding that, if we’re creating inaccessible learning materials or activities, then effectively we’re responsible for creating barriers. These kinds of resources often lack structure, written and designed with a set of assumptions. It is about having the realisation when we create resources that fail to accommodate a certain group of students, effectively we have disabled them.

What ‘Digital Accessibility’ is not.

Now we know what digital accessibility is and its role in teaching and learning, let’s have a look at what it is not.

Misconception 1: digital accessibility is just about disability.

It’s not. Digital accessibility in essence is about inclusiveness and universality. 

It’s about having good design and making resources that can be used by as many people as possible.

I believe every student, in fact, everyone was once in one or more of the situations below; maybe even more than once:

  • In different cultural environments e.g. in a foreign country 
  • In a noisy environment or a public place where you can’t hear properly
  • Using many different devices e.g. desktop computers, mobile phones, tablets etc
  • Are temporarily or situationally impaired e.g. from injuries or with caring responsibilities
  • Have age-related cognitive decline.

In these situations, everyone can benefit from the flexibility brought by materials and activities designed with digital accessibility in mind. In fact, many of us use elements of them without particularly thinking about them. We might think that only disabled students use assistive technology, but, in fact, we are walking around with a kind of assistive device in our pockets all the time – our mobile phones. Have we not used and enjoyed its built-in accessibility functions like voice over, browser enlarge, colour changes, speech recognition, screen reader etc ever? When digital accessibility is put in place, everyone benefits including ourselves; inevitably everyone grows old and will eventually be impaired by age. So, essentially, we are just helping ourselves.

This video from the Web Accessibility Initiative shows a variety of ways that content produced to be accessible is beneficial for all users regardless of their ability or disability.

Misconception 2: digital accessibility is not my problem. 

Yes, it is. Digital accessibility is everyone’s responsibility. 

We’re all in this together. As mentioned before, it is required by law and it is the University’s Vision. More importantly, as an educational institution, we are responsible. We are the teachers and role models, what you do makes a difference. We can change and have the responsibility to lead the change in society’s perceptions and practice. We can create a society that accepts and celebrates that everyone is unique, recognises and encourages the strength and talents of people with impairment(s).

The whole purpose of providing education is to give the young the things they need in order to develop in an orderly, sequential way into members of society; to impact and change lives. Digital accessibility helps us fulfil that purpose. It provides us with an opportunity for education to reach everyone who needs it, in a way that can benefit as many people as possible. It gives us an opportunity to improve our teaching and learning materials to fulfil their purpose of existence. It is our responsibility to make that change. 

Misconception 3: digital accessibility is hard. 

No, it’s very easy. All you need is empathy and consideration. 

What you do at the start makes it easier at the end. Follow these good practices when designing and adding your content. When you start doing it, you will realise that most of them are really just common sense! Information should be consistent and easy to find, easy to read, and easy to navigate; documents need good structure and colour contrast; images and graphs are described well; videos have captions or transcriptions etc, just to name a few. It is all a matter of common sense. eLearning Tools website has all the support and information you need to create this kind of accessible content.

Remember that when you design and create your content with digital accessibility in mind, you will not only create better learning materials, you will have content that can be more easily reused and repurposed – saving a lot of time in the future!

References

  1. Codeacademy: What is Digital Accessibility
  2. Equality and Human Rights Commission: UN Convention on the Rights of Persons with Disabilities (CRPD)
  3. Social model of disability
  4. Rethinking disability: the social model of disability and chronic disease

Credit Image:Image by Gerd Altmann from Pixabay

Reflecting on the TEL Tales Blended Learning Festival

As you are probably aware by now, the TEL Tales Blended Learning Festival was a week long festival looking at developments in educational technology with learning and teaching. How the sudden shift to online and blended learning has put a strain on many academics workloads and more often skill perceptions. I say this as what the current situation has shown is people are far more capable than they ever gave themselves credit for

Across the week the range of subjects was impressive and comprehensive. All the sessions were recorded and can be found on both the TEL Tales Blended Learning Festival website and YouTube on this playlist.

This post though, is just a brief look at how the week felt to me. The first point to note of every conference is that if you are presenting, you don’t really get to attend. Except with a fully digital conference, I am able to go back and revisit the sessions. This is a real pleasure as there were a few sessions that conflicted with mine that are of benefit to my training and personal interest (now just to find the time to watch them all back)!

The second area for me is the adrenaline rush, even just being sat at my desk. The first day saw a few technical issues that made it all the more exciting to diagnose and solve as part of a team. It got rectified very quickly, and by the 4th session of the day the hangover from the problem was gone. The team came together, set goals, allocated tasks and we did it. The reason I bring this up is not to talk in depth about the issue but to say how great it was to work with the whole team, we were able to overcome obstacles together and  provide a smooth experience to those that attended. 

My third point is that it was great to have another person to present with and to a large crowd (for most sessions) to bounce ideas around and get people involved with the process. It was a shame that my session with Andy Clegg wasn’t a little longer to be able to run the activity in full, which would have essentially been ‘Ready, Steady, Cook’. Throw some ingredients at us and let us help solve a problem on the fly. This did take place with Mike Wilson when we ran beginner and advanced Moodle demos and were able to just show people answers to questions they had. It was live, specific and not fitting a normal conference where you may not get what you actually hoped to see. It is something that I hope to be able to do more of going forward, working with colleagues on sessions that can be fun, engaging and promote creative discussions for those that may not normally see the benefit of what we do in the department.

My final reflection is that I am so proud of what the team and colleagues have done. Watching Stephen Webb and Shaun Searle demonstrate principles of content capture and talk about how quickly they have had to deal with the implementation of Panopto, it makes you proud to be part of something so proactive (maybe a little reactive in this case) and so professional. The whole conference was put together quickly with people quickly developing sessions and actually we all learned of ways to develop our specialisms into the wider field of what the department does. 

bunting

From the feedback I and others have received, it was a great success and I hope to do something similar in the future. It has opened my eyes to ideas that can run across the year and not just in one day. How my training online could be the way to go (don’t get me wrong I love face to face experiences) but actually online, I know people have a device and I can be more creative with what was once a demonstration type session. 

This festival was a pleasure to be a part of, and I was sad it finished. I was also glad at the end as it was intense, mildly hectic and not really a sustainable solution to training especially with trying to help manage a family in lockdown (I was lucky not to be attacked by children!). However, my feeling going forward, even after the worst of this current isolation is over, is that the conferences that we run and organise could and should have more online elements to really engage everyone that wants to attend. It opens up other possibilities that physical locations can’t. The best part of this whole experience though was this tweet from @Drstuartsims. Even in lockdown, conference food is questionable …

Credit Image: Photo by Stefan Spassov on Unsplash

Credit Image: Photo by Chris Lawton on Unsplash

Guest Blogger: Fiona Cook – Introduction to Tel Tales

Hi, I’m Fiona Cook and I am the new Research Associate for the Department of Curriculum and Quality Enhancement (DCQE). 

I have joined the University after nearly four years, across three roles, at the University of Portsmouth Students’ Union (UPSU). My latest role at UPSU was Insights Lead and I was responsible for leading all consultation, research, benchmarking, and data management, and for supporting GDPR compliance. Previously, I worked for the Student Focus team which covered areas such as academic representation, quality assurance, and collaborative partners. I have also worked in FE where I supported multiple departments including all HE provision. 

I have joined DCQE to support the University’s work on widening access to and participation in HE and enhancing the student experience. My role focuses on projects across the University that support the Access and Participation Plan, particularly those around BAME students, WP, innovation in learning and teaching, and student voice. When I saw the posting for my new role it seemed like the perfect opportunity to bring together my experience and interests! 

My previous roles at UPSU means I have already worked closely with the University, and DCQE, on similar projects including content capture, the NSS, the student charter, and the TEF. Working within the Student Focus and Insights teams means I have a lot of experience with student voice, and I thankfully already know quite a few of the many acronyms used in this area. 

I think student feedback is crucial in the development of both strategic and operational planning, and I’ve been able to share my work on using data to drive the student experience at conferences such as the University’s Learning & Teaching Conference and Qualtrics X4 which was one of the highlights of my career to date! The relationship between the University and the UPSU Insights team was also mentioned in the newest APP, demonstrating the importance of collaboration and inclusion.

I am keen to support access and participation through equity of opportunity, and I am excited to continue to contribute to this in my new role. I hope to bring my experience of these key areas and use it to support the development of University activities, whilst also developing my own research and analysis skills. 

It’s quite strange that after just over six weeks into a new role I’m now working at home daily. I shared some of my top tips for remote working on LinkedIn, and it’s been really useful to access the TEL resources as I adjust to regular video calls and life online.

Outside of work I am interested in dancing, politics, and baking – although I made my sister’s wedding cake last year so I’m on a slight hiatus. I’ve also recently adopted an eight-year-old cat called Mimi. 

Feel free to connect with me on LinkedIn

Fiona is based in Mercantile with AcDev and the Tel team.

Welcome to the team, Fiona! We look forward to hearing more about your projects in the not so distant future on Tel Tales.

Google in a time of lockdown

2020 marks the fifteenth anniversary of the beta version of Gmail, Google’s first move beyond being just a search engine. Since then Google has created an extensive suite of applications many of which are extremely useful for teaching and learning. In this blog I’ll be looking at some of, what I think, are the most useful apps and why, during the current lockdown, Google can be useful in helping deliver online learning.

Possibly the most useful change Google has made in light of the lockdown was to extend video conferencing (Hangouts Meet)  to all GSuite accounts allowing up to 250 participants in any online meeting. Setting up an online meeting using Hangouts Meet can be done via the Google calendar thus notifying participants automatically. While this particular app lacks some of the functionality of Webex, it is useful for hosting and running a simple meeting or online seminar.

In this time of distributive learning, collaboration can still be facilitated and Google provides tools such as jamboard that will allow students to contribute to online tasks and discussions. Jamboard provides a pin-board style interface onto which students can pin their ideas and contributions to group tasks. While apps such as Google docs do clearly provide opportunities for online collaboration, jamboard provides a tool for more focused tasks with a clear and easy to read interface.

On the Degree Apprenticeship programme, we make major use of G Suite including Shared Drives and Google Docs, indeed without these, it would be difficult to see how we could manage some of the required administrative tasks. The ability to enhance the functionality of some Google products such as sheets, also means that they can be tailored to best meet the needs of our students. For example, all degree apprenticeship students are required to keep a log of their off the job training activities, such as their weekly University sessions, to help them complete these logs we use Google forms linked to Google sheets. Being able to add a script to the sheets means that emails can automatically be sent out allowing course administrators to more easily monitor log entries.

In terms of teaching and learning, one of the most useful Google products, and certainly the most ubiquitous in terms of videos, is YouTube, bought by Google back in 2006. By virtue of having a Google account, all members of the University automatically have a YouTube account. This, combined with the unlimited storage offered by Google, provides staff and students with an invaluable teaching and learning platform. Google’s screen capture app, Screencastify, integrates nicely with Youtube allowing users to edit and then upload directly to their YouTube channel.

So, out of the range of apps, Google provides, which ones are my favourites?

Having worked with apprenticeship students in the Business faculty for over two years, helping them with their ePortfolios, I’ve become a convert to Google Sites. I found the old version while having plenty of functionality, a bit clunky and not that user-friendly, often having to write HTML to achieve what I wanted. A downside of New Google Sites was the lack of template functionality, but this issue is being addressed as the addition of templates is currently in development.

But, on a day to day level, Google docs and Shared Drives have pretty much transformed the way I work, simplifying working collaboratively with colleagues and students. 

The pace of development of Google products is also impressive and I’m looking forward to making use of Smart Compose (https://gsuiteupdates.googleblog.com/2020/02/smart-compose-ga.html) and neural grammar correction, currently in beta. While Word does ship with far greater functionality and even slightly complicated Word documents do not convert well to Google, for the majority of users, the tools available with Docs are generally more than enough and thinglink (https://www.thinglink.com/scene/1282367584611598339) is great for those new to Docs. Google has also made it slightly easier to share documents with non-Google account holders, users can now use their existing email address to set themselves up with Google to enable access to shared Google docs, Sites etc. while a PIN verification system, currently in beta, will remove the need to set up any kind of Google account at all.

The current situation has thrown up considerable challenges in continuing to provide engaging and high quality teaching and learning especially in terms of students working collaboratively, Google clearly does not provide all the solutions required, but its suite of apps are certainly a good starting point.

Image by Saveliy Morozov  from Pixabay

Guest Blogger: Emma Duke-Williams – On changing from academic to educational Technologist

Some of you may remember me – I spent about 16 years at Portsmouth, lecturing in the School of Computing. I’ve now moved to be an Educational Technologist in Dundee University, working as part of a central team. In many ways it seems only yesterday that I left; in reality, it has been more than four years!

The change wasn’t entirely planned. I moved to Scotland for personal reasons and was therefore looking for a job. A combination of 16 years of lecturing (and a few more of school teaching before that) and a lack of a PhD made me decide that I didn’t want to lecture again. However, I did like the contact I had had with staff and students at Portsmouth, so the opportunity of a sideways jump appealed. Those who knew me at Portsmouth will remember that IT to support learning was a key area of interest for me. Indeed, my PGCE in the 80s had included “Computing in the Primary Curriculum” – those were the days of BBC Bs, Granny’s Garden and the Domesday project!

While at Portsmouth I was keen to experiment, and no doubt I drove some in the eLearning team mad with my “why won’t it do …” questions – but I also was free to experiment to see whether something could be done, and students generally went along with my ideas. 

All of that experience has really helped me here! I understand those wanting to push a tool to its limits, to get it to work for the way they want to teach, and that it’s rare to find ‘one size fits all’ solution. That aspect of my role is fun, challenging, and forces me to think about both tools and pedagogic practice. 

However, the greater challenge is supporting all staff to have a good understanding of the tools we have, the functionality on offer, while helping staff decide what’s really needed for their students. When teaching I came across students who found aspects of the subject hard, or didn’t understand why they had to do various things. Academics are no different! 

Do I miss teaching? Not often. I do miss the buzz of enthusiastic students, but I don’t miss the pile of marking – but I’d find this role 100 times harder, had I not had that experience of being on ‘the other side’. 

« Older posts

© 2024 Tel Tales

Theme by Anders NorénUp ↑