Tel Tales

Adventures in Technology Enhanced Learning @ UoP

WiseFlow – Looking in the mirror with reflective portfolios in WiseFlow

Hey there, fellow exhausted souls!

Can you believe it? We’re finally coming towards the end of the academic year, and boy, has it been a fun ride!  Our WiseFlow pilot has gone from strength to strength as we support academics through a new assessment process.  More importantly, we have successfully run two separate assessments using our innovative approach of using WiseFlow as a reflective portfolio – the first use case of this we know about!  We’ve grown, learned, and potentially discovered an exciting prospect for the future of reflective portfolios at Portsmouth University, so let’s take a moment to reflect on the journey we’ve been on. 

You may have read our previous blog post on “Unlocking the power of WiseFlow: Transforming ePortfolio assessments” where we discussed the possibilities of using WiseFlow as a viable reflective portfolio platform and the benefits a reflective portfolio approach brings.  For students, this helps develop their metacognitive skills and self-awareness as learners over a period of time.  Academics, on the other hand, can use reflective portfolios to assess students’ learning outcomes in a more comprehensive and authentic manner.  This is all part of our wider WiseFlow pilot to provide one integrated assessment platform that serves our current (and future) assessment needs within Portsmouth University, which Mike Wilson spoke to us about recently on our podcast – you can listen here

Teach Well and Research-Informed Teaching

This year we ran two reflective portfolios within WiseFlow as part of our pilot project – to test the water and find out if this was even possible. The first was within our Researched Informed Teaching module, which supports early career academics to apply their learning in educational enhancements into their own contexts, through reflection and innovation.  Students will draw together higher education policy, research methods and educational developments to build students knowledge for their future work.  Secondly, we ran a reflective portfolio in our new level seven Teach Well: Principles to Practice module, which is a professional development route for those in roles related to supporting student learning. Students in this module embark on a pedagogical journey through three pillars of practice for teaching well in higher education, gaining the confidence to critically evaluate learning and design approaches and reflecting on what it means to teach well across different modes of study.  We recently caught up with Maria Hutchinson who runs this module in our podcast series, if you missed this one, you can listen here

We’ve worked closely with these academics and our support teams to develop reflective portfolios for these modules that can be used as a summative assessment vehicle which is both intuitive for learners and versatile enough to encompass a broad range of tools which enable the course learning outcomes to be demonstrated in an engaging and meaningful way.

What the students said…

Following the submission of reflective portfolios into WiseFlow, we sent out a survey to participants to gain their feedback and views.  Some of the headline figures are detailed below…

  • 90% of students found the WiseFlow reflective portfolio easy to navigate
  • 90% of students agreed that a reflective portfolio suited this type of assessment (compared with traditional essay-based assessment methods)
  • 82% of students felt their own students would enjoy using a reflective portfolio in WiseFlow
  • 71% of students enjoyed the interactive assessment methods, such as histograms, voice recorders etc. 
  • We received multiple comments about the clear instructions that were given on how to access and use Wiseflow as well as its reliability and stability as a platform.  Many users also commented positively on the functionality that WiseFlow offered compared to previously used portfolio solutions. 

Students also commented on…

  • If there was a need to add another system to Portsmouth University’s available assessment platforms – “There are too many platforms for submitting the work, Moodle, ePortfolio, WiseFlow, it is really confusing and frustrating that is necessary to learn how to use different platforms for different modules.”
  • The lack of formatting transfer from applications such as Word, when copying and pasting into WiseFlow – “Transfer of formatted MS Word document to WiseFlow could be improved. Currently, the document format is lost during the cut & paste process which then requires more effort to re-format within the WiseFlow portal.”
  • Better integration with Moodle and WiseFlow – “I’d like to see direct access from Moodle”. 

The data presented highlights the positive reception of WiseFlow as a reflective portfolio solution by students. The high percentage of students that recognized the suitability of a reflective portfolio as an assessment method, in comparison to traditional essay-based approaches and praised its usability is a really positive sign. The positive feedback on the interactive assessment methods further emphasizes the adaptability of the question bank in a traditional FlowMulti assessment to be used in an innovative way. 

However, some concerns were raised by students, such as the frustration of managing multiple assessment platforms at the university, indicating a need for better integration. This all links to our Digital Success Plan to (re)design robust assessments to meet the needs of the diverse student population within a blended and connected setting and incorporate a robust specialist end-to-end assessment platform. Our aims in the project were to make it easier for academics to design assessments, easier for students to find their assessments and feedback, and support staff by reducing the manual workaround assessments for academics.  During the next stage of the pilot project, integration into our current systems is a top priority and will alleviate these challenges.  Furthermore, the lack of formatting transfer from applications like Word to WiseFlow was highlighted as an area for improvement. These critical comments provide valuable insights for further refining and optimizing the WiseFlow system.

The evidence is clear to see – WiseFlow has the ability to provide a viable solution to reflective portfolios, with a bit of refinement – it could be excellent. 

What the staff said…

It was also vital to us that we gathered feedback from our academic staff.  

  • 100% of staff agreed that WiseFlow allowed them to develop their assessment in ways that were not previously possible
  • All staff agreed the WiseFlow reflective portfolio allowed them to fully cover learning objectives and meet the needs of their students
  • We received multiple comments about the speed of the platform, intuitive nature and search functionality which made the verification/moderation process seamless.  Staff also commended the accuracy of the rubrics for grading and how new interactive elements made them rethink how they could better use this type of functionality in the future.

Staff also commented on…

  • Comparisons to previously used portfolio platforms – “Historically the module used [another portfolio system] which was really clunky and didn’t work well at all. I really liked that Wiseflow could be scrolled across (as opposed to clicking through each page) and the layout was great”
  • Design elements within the marking interface – “It would have been useful to have had the comment box movable (I work with two screens and being able to drag the box to another screen to write on would have been a nice touch – several times I had to keep opening and closing the box as I wasn’t able to see the text underneath it)”
  • Having more time to explore the platform – “I did not feel I had enough time to play before it went live for students, but this was not WISEflow’s fault – it was just timing”. 

As an honest answer, we’ve been blown away by our staff feedback.   The unanimous agreement that WiseFlow enables new possibilities for assessment development speaks very highly of this solution and its potential in enhancing the teaching and learning experience for students at Portsmouth University.  The potential to create authentic assessments through the use of reflective portfolios is exciting.  The accuracy of the grading rubrics was also very highly commended – allowing students to have a greater chance of achieving a clear and defined target and making academic decision-making easier, fairer and more accurate.  In terms of developmental areas, the movement of the comment box is a fair point – we’ve heard from other academics about the size of the comment box before – hopefully, something that WiseFlow’s New Marker Journey will alleviate. 

Where do we go from here?

As we raised in our first blog post – the reflective portfolio solution in WiseFlow is far from perfect, with a few simple tweaks the solution could become very appealing. Sadly, some of these are out of our hands and lie within the code of the platform.  We’ve learnt a lot during the duration of this assessment as a project team, including developmental areas we have highlighted for the future.  

The single biggest limiting factor when using a reflective portfolio is when using a file upload question type.  This is limited to twelve files that are no more than 10Mb each – multiple file upload questions can be used, but will still have limits on them.  We have approached WiseFlow about this for development purposes, however, we have yet to have any significant movement on removing this limit.  The removal of this limit puts WiseFlow in an incredibly powerful position to offer another “string to their bow” in terms of assessment choice and would truly open up the use of reflective portfolios within the platform.  Sadly, with this limit in place, using reflective portfolios with some faculties such as our Creative and Cultural Industry, where students would regularly upload large .psd, CAD files, HD video, and high-quality audio etc) is just not a viable option.  Creative students will often build a “portfolio career” and we would love to be able to work with them on developing reflective portfolios, but this limit stops us.  Until this is removed, careful consideration must be taken at the planning stage of an assessment as to whether the reflective portfolio is the correct solution.  Further to this, other limitations must be considered – for example, once the reflective portfolio is live for students to complete, it cannot be altered, changed or adapted.  During the pilot, we’ve worked extensively with academics and our support teams to iron out any issues prior to release. Careful planning and consideration must take place in the authoring phase of an assignment, which will then be rigorously checked prior to release – in the same way an exam would.  This has worked at a small scale but we would need to ensure appropriate support mechanisms are in place at a larger scale.  

Our student feedback gave us valuable insight into the process of using WiseFlow.  Although reflective portfolios save every 10 seconds, if a student deletes a file or a piece of text and exits the platform, this cannot be recovered.  Over the duration of the assessments that took place, we encountered one reported instance of this. We also had some reports of formatting that will not copy from Word documents.  Again, we approached WiseFlow regarding this and it is recommended to copy/paste plain text from Word and finish the styling in the text editor of WiseFlow.  Although this solution works, having formatting that copies across would make students’ work translate much easier – particularly for those who write on external documents before copying into the platform at the last minute (like myself). In terms of progression beyond WiseFlow, we’d love for students to be able to take their work from the platform and have the ability to store it themselves or share it beyond the WiseFlow platform.  Currently, there is no solution to this.  A “zip folder” that contained all exports and uploaded files of any inputted answers into WiseFlow would be a great starting point.  Again, we’ve put forward the idea to WiseFlow, but have yet to have any movement on this.  

Where do we take our pilot now?

Although these are risks with using a reflective portfolio solution in WiseFlow, the prospect and the potential gain of this authentic assessment are exciting.  We’ve taken the plunge and proven the concept works, highlighting potential development areas which we really hope get some traction and we’d like to think WiseFlow will be open to listening to these developmental ideas.  As for our pilot project as a whole, we move into a second phase of the pilot with a continued focus on reflective portfolios but also some other areas of assessment we have struggled with in the past, such as large file submissions.  We have a plethora of training and support we are actively developing and working with local teams to ensure staff feel confident using the systems.  

We continue to have a waiting list for academics who are wanting to work with us to develop reflective portfolios in WiseFlow. I find myself meeting with academics on a weekly basis to discuss potential projects and reflective portfolio solutions in their disciplines.   So far, we’ve done no real advertising, and this interest has been created from word of mouth and from those who have used it as students. We are keen to share our experiences with other Universities in WiseFlow user groups, who are actively keen to explore this and want to learn about our innovative approach. However, we need to be open and honest about the limitations that this solution has at the moment. Collectively, we might hold enough power to make change happen but until that point, caution must be taken before embarking on a reflective portfolio to ensure this is the correct fit for assessment.

The potential of this solution is game-changing, not just for us, but for a lot of other Higher Education institutions across the world.

The future of reflective portfolios in WiseFlow is exciting – keep watching this space.  

Chris

Credit Image: Photo by MidJourney 

Guest Blogger: Tom Langston – How ChatGPT Revolutionized My Writing Process – The AI version

(Co-writer: ChatGPT)

This version was generated after feeding in my attempt at writing this blog post and asking ChatGPT to ‘shorten this text into a blog post’.

I recently discovered the incredible potential of ChatGPT, and it completely revolutionized the way I approached my writing. As a student pursuing a Professional Doctorate in Education, I often struggled with crafting cohesive and focused assignments. My writing style was scattered and verbose, making it difficult for readers to follow my train of thought. Enter ChatGPT, a powerful AI language model that became my secret weapon in developing a well-structured research proposal.

Initially, I sought human help from my project supervisor, Stephen Webb, who pointed out that my writing relied too heavily on vague terms like “this” and “it.” While I understood the meaning behind my words, Stephen reminded me that readers might not share the same clarity. Armed with his feedback, I set out to improve my proposal.

When I completed the next draft, I thought I was finished. However, the word count was far too high, so I turned to ChatGPT for assistance in condensing my writing. It did the job, but the result lacked my personal touch and voice as a writer. I realized I would have to manually trim down the content to achieve the desired outcome.

During this editing process, I discovered the true power of ChatGPT. I started using it to check sentences that I struggled to articulate clearly. When I asked for grammar assistance, ChatGPT not only provided corrections but also suggested ways to enhance sentence structure and improve readability. This iterative process allowed me to refine my writing, discarding extraneous thoughts and rearranging sentences for better flow. It became evident that if a particular text no longer fit within the new leaner version, despite my attachment to it, it had to be eliminated. This experience epitomized the essence of editing.

As I continued to utilize ChatGPT, I encountered questions of ethics. While it proved immensely useful in reducing my word count, I couldn’t help but feel uneasy about presenting the AI-generated content as my own. The ethical implications of using AI in writing remain a divisive topic, with differing perspectives on its appropriateness and boundaries. To navigate this issue, some suggest referencing AI tools in an appendix, detailing their specific use and purpose in the writing process. However, this raises questions about the extent to which we should acknowledge AI contributions and how that compares to other forms of assistance, such as spell checkers or language tools like Grammarly.

One important consideration is whether AI tools are always correct. While they excel at constructing language, they may strip away the author’s voice and inadvertently introduce errors. In my experience, Grammarly sometimes made corrections to the text generated by ChatGPT, highlighting the need for human judgment in refining the output.

In conclusion, ChatGPT became a game-changer for me, transforming my writing process in ways I had never imagined. I believe it should be embraced as a valuable resource for students to tighten their drafts before seeking extensive academic support. By incorporating AI tools, we can alleviate the burden on educators, who can then focus on reviewing more polished submissions requiring minimal intervention. However, it is crucial to strike a balance and ensure that AI enhances our own voices rather than overshadowing them. AI is undoubtedly here to stay, and by harnessing its power thoughtfully, we can elevate our writing while preserving our unique identities as authors.

Thoughts from the Proofreader:

I was asked to proofread Tom’s blog post here and have a look at ChatGPT’s version too. Comparing the two, I definitely found Tom’s writing much more personable – I could imagine him speaking as I read through it. ChatGPT’s version presented the information efficiently and effectively, and I found nothing to correct. However, in its response to the prompt, it had taken away everything that made Tom sound like Tom!

This experience also really made me realise how many misconceptions I have about AI tools. I’d assumed that ChatGPT would have a very grammatical approach to its interpretation of language – rather like I’ve been told I have! However, when Tom asked it about the difference between ‘that allowed’ and ‘allowing’, ChatGPT talked with authority about implications and potential meanings. This answer was a long way from my interpretation, which attributed the difference to the grammar of relative clauses (X refers to one thing, Y refers to another). As Tom demonstrated with his irony example, it’s worth being cautious with how far we trust its responses. And I think we can be confident that human input will still be needed for a few years (or at least months) yet. 

Credit Image: Photo by Bram Naus on Unsplash

Guest Blogger: Tom Langston – Last Night ChatGPT Saved My Life…

The classic line from Indeep’s 1982 post-disco hit “Last Night a DJ Saved My Life” serves as the basis of my title. As ChatGPT did – not literally, but figuratively – save my life.

I am currently studying for the university’s Professional Doctorate in Education and, since February, have been completing my final taught assignment entitled “Research Proposal”.

It takes me a while to write, as I don’t find it easy, and I am not that good at it. My approach to writing is best described as little and often, frequently getting distracted and losing focus on the task at hand. If you have ever seen Dave Gorman’s “Googlewhack Adventure”, you will understand my problem: his book and stage show outlined how he was meant to write a fictional novel and ended up writing and performing about his experiences of Googlewhacks (Googlewhacking?) instead. He got distracted by less important but much more fun endeavours. 

The other problem is that I don’t get to the point. I write a verbose mess of unconnected thoughts. So with that in mind, I am going to explain how the rise of the machines (we are not far away from ChatGPT working out it wants to be Skynet) assisted me in writing my “research proposal” assignment. 

First Drafts

To start with, I had human help. Stephen Webb, as my project supervisor, read a draft I had written. He demonstrated that a lot of my writing relied on small words to, as he put it “do the heavy lifting”. Using ‘this…’ and ‘it…’ frequently because I understood what I was trying to say –  he pointed out that the reader might not. The only reason he could even start to understand my work was because he knew me and the context of my subject. 

From his extensive feedback, I redrafted, edited and tried to improve my proposal. 

After completing the next draft, I thought I’m done. However, I was well over the word count, so I put sections into ChatGPT and told it to re-write it within my word limit. It only bloomin’ did it! The problem was it was very sterile; it lost all sense of who I am as a writer. This meant I was going to have to manually get the word count down. 

After another version, I asked Stephen for some clarification on some of his earlier feedback. His reply was:

“In the section you sent through, there are still individual sentences that are hard to parse. The initial word of a sentence or clause primes the reader to think in a certain way; if the sentence or clause doesn’t deliver, then confusion arises. You really should pass this through a grammar checker, Tom.”

Not being a confident writer, I already knew that my writing needed work. What this feedback demonstrated to me was that I lacked cohesion and structure. My train of thought constantly jumping around the entire document is a by-product, I guess, of writing in such short unfocussed bursts. 

I wrote a heading titled “Introduction” but did not actually introduce anything. As Stephen had pointed out, I had poorly communicated my narrative. What this meant was that readers were getting lost in my disconnected thoughts. I could now see that certain sections felt shoe-horned in because I liked what I had written. Realistically, those “gems” made no sense to the reader as they were not connected with the text surrounding them. 

I tried to use ChatGPT to see if it could make sense of it, but there was too much there for it to tell me how to rearrange it. The system did not accept the entirety of the text and could not provide clarity to the sections I had written. ChatGPT could not offer clear guidance for how best to rework and rearrange it.

A Quick Aside

At this point, on re-reading the above section, I think it nicely illustrates the problem I was talking about. I have said the same thing here in three different ways. 

  1. I tried to use ChatGPT to see if it could make sense of it, but there was too much there for it to tell me how to rearrange it. 
  2. The system did not accept the entirety of the text and could not provide clarity to the sections I had written. 
  3. ChatGPT could not offer clear guidance for how best to rework and rearrange it.

I only need to use one. In the past, Stephen had raised this issue with me, but it is a hard thing to change. What I am going to discuss next is my recent experiences of using ChatGPT as a tool to develop the parts in my writing I would not have picked up on myself. Heck, as I go back through this blog, I’m sure I’ll find more. I just won’t mention anything I’ve changed, so you’ll never know… 

Editing with ChatGPT

As I worked my way through the editing process, I started to use ChatGPT to check sentences I wanted to keep but was struggling to make coherent. They were either too long, or the idea I was trying to convey was not coming through clearly enough. 

An example of this would be when I asked ChatGPT to “check the grammar”:

My original question:

My situation should be made clear to the participants, allowing them to have fully informed consent, but to also highlight that if they perceive issues in the current hybrid environments I am hoping to act on their concerns and not silence their voices.

ChatGPTs reply:

The grammar of the provided text is correct. However, there are a few suggestions for improving clarity and readability:

“My situation should be made clear to the participants, enabling them to provide fully informed consent. Moreover, it is important to emphasise that if they perceive any issues in the current hybrid environments, I am committed to addressing their concerns and not silencing their voices.”

In this revised version, I made a few adjustments to enhance the sentence structure and clarify the message.

It was this repeated process that helped me (despite, at times, slowing me down) as I kept re-reading the submission each time, trying to tighten up my writing. Finding new thoughts that were slightly out and attempting to move them to the correct place. It became a good rule of thumb that, if the text I wrote did not fit in the new, leaner version, despite how much I wanted to keep it, it was gone. (Well done Tom – you just discovered the true meaning of editing!)

Below is an example of how I started to use ChatGPT to my advantage so I could improve my writing. 

My question to ChatGPT:

Should it be “that allowed” or “allowing” in the following text – 

Before the pandemic only the most confident teaching staff employed interactive tools (Nearpod, Mentimeter, etc.) within a lecture space that allowed students the ability to creatively engage with the material (Tan et al., 2020).

ChatGPTs reply:

Both “that allowed” and “allowing” are grammatically correct in the given sentence. However, the choice between them depends on the intended meaning and emphasis.

“That allowed”

Using “that allowed” suggests that the interactive tools were specifically designed or configured in a way that granted students the ability to creatively engage with the material. It implies that the tools had a built-in functionality or feature enabling such engagement.

“Allowing”

Using “allowing” suggests that the interactive tools, in a general sense, provided the opportunity or capability for students to creatively engage with the material. It implies that the tools had the potential or capacity to facilitate such engagement.

Both options are valid, but if you want to emphasize that the interactive tools had a specific functionality designed to enable creative engagement, “that allowed” would be more suitable. If you want to convey a more general sense of opportunity or capability, “allowing” would be a better choice.

As you can see, ChatGPT helped to explain how the slight difference in the sentence structure changed the meaning of what I was writing. I was able to consider what it was I’m trying to say and use the most appropriate version. 

Through all my previous doctoral assignments, I had a different tutor who helped me with my drafts. They told me they liked my writing style, which gave me the impression I was good at it. I can’t say what they specifically liked about my writing. I can make an educated guess that, because it was a level 8 assessment, my mistakes were forgiven. The marker was able to interpret the little words “that are doing the heavy lifting” (as Stephen had pointed out) more easily than your ‘average’ reader. Stephen helped me understand that it is one thing to have an interesting voice running through your text but is quite something else to actually be good at writing. 

The ethics of using AI

When I got ChatGPT to reduce my word count, I spoke with Stephen about the ethics of it and (ignoring the sanitisation of the output) it felt like a great way for a writer to do this. However, it felt wrong to take what ChatGPT had created as my own.

There is going to be an ethical divide between those who see using any form of AI as wrong and those who see it as a tool for improvement. The problem (as always) lies in the grey area and the boundaries of where people choose to deploy it, for example how far the tool is shaping their work beyond what would have been possible by them alone. While knowing it might be unethical, some will use it due to other commitments (work, family, etc). This scenario is a foreseeable eventuality, much like those who copied other work or paid essay mills for their work. But perhaps AI may feel slightly more ethical? As I am about to discuss, maybe a strong referencing code is what is required. But people (I was going to put students, but felt that unfair as we all have a personal line we move and adjust depending on the subject) will always push and flex the boundaries of fairness.

Referencing AI

In a recent ALT mailing list post, the question was asked about referencing when AI was used to  support work. The reply pointed to Boston University’s faculty of computing and data science guidance “Using generative AI in coursework”. The post highlighted this text:

“When using AI tools on assignments, add an appendix showing

(a) the entire exchange, highlighting the most relevant sections;

(b) a description of precisely which AI tools were used (e.g. ChatGPT private subscription version or DALL-E free version),

(c) an explanation of how the AI tools were used (e.g. to generate ideas, turns of phrase, elements of text, long stretches of text, lines of argument, pieces of evidence, maps of conceptual territory, illustrations of key concepts, etc.);

(d) an account of why AI tools were used (e.g. to save time, to surmount writer’s block, to stimulate thinking, to handle mounting stress, to clarify prose, to translate text, to experiment for fun, etc.).”

This almost feels as if there are too many barriers to use for a subject (or level) where you may be writing assignments of 6000+ words. Especially if you have to reference every section of text AI helped with. The follow-up question is then do you have to highlight every time spellcheck made a correction? Do you have to disclose that you used a service like Grammarly to make minor amendments? That is of course hyperbole. I think that if you get AI to generate starting paragraphs or change substantial amounts of your work, yes, this should be explained in an appendix. For re-writing a sentence and making grammatical choices, then no. That raises another question, how many sentences constitute a substantive change? 

Are AI tools always correct?

The final point relates to whether they are always correct. Now I know these tools are better than me at the construction of language. I have always been better at face-to-face conversations as a form of communication. During a conversation I can see when people don’t understand me, so adapt how and what I say accordingly. Writing does not afford the writer or reader the same luxury. So why do I ask if they are correct? As I have already mentioned, ChatGPT sterilised my voice. When asked to reduce the word count, it did what it was supposed to do but it strips out the humanity of the author (unsurprising being that it is a computer).

Another example (inside a paragraph that starts “The final point”, I do get the irony) is that Grammarly made corrections to the generated text from ChatGPT. 

This is a screenshot of a reply from ChatGPT placed within a box saying 'Both "that allowed" and "allowing" are grammatically correct in the given sentence. This box appears on the lefthand side of the screen on the righthand side is a box created by Grammarly. Grammarly has written Formality, then says to engage with the material creatively. The to-infinitive to engage has been split by the modifier creatively. Avoiding split infinitives can help your writing sounds more formal.
Another “another example” of a final point is: I put the above paragraph into ChatGPT to ask if I had used irony correctly. This is what it said:

Yes, the use of the word “irony” in this context is correct. Irony is a literary device that involves a contrast between expectations and reality. In your statement, you are highlighting the irony in two situations:

The first example of irony is that despite recognizing that language generation tools like ChatGPT are better than you at constructing language, you also acknowledge that you excel at face-to-face conversations where you can adapt and clarify your communication based on people’s reactions. The irony lies in the contrast between the effectiveness of these tools in writing and your personal strength in face-to-face communication.

The second example of irony is that Grammarly, a tool designed to enhance writing and make corrections, ends up making corrections to the text generated by ChatGPT. Here, the irony arises from the fact that a tool meant to improve the quality of the text is itself making changes to the output of another tool that is supposed to assist in generating text.

Overall, both examples demonstrate a contrast between expectations and reality, highlighting the ironic situations in your statement.

The system had not understood how I had used irony in the sentence but said that the overarching point I was making was ironic.

This image is of a person standing, with their eyes focusing as if they're looking at someone else who might be sitting down, because their chin is tilted down a little. The wording across the top of the image reads: I am still none the wiser if I used irony correctly in the parenthesis when I keep adding examples to a final point. The wording along the bottom of the image reads: And at this point I'm too afraid to ask.

Conclusion

In conclusion, ChatGPT ‘saved my life’ by allowing me to interrogate my writing in a way I have never been able to before. We should promote ChatGPT to students as a resource that can help tighten up their drafts before needing academic support. It should be utilised to alleviate the burden on academics, who are often asked to critique early drafts of work. Academics should hopefully then only see much more polished versions of submissions that require less input from them. 

As a final example, I didn’t like my last sentence. ChatGPT gave me another version:

It should be used to relieve the burden on academics who are frequently asked to critique early drafts. Ideally, academics would primarily review more polished submissions that require minimal input.

I didn’t like that version either. Maybe then I should just delete it? That is where the human side wins out over AI, your sense of self and how you want to sound when writing.

AI is here to stay (and take over the world if the Terminator documentaries are anything to go by), but actually, that is alright (AI being here as a tool, not taking over the world). At levels 4 and 5, we need to find creative ways to incorporate AI tools into the assessment process (which may mean fewer essays). Levels 6, 7 and 8, it’s about using AI to help improve our own voice but not lose it. 

The ChatGPT re-write: How ChatGPT Revolutionized My Writing Process – The AI version.

Credit Image: Photo by rupixen.com on Unsplash

Image in the text created by T. Langston using imgflip.com

S01E07 – Dr Lynn Gribble – Artificial Intelligence

TelTales Podcast
TelTales Podcast
S01E07 - Dr Lynn Gribble - Artificial Intelligence
Loading
/

In this special AI episode of the Tel Tales podcast, Associate Professor Lynn Gribble discusses the impact of artificial intelligence in higher education, and how assessment can be adapted to become more authentic for our students.

Associate Professor Lynn Gribble is an Education Focused academic in the School of Management and Governance at The University of New South Wales Sydney. Awarded an AAUT citation for her leadership and impact as a digital innovator, she has taught management to large classes of Master of Business Administration and Master of Commerce students for 15+ years and has pioneered the use of voice recordings, audience response platforms and learning analytics to personalise every interaction with her students, increasing both their engagement and learning outcomes. Lynn co-leads Communities of Practice in Online Learning and Innovation, and the 4Cs (A Strategic Approach to Impact) and is a Senior Fellow of the Advance HE UK.

You can read two recent recently published blog posts that Lynn has written on the impact of AI in higher education here…

Overconfident with ChatGPT and Generative AI – Time for our students to think again

https://www.education.unsw.edu.au/news-events/news/overconfident-chatgpt-and-generative-ai

Surviving the start of 2023 in the face of generative AI

https://www.education.unsw.edu.au/news-events/news/surviving-start-2023-generative-ai

You can subscribe to the Tel Tales podcasts on Spotify, Apple Podcasts or listen directly through the Tel Tales website.

Original audio created by Chris Wood for use with the Tel Tales podcast.

S01E06 – Mike Wilson – WiseFlow

TelTales Podcast
TelTales Podcast
S01E06 - Mike Wilson - WiseFlow
Loading
/

In this final episode of Season 1 of the TEL Tales podcast, Mike Wilson discusses the WiseFlow pilot to provide one integrated assessment platform that serves our current (and future) assessment needs within Portsmouth University.  Mike discusses how the platform makes it easier for academics to design assessments, students to find their assessments and feedback and reduce the manual work centred around assessments for academics and support staff.  Mike discusses some of the innovative practices the core project team have developed as well as sharing some of our staff and student feedback about the project.  Below are some links to explore further…

UniWise Website

https://www.uniwise.co.uk/wiseflow

Portsmouth University – Digital success plan for Teaching and Learning 2021-2025

Digital success plan for Learning and Teaching

Got a story to share? If you’d like to appear on the podcast, please email chris.wood1@port.ac.uk and Chris will be happy to discuss this with you.

You can subscribe to the TEL Tales podcasts on Spotify, Apple Podcasts or listen directly through the TEL Tales website.

Original audio created by Chris Wood for use with the TEL Tales podcast.

S01E05 – Beth Hallissey – Simulation Technology / Gamification

TelTales Podcast
TelTales Podcast
S01E05 - Beth Hallissey - Simulation Technology / Gamification
Loading
/

As a teaching fellow within the dental academy, Beth discusses the importance of simulation technology and gamification for students in this episode of the TEL Tales podcast.  Hear how this is creating a sense of realism and immersing students in their studies.  You can find out more information about the dental academy below…

https://www.port.ac.uk/about-us/structure-and-governance/organisational-structure/our-academic-structure/faculty-of-science-and-health/dental-academy

Got a story to share? If you’d like to appear on the podcast, please email chris.wood1@port.ac.uk and Chris will be happy to discuss this with you.

You can subscribe to the TEL Tales podcasts on Spotify, Apple Podcasts or listen directly through the TEL Tales website.

Original audio created by Chris Wood for use with the TEL Tales podcast.

S01E04 – Tom Cripps – Accessibility

TelTales Podcast
TelTales Podcast
S01E04 - Tom Cripps - Accessibility
Loading
/

In this episode of the TEL Tales podcast, we discuss accessibility with Tom Cripps including why accessible content is vital for all students as well as share some simple and effective tips to improve accessibility for everyone. Below are links to some of the resources that Tom spoke about within this podcast…

General good link to accessibility information & Resources

https://web.dev/learn/accessibility/

Link to government information regarding accessibility legislation

https://www.gov.uk/guidance/accessibility-requirements-for-public-sector-websites-and-apps

WEB Aim Colour Contrast Checker

https://webaim.org/resources/contrastchecker/

SCULPT at Worcestershire County Council

https://www.worcestershire.gov.uk/digital-worcestershire/sculpt-accessibility

Got a story to share? If you’d like to appear on the podcast, please email chris.wood1@port.ac.uk and Chris will be happy to discuss this with you.

You can subscribe to the TEL Tales podcasts on Spotify, Apple Podcasts or listen directly through the TEL Tales website.

WiseFlow ePortfolio – Unlocking the power of WiseFlow: Transforming ePortfolio assessments

In the digital age, traditional paper-based portfolios have given way to ePortfolios, harnessing a powerful way to showcase a student’s work that demonstrates their learning, progress, reflections and achievements, over a period of time. ePortfolios are increasingly becoming popular in education as they offer several benefits to both students and academics.

For students, ePortfolios provide an adaptable platform to showcase their learning journey, including their best work and reflections on when it didn’t go quite to plan, and draw on evidence from a range of sources whether that be PDFs, images, videos, audio snippets or written text. This process helps students develop their metacognitive skills and self-awareness as learners over a period of time.  Academics, on the other hand, can use ePortfolios to assess students’ learning outcomes in a more comprehensive and authentic manner. In turn, this allows academics to gain insights into students’ thought processes, identify their strengths and weaknesses, and provide targeted feedback. Additionally, ePortfolios allow academics to track students’ progress and provide evidence of their achievements.

Using ePortfolios also builds several skills, including digital literacy, communication and critical thinking – all of which are vital in the modern workplace. Students have to select, curate, and present their work in a clear and engaging manner. They also have to reflect on their learning process and map this to learning outcomes. These skills are crucial for success in the modern workplace, where digital communication and collaboration are essential. With a background in teaching vocational courses for 12 years at Further Education level, I’ve seen first-hand the impact and outcomes of effective ePortfolio use for both students and academics. 

At Portsmouth University, we have struggled to find a solid ePortfolio solution. We currently use a popular open-source ePortfolio platform that allows students to create and share their digital portfolios. While the platform has several benefits, including flexibility, customizability, and integration with other systems, it also faces some challenges. One major issue is its user interface, which can be overwhelming and confusing for some users – particularly in the setup stage of having to import the portfolio into your own profile. This process often leads to a lot of technical issues and puts up an immediate barrier to entry for those not tech-savvy. Additionally, the learning curve for using the platform can be steep, and it may take some time for users to become familiar with all the features and functionalities. However, despite these challenges, academics and students value the use of the ePortfolio system on offer and the benefits this provides.  

We are currently coming towards the end of our first stage of a pilot with a new system: WiseFlow. This is a cloud-based digital end-to-end exam and assessment platform that supports the assessment and feedback lifecycle for students, assessors and administrators. It’s fair to say that staff feedback about the WiseFlow pilot has been overwhelmingly positive. As a core project team, we’ve had the pleasure of working with academic teams to support students with innovative assessments in Wiseflow, across a range of disciplines. This all links to our Digital Success Plan to (re)design robust assessments to meet the needs of the diverse student population within a blended and connected setting and incorporate a robust specialist end-to-end assessment platform. Our aims in the project were to make it easier for academics to design assessments, easier for students to find their assessments and feedback, and reduce the manual workaround assessments for academics and support staff.  All of which, WiseFlow seems to have been able to deliver. 

Within the pilot, we wanted to really push the boundaries of WiseFlow – utilising a wide range of assessment types to really test if WiseFlow can become the go-to platform for assessments at Portsmouth University.  One of the big challenges for us was to find an ePortfolio solution that is user-friendly, and adaptable across a range of disciplines as well as providing a versatile feedback loop where students could receive formative feedback on their work from assessors and develop ideas, prior to final submission. After challenging the team at WiseFlow to this – they came back with a solution. Block arrows showing the timeline of a flow ePortfolio for Online Course Developers, eLearn Team, Students and Academics

Traditionally, a FlowMulti (just one of the many ‘flow types’ WiseFlow offers for assessment) would be used for open/closed book multiple-choice exams, where the participants fill out a provided multiple-choice test.  However, the team at WiseFlow suggested we could utilise this functionality to use as a bespoke ePortfolio solution.

Using a FlowMulti allowed us to replicate the layout and design of current ePortfolios as well as allow us to adapt the setup to truly take ePortfolios to the next level. To create the feedback loop, we allowed assessors early access to the work, early release of feedback to students, and students to submit unlimited times before the deadline.  The portfolios could be easily updated year-on-year, were inviting for students to engage with, and could be authored by multiple academics at the same time. This seemed like the perfect solution.  

After testing, adapting and re-testing, we felt this solution offered a totally new level of ePortfolio to our current offering. The ability to re-purpose traditional multiple-choice questions allowed us to push the boundaries of assessment further, like never before. The only limitation is our own creativity to adapt and repurpose these. We put together a showcase of a PGCert portfolio to show our academics the findings, who immediately fell in love with the platform and we started working together to develop a portfolio to run within the pilot.

“As a course team, we are incredibly excited about the flexibility that the Wiseflow ePortfolio has to offer. Working with the project team we have been able to design a summative assessment vehicle which is both intuitive for learners and versatile enough to encompass a broad range of tools which enable the course Learning Outcomes to be demonstrated in an engaging and meaningful way.”  Dr Joanne Brindley, Academic Practice Lead & Senior Lecturer in Higher Education.

A screenshot of an image taken from a computer of the page that the participant will see asking them to reflect on their skills. A screenshot of an image taken from a computer of the page that the participant will see. The image is of empty drop box for the participant to upload their activity into it. A screenshot of an image taken from a computer of the page that the participant will see asking them to type up a reflective statement on the using of Technology to support learning.

We are now in the “participation” phase of two ePortfolios – one for the Research Informed Teaching module and one for the new Level 7 Teach Well: Principles to Practice professional development module.  We have had great experiences re-designing pre-existing portfolios to really push the boundaries of what is possible in WiseFlow. We’ve added interactive elements, by turning traditional questions and approaches on their head – such as using a histogram for reflection, allowing students to visually reflect on skillsets pre- and post-observation. We’ve provided students freedom of choice with assessment by integrating a voice recorder into the portfolio and also utilising existing platforms to integrate into the WiseFlow portfolio. Really, the only limitation is our own imagination.  

“We teach the PG Cert Higher Education so our students are staff. The platform is incredibly user-friendly for both staff and students. We used it for ePortfolio as the last platform created lots of complaints, whereas this platform has led to lots of compliments.  The staff members spoke highly of the platform and I believe, many have asked to be part of the pilot next year due to their positive experience.”  Tom Lowe, Senior Lecturer in Higher Education

There has been overwhelmingly positive feedback from academics and students regarding the usability and functionality of using WiseFlow as an ePortfolio solution.  Through word of mouth and firsthand experiences from early career academics, particularly those who are studying on the Research Informed Teaching module, the platform’s potential in enhancing their own teaching has become widely recognized.  I remember being invited to one of Tom’s lectures to showcase the platform to his students who would be using it and the response was overwhelming. Staff were excited to use this as students and saw the immediate potential for their own teaching. It is always a good sign of a new innovation when there is an immediate benefit to both staff and students that can be applied instantly in the classroom. Essentially, we now have a waiting list for academics who are wanting to work with us to develop ePortfolios in WiseFlow – with no advertising at all and purely from those who have used it as students. We believe that when this is advertised, we will see a huge influx of academics wanting to use this. We have also spoken to other Universities in WiseFlow user groups, who are actively keen to explore this and want to learn about our innovative approach. The potential of this solution is game-changing, not just for us, but for other Higher Education institutions. 

However, using an innovative approach and essentially turning a quiz assignment on its head does not come without some drawbacks that need to be considered before academics embark on an ePortfolio solution within WiseFlow.  There is currently a 12-file limit, set at 10Mb per file when students upload files into the portfolio. Although it is great that students can do this, it does not lend itself to modern file sizes or some of our subject areas (for example, our Creative and Cultural Industry Faculty, where students would regularly upload large .psd, CAD files, HD video, and high-quality audio etc). In our initial pilot, we haven’t encountered this issue – but it’s worth considering if this is the correct way to proceed with an assessment. The limit on the number of files is also a concern. For example, some students in our pilot have reached the 12-file upload limit. While there are workarounds, such as storing files in a Google Drive folder and sharing the link or combining multiple files into one, however, it defeats the purpose of an ePortfolio as an all-encompassing system. Perhaps, a better approach would be to have an upload limit as a whole, with a defined combined file size.  The final consideration to make is that once the ePortfolio is live, we cannot make changes.  We’ve worked extensively with academics and our support teams to iron out any issues prior to release, but again, this is important for academics to understand. Careful planning and consideration must take place in the authoring phase of an assignment, which will then be rigorously checked prior to release – in the same way an exam would. Despite these setbacks, we’re actively in discussions with WiseFlow regarding developing this and hope to make progress on these in the near future. 

The future of ePortfolios in WiseFlow is exciting, and we can’t wait to see how they will continue to be developed across the University. The ability to adapt and transform ePortfolios will open up new doors for our students and academics to really develop the ways in which students can showcase their knowledge and understanding. We’re hoping for a successful run of ePortfolio use within our pilot and looking forward to developing new ideas as we move into the future.  

Until next time. Watch this space.

Chris.

S01E03 – Maria Hutchinson – Teach Well

TelTales Podcast
TelTales Podcast
S01E03 - Maria Hutchinson - Teach Well
Loading
/

In this episode of the TEL Tales podcast, we discuss the brand new Tech Well: Principles to Practice level 7 professional development module, developed exclusively here at Portsmouth University.  Maria discusses the module itself, some of the activities participants undertake and how completing this module is upskilling our staff at Portsmouth University and leading to progression. To find out more information about the Teach Well: Principles to Practice module or enquire about taking the course, please contact maria.hutchinson@port.ac.uk

Got a story to share? If you’d like to appear on the podcast, please email chris.wood1@port.ac.uk and Chris will be happy to discuss this with you.

You can subscribe to the TEL Tales podcasts on Spotify, Apple Podcasts or listen directly through the TEL Tales website.

Original audio created by Chris Wood for use with the TEL Tales podcast.

S01E02 – Shaun Searle – Learning Glass

TelTales Podcast
TelTales Podcast
S01E02 - Shaun Searle - Learning Glass
Loading
/

In this episode of the Tel Tales podcast, we discuss the Learning Glass here at Portsmouth University and the impact it is having on teaching and learning.  To book a session to use the learning glass please email elearn@port.ac.uk where Shaun will be happy to discuss your needs and arrange a session to record.

https://learningglass.eu/

Got a story to share? If you’d like to appear on the podcast, please email chris.wood1@port.ac.uk and Chris will be happy to discuss this with you.

You can subscribe to the Tel Tales podcasts on Spotify, Apple Podcasts or listen directly through the Tel Tales website.

Original audio created by Chris Wood for use with the Tel Tales podcast.

« Older posts

© 2024 Tel Tales

Theme by Anders NorénUp ↑