Learning Technologies Awards 2017
On the 29th November, the Park Plaza Westminster Bridge Hotel in London hosted an event to celebrate the very best in learning technologies across the world. I was lucky enough to be there, sat alongside Andy Kirke from Sheffield Hallam University in an 800-strong audience eagerly awaiting the results. We both had a vested interest in the outcome after our joint 'Innovation in Paramedic Training' award submission had been shortlisted in the Best learning technologies project - UK public and non-profit sector category.
This year's event was the biggest to date with award entries from across the globe and some 30 different countries taking part. To have been shortlisted was an achievement in itself, but to have lifted the bronze award is a fantastic result and testament to the terrific work at Sheffield Hallam University and a technology we're really rather fond of - PebblePad. Below you'll find an overview of the award submission, which outlines the challenge, the solution and the benefits of the transformative approach. Enjoy.
Left to right: Sara Pascoe, Andy Kirke, Debbie Holmes & Deborah Frances White.
Exceeding expectations
Sheffield Hallam University needed to find an effective way to transition away from traditional clinical assessment for its first and second year undergraduate paramedic trainees. One of the key requirements was a need to support assessment at scale whilst maintaining quality standards. The team turned to PebblePad's Personal Learning Space and eportfolio platform for a solution in the form of a structured online workbook that would allow trainees to upload video evidence of their capabilities and reflect upon their own performance, prior to submitting for formal assessment. The project results have exceeded expectations and the new model has not only resulted in the team achieving its goal of developing a scalable assessment model, but also reduced costs and motivated the trainees to prepare better for their assessments.
The challenge
The initial clinical capabilities of trainee paramedics on university undergraduate course are assessed via simulation activities, where the trainees carry out medical procedures, typically using manikins, whilst being observed by a qualified examiner. These assessments are known as Objective Structured Clinical Examinations (OSCEs) and are commonplace within healthcare training. The examiners observing and assessing the performance of trainees may be members of the course delivery team, associate lectures or medical practitioners.
With four groups each with 70 trainees, undertaking multiple OSCEs and only five staff available to manage the assessment, course leaders faced a challenge in locating enough examiners for these live activities. Coupled with numbers increasing year on year, the live observation process was becoming unsustainable. Due to the pressure of assessor availability, trainees were required to complete several OSCEs one after the other in a simulation suite. There was little or no time for the individuals to reflect upon their performance of any individual action or learn from the experience, resulting in missed learning opportunities for trainees.
Additionally, the University had no ongoing record of the live performance, other than the assessors’ notes - should a trainee challenge an assessment result, reviewing the outcome could be problematic. It was also evident that having individual trainees watched closely in a high stakes observation led to some experiencing exam stress. This inevitably resulted in some candidates not performing at their best.
The solution
The course leaders of the BSc Paramedic Science course evaluated the level one and level two OSCEs, carried out in the first two years of the course, and concluded that these did not need to be observed live in person. Video evidence of the activity was appropriate for these initial OSCEs. To manage the video submissions effectively the course team created a structured workbook in PebblePad for students to complete. The workbook includes placeholders for students to add videos of each of their OSCEs and an area for them to reflect upon their own performance. The videos can also be peer reviewed within PebblePad, providing an opportunity for formative feedback for a first attempt at a skill before formal assessment by an appropriate staff member.
An example of a digital PebblePad workbook *
Students record video on their own mobile devices and upload it against the relevant competencies using the PebblePad app (PebblePocket). This provides the additional benefit of compressing the video on upload.
In the initial rollout, trainees were required to upload three videos for each OSCE – the first being reviewed and reflected upon by the student, the second reviewed by a peer and the third being reviewed by an assessor. The three videos were created at different stages in the course to allow the trainees to practice their skills in light of their own reflections and the feedback of peers. This helped ensure the final submitted video met the assessment requirements.
An example of peer review in a shared PebblePad workbook *
The OSCEs are performed in a simulation suite where stations are set up for each specific examination task. To support trainee practice and combat variations in technique, exemplar videos of each skill, recorded by the project team, could be viewed by trainees to help them understand what is required of them for each of the assessments. An assessment deadline was set and trainees could book a time to use the simulation suite when they were ready to complete the OSCEs without requiring supervision.
After the deadline, assessors could review the submitted digital workbooks and provide feedback on each element. Any inexperienced assessors could seek the advice of those more familiar with the assessment process as they reviewed the video evidence. This was particularly helpful where submissions did not meet the required standard or were viewed as borderline. As the whole assessment process was documented, and iterations of trainee competency could be tracked, there was a significant reduction in the number of appeals against failed assessments.
Once trainees complete this part of the course, the resulting digital workbooks can be linked to a larger full course eportfolio which tracks their long-term assessment and progress toward professional status. This eportfolio can also be shared with practice-based educators to enable them to review the capabilities of the trainees as they begin to work in practice.
Completed course templates & workbooks can be embedded in personal and career portfolios *
In summary
The project has surpassed the expectations of the project team by not only providing a sustainable model for the future, but also enhancing the learning process for the trainees, improving their performance, and reducing the cost of the assessment process.
PebblePad is used extensively across a range of healthcare disciplines. If you'd like to learn more about how it is used, we've made a free health education resource pack available below. It contains an eportfolio comparison checklist to help you compare PebblePad features and functionality against other eportfolio platforms, our Implementation Toolkit, based on our extensive experience (and validated by advice published by Jisc) to guide you through the steps required for implementation success, and a digital pack containing a range of health education case studies.
* Product examples included in this post
Examples of PebblePad functionality have been included purely to illustrate the processes outlined within this post - they do not represent the actual workbooks used by the team at Sheffield Hallam University.