Reading Time: Less than 5 mins.
Synopsis: Higher Education is a fast-paced and dynamic sector, where institutions must constantly adapt to their students’ needs. Accelerated by the Covid-19 pandemic, institutions have to be more agile and responsive than ever. This guest blog from Cardiff University’s Student Engagement Manager, Eleanor Mayo-Ward, reflects their experience on creating an innovative and refreshing approach to capturing the student voice during the pandemic using ‘Cardiff Pulse’.
Like at many other institutions, Covid-19 has definitely shone a spotlight on student voice at Cardiff University. We needed to have a better way of having conversations with our students, and for that to be more agile and responsive.
Before the pandemic, we were very much reliant on traditional surveys – the National Student Survey, the Postgraduate Taught Experience Survey and the Postgraduate Research Experience Survey – and we also had relatively low scores in some of these metrics, which we don’t shy away from. But changes in circumstances brought about by Covid meant that we had to work differently and be more adaptive.
Moving to a More Agile Approach
Where we are today stems from a wider piece of work the University started in January 2020 on its student voice practices and how we can better listen to our students, including what technology that might involve. With the pandemic hitting in March 2020, that obviously hit our timelines, but we still needed to be more timely, agile and responsive in our student voice mechanisms and received HEFCW funding which allowed us to think differently. Explorance had the solutions, and we began working with them in late February 2021 and launched ‘Cardiff Pulse’ in March having built this in Blue.
We called it Pulse, not a survey, because we wanted to frame it as a conversation starter with students. This was a monthly series of questions, running from March-June, which was open for seven days to all 30,000 students where we could pick up any discussions that needed to happen with them. We kept it short, sharp and snappy with six or fewer questions – some questions (at least one) we kept consistent over time so we had a baseline. Sometimes we would adapt questions each month – we would ask questions in earlier months and then follow up to see if things had improved or if they were any follow-up questions that we needed to ask. We wanted it to be agile, not ‘you tell us what you think we need to know and we will give you a response’, and this was a big culture change.
We have 26 student champions, who we recruit each year and pay them to work with us on various projects, and they helped us to develop Pulse as went along. Each month we would work with the champions to build the question set and to check it all made sense to students. The questions were generally the same for all students, although sometimes we asked routed questions, mostly for undergraduate/postgraduate splits. The student champions helped us to promote the survey to students, including via a video.
Integrating Cardiff Pulse with Blackboard
The best thing we did was integrating Pulse with our virtual learning environment, Blackboard. Our March response rate was 2,205 (7.33%) but when we integrated with the VLE our response rate in April was 6,628 (20%). That is the highest number of responses (not %) we have ever achieved in one of our surveys. The results were reported at University and School level and had a key contact in each school to manage the results and response to students. We also made sure we shared the data with our Students Union. They had access to every monthly report and could share with their student reps.
Another important aspect was building a communications cycle around Pulse to promote and close the loop because that is one of the core issues we face: telling the students what we have achieved and closing that loop. Ultimately Pulse is the first mechanism that allows us to receive and deal with feedback in a timely fashion. Previously we were doing the large national surveys that don’t report until the end of the year so were not able to make changes quickly whilst students were still interested in those issues.
Did it work? Well, our NSS Student Voice scores dropped by 9% in 2021 but so too did the sector and Wales. However, we have had some very positive feedback from our students and Students’ Union about Pulse and about our shifting approach to more agile student voice. We don’t know what those scores would have been if we didn’t run Pulse – they could have been significantly worse and I would like to think Pulse didn’t have a negative detriment – but we are continuing this now and into the next year.
Integration with the VLE has worked really well – it is so easy for students to access – and the fact that there are so few questions means they can quickly fill them in. The limited size of question set, keeping consistent questions (but adapting them as needed) over time to give us a baseline and support longitudinal analysis, and not using the term ‘survey’ also helped. Having dedicated staff to look at results and respond to students allowed us to gather some of that local level change. Students seeing communication from senior staff, including the Pro Vice-Chancellor, really helped to get buy-in.
There were challenges, of course, and the time for implementation and staff engagement beforehand was limited as we launched just one month after agreeing the brief with Explorance. However, it did show that as a university we can work rapidly when it is needed and when our students need us to. It really good to show that old institutions like ours can make big changes when there is the perfect environment for it, and working at pace – driven by Covid – was a new culture for Cardiff.
Creating a New Culture at Cardiff University
We also learned that whilst the monthly cycle was agile and allowed us to have a good conversation with students, consulting on questions in a short turnaround was hard and was too quick to close the loop effectively. We didn’t have enough time in between gathering students’ feedback and the next cycle to properly implement the changes they required. This was compounded by the need to close the loop or students lose interest quickly. They want to see change is happening and we saw a drop off in response rate towards the summer. Some of this can be explained by the time of year, exams and so on, but some of that also comes down to us and the closing the loop aspect.
Going forward, we are operating on a seasonal cycle instead of monthly with at least six weeks between each so we can really work on some of the students’ feedback, and are also going to explore Blue’s functionality to tailor some more of our data either through doing routed questions or creating separate definitions.
Cardiff is a beginner in this game, with lots to learn, but is an example of trying something different in an old institution. We knew we needed to do something bold to create a step change and overcome some of our cultural challenges, and trying and failing fast has helped us get there quicker. Significantly, however, this is also part of a bigger university plan to tackle student voice – and that plan has allowed us to think differently and with more agility.