The Learning University

Posted on

In this week’s blog post, Paul Redford discusses the importance of evaluating University-wide initiatives.

Image of a light bulb resting on an open palm

It is clear that reflection and research evidence are key to effective practice. We encourage reflection in our students as we know it is related to academic success, and our staff as we know it is related to teaching & learning quality (Bolton & Delderfield, 2018). We also encourage both students and staff to be research informed and to critically evaluate evidence. But to what extent are we reflective about the policies and practices undertaken within our institutions, and are our decisions research evidence informed? As educators and educational developers we are interested in learning, but to what extent are the organisations we work in “learning organisations”? 

Often it is easy to fall into a trap of developing initiatives (often by informed people with good data), without spending sufficient time reflecting, evaluating and gathering research about the impact and effectiveness of these initiatives, policies and practices. Externally funded projects require evaluation of impact, but this is not always the case with internal practices such as university initiatives and changes in practice. At UWE (as at most universities in the current Covid Era) we recently undertook a large project to ensure that our students “started well”, were prepared for the uncertainties and unusual nature of learning during the forthcoming year (and beyond), were confident in their abilities, and were well connected to other students and staff through a sense of community and belonging. The project was university wide and involved all aspects of the institution including programme teams, timetabling, wellbeing services, library, learning technologists, communication and marketing, the Students Union and more. Of course, these services work together at the start of every academic year, but this year the project was more ambitious, aiming at a higher level of co-ordination and impact. The resulting outcome was an extended induction for all programmes across the institution, which impacted upon academic calendars, student start dates, and a delayed to return to campus. This was also a challenging time for most people at the institution, in particular programme teams who were managing a seismic and urgent shift to online learning. Alongside this new start to the year, the university also implemented an equally large evaluation of the project. A collective reflection. A research informed evaluation. 

The evaluation of this project was just as ambitious and involved as the “starting well” project had been. The evaluation took an evidence-based practice approach, identifying all the stakeholders who were impacted by the process and ensuring their voice was heard and included. This project was quite an undertaking particularly at a time when the “new start” project had come to an end and staff were facing unpredicted challenges of shifting to delivering online within a complex and volatile HE environment. Having said that, the engagement with the evaluation was excellent. The evaluation involved: 

Staff:

  • Interviews with all faculty leads at a senior level, headed by a senior team. 
  • A series of workshops available for all members of the institution. 
  • A series of focused workshops for targeted groups such as learning technologists and timetabling. 
  • A staff survey for all staff including programme teams experience as well as those delivering the project such as library and other central services. This was completed by over 350 staff members and included extensive quantitative and qualitative feedback. 

Students:

  • A series of focus groups for students of all levels to discuss their experiences. 
  • A student survey aimed at all students about their experience of the project as a whole and also of particular elements, such as programme sessions, independent learning sessions, central sessions etc. This was completed by over 3000 students across all faculties in the institution. 
  • Engagement data from different sources, such as viewing figures on videos, numbers of log-ons etc. 

The results of the workshops, focus groups and surveys were collated into a 40 + page evaluation document (not including the in excess of 100 pages of qualitative responses from the surveys of staff and students). The quantitative and qualitative results were analysed around core themes as well as examining specific experiences by key groups (such as faculties, professional services, etc). 

The student data gave us a valuable insight into the student experience of the initiative as well as whether it helped onboard our students. Importantly we were able to discuss and frame ideas about how we can better prepare students for entry into HE. This data also gave us an opportunity to understand our own students experiences against national surveys conducted at this time. The staff data allowed us to reflect on the staff experience of not only the outcome, but also the process. 

Overall, this evaluation gave us evidence of the impact of the ‘starting well’ project at an institutional level. The evaluation facilitated conversations which could be informed by evidence about the prevalence of student and staff experiences. Crucially, we were able to avoid the situation where the loudest voice can sound like a voice of truth rather than one experience among many. We could draw evidence informed conclusions about where our initiatives had worked and where they did not; who it worked for and what challenges there were for others; the extent to which the “starting well” project was successful, and particular areas of failure. The data was of course not homogenous, with variations in views and experiences, which is to be expected in such a complex and widespread undertaking.  However, we are now well informed about the key challenges and informed ideas around how to address these issues in future incarnations of the project. We can now implement an adapted system, informed about the key issues, understanding better who needs to be involved, and with evidence about what needs to be delivered. We know more about what students valued, and didn’t, where they engaged and didn’t. Overall, the evaluation of impact has given us a valuable opportunity to reflect, to understand the impact and to improve our offering going forward.  

Although the evaluation included lots of elements about the specifics of this year, the framing of the evaluation was about growth and development rather than performance. Using evaluation as a tool for reflection and development allows us to foster a sense of psychological safety where feedback is encouraged, engaged with and acted upon. The evaluation was not about trying to either blame nor necessarily praise, but to understand, develop and improve.

As it may be clear, I have said nothing about the results. The review was not about judging “success” or “failure” of the project. Instead, it was about what we can learn and how this shapes future actions. The results are important, but much more important is the learning.

Paul Redford, Associate Director of Academic Practice, UWE Bristol

Contact: paul2.redford@uwe.ac.uk 

Reference

Bolton, G. with Delderfield, R. (2018). Reflective Practice: Writing & Professional Development (Fifth Edition). London: Sage. 

HE Policy Round-up of 2020

Posted on
Cartoon of colourful speech bubbles

Rather than write our own policy round-up of 2020, I’m taking this opportunity to plug WONKHE (pronounced ‘wonky’), an independent organisation that provides a commentary on the latest developments in the higher education sector for those who work in universities and anyone interested and engaged in higher education policy, people and politics.

WONKHE offers daily and weekly updates on HE policy straight into your email inbox, with additional material via their website and podcast. UWE Bristol is currently a subscriber so you if you are a member of staff here you can sign up for free.

Today’s recommended read is Debby McVity’s review of 2020: Seven things the HE sector learned in 2020 – and what universities should prepare for in 2021. Enjoy!

Back to top