Dynamic Course Evaluation

Use cases

How UHasselt University converted existing course evaluations into micro surveys

FOCUS AREA

Course evaluation

TYPE OF INSTITUTION

University

TARGETED STUDENTS

+1.500

official website

54%

Avg. response rate

+92%

Avg. completion rate

8.7/10

Student easiness

The Motivation

‍To improve student survey engagement and motivation and to build even more proactivity into quality assurance.

The Solution

A custom StudentPulse course evaluation setup based on six quality criteria and three micro surveys every quartile.

...we were aiming to optimise the relevance and length of our surveys - and to act more often and quicker to our students' inputs...

Context

Over the years, Hasselt University has used different student surveys to assess the educational quality across the university. Evaluating the quality of teaching, specific courses and the programmes in general.


From the fall semester 2021, StudentPulse has been used to convert existing surveys into a micro survey setup for three of the university’s faculties. Looking into differences between a traditional setup and a new approach to student feedback.

The Challenge

In their existing setup, Hasselt University experienced difficulties when surveying specific courses and programmes (course evaluations). The response rates and student easiness proved to be a an issue, and for programme staff a more flexible, customisable and usable platform was needed without compromising the possibility to compare data with other programmes and faculties. With implementing StudentPulse the aim was to:

  1. Increase engagement of students with regard to educational quality assurance.
  2. Motivate  students to fill in questionnaires related to educational quality in order to obtain reliable data.
  3. Support university staff in its work with improving  quality of teaching during the semester and not just at the end.
  4. Compare results across courses, programmes and faculties, still allowing a customised feedback setup within each of these.

The Solution

The main challenge for Hasselt University was to increase student survey engagement and motivation, to disseminate feedback in an easy interpretable way, and to make sure feedback was used to improve students' experience.

“In using StudentPulse we were aiming to optimise the relevance of the questions asked, diminish the length of the questionnaires, update to a more user-friendly application, and implement a shorter and more transparent feedback loop with linked actions.

This way we hoped to increase the response rate, to be more actionable and to react quickly when problems or challenges arise".

Franne Schepers

Quality Assurance, Hasselt University

Involving the organisation

To align a new setup with organisational needs, before implementing StudentPulse, an internal analysis clarifying the different types of quality assurance tools used at Hasselt University was conducted amongst students and staff to pinpoint the strengths and weaknesses of the institution’s QA instruments. Further, a selection of faculties and study programmes was made.

When the implementation started, the selected faculties were involved in the process of choosing questions within six different quality criteria. Further, the Quality Assurance Team together with the faculties decided upon the number of micro surveys (pulses) during the semester, the number of courses to be included and which colleagues to be involved.

By involving faculties, they got familiar with the new approach even before the first feedback was collected, but more importantly it made sure that the setup was adjusted to their needs, increasing the likelihood of not just collecting but also using student feedback more often and efficiently throughout the semester.

Building a custom StudentPulse-framework

StudentPulse comes with a framework of 11 verified student experience drivers and 60 verified questions, but in the case of Hasselt University it was decided to implement a custom framework instead. This allowed the institution to build their existing six quality criteria into the platform, making it possible to compare StudentPulse-data with faculties not using StudentPulse, as well as comparing new data with previous years’ data.

The framework was set up in a way where quality criteria were kept consistent across all participating courses, whereas differentiation of questions were allowed down to the programme level.

A micro-distribution strategy

To reach most students in the most efficient way, professors got notified with a survey-link and QR-code and got a reminder to make the students fill out the surveys during class. At the same time, surveys were posted on the institution's learning management system.

The process of collecting feedback is currently being automated and integrated but the idea of engaging professors remains at the core of the future distribution strategy, serving as an important prerequisite for building a strong feedback loop between students and staff.


Tracking survey participation, completion and easiness

When student survey engagement and motivation is put on top of the agenda, so is the need of measuring these concepts. To do so, student response rates, completion rates and easiness scores were reported throughout the semester. The data collection KPI’s all yielded significant improvements across programmes, but more importantly they uncovered the effect of different approaches to survey distribution as well as survey setup. Inputs that have been used to define the future best practices for the institution to work with student feedback.

Distribution

Pushing survey links through learning management system and in-class QR codes.

Micro survey (pulse) frequency

Before, during and after quartile.

Questions per pulse

5-7 plus follow-up's on sceptical student answers

Main KPI's

Response rate, completion rate, student easiness, comment likelihood

The results

More feedback

More students reply on a continuous basis, and more students complete the surveys.

More improvement suggestions

Students' comment likelihood has been increased, and their improvement suggestions are automatically linked to each subdriver of student experience.

Faster feedback cycle

Surveys are taken multiple times throughout the educational period, and staff have the opportunity to adapt their classes to the feedback they received from the students.

Feedback overview

Micro surveys conducted

+100

Courses targeted

+30

Student answers

+15.000

Improvement suggestions

+500

Staff members involved

200