EUROSTUDENT 8 field phase – lessons and challenges from Estonia
EUROSTUDENT 8 field phase – lessons and challenges from Estonia
Decreasing response rate is a challenge that many researchers are facing today when gathering data with questonnaires. The same challenge has been risen in the context of the EUROSTUDENT survey. Estonia began to participate in the EUROSTUDENT project in its 4th round and since then the response rates have constantly decreased: from 15% in E:IV to 8% in EVII. However, in E:8 we managed to turn the trend by receiving a response rate of 11%. What did we do to accomplish it?
As think tank Praxis had carried out the EUROSTUDENT field work in Estonia several times before, we were in a good position to build the strategy based on the lessons we had learnt from the past. There were four key components in our E:8 field phase strategy.
Right timing is not the most novel aspect to highlight, but still, the importance of it is worth emphasising. Based on our experience, the survey should start with sufficient time span before the exam period (in the end of April the latest in Estonian context as the exams usually start in the second half of May) so that students have enough time to fill out the EUROSTUDENT’s meticulous questionnaire without worrying about having to prepare for exams at the same time.
Direct contacting by the researchers instead of higher education institutions (HEIs) is another key element. In Estonia, students’ personal data (e-mail addresses etc.) are not gathered and stored by any public central institution (ministries). Only HEIs have the data. Therefore, there are two options to invite students to participate in the EUROSTUDENT survey: (a) to ask HEIs to contact students or (b) to ask students’ e-mail addresses from the HEIs and send the invitations to students directly by the research team (Praxis, in our case), i.e. to use a direct contacting approach. Based on our experience from E:VII, direct contacting results in higher response rate: response rates were more than three times higher in HEIs whose students we (Praxis) contacted compared to the HEIs who contacted their students themselves. Therefore, in E:8 we made efforts to receive students’ e-mail addresses from as many HEIs as possible. Although it was a time-consuming process as it encompassed lots of communication with all Estonian HEIs (18 in total) and permission from the Data Protection Inspectorate had to be applied for, the result was successful – we managed to receive students’ e-mail addresses from almost all Estonian HEIs (only two small HEIs out of all Estonian HEIs decided not to share their students’ e-mail addresses with us).
Systematic and evidence-based approach for contacting students to invite them to participate in the survey is another central feature of our field phase strategy. We used the following key elements to motivate students to participate:
(a) carrying out a lottery (with the kind sponsorship of book store gift cards, nature hiking gift cards, and travelling theme goody bags)
(b) creating e-mail invitations that are based on the knowledge of behavioural science, and
(c) sending many (6!) reminders.
As the e-mails were created by our behavioural scientist, we could not resist a mini-experiment in order to test which e-mail leads to a higher response rate. Further testing is needed for final conclusions, but there is reason to assume that both the structure as well as the subject line of the e-mail are important: response rates were higher for the group who received an e-mail where the information about prize draw was in the beginning of the e-mail and the subject line was shorter and more precise in terms of the lottery (compared with the group who received an e-mail with information about the prizes more below and with longer subject line that was not so explicit in terms of prize draw).
Finally, visual attraction is important, which leads us to the last strategy feature we applied. We kept the information about the ongoing survey constantly available in the background via creating and sharing attractive banners in social media and other channels and asking HEIs and the coordinating ministry to do the same.
In sum, while we managed to turn around the trend of decreasing response rates we cannot say which component of the described strategy was the most successful as we could not measure the impact of them. Whereas some components (e.g. right timing) are universally helpful, some probably depend on the context (e.g. whether researchers can and should contact the students or this is something that the HEIs or the ministry should do). In Estonia, the described combination seems to work very well and hopefully provides some inspiration for other countries and their researchers as well.