Article

Unlikely Tools for Unlikely Times

‘Give a person a hammer and everything begins to look like a nail,’ is a saying which summarizes the cognitive bias that examines how tools have the ability to shape how we see the world. When the pandemic struck we were forced to trade in our ‘hammers’ for tools that were more conducive to remote work.

At Busara, the shift forced us to think critically about how we work – particularly how we carry out research. This led to the development of the Busara Online toolkit, a process that involved looking at the tools available and how we can best maximize them in the emerging environment.

What we learnt using IVR for remote data collection

It was around this time that IVR technology caught our eye. Interactive Voice Response (IVR) technology works by using a computer-generated voice to ask survey questions by telephone, which allows us to eliminate interviewer bias and measure just how much impact a researcher’s accent would have on their data in Kenya for an accent bias study.

Beyond this, IVR offers an array of benefits including systematizing data collection, minimizing in person research implementation costs, and enabling respondent flexibility. However, as with all tools, IVR also offered some unique challenges. In this article, we compile some lessons learned for any researcher who decides to turn to IVR to gather data.

To improve user engagement with IVR, these are the factors we found worth thinking over:

  • Train your respondents on how to engage with the platform –Preparation on what to expect and which options to choose for specific responses is crucial to avoid confusion while engaging an IVR platform. This training should also raise awareness on the importance of inputting actionable data, as well as carefully guiding the respondents on how to input correct responses.
  • Allow repetition of questions and instructions – Participants may need to listen to some questions/instructions more than once due to background disruptions or for clarity. It is important to have a number that a respondent can press for a particular question/instruction to be repeated.

Here’s what to look out for if you want to improve your study participants’ uptake of IVR:

  • Make sure you oversample – The absence of human interaction in IVR can consequently lead to low conversion rates. This can happen for many reasons – respondents may be predisposed, disinterested or uncomfortable answering certain questions.

When we made follow up calls to some of the respondents they readily provided answers to previously unanswered questions. This implies that the lack of human interaction may cause attrition in IVR studies as was echoed in a study conducted to assess the feasibility of IVR in providing self-care services among patients.

To avoid human interaction altogether, make sure your IVR approach limits the questions to only the essentials.

  • Use nudges to increase interactions with IVR – For one-way IVR, respondents are required to contact the platform in order to take part in a survey. When we used IVR, we found that sending constant gain-framed reminders to the respondents increased interactions with the IVR platform. We also found it helpful to emphasize that in order to be remunerated, respondents should answer all questions.
  • Share the costs to participants upfront — Some of the respondents that were reached during our follow up indicated that they dropped off due to lack of airtime to continue the call. Interactions increased after the project team sent messages indicating how much airtime was required for each participant before engaging the IVR platform.
  • Consider the ideal IVR setup based on your study’s needs — Consider if you are using a one-way (only researchers can prompt respondents, or only respondents can call in) or two-way (researchers and respondents can prompt IVR) system, what costs will be incurred by callers, and what languages on the system callers can select. We also advise presenting the IVR number via a written and accessible mode such as text.

To improve the quality of data you get in IVR, you should:

  • Build data quality checks upfront — IVR requires automatic data checks to limit data entry issuesWhile using IVR for our study we noticed a number of respondents who either mistyped responses or input abnormal figures on sensitive questions touching on income per month, income expectations and number of children.

These issues led to time and effort being spent on data cleaning to report correct averages and trends. For future use of IVR, it is important to create automatic data checks to prevent respondents from typing unexpected responses.

  • Pilot your study multiple times to optimize user interactions and usability — Have an internal test run team to engage with the IVR platform in the early stages. This will help to get a sense of the respondents’ experiences and will go a long way in making sure your instrument is research ready.

In addition to all the above you can also control your costs by capping the number of responses. While snowball sampling might be crucial in increasing IVR interactions, the amount of airtime spent on financing an IVR exercise can exceed your budget if the number of respondents reaching the system are not capped in advance. Ideally, you should recruit a specific number of respondents fitting a given selection criteria, who will then be allowed to interact with it to minimize airtime usage.

On the whole, using IVR as a research tool inevitably requires test runs and learning while doing. However, in a world locked down by COVID-19 restrictions, this tool helped us reach respondents using a basic phone which could have otherwise been bypassed in favor of more advanced digital research tools.

Connect with us on our social media platforms: TwitterFacebookInstagramLinkedIn and YouTube.

Scroll to Top