By: Zara Ali, Luz Azlor del Valle, Erin Fletcher, Julius Josephat and Taylor Salisbury
Part 2 of 2
[Editor’s Note: R4D has been collaborating with the Government of Tanzania to increase access to amoxicillin dispersible tablets (amox DT) — the first-line pediatric treatment for the disease – with the goal of reducing the number of children dying from pneumonia every year.. Since 2015, R4D has supported the government in mobilizing financial and market shaping support for amox DT. In 2017, R4D also conducted a health facility survey to determine whether the amox DT the government procured was actually reaching health facilities, where it could then be prescribed to children who needed it. This year, R4D and EDI Global have been working on a follow-up survey to understand trends around the availability and stocking of amox DT and other essential medicines at frontline public health facilities. The survey, which began in March 2020, was retooled to be fully remote as result of the COVID-19 pandemic. This is the second post in a two-part series on lessons learned from this remote surveying process. Read the first post on challenges here.]
In the first post in this series, we described some of the challenges we faced with a rapid pivot to remote surveying during the COVID-19 pandemic with our partners in Tanzania. Despite the challenges we faced in this process, we learned a tremendous amount. We think these lessons are applicable across contexts, so we’re sharing them in the hope of informing not only our subsequent data collection efforts, but others’ as well.
Here, we summarize several key takeaways from the process and provide suggestions for remote surveys going forward.
1) Pilot, pilot, pilot.
Teams at both organizations conducted a series of piloting exercises that surfaced many of these challenges and allowed us to devise strategies to overcome them. We conducted a pilot over the phone, but had also previously conducted an in-person exercise that imitated the phone survey, allowing us to observe how respondents might respond to each new instruction, the preparation document, etc. The piloting exercises also confirmed that the survey would take, on average, about 20 minutes.
2) Remote training should be planned, interactive, and flexible.
Remote facilitation and training present some of the biggest challenges to starting a new project. While we were previously accustomed to gathering together in a big room, we had to switch to videoconferencing and phone calls. We endeavoured to keep the groups small, to keep the sessions short and pertinent, and to introduce remote mock interviews and practical training sessions with staff listening in and sharing the data. During the training, the process of contacting participants and protocols was reviewed and practiced several times. For a more comprehensive discussion of this topic, see EDI Global’s blog post on how to conduct an effective remote training (see blog here).
3) The relevant authorities can help to build trust with respondents, but require early and consistent attention.
In any survey, building trust is key, but that need is magnified over the phone in interactions with the authorities who have to approve the work. Ideally, the person contacting the authorities will already be known to them and have a problem-solving attitude. When speaking directly with respondents, interviewers successfully built rapport by patiently explaining the project and the data collection activities. The persistence of the interviewers and good rapport with the authorities resulted in the cooperation of 98.8% of the facilities sampled.
4) Phone surveys may lead to higher, unexpected costs for respondents and they should be compensated accordingly.
Sharing the instructions document with the respondents also influenced the compensation provided to the respondents. In the case of in-person interviews, there was no need for compensation incentives because the respondents were government employees. However, in the remote round the respondents were compensated for the costs they incurred in terms of printing costs, internet data usage, and phone battery life. The compensation was key to a high response rate, as it was perceived as an incentive by many hesitant respondents.
5) An attempted phone call can feel too minimal to record but tracking and monitoring activities to mark progress is key to success.
Keeping a clear list of the respondents attempted, reached, and completed is paramount to the success of the project. The coordination team also played an important role in that regard by monitoring the daily progress of interviewers and confirming with them the list of respondents that remain to be completed.
6) Where possible, employ skilled, and experienced data collectors, even if it takes longer.
For this project, we selected data collectors who were highly qualified staff with years of experience in remote data collection. Having a small team of skilled interviewers over a longer period was preferred to a larger group over a shorter time, because tracing respondents turned out to be very time consuming, especially towards the end. This is apparent by productivity decreasing from 12 interviews per day on average during the first week to less than five toward the end, when the remaining respondents were trickier to reach and required more attempts.
7) learning and communication can increase response rates.
Though we had a starting list of phone numbers, experience quickly showed that some did not work and alternative methods to find the right people were available. Interviewers tracked down respondents by contacting nearby facilities, previous staff and authorities. Some of the points in the attempt protocols were modified when a best practice emerged, e.g. calling the same phone number several times in a row turned out to be effective as the respondent would contact the interviewer when he or she became available.
On the Horizon
Though clearly not without its challenges, the success of the pivot to remote data collection and the subsequent survey was highly dependent on careful planning at the outset, and we owe much to the patience and persistence of our team of interviewers. Undergirding all of these challenges was the specter of poor communications networks: dropped calls, interrupted calls, the inability to gather for a training — all of these challenges meant that any solution would require a lot of work.
Whether it was network or old phone numbers, the fact is that the majority of calls didn’t go through: in 61% of facilities, we had to try more than one of the numbers in our database before we got the intended respondent on the line and only 38% of calls resulted in a survey. Once we were able to make contact, we did our best to schedule around the work of our respondents – people who were busy keeping a taxed public health system in motion in the middle of a pandemic. Interviewers were flexible, and when calls were interrupted by more pressing health facility matters, they scheduled a time to call the respondent back. All told, 52% of facilities required more than one conversation with the respondent in order to ultimately complete the survey.
Our plan is to conduct this survey six more times, but we are revisiting that strategy, too. Our goal of constructing a panel data set over many years was designed to help us understand seasonality of pneumonia medication use —I f there is any — as well as maintain relationships over time. The pandemic, however, has changed the calculus. Tanzania has remained relatively open, so should we consider restarting an in-person survey to get our full facility information? What are the ethical implications of doing so? If it doesn’t open up or we decide to keep going remotely, what are the ethical implications of regularly taking up personnel time? Should we change the frequency of the survey to account for COVID in some way? How? What are the things we might learn from such a change?
The pandemic is quickly shaping and reshaping lives and work around the world, and if nothing else, we are learning to be flexible and adaptable. There will be many decisions to make moving forward, likely pivots to the design, and new challenges to face, as we settle into this new normal — and then the one that comes after the pandemic. And we will continue to face them with aplomb, devising new strategies together to move forward.