By: Zara Ali, Luz Azlor del Valle, Erin Fletcher, Julius Josephat and Taylor Salisbury
Part 1 of 2
[Editor’s Note: R4D has been collaborating with the Government of Tanzania to increase access to amoxicillin dispersible tablets (amox DT) — the first-line pediatric treatment for the disease – with the goal of reducing the number of children dying from pneumonia every year. Since 2015, R4D has supported the government in mobilizing financial and market shaping support for amox DT. In 2017, R4D also conducted a health facility survey to determine whether the amox DT the government procured was actually reaching health facilities, where it could then be prescribed to children who needed it. This year, R4D and EDI Global have been working on a follow-up survey to understand trends around the availability and stocking of amox DT and other essential medicines at frontline public health facilities. The survey, which began in March 2020, was retooled to be fully remote as result of the COVID-19 pandemic. This post is the first in a two-part series on the challenges and lessons learned of this remote surveying process; it is also available on the R4D‘s site]
On March 16, 2020, Tanzania reported its first official case of COVID-19. It was inevitable in the context of an accelerating pandemic, but it still made our hearts sink.
We had spent the previous weeks assisting staff from both our organizations (EDI Global and R4D) in returning to their homes in Dar es Salaam, in the UK, and the US. We had canceled team retreats and all travel for the foreseeable future under the threat of closing borders and uncertainty around the safety of transportation. Though much of the world was just beginning to think about a long lockdown and indefinite work from home, it became immediately apparent that we would need to halt an ongoing in-person health facility survey in Tanzania. We knew, despite the uncertainty around the pandemic, that it was part of our responsibility to act as ethical researchers. We could not be sending people all over the country, to ride on public transportation and go in and out of health facilities.
The problem was, we still really wanted the information the survey was supposed to provide.
For the past year, EDI Global and R4D had been working together in partnership with the Government of Tanzania on a follow-up to a health facility survey we conducted in 2017 to track the availability of lifesaving pediatric pneumonia medications in Tanzania. Our teams had been busy in Tanzania, as well as in the US and the UK: revising protocols, writing proposals to ethics boards, liaising with the Ministry of Health and other Government of Tanzania departments, forming contracts with consultants and data collectors, disseminating approval letters to facilities around the country. We were getting excited about the prospect of getting a fuller answer to the question: does R4D’s technical and financial assistance to the Government of Tanzania around pediatric amoxicillin result in lives saved? And our donor wanted to explore whether such a strategy could be cost-effective.
We had planned to do this through a hybrid in-person and remote model. In March 2020, we would conduct an in-person survey at selected health facilities, to be followed with a series of seven remote survey rounds over 2020 and 2021 using a much shorter questionnaire. In planning this work, we knew we needed a less intensive, less expensive way to get the information we needed, as compared to the long, detailed health facility survey conducted in 2017. After three survey rounds, we had identified a few key variables of interest and hoped to gather them in a more efficient way, one that allowed us to estimate key parameters but imposed less on respondents who were busy at health facilities, one that could be analyzed quickly and turned around to our government partners to inform policy and practice.
When the first COVID-19 case was confirmed in Tanzania, we were in only our second week of in-person surveying and so pulled the survey out of the field with fewer than 40% of health facilities contacted. We had been planning for remote work going forward, but our plans for remote survey deployment and success were dependent on that first in-person round. Our initial contact was to be in person, with physical letters to hand to facilities-in-charge, accompanied by the chorus of greetings and trust-building that comes with meeting a new colleague and survey respondent. We had intended to build a foundation and a process that respondents could rely on and return to when someone called them on the phone later. We had phone numbers for the remaining ones, but would respondents even pick up the phone? Would they trust we were who we said we were?
We also had very little help from the literature. Remote surveys are not new, but most of the literature on them references individuals and households, which are notoriously difficult to contact. Our survey is unique in that it includes respondents who are trained health facility staff. Our partnership with the government means that it is part of the respondent’s job to work with us and answer our questions. But folks are still busy, and without that in-person contact, we did not know which pull factor would dominate.
Not to be deterred, we decided to accelerate our first remote round and hope that folks would pick up the phone. After a series of tri-continental online training sessions with expert enumerators and staff calling in from home, we launched the remote survey on May 12th. By the end of the first week, we had contacted 84.9% of the facilities in our sample. Aside from network issues and the occasional reluctant respondent, the numbers rang through and folks willingly answered our questions. Our success grew, day by day, phone call by phone call. By June 16, the last day of surveying, we had reached all but three of the 251 health facilities sampled and had collected medicine availability data from 248 of them, fully completing 98.8% of the facilities in the sample.
The extraordinary response rate was not the result of luck, but of significant planning and patience. We met with several challenges along the way and share the most salient ones here, along with some lessons learned, in the hopes of informing other researchers looking to conduct remote surveys.
First, we outline the challenges we faced, dividing them into three major groups: (1) challenges due to the nature of the project itself, (2) challenges from transitioning to remote data collection and (3) challenges of conducting data collection during a pandemic.
1) Challenges related to the project
Several contact points with various respondents were required. As part of the data collection exercise it was necessary to contact multiple staff—and for some folks, multiple times—in any given facility to reach those with the requisite knowledge and access.
Sharing documents remotely in low-network settings is hard. Much of the information needed for the survey was on record or would take some movement around a facility to record, which would make for a very long phone conversation. All the information could be collected ahead of a scheduled phone call with a little planning and so we needed a way to share a soft copy of a guiding document with the respondents. With travel prohibited and some facilities having poor network coverage, sharing the document was a challenge. In some cases, respondents had to travel to find a printing office and in others, the information was dictated to them over the phone.
2) Challenges related to conducting data collection remotely
Phone surveys require shorter, simpler instruments to ensure quality and reduce fatigue. As discussed above, we had to shorten and even modify the survey instrument to make it appropriate for a phone call. Over the phone, respondents fatigue more quickly, or may be distracted by other tasks. With a poor network connection the call may be interrupted. All of these factors make it harder to engage a respondent remotely. Thus, surveys conducted by phone need to be shorter and researchers must prioritize. In addition, questions needed to be simplified, i.e., instead of asking the quantity of medicines for each treatment, a yes/no question is expected to reduce the length of the survey and provide more reliable information.
The phone numbers provided by the government partner were not always correct. For 61% of the sampled facilities, the previous round of in-person data collection was not conducted, and therefore interviewers needed to rely on the phone numbers collected in the previous phase, three years ago.
Quality control protocols needed to be re-thought. A common concern when collecting data remotely is whether the information will be as reliable as when collected in-person. In order to mitigate against this concern, we asked respondents to gather the required information prior to the interview. However, our capacity to do backchecks, perform quality control protocols, and visually confirm, for example, medicine availability, as we had in the past, was gone. Ultimately, though, we have to rely on the respondent. As direct observation of the interview was not possible, the strategies to identify and minimize interviewer errors needed to be revisited. To address this concern, we increased the instance of re-interviews and listening by supervisors.
3) Challenges related to conducting data collection during a pandemic
Health workers are generally busy, but during a pandemic…forget about it! Contacting health workers by phone can generally be challenging as individuals may not be available during the working day — and this is especially the case during a pandemic. The interviewers patiently and expertly adapted to the schedule of the health workers to ensure the project did not disturb their work.
Pivoting to remote enumerator training was entirely new and required investments and innovation in making sure data collectors had access, scheduling, and activity design. Previously, EDI Global conducted in-person training for data collection staff regardless of whether the data collection was remote or in-person. A series of logistical arrangements needed to be put in place to facilitate training where all the participants were attending from their homes across Tanzania, the UK, and the US.
Building trust without face-to-face interaction requires patience. Presenting the project remotely proved somewhat difficult in assuring the facility in-charges that the project had received the appropriate approval letters from the authorities. Although this was a follow-up survey, over half of the health facilities had not been contacted since 2017. With turnover and the lapse between surveys, interviewers met with some difficulty in earning respondents’ trust and cooperation.
It seemed like every day of surveying brought on new challenges, a hiccup we didn’t expect or another excuse for someone not to be able to talk to us. Luckily, every day also brought new lessons, which we’re excited to share. In the next post, we’ll talk about some lessons learned and suggestions for those implementing remote surveys going forward starting with pilot, pilot, pilot! and keeping lines of communication open.