Study design, data collection, and key measures
We performed a secondary analysis of data collected in Southern Province, Zambia as part of a study assessing the feasibility and performance of exact-match and various ecological linking methods. A detailed description of the study methods and findings has been published previously (7). Briefly, we conducted the study in five health facility catchment areas in Choma district between January and March 2016. The study collected data on care-seeking for illness in children under 5 (fever, diarrhea, or suspected ARI) in the preceding two weeks, using a household survey instrument based on the Zambia DHS. In addition to the standard DHS questions on the type of provider from which care was sought for reported child illness, we also asked mothers to identify (name or describe) the specific source(s) of care utilized. We also collected data on structural quality, or infrastructure required, for managing child illness for every health care provider in the study area using questions derived from the Service Availability and Readiness Assessment (SARA). The structural quality indicators were designed to assess a provider or facility’s capacity to provide curative services for children, including the presence of drugs and commodities, training, supervision, and provider case management knowledge. We included public, private, informal, and traditional sources of care in the assessment. Geo-locations of all participating households and health care providers were collected using the geopoint function built into Open Data Kit (ODK) Collect operated on Motorola Moto G (Gen 2) smartphones running the Android 5.0.2 system.
Analysis
Overall approach
We used data generated using the exact-match approach as the measure of correct quality-adjusted effective coverage of management of child illness in the study population. Using the exact-match linking approach, we assigned each child the structural quality score of the specific provider from which care was reportedly sought, which was considered to be the true source(s) of care. Children were not linked using the exact-match method if their caregiver could not recall the name of the provider or facility from which care was sought or the provider could not be located for inclusion in the study, mostly affecting individuals who utilized informal shops.
To simulate ecological linking in the absence of data on specific source of care, each sick child was linked to the closest health provider(s) within the reported category of source of care (Box 1) using three measures of geographic proximity: 1) absolute distance, 2) travel time, and 3) 5 km radius. Each measure of geographic proximity was applied using 1) known household location, 2) undisplaced cluster location, and 3) five sets of displaced cluster locations, each reported separately. Quality-adjusted coverage of management of child illness was calculated using each combination of ecological linking method and measure of sick child location by assigning each child the quality score of the proximal provider(s) to which they were linked.
Box 1. Categories of healthcare providers in the study area
|
|
Public
|
Government hospital
|
Government health center / post
|
Government CBA / fieldworker
|
|
Private
|
Private hospital / clinic
|
Pharmacy
|
|
Informal
|
Shop / market
|
Traditional / faith-based practitioner
|
For both the exact-match and each ecological linking approach, we calculated the quality-adjusted coverage of management of child illness as the average quality score across all sick children in the study. If no care was sought for a sick child, they were assigned a quality score of zero. If care was sought from multiple sources, we averaged the scores of those sources.
To quantify the bias introduced into each method by imprecise household location, we compared the estimates of quality-adjusted coverage from each combination of ecological linking approach and cluster location against the 1) exact-match quality-adjusted coverage estimates and 2) estimates generated using each ecological approach with the true household location. We also assessed how accurately each approach identified the actual provider(s) utilized by each sick child by comparing the provider(s) linked to each sick child using the ecological approaches with the specific source(s) of care reported by each child’s mother.
Quality score
A full description of the construction of provider structural quality scores and methods for defining geographic proximity is presented in a previous publication (7). Briefly, we defined each provider’s structural quality score as the availability of services, commodities, and human resources needed to appropriately manage common child illnesses (Box 2). These indicators were considered the minimum inputs for appropriate care: the basic commodities required to diagnose and treat common child illness, along with the human resources and clinical knowledge to apply them correctly. As such, the score reflects an upper ceiling of the potential quality of care offered by a provider. We calculated scores as a continuous variable ranging from zero (no capacity to provide care) to 100% (full capacity to provide care).
Box 2. Structural quality score components
|
|
|
Diagnostics
|
|
Malaria Diagnostic (RDTs or microscopy)
|
Malnutrition Diagnostic (MUAC or Scale + Height board + Growth chart)
|
ARI Diagnostic (Stethoscope or respiratory timer)
|
General microscopy (Functioning microscope and slides)
|
|
|
Basic Medicines
|
|
ORS
|
|
Zinc
|
|
ACT
|
|
Oral antibiotic
|
|
|
|
Severe / Complicated Illness Medicines
|
IV fluids
|
|
Injectable quinine or artesunate
|
Injectable antibiotics
|
|
|
Human Resources
|
|
Training (At least one staff member with IMCI or relevant training)
|
Guidelines (IMCI guidelines or relevant guidelines or job aid available)
|
Supervision (Received supervision visit with case management observation in past 3 months)
|
|
|
Available Services
|
Diagnosis and treat malaria (by pathology)
|
Diagnosis and treat diarrhea (by pathology)
|
Diagnosis and treat ARI (by pathology)
|
Diagnosis and treat malnutrition (by pathology)
|
Facilitated referral capacity
|
|
Knowledge
|
Average performance on case scenarios
|
Geographic proximity
We employed three measures of geographic proximity in this analysis. For each method, we developed an automated script in QGIS comparable to the process outlined for application in ArcGIS in a previous paper (7). We conducted all geographic analyses in QGIS 2.18.24 (Open Source Geospatial Foundation Project, Beaverton, OR, USA). Ecological linking was restricted to only assign children to the types of providers (managing authority and level of care) from which care was reportedly sought based on responses during the household survey. For example, if a mother she reported care for her sick child from a government health center, then the child could only be linked to another government health center – not a private facility or a government hospital.
- Absolute distance: Each sick child was linked to the single closest provider based on absolute distance from the child’s location within the reported source of care provider category. This method is the simplest approach for assigning a child to a specific provider.
- Travel Time: Each sick child was linked to the single closest provider by travel time from the child’s location within the reported source of care provider category. Travel time was approximated by grading the relative speed of travel on different types of roads (e.g. paved roads, graded roads, footpaths). This method is designed to model the effect of road access and quality on care-seeking.
- 5 km Radius: Each sick child was linked to all providers within the source of care provider category within a 5 km radius of the child’s location. This method is designed to approximate a 1-hour walking distance from a household to a provider in any direction.
Cluster location and displacement
The central point location for each cluster was generated to capture an area of high population density within each cluster inline the DHS central point measurement procedures. A census of all households within each of the study catchment areas was conducted before the study and included the location of each household. In QGIS, we grouped all the households into clusters of 150 households based on measured latitude and longitude, and we calculated the mean point of each cluster of 150 households as the central point.
Each central point was displaced five times using an R script developed by Measure Evaluation for DHS cluster displacement (5). In brief, the code offsets each point using a random angle and random distance, capped at 5 km for rural clusters (1% capped at 10 km) and 2 km for urban clusters. The code further restricts the displacement to ensure points are not displaced outside of their true administrative unit (e.g., district). However, this feature was redundant in our analysis due to the small size of the study area. We ran the displacement code in R 3.4.3 (R Foundation for Statistical Computing, Vienna, Austria) and we imported each set of displaced coordinates into QGIS for the linking analyses.
We then substituted each central point and displaced central point for the household location in our measures of geographic proximity. Instead of calculating the geographic proximity of providers from the home of each sick child, we measured proximity from the relevant central point or displaced central point location as depicted in Figure 1.