Distribution of CD4 enumeration sites in Cameroon:
In Cameroon, 10 regions have health facilities to perform CD4 enumeration for HIV infected patients. EQA of CD4 enumeration for these health facilities is managed by two programs; Cameroon CDC in the North-West, South-West and Littoral regions supported by the Centre for Disease Control (not discussed in this paper) and Cameroon National in the Far-North, North, Adamaoua, West, Centre, South and East regions support by CIRCB.
Figure 1 shows the distribution of the 61 regional sites participating in the National EQA program coordinated by the CIRCB in collaboration with QASI-LI across Cameroon.
These sites are located in both urban and rural areas, but as seen a majority of participants are centrally located. A variety of CD4 enumeration platforms (see Table 1), both conventional and POC, including PIMA from Alere (now Abbot), FACSCount from BD Biosciences and CyFlow from Partec are registered in this program.
Table 1
Data on pass and failure rates for all instruments further broken down by instrument type over the EQA sessions. Numbers are represented as raw values and percentages are in parentheses
|
All instruments
|
FACSCount
|
|
CyFlow
|
|
PIMA
|
|
others
|
|
Session
|
N°
|
Pass
|
Fail
|
N°
|
Pass
|
Fail
|
N°
|
Pass
|
Fail
|
N°
|
Pass
|
Fail
|
N°
|
Pass
|
Fail
|
1
|
14
|
7 (50)
|
7 (50)
|
8
|
3 (37.5)
|
5 (62.5)
|
3
|
2 (67)
|
1 (33)
|
2
|
1 (50)
|
1 (50)
|
1
|
1 (100)
|
0(0)
|
2
|
50
|
29 (58)
|
21 (42)
|
16
|
9 (56)
|
7 (44)
|
13
|
5 (38)
|
8 (62)
|
17
|
11 (65)
|
6 (35)
|
4
|
4 (100)
|
0 (0)
|
3
|
48
|
35 (73)
|
13 (27)
|
16
|
11 (69)
|
5 (31)
|
11
|
6 (55)
|
5 (45)
|
19
|
16 (84)
|
3 (16)
|
2
|
2 (100)
|
0 (0)
|
4
|
59
|
55 (92)
|
4 (7)
|
11
|
10 (91)
|
1 (9)
|
14
|
12 (86)
|
2 (14)
|
32
|
31 (97)
|
1 (3)
|
2
|
2 (100)
|
0 (0)
|
5
|
57
|
48 (84)
|
9 (16)
|
9
|
8 (89)
|
1 (11)
|
13
|
10 (77)
|
3 (23)
|
33
|
28 (85)
|
5 (15)
|
2
|
2 (100)
|
0 (0)
|
6
|
54
|
50 (93)
|
4 (7)
|
11
|
11 (100)
|
0 (0)
|
8
|
7 (87)
|
1 (13
|
33
|
30 (91)
|
3 (9)
|
2
|
2 (100)
|
0 (0)
|
7
|
18
|
16 (89)
|
2 (11)
|
6
|
6 (100)
|
0 (0)
|
2
|
2 (100)
|
0 (0)
|
9
|
8 (89)
|
1 (11)
|
1
|
0 (0)
|
1 (100)
|
8
|
31
|
23 (74)
|
8 (26)
|
9
|
5 (55.5)
|
4 (44.5)
|
5
|
4 (80)
|
1 (20)
|
15
|
13 (87)
|
2 (13)
|
2
|
1 (50)
|
1 (50)
|
9
|
48
|
42 (87)
|
6 (13)
|
5
|
5 (100)
|
0 (0)
|
9
|
7 (78)
|
2 (22)
|
32
|
29 (90)
|
3 (10)
|
2
|
2 (100)
|
0 (0)
|
10
|
47
|
36 (76)
|
11 (24)
|
6
|
4 (67)
|
2 (33)
|
7
|
2 (28)
|
5 (72)
|
32
|
29 (90)
|
3 (10)
|
2
|
1 (50)
|
1 (50)
|
11
|
54
|
49 (90)
|
5 (10)
|
6
|
6 (100)
|
0 (0)
|
8
|
5 (62)
|
3 (38)
|
37
|
36 (97)
|
1 (3)
|
3
|
2 (66)
|
1 (34)
|
The diversity of machines results in a diversity of manufactures/suppliers, for supply of both reagents and instrument maintenance as well as requiring operators to display competence on multiple platforms. Participating health facilities include those from both the public and private sector. Importantly, the number testing sites within a region is not proportional to the number of HIV patients accessing service. An excellent example of this is in the case of pregnant women. As seen in Fig. 1, in the North region there are only four CD4 enumeration instruments used to test more than 7000 expectant HIV positive pregnant women while in the West region there are seven CD4 instruments for less than 7000 expectant HIV positive pregnant women in 2013. This clearly illustrates that EQA coverage across the country is not proportional to the amount of testing at each site. A goal will be in the future to increase the participant’s number in the others areas (West, South and North).
EQA Program Participation Rates and Performance:
The participation rate in the EQA program within Cameroon started small (shown in Fig. 2) but rapidly increased from the pilot session (QASI-LI session 38) in 2014, with 15 participating sites to 68 participating sites at present.
This quick growth in participation was due in large part to increased communication and expanded engagement efforts of the country coordinator with the participating labs. Figure 2 also displays overall performance rate (number and percentage) of participating sites who Passed, Failed or were Unable to Report results for each QASI-LI session. Although the pass rate among those able to report for each session ranged from 50–95%, we observed an overall rapid increase in the number of participating sites who reported correct results across all sessions following implementation of the program. The corresponding failure rates decreased with time, particularly in the 4th session following the signing of instrument maintenance contracts by the Ministry of Public health. In addition, failures decreased over time due to response to corrective/remedial action recommendations resulting in increased training and on-site visits from the coordinating centre. These trainings and on-site visits were extremely important as they allowed for careful review and consideration of conditions and practices in place at each site.
In sessions, 7 and 8 there was a large proportion of participating sites that were Unable to Report results as seen in Fig. 2. Sites with this status are those with broken instruments, expired reagents, no reagents or were unable to report for other reasons. The high rates of participating sites that were Unable to Report results during these sessions reflect the reality of CD4 testing in resource-limited settings, where funding for procurement of reagents is not always stable.
Identifying Sources of Error for Corrective Action:
For participating sites who fail a session because their reported results are outside of the acceptable range, a major part of the EQA program is to assess the reason for incorrect results in order to implement successful corrective and remedial measures. Errors were identified in the pre-analysis, during analysis, and post-analysis phases.
Figure 3 represents the distribution of these errors across the 11 sessions among participating sites who reported results outside of the acceptable range. The majority of failures were in the analysis phase, which ranged from 63.5–100% in any given session.
The errors in this phase commonly occur during sample preparation and include pipetting errors, or they occur during sample acquisition/analysis and include inaccurate gating technique and instrument errors. Failures during the pre-analytical phase varied from 0–28.5% in a session and were mainly the result of issues related to sample integrity during transport and or preservation. Failures during the post-analysis phase varied from 10–37.5% in a given session and were mainly the result of errors during data entry either onto the paper-copy result submission form or onto the electronic website for result collection.
Figure 4 further summarizes the distribution of errors seen across all 11 sessions.
In order to identify if failures were higher in labs using a particular instrument or a POC device, we calculated pass and fail rates across sessions according to the CD4 enumeration platform used. Table 1 identifies the variety and number of instruments (both conventional and POC) used in this program with pass and fail rates observed for those instruments over the evaluated sessions. Overall, the number of conventional flow cytometry instruments, including the FacsCount and CyFlow, reduced in number across sessions, while the number of sites utilizing the PIMA analyzer increased from 2 sites to 37 over the evaluated timeframe. (As some sites do not always use the same instrument to participate in the EQA program, we do not always have consistent numbers and instruments used across sessions).
Figure 5 shows the failure rates for each of the different CD4 enumeration instruments used.
The orange line representing all instruments combined serves as the baseline and is used as a reference for all comparisons. CyFlow and FACSCount users often had failure rates higher than the overall instrument failure rate. In contrast, failure rates observed for PIMA users were generally lower. Sites reporting results using the PIMA analyzer did have a high rate of failure in the first two sessions but with training and corrective action the rate of failure was consistently reduced in all future sessions. In contrast, the failure rates for the flow cytometry instruments is much more erratic and the CyFlow instrument in particular shows a rise in failure rates over the latter sessions.