This study assessed the usual iron, hem, non-heme, and bioavailable iron intake among women of reproductive age in Kersa, Eastern Ethiopia. The median usual iron consumption was 24.7 mg/d and 41.8 % of WRA were at risk for iron inadequacy. The following factors were associated with a greater likelihood for the risk of iron inadequacy: seasonal and part-time agricultural employment, market food source, and low women’s dietary diversity. Older women in the study were more likely to have inadequate dietary iron intake.
The mean dietary total iron intake in this study was lower than a study that was done in an urban resident of Gonder, which reported a mean iron dietary of 97.81 mg/d [39]. The size of dietary iron inadequacy was also high compared to the national report that listed 14% and others that reported iron inadequacy in women of reproductive age group [40, 41]. This difference could be due to the disparity of malnutrition in the regions and cities of Ethiopia. The district, like most rural areas, is dependent solely on rainfall and traditional agricultural means of food productions [42]. Moreover, the study’s accuracy in the estimation of the total dietary iron intake might be limited due to not measuring biochemical markers of iron amount and storage and the use of a non-quantitative FFQ which can distinguish between high and low consumers of nutrients, but is not ideal for measuring absolute intake.
Diets for study participants were plant-based and poor as expected. Most participants consumed starchy staples and all types of vegetables, and the least food groups consumed were fish, eggs, flesh foods, and fruits. Non-heme iron-made up most of the dietary intake.
The dietary iron inadequacy found in this study was high compared to the recommended cut-off point in a population [38]. The finding is not surprising Ethiopia has high levels of poverty, food insecurity, and malnutrition [43]. Malnutrition and hunger are also common in this rural district being a major cause of death in children under five [22]. Many households are also dependent on handouts from Ethiopia’s Productive Safety Net Program (PSNP) due to chronic malnutrition [44].
Some Sub-Saharan African countries have implemented fortification of cereal and other food products with iron to meet their high dietary needs [45] There are limited iron-fortified foods or enriched food products yet available in Ethiopia, and food insecurity has made the matter even worse [40]. Fortifying wheat and cereals, availing inexpensive nutrient-rich food alternatives, and eliminating hunger have significantly improved nutrition and the health status of WRA. The World Health Organization (WHO) currently recommends iron supplementation for women of reproductive age and children [46], fortification of foods, and food and nutrition education as a strategy in combating iron deficiency [47]. The application and effectiveness of these strategies in developing countries are affected by economic, socio-cultural, and infrastructural challenges. Therefore, dietary approaches are a top priority in addressing IDA.
Low iron intake among women of reproductive age impacts the women’s life and future pregnancy and birth outcomes. Several studies have shown that a woman with low iron intake is at risk for antepartum and postpartum hemorrhage, which is the number one cause of maternal morbidity and mortality [7].In addition, women with low iron stores and consumption are more likely to have poor birth outcomes including preterm birth, stillbirth, and low birth weight babies(ref). Babies that were born from anemic women tended to show poor physical growth performance, predisposition to infection, and retarded brain development, which affects eventual schooling and social development [48]. Therefore, it would be an important addressing problem.
Furthermore, older women had a higher risk of dietary iron inadequacy in this study. Because the need for higher dietary iron intake for old women increases due to larger families and have more children, they are the most at risk for iron deficiency and insufficiency [21]. On the contrary, other studies had shown traditional social and cultural inequities, younger women are also more likely to impose dietary restrictions that would decrease the consumption level and inadequacy of iron [49]. This difference could arise from other unmeasured confounders and the potential introduction of recall bias in the study.
Women that had lower dietary diversity were also found to be at higher risk of dietary iron inadequacy. This could be explained by poor individuals are more likely to consume foods that are less diverse and healthy [50]. Furthermore, the mean dietary total iron in our sample with higher dietary diversity had roughly twice the amount of recommended dietary iron and was statistically different across the subtypes of Fe [38]. Having a lower dietary diversity tends to decrease the consumption level of a balanced and healthy diet, leading to nutrient inadequacy and deficiency. This is in line with different studies that describe a strong relationship between individual dietary diversity score and micronutrient inadequacy [34, 51, 52]. Research has shown that women meeting minimum dietary diversity for the FAO MDDW are more like to meet their RDAs/EARs for iron, vitamin A and other micronutrients [53]. It is also worth noting that women’s dietary diversity varies in the seasonal availability of foods, which was not considered in this study.
In addition, Others have also shown poor individuals had a higher restrain on financial freedom in the household that would limit securing and choosing nutrient-rich food for the family. This not only predisposes families to hunger but also other poor health outcomes [54]. are more likely to consume foods that are less diverse and healthy [50]. Furthermore, the mean dietary total iron in our sample with higher dietary diversity had roughly twice the amount of recommended dietary iron and was statistically different across the subtypes of Fe [38].