First-Trimester Iron Screening Can ID Women at Risk for Later Deficiency
WEDNESDAY, Oct. 2, 2024 (HealthDay News) -- Screening pregnant women during their first trimester with a target ferritin concentration of >60 μg/L may identify those at risk for iron deficiency later in pregnancy, according to a study published online Sept. 26 in the American Journal of Clinical Nutrition.
Elaine K. McCarthy, Ph.D., from University College Cork in Ireland, and colleagues evaluated the longitudinal changes in iron biomarkers across pregnancy and the prevalence of iron deficiency in primiparous women. The analysis included 629 individuals with low-risk, singleton pregnancies with iron measurements taken at three points.
The researchers found that the prevalence of iron deficiency (ferritin <15 μg/L) increased throughout pregnancy, from 4.5 percent at 15 weeks of gestation and 13.7 percent at 20 weeks to 51.2 percent at 33 weeks. With a threshold of <30 μg/L for ferritin, rates of deficiency were 20.7, 43.7, and 83.8 percent across the three time points, respectively. A similar prevalence of deficiency was seen using soluble transferrin receptor (sTfR) measures of >4.4 mg/L or ferritin <15 μg/L (7.2, 12.6, and 60.9 percent, respectively). Rates were lower using total body iron <0 mg/kg than using ferritin or sTfR. Ferritin <60 μg/L emerged as the ferritin threshold at 15 weeks that predicted iron deficiency (ferritin <15 μg/L) at 33 weeks (area under the curve, 0.750). There was a reduction in risk for deficiency throughout pregnancy with use of iron-containing supplements (mainly multivitamins) taken in prepregnancy/early pregnancy (odds ratio, 0.57).
"Pregnancy places a remarkable strain on maternal iron status even in a high-resource, generally iron-supplemented population," the authors write.