Reliable Assessment of Iron Deficiency Anemia Using Soluble Transferrin Receptor

Clinical Trials & Research


A new study finds that using soluble transferrin receptor (sTfR) concentrations to estimate nutritional iron deficiency anemia may be inaccurate as sTfR levels are affected in the presence of inflammation and malaria. The study proposes methods to adjust measured sTfR values when inflammation or malaria are present, so that these values reflect nutritional iron deficiency alone.

It is essential for public health programs to determine nutritional iron deficiency anemia accurately in order to design and implement intervention strategies. The World Health Organization and the United States Centers for Disease Control and Prevention recommended in 2004 that ferritin levels should be used to assess nutritional iron status in populations with low levels of inflammation. Unfortunately, the prevalence of inflammation, a normal bodily response to various chemical, physical, and biological agents, is high in many parts of the world where poor sanitary conditions result in frequent infections. As ferritin levels increase during inflammation, measuring ferritin levels may not be an accurate reflection of nutritional iron status, especially in regions where infectious diseases are endemic.

It is recommended that soluble transferrin receptor levels (sTfR) in the blood be measured under such conditions to determine nutritional iron status. However, several reports indicate that sTfR levels also increase during inflammation, although there is little consensus on the extent of the increase. Malaria, often accompanied by anemia because of the destruction of hemoglobin carrying red blood cells, may also influence sTfR levels.

A new study published in the American Journal of Clinical Nutrition examines the need to also measure markers of inflammation, such as C-reactive protein (CRP) and a-1-acid glycoprotein (AGP), when determining nutritional iron status. The paper then discusses methods of normalizing or adjusting measured values of sTfR so they more accurately reflect nutritional iron status without inflammation- or malaria-induced effects.

The study used previously gathered data from the Biomarkers Reflecting Inflammation and Nutritional Determinants of Anemia (BRINDA) project, a collaborative international project that aims to develop improved methods for assessing nutritional iron deficiency anemia. The data was collected in international surveys of preschool children (PSC) (aged 6–59 months) and nonpregnant women of reproductive age (WRA) (aged 15–49 years). Only survey data that contained information about sTfR, CRP, AGP, and malaria status were included in this study, which resulted in a data set comprised of 9281 PSC and 5004 WRA.

For the purpose of this study, an sTfR concentration > 8.3 mg/L in both PSC and WRA was used a cutoff point to define iron deficient erythropoiesis (red blood cell production in the bone marrow under iron deficient conditions, which is determined by sTfR measurements). CRP levels > 5mg/L or AGP levels > 1 g/L signified the presence of inflammation.

Analysis of the data showed that higher blood sTfR levels were modestly linked to higher levels of both CRP and AGP. There was a weak linear relationship with sTfR levels increasing with CRP or AGP levels for both PSC and WRA; this relationship was stronger for AGP than for CRP. The presence of malarial infection was also weakly associated with higher sTfR levels.

To correct measured sTfR values that were elevated by inflammation or malaria, the researchers adopted three different approaches. One approach involved excluding individuals with elevated CRP or AGP levels from analysis and determining individuals with iron deficiency anemia in the remaining dataset. In a second approach, individuals were divided into 4 groups based on the CRP and AGP cutoff values described above: the reference set with both CRP and AGP levels lower than their corresponding cutoffs, the incubation set with high CRP but low AGP, the early convalescent set with high CRP and AGP, and the late convalescent set with low CRP but high AGP. Next, correction factors were calculated for the incubation, early convalescent, and late convalescent sets by dividing the geometric mean sTfR values of each group by the geometric mean sTfR value of the reference group. Similarly, correction factors were calculated for data from populations where malaria was endemic. A third approach involved using the statistical method of linear regression to determine adjusted sTfR levels, and was based on the near linear positive correlation between AGP and sTfR levels.

The results showed that regardless of the adjustment method used, adjusted sTfR levels were lower, and the estimated prevalence of iron deficient erythropoiesis and iron deficiency anemia were reduced. The decrease in sTfR levels ranged from 4.4–14.6 and 0.3–9.5 percentage points in PSC and WRA, respectively, depending on the adjustment method used. The regression method resulted in a more drastic reduction in sTfR levels compared to the correction factor method. A notable limitation of this study was that it used data collected from individuals at a single time point only, without follow up over the long-term.

This study highlights the importance of taking other health conditions into account when assessing iron deficiency anemia. Nonetheless, as stressed by the authors, the adjustment methods used in this study may have to be tested in different populations before being incorporated into public programs for assessment of iron status.

Written By: Usha B. Nair, Ph.D.



Source link

Products You May Like

Articles You May Like

Can a Healthy Diet Help Reduce Blood Clot Formation?
Bacterial Mutation Rates and Antibiotic Resistance
Are Quantum Computers the Future of Drug Discovery and Development?
Is There a Link Between Radon Exposure and Breast Cancer?
Do Trees Really Reduce Asthma-Inducing Pollutants?

Leave a Reply

Your email address will not be published. Required fields are marked *