1. Nutritional Value of Iron
Iron is an essential nutrient that is required for critical biological functions such as oxygen transport and cellular respiration. The adult human body contains 3-5 g of iron, with ~70% utilized in hemoglobin of red blood cells [
1]. The daily iron requirements for erythropoiesis are 25-30 mg and are mostly met by iron recycled from senescent red blood cells, which are cleared by tissue macrophages. As there is no specific mechanism for iron excretion from the body, dietary absorption of 1-2 mg iron per day is essential to compensate for non-specific iron losses. It should be noted that under physiological conditions, only a small fraction (~15-20%) of ingested luminal iron gets eventually absorbed. The limited bioavailability of iron can lead to iron deficiency anemia or non-anemic iron deficiency, which are the most common pathologies worldwide and remain leading contributors to global burden of disease [
2,
3,
4]. Iron deficiency is associated with fatigue and may also lead to immune, growth and neurocognitive defects (
Figure 1). In 2005, anemias affected roughly a quarter of the world’s population with iron deficiency anemia accounting for about half of these cases [
5]. Little progress has been made since then as, in 2016, iron deficiency anemia was one of the top five causes of years lived with disability with over 1.2 billion cases reported [
6].
Nutritional guidelines developed by the Food and Nutrition Board at the National Academy of Medicine in the United States recommend that infants between 7-12 months obtain 11 mg of iron from their diet daily, whereas the corresponding values for adult men, menstruating women and pregnant women are 8, 18 and 27 mg of iron, respectively [
7]. These guidelines highlight the elevated iron requirements in pregnant women and infants where iron is critical for growth and development [
8].
Iron deficiency and anemia are more prevalent in vulnerable populations such as indigenous peoples, refugees and immigrants from low and middle-income countries, and disadvantaged subpopulations [
4,
9,
10]. These notions provide the rational for food iron fortification programs [
11], and for the use of oral iron supplements or intravenous iron for therapeutic purposes [
12]. Nevertheless, excess body iron may lead to adverse health outcomes [
13], mainly due to the redox reactivity of the metal that can promote oxidative stress and tissue damage [
14]. This is vividly illustrated in diseases of iron overload such as hereditary hemochromatosis or iron-loading anemias (including thalassemia), which are associated with type 1 and type 2 diabetes mellitus, arthropathy, osteoporosis, hypogonadism, liver disease (fibrosis, cirrhosis, hepatocellular carcinoma) and cardiomyopathy (
Figure 1) [
15,
16,
17,
18,
19,
20,
21,
22].
The hazardous effects of excess iron are also evident from cases of acute iron poisoning, for instance following accidental ingestion of iron supplements by children [
23]. The severity of iron intoxication depends on the amount of iron ingested. Symptoms include nausea, vomiting, diarrhea, gastrointestinal bleeding, coagulopathy, shock, metabolic acidosis, hepatotoxicity, and abdominal pain. Thus, balanced iron intake is critical to avoid states of iron deficiency or overload.
2. Iron Homeostasis
Systemic iron homeostasis is largely controlled by hepcidin, a peptide hormone [
24]. Circulating hepcidin is primarily produced by hepatocytes in the liver and targets the iron exporter ferroportin in intestinal enterocytes, tissue macrophages and other cells. The binding of hepcidin occludes ferroportin’s iron export channel and, additionally, triggers internalization and lysosomal degradation of ferroportin, limiting iron entry into the bloodstream [
25,
26,
27]. Hepcidin is transcriptionally induced in response to iron or inflammatory cues via BMP/SMAD and IL-6/STAT3 signaling, respectively [
28,
29]. Iron-dependent induction of hepcidin primarily serves to prevent excessive dietary iron absorption and systemic iron overload [
24]. Inflammatory induction of hepcidin is considered as an innate immune response that causes hypoferremia to deprive extracellular pathogens of iron, which is essential for their proliferation. However, persistent hypoferremia due to chronic inflammatory induction of hepcidin results in functional iron deficiency by restricting iron availability for erythropoiesis, thereby contributing to anemia of inflammation, also known as anemia of chronic disease [
30]. Defective iron-dependent regulation of hepcidin due to genetic inactivation of upstream regulators leads to hereditary hemochromatosis, while suppression of hepcidin due to ineffective erythropoiesis contributes to iron overload in thalassemia and other iron-loading anemias [
31].
3. Iron Biomarkers: Applications and Limitations
The major serum biomarkers typically used to evaluate iron status are ferritin, iron, transferrin saturation, soluble transferrin receptor 1 (sTfR1; also referred to as sTfR) and hepcidin [
32]. Ferritin is normally an intracellular iron storage protein that can also be found in the circulation for reasons that remain unclear. Serum ferritin typically reflects body iron stores but is also influenced by inflammation, liver disease, obesity, and malignancies and must thus be paired with other tests for accurate diagnosis [
33,
34,
35,
36].
Measurement of sTfR1 and establishing sTfR1/log ferritin ratio are currently used to assess iron deficiency states [
37,
38]. TfR1 is the primary cellular iron gate and mediates import of circulating transferrin-bound iron [
39]. It is induced in response to iron deficiency, and its shedding from the plasma membrane gives rise to sTfR1. Thus, plasma sTfR1 concentrations reflect protein density at the cell surface and the number of cells expressing the receptor, making it a good biomarker of iron demand [
40,
41]. It may be used to differentiate “true” from “functional” iron deficiency under inflammatory conditions [
42].
Serum hepcidin can be used for assessing iron status, diagnosing iron deficiency states and for predicting responses to iron absorption from foods and supplements [
43]. However, its main limitation, much like serum ferritin, is its induction by inflammation. Despite this, hepcidin may be a useful diagnostic tool in specific conditions such as chronic renal disorders or diseases of iron overload [
44]. Yet, hepcidin (and sTfR1) assays have not yet become standardized tests due to harmonization challenges [
38,
43]. Serum hepcidin assays can vary up to 10-fold between tests due to the absence of a main reference material, a reference technique and a commutable calibrator, making it difficult to compare data and establish a uniform reference range [
45,
46].
4. Dietary Iron Intake and the Risk for Disease
High iron stores are often viewed as a potential biohazard [
13], indicating that excessive dietary iron intake increases the risk for disease. Dietary iron is found in forms of heme or inorganic (non-heme) iron [
47]. Heme is primarily present in hemoglobin and myoglobin from animal food sources, while inorganic iron is found in food derived from both plants and animals [
48]. Heme consists of ~10-15% of total dietary iron sources in meat-eating populations, but accounts for over 40% of assimilated iron due to its enhanced absorption [
48]. Thus, it is estimated that the efficiency of inorganic iron absorption is ~10% and of heme ~25%, which is 2.5 times higher [
48]. This may be related to the lipophilic nature of the heme molecule, and possibly also to lack of negative feedback regulation of heme absorption. In any case, the intestinal heme transporter remains elusive, and its identification and characterization are expected to increase our mechanistic understanding. Thus far, it is well established that dietary absorbed heme is catabolized within intestinal epithelial cells and liberated iron follows the fate of dietary inorganic iron assimilated via the divalent metal transporter (DMT1). DMT1 is expressed on the apical membrane of duodenal enterocytes and is subjected to negative regulation by iron [
49].
There is evidence from epidemiological studies that high dietary iron intake may predispose to diseases (
Table 1). Most if not all these studies assessed dietary heme intake based on food questionnaires and data were extrapolated from meat consumption. However, it should be noted that processed red meat contains several potentially confounding substances, such as nitrate/nitrite, heterocyclic amines, polycyclic aromatic hydrocarbons
etc., which may likewise affect health outcomes. Therefore, it is not clear whether the adverse effects of high dietary heme intake can always be entirely attributed to iron. This primarily applies to heme as opposed to inorganic iron as described in an umbrella review with a total of 34 meta-analyses [
50]. The intake of heme but not inorganic iron was significantly associated with modestly increased risk for type 2 diabetes mellitus (T2DM), gestational diabetes mellitus, coronary heart disease, cardiovascular disease (CVD), CVD mortality, as well as colorectal, esophageal and breast cancer [
50].
5. Iron and the Risk for Metabolic Syndrome
Metabolic syndrome is a pathologic state defined by combined manifestation of abdominal obesity, hyperglycemia due to insulin resistance, dyslipidemia and/or hypertension. Consequently, patients with metabolic syndrome are at greater risk of developing type 2 diabetes mellitus (T2DM), CVD, non-alcoholic fatty liver disease (NAFLD), and some types of cancer. Interestingly, in a multi-ethnic, population-based cohort of 3,828 participants, metabolic syndrome itself was associated with intake of heme iron [hazard ratio (HR) = 1.25 (95% CI = 0.99 to 1.56)] and zinc [HR = 1.29 (95% CI = 1.03 to 1.61)] from red meat [
51]. Several clinical studies established a strong association between high body iron stores and insulin resistance [
52,
53,
54,
55], and the combination of iron overload with metabolic defects has been labeled as “dysmetabolic iron overload syndrome” (DIOS) [
56]. Importantly, the alterations in iron metabolism appear to be multifactorial and dynamic due to unhealthy diet combined with environmental and genetic cofactors [
56]. Despite evidence that dietary iron-depletion can attenuate NAFLD progression in mice [
57], the benefits of phlebotomy on patients with DIOS remain controversial with a meta-analysis of 9 studies not finding significant improvement [
58]. Nevertheless, the strong correlation of high dietary heme intake with T2DM and metabolic syndrome is consistent with iron-dependent effects [
59,
60]. In fact, there are several molecular links between iron and glucose metabolism [
61]. For instance, insulin stimulates ferritin synthesis thereby increasing tissue iron stores; in addition, it promotes redistribution of transferrin receptors to the cell surface, thereby stimulating iron uptake [
62].
Insulin was positively associated with serum ferritin levels in the 1988-1994 cross-sectional study from the Third National Health and Nutrition Examination Survey suggesting a possible link between T2DM and iron [
63]. This association has since been corroborated in smaller cohort studies in Norwegian men, subjects with excessive body weight, and adolescent girls [
64,
65,
66]. A meta-analysis found a correlation between high serum ferritin and T2DM in 11 out of 12 studies analyzed, with an odds ratio of 1.43 (95% CI 1.29 to 1.59) [
67]. The ratio of sTfR1 to serum ferritin was also inversely related to risk of T2DM with an odds ratio of 0.65 (95% CI 0.45 to 0.95) in participants with high serum ferritin [
67].
A 2012 meta-analysis of 5 primary prospective cohort studies suggested a relative risk of 1.33 (95% CI 1.19 to 1.48) for T2DM in individuals with highest vs lowest heme intake [
59]. High body iron stores were likewise linked to T2DM with a relative risk of 1.70 (95% CI 1.27 to 2.27); however total dietary or supplemental iron intake did not show any significant association [
59]. Another systematic review performed in 2021 came to the same conclusion [
68]. This review included data from 323,788 participants over 11 studies and found that higher heme intake was associated with a 20% increased risk for developing T2DM (95% CI 1.07 to 1.35) [
68].
Similar results were reported in two meta-analyses on iron and gestational diabetes mellitus (GDM). Fu et al. analyzed 1,025 GDM patients against 15,608 controls across 2 studies to compare the lowest and highest consumers of heme iron, and found a relative risk of 1.53 (95% CI 1.17 to 2.00) for development of GDM [
69]. Kataria et al. analyzed the association between heme iron intake and GDM across 4 studies including the two studies from the previous meta-analysis and found an adjusted odds ratio of 1.48 (95% CI 1.29 to 1.69) [
70]. Both studies confirmed a strong association between GDM and high body iron stores, but not intake of total dietary iron or iron supplements [
69,
70]. Due to the limited number of studies examining heme iron intake and GDM compared to T2DM, further investigation should be performed to provide a reliable correlation.
Molecular studies in cell and mouse models have linked iron overload to insulin resistance through mechanisms involving autophagy in skeletal muscles and cardiomyocytes [
71,
72]. Iron loading in cell and mouse models exacerbated palmitate-induced insulin resistance, impaired glucose-stimulated insulin secretion from pancreatic β cells, and increased gluconeogenesis in the liver [
73,
74,
75]. Conversely, iron deficiency has been associated with insulin sensitivity in rats [
76,
77]. In young women, correction of iron deficiency anemia decreased insulin levels [
78]. Taken together, these data provide a basis to investigate molecular mechanisms underlying the increased risk for metabolic syndrome pathologies in individuals with high dietary iron intake and elevated body iron stores.
6. Iron and the Risk for Cardiovascular Disease
The association between iron and CVD was first proposed in 1981 from observations that the incidence of CVD was elevated in men and post-menopausal women [
79]. A meta-analysis of 21 cohort studies with 292,454 participants revealed a significant association between heme iron intake and coronary heart disease incidence [
80], with relative risk of 1.57 (95% CI 1.28 to 1.94). These findings were corroborated in another meta-analysis from 6 different studies including 131,553 participants [
81]. Interestingly, total iron intake, serum iron levels and transferrin saturation were inversely correlated with coronary heart disease incidence [
80]. A meta-analysis of 13 primary studies with 252,164 participants reported a relative risk of 1.07 (95% CI 1.01 to 1.14) for CVD in individuals with high dietary heme intake [
82]. Another meta-analysis of 19 studies with 720,427 participants reported an association between high dietary heme intake and CVD mortality, with a relative risk of 1.19 (95% CI 1.01 to 1.39) [
83]. Iron status has also been positively associated with carotid atherosclerosis in the absence of inflammation [
84]. Additionally, abdominal walls from patients having suffered abdominal aortic aneurysms displayed iron accumulation compared to healthy controls with elevated expression of TfR1 [
85]. On the other hand, iron deficiency is a known comorbidity in patients with heart failure [
86,
87] and its correction with intravenous iron administration has been shown to reduce hospitalizations [
88]. Taken together, these data highlight epidemiological links between iron status and CVD risk.
Animal studies have provided supportive evidence. Thus, early experiments in rabbits injected with iron dextran and fed a 0.5% cholesterol diet demonstrated greater atherosclerotic lesion development compared to diet alone [
89]. Recent work in mice suggested that non-transferrin bound iron (NTBI), a highly redox active form of iron that appears in the circulation primarily during conditions of iron overload, aggravates atherosclerosis [
90]. Crossing apolipoprotein E knockout (
apoE–/–) mice, an established model of atherosclerosis, with mice that express a hepcidin-resistant ferroportin mutant (
Fpnwt/C326S) aggravated atherosclerosis via increased levels of NTBI and oxidative stress [
91]. Iron loading of the heart appears to be critically important when it is in cardiomyocytes as demonstrated by the reduced survival of mice lacking ferroportin in this cell type [
92]. Thus, despite having elevated cardiac iron content, mouse models of hemochromatosis exhibit only minor cardiac dysfunction and develop cardiomyopathy only in response to chronic dietary or parenteral iron loading [
93,
94]. The functional importance of cardiomyocyte iron load is also emphasized by the lethal cardiomyopathy documented in mice with iron-deficient cardiomyocytes due to ablation of TfR1 [
95] or expression of hepcidin-resistant Fpn
C326S [
96]. Notably, local production of hepcidin is necessary for proper iron homeostasis in the heart [
96], and a better understanding of its function and regulation is needed.
7. Iron and Cancer Risk
An observational study in a cohort of 309,443 adults in Taiwan identified an increased incidence of all cancers in individuals with high serum iron with a hazard ratio of 1.25 (95% CI 1.16 to 1.35), and a hazard ratio for mortality from all cancers of 1.39 (95% CI 1.23 to 1.57) [
97]. Similar results were reported in a meta-analysis of 27 studies, where high serum iron correlated with higher relative risk for breast cancer (1.22; 95% CI 1.01 to 1.47) [
98]. While no association between breast cancer and total dietary iron intake or inorganic iron supplementation was documented, dietary heme intake was associated with significant relative risk for breast cancer of 1.12 (95% CI 1.04 to 1.22) [
97]. Dietary heme intake was also significantly associated with increased relative risk for esophageal cancer (1.21; 95% CI 1.02 to 1.45) [
99], and lung cancer (1.16; 95% CI 1.02 to 1.32) after adjustment for smoking history [
100].
The meta-analysis on esophageal cancer involved 20 studies with 1,387,482 study participants; interestingly total iron intake was found to be protective against esophageal cancer development (relative risk 0.85; 95% CI 0.79 to 0.92) [
99]. A similar trend was also observed in the lung cancer study that involved 416,746 individuals from European countries [
100]. Comparable data were also obtained from meta-analyses of studies focusing on colorectal cancer [
101,
102]. A European prospective cohort study of 450,105 participants followed for 14.2 ± 4.0 years found a stronger positive association of colorectal cancer in the proximal vs distal colon in men with high dietary heme intake (1.11; 95% CI:1.02 to 1.20) [
103]. Yet, this trend was not observed in women, and other sources of iron were not associated with colorectal cancer [
103]. Conversely, phlebotomy was associated with less new visceral malignancy in a prospective multicenter randomized clinical trial of 1,277 patients with peripheral arterial disease (0.65; 95% CI 0.43 to 0.97) [
104]. Moreover, an analysis of 37,795 blood donors found an association with decreased risk of cancer (0.79; 95% CI 0.74 to 0.84) [
105].
Heme may have diverse and often opposing functions in carcinogenesis [
106]. In essence, above average levels of heme intake may lead to cellular cytotoxicity, lipid peroxidation, DNA and protein oxidation, and genetic mutations promoting carcinogenesis; however, excessive heme may also protect from carcinogenesis by shifting cell metabolism towards oxidative phosphorylation and eventually inducing cell death through ferroptosis, an iron-regulated programmed cell death [
106]. It should be noted that highly proliferative cancer cells have increased needs for iron and undergo reprogramming for efficient iron acquisition and retention [
107]. Studies in breast, ovarian, and prostate cancer have demonstrated reduced expression of ferroportin in the tumor to increase iron retention [
108,
109,
110]. On the other hand, TfR1 is upregulated to increase iron uptake [
111,
112]. In line with these data, a meta-analysis of 22 studies revealed that serum ferritin levels were significantly higher in cancer patients with a standardized mean difference of 3.07 (CI 1.96 to 4.17) [
113].
Mouse studies have validated clinical and epidemiological data on the role of iron as driver of carcinogenesis and, furthermore, have provided mechanistic insights. For instance, deletion of the
Apc gene, an important precursor in colorectal cancer development, made mice susceptible to tumorigenesis from luminal iron [
114]. Other studies demonstrated that dietary heme promotes epithelial hyperplasia in mice due to oxidative stress, hyperproliferation and reduced apoptosis of intestinal epithelial cells [
115,
116]. Thus, it is likely that toxicity of excess dietary heme is directly relevant to colorectal cancer. Nevertheless, since dietary heme promotes gut dysbiosis [
117], which in turn may affect metabolic syndrome and other pathologies [
118,
119,
120], it is possible that several of its adverse effects are indirect.
Mutations predisposing to hereditary hemochromatosis have been associated with increased risk of colorectal and breast cancer providing an association between iron and these malignancies [
121]. Yet the most common type of cancer associated with hereditary hemochromatosis is hepatocellular carcinoma [
122], which is also a common complication of transfusion-dependent thalassemias [
123]. Interestingly, hemojuvelin knockout (
Hjv–/–) mice, a model of juvenile hereditary hemochromatosis, are predisposed to hepatocarcinogenesis by a mechanism linked to mitochondrial hyperactivity [
124].
8. Iron and the Intestinal Microbiome
Iron is also essential for microorganisms, and excessive dietary iron intake may affect the composition of bacterial communities in the gut. Iron deprivation has been shown to cause irreversible community alterations in the human intestinal microbiome, whereas iron supplementation resulted in small person-specific shifts [
125]. Changes in microbial composition may be more relevant to specific populations or patients with increased sensitivity to such alterations. For example, the shifts in the gut microbiota from iron intake and supplementation has been proposed to influence progression of NAFLD in obese individuals [
126].
Recent studies revealed adverse effects of iron fortification or supplementation in the intestinal microbiome of children in tropical countries that are vulnerable to infectious diseases [
127,
128] or of inflammatory bowel disease (IBD) patients [
129]. In the latter, oral iron caused a decrease in
Faecalibacterium prausnitzii and
Ruminococcus bromi which are important anti-inflammatory bacteria by producing butyrate [
129]. IBD poses an extra challenge as it is often associated with iron-deficiency anemia in part due to gastrointestinal bleeding [
130], which provides an additional source of heme that may alter microbiome composition. Experiments in mice showed that dietary heme exacerbated dextran sodium sulphate (DSS)-induced colitis and promoted formation of adenomas [
131]. Similar results were obtained in further studies involving the use of iron fortified chow in mouse models of colitis [
132,
133].
The gut microbiota has also been shown to play a role in colorectal cancer progression in mice, by sensitizing the mucus barrier in the presence of heme [
134]. Heme may favor the growth of
Akkermansia muciniphila, a mucin-degrading bacterium, which may lead to disruption of the mucus barrier [
135]. Nevertheless, there is evidence suggesting that iron supplementation may be beneficial for the microbiome depending on formulation [
136,
137]. Additional studies would be needed to evaluate the impact of dietary or various forms of supplementary iron on the intestinal microbiome in the general population and in specific patient groups.