Skip to main content

Evolutionary Perspectives On Iron Fortification and Infant Formula

While finishing my graduate dietetics coursework, i've taken a hiatus from nutrigenomics work and have delved into some breast milk analysis. The whole field of breast milk is extremely interesting, especially from an evolutionary perspective. For some great work on breast milk, from evolutionary perspectives, check out Elizabeth Quinn and Katie Hinde's work.

Elizabeth Quinn recently put out a paper in The Journal of Human Biology, addressing the issue of iron fortification in infant formula (1). Iron is a big area of concern for infant nutrition. Breast milk, the gold standard for infant nutrition, is quite low in iron, and this appears to be relatively unaffected by mother's intake. An infant relies on its own iron stores, that it accumulated during gestation, for the first 6 months of life - one of the many reason's a mother's adequate iron intake is so important throughout pregnancy. Iron is extremely important for normal growth of an infant. But why is iron so low in breast milk?

Credit: blog.scientificamerican.com
Quinn eloquently argues that throughout human evolution, the low iron content of breast milk was selected for. The major selective pressure driving low iron was infection - many pathogens require iron to proliferate. Maintaining a low iron status would not only be protective against infection, but may also limit the duration of an infection. Infection via pathogens has been a major selective pressure throughout evolutionary history - studies showing evidence of selection throughout human evolution are widespread in the biology/anthropology literature (2).  It is not surprising that human breast milk, important for its role in development of the human immune system (3) and replete with IgA, leukocytes, lysozymes (4), may also attempt to reduce pathogen proliferation by reducing iron availability.

Quinn takes issue less with the introduction of iron-rich foods as part of complementary feeding after 6 months, and more with the current iron fortification in American formulas. She notes that the iron in formula is less bioavailable than it is in human milk, and therefore, more iron must be added to ensure proper absorption. The iron content of human milk for a typical male infant averages about .47mg/L, with an absorption rate of 29.15%. Formula is fortified with iron at levels of either 4, 7 or 12mg/L, with absorption being 7%. Formula results in higher iron absorption, despite lower absorption rates, at all levels of fortification. This could not only lead to iron overload in infant, but also make high levels of unabsorbed iron available to the developing gut microbiota. Quinn cites several studies showing formula fed infants having higher levels of Escheria, Enterococci, and Clostridia (not-so-good), compared to breast fed infants with higher levels of Lactobacillus and Bifidobacterium (good). This evolutionarily novel intestinal iron content is likely altering the development of the gut microbiota - with the increased focus on the gut microbiome and it's ability to relay susceptibility to disease, iron fortification of infant formula could be an area for concern.

This issue of iron fortification at such high levels is particularly unique to the U.S. The ESPGHAN  global standards for iron fortification in infant formulas were set at 4-8mg/L of iron. In the U.S., however, formula is generally fortified with 10-12mg/L. ESPGHAN showed that current data supports fortification with 4-8mg/L of iron for normal growth and development, and that elevated levels of iron may be harmful.

It has long been assumed that the morbidity/mortality rate discrepancy between breast fed and formula fed infants was due to breast milk's protective immunological factors (IgA, leukocytes), but the role of iron fortification should be considered as well. I will be interested to see whether this commentary affects the fortification levels in U.S. formulas. Iron is certainly important for normal growth and development, especially for pre-term infants. But for healthy individuals, if there's one message that seems to reign true throughout nutrition research, it's that more is not always better.

This is also an important example of how to use evolutionary theory to ask questions about health-related outcomes. Quinn identifies an area that has potentially been selected for, and backs up her modest recommendations with a lot of great evidence. I talk a lot about how we can't rely on just evolutionary theory to determine health recommendations, but we can certainly use evolutionary inference as a starting point for questioning current recommendations.

1.http://onlinelibrary.wiley.com/doi/10.1002/ajhb.22476/abstract
2. http://hmg.oxfordjournals.org/content/13/suppl_2/R245.full
3. http://jn.nutrition.org/content/138/9/1782S.full
4. http://www.jaoa.org/content/106/4/203.full

Comments

Popular posts from this blog

Beware the Meta-Analysis: Fat, Guidelines, and Biases

Headlines were abuzz this week, reporting that a new review of randomized controlled trials at the time of the low-fat guidelines didn't support their institution. Time , Business Insider , and The Verge all covered the topic with sensationalist headlines (e.g. 'We should never have told people to stop eating fat' #weneverdid). I won't spend every part of this blog picking apart the entire meta-analysis; you can read it over at the open access journal, BMJ Open Heart (1) -- (note, for myself, i'm adding an extra level of skepticism for anything that gets published in this journal). I'm also not going to defend low-fat diets either, but rather, use this meta-analysis to point out some critical shortcomings in nutritional sciences research, and note that we should be wary of meta-analyses when it comes to diet trials. First off, let's discuss randomized controlled trials (RCTs). They are considered the gold standard in biomedical research; in the hierarc

On PURE

The PURE macronutrients studies were published in the Lancet journals today and the headlines / commentaries are reminding us that everything we thought we think we were told we knew about nutrition is wrong/misguided, etc. Below is my non-epidemiologist's run down of what happened in PURE. A couple papers came out related to PURE, but the one causing the most buzz is the relationship of the macronutrients to mortality. With a median follow up of 7.4 years, 5796 people died and 4784 had a major cardiovascular event (stroke, MCI). The paper modeled the impacts of self reported dietary carbohydrate, total fat, protein, monounsaturated (MUFA), saturated (SFA), and polyunsaturated (PUFA) fatty acid intakes on cardiovascular (CVD), non-CVD and total mortality; all macros were represented as a percentage of total self reported energy intakes and reported/analyzed in quintiles (energy intakes between 500-5000kcals/day were considered plausible..). All dietary data was determined by a

Nutrition Recommendations Constantly Change...Don't They?

I was on Facebook the other day, and someone in a group I'm in made a statement about not being sure whether to eat dairy, because "one week its bad, and the next its good". This is something I hear all too often from people: nutrition is complex, confusing, and constantly changing. One week 'X' is bad, the next 'X' is good. From an outsider's perspective, nutrition seems like a battlefield - low fat vs low carb vs Mediterranean vs Paleo vs Veg*n. Google any of these diets and you'll find plenty of websites saying that the government advice is wrong and they've got the perfect diet, the solution to all of your chronic woes, guarantee'ing weight loss, muscle growth, longevity, etc. Basically, if you've got an ailment, 'X' diet is the cure. I can certainly see this as being overwhelming from a non-scientist/dietitian perspective. Nutrition is confusing...right? Screenshot, DGA: 1980, health.gov From an insider's pe