This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognizing you when you return to our website and helping our team to understand which sections of the website you find most interesting. We do not share any your subscription information with third parties. It is used solely to send you notifications about site content occasionally.

Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

Many medical organizations advise against routine supplementation of vitamins and minerals (citing “safety concerns,” lack of evidence of benefit,” or that they are simply “unnecessary”) and recommend a focus on acquiring nutrients from the diet. Which, for the most part, is a good suggestion: no combination of supplemental vitamins, minerals, or other nutrients could possibly emulate the diversity of known (and unknown) beneficial compounds found in the diet. However, general dismissals of dietary supplements often fail to acknowledge the significant portion of micronutrients in the average diet that may come from the fortification of foodstuffs. Food fortification (addition of nutrients to foodstuffs for commercial benefit or as a part of public health policy) has been credited for the eradication of several diseases of nutrient deficiency in the U.S.

If both fortification and supplementation similarly involve the addition of vitamins or minerals to the diet (often achieved using the exact same chemical “additives”), why is there such a disparity in the perceptions of each?

In the United States, foods were being fortified even before the concept of recommended daily intake had been established. Beginning with the addition of iodine to table salt in the 1920s to stem the prevalence of goiter, the enrichment of common dietary foodstuffs with vitamins or minerals became the preferred means to abate nutrient deficiency epidemics through the 1930s, such as pellegra (vitamin B3 deficiency) in the South, and rickets (vitamin D deficiency) in the Northeast. Around this time, one-third of Americans had poor diets, with over 10 percent showing signs of vitamin deficiency. This frequency of nutrient deficiency motivated the creation of the first recommended daily allowances (RDAs) for iron, calcium, vitamins A, B1, B2, B3, C, and D by the Committee on Food and Nutrition of the National Research Council (the predecessor to the modern Food and Nutrition Board). RDAs served as the guidance for fortification of low-cost “staple” foodstuffs; by the end of the 1950s, breads, flour, pasta, cornmeal, and white rice (enriched with iron, vitamins B1, 2, and 3), as well as milk (enriched with vitamin D and optionally vitamin A) all had formal standards for fortification that were encouraged and regulated by the FDA. Folic acid was added to the list of grain product enrichments in 1996, as a measure for the prevention of neural tube defects. This addition represented a change in paradigm for the FDA, in which they considered the benefits of fortification for a small, poorly-defined group (women who may be pregnant), over the potential for excessive intake by the rest of the population.

Nowadays, fortified foods are prevalent in the American food supply. Fortification of foods is no longer only by government encouragement or mandate, but often provides a competitive advantage for food manufacturers, and has expanded to include nearly all essential and several non-essential nutrients (such as phytosterols, or fatty acids from fish oil). Ready-to-eat cereals and vitamin-C fortified drinks are the largest contributors of fortified nutrients to the U.S. food supply. Ninety-two percent of breakfast cereals are fortified, and contribute up to 30 percent of the daily intake of many vitamins and minerals for adults and children; many cereals are fortified up to 100 percent of nearly all the essential vitamins and minerals.

While food fortification has been viewed by some as a nutritional triumph (and to be fair, it has resulted in significant decreases in several nutrient deficiency diseases that were common in the early twentieth century), individual supplementation with vitamins or minerals has at best been deemed unnecessary without underlying deficiency, and at worst been called potentially hazardous. So what are the differences between supplementation and fortification, which makes one more acceptable than the other?

In reality, there is little difference between these two schemes for improving nutrient intake. Food fortification uses many of the same ingredients as supplements; comparing the ingredient lists on a cereal box and multivitamin bottle will reveal several of the same chemical compounds. Baking these vitamins or minerals into a loaf of bread doesn’t make them any more effective than if they were compressed into a tablet. While there is a pervasive assumption that “food nutrients” are more effective than “supplemental nutrients,” this theory is of little relevance to the added vitamins and minerals of fortified foods. The argument that supplemental vitamins or minerals may lead to unnecessarily high, potentially detrimental daily nutrient intakes may have some merit (one could imagine it is probably easier to “overdose” on multivitamins than to eat a dangerous amount of fortified bran flakes), although excessive vitamin intake is also possible through fortified foods (the possible link between long-term excessive folic acid intake and the risk of colon cancer is a recent concern).

Unfortunately, many organizations do not recognize the parity between the fortification and supplementation, or simply fail to acknowledge that a substantial portion of nutrients from a “healthy diet” (even diets based on their own recommendations) may actually be supplemental (“fortified”). For example, consider the TLC diet, the heart-healthy eating plan recommended by the American Heart Association in its Third Report of the National Cholesterol Education Program. The TLC plan meets most conventional definitions of “healthy eating,” it encourages a diet low in calories and saturated fats, high in fiber, fruits, and vegetables. Looking at the sample TLC menus, however, one begins to realize a significant percentage of several essential nutrients comes from fortification, and not necessarily from the foods themselves. In one menu (“Traditional American Cuisine”), for example, almost half of the iron, B1, B2 and B3, 40 percent of the folate and calcium, and all of the vitamin D may come from fortified foods (including enriched flour products, rice, milk, and calcium-fortified orange juice), based on analysis of the menu choices using the USDA National Nutrient Database for Standard Reference. Without fortification, these meals could be deficient for all but one of the above nutrients.

Therefore, the blanket characterization of dietary supplements as “unnecessary” really ignores the prevalence of supplemental vitamins (“food fortification”) in the diet. Whether as an added food ingredient, capsule, or tablet, supplemental vitamins can, and have, played a recognized role in prevention of disease, when their intakes are properly balanced. Take any statements to the contrary with a grain of (iodized) salt.

Kevin M. Connolly, PhD

Kevin M. Connolly, PhD received his bachelor’s degree in anthropology from Brown University, and doctorate in biochemistry and molecular biology from UCLA. Before consulting for the dietary supplement industry, he spent 15 years in basic biochemistry research elucidating such diverse mechanisms as bacterial antibiotic resistance and collagen synthesis. He contributes to several online and print publications, and is a frequent guest on radio health programs throughout the country. When not writing, he teaches undergraduate biochemistry.