A Brief History of Food Fortification in the U.S.

According to the World Health Organization (WHO), food fortification is the practice of deliberately increasing the content of one or more micronutrients (i.e., vitamins and minerals) in a food or condiment to improve the nutritional quality of that food—thereby providing a public health benefit with minimal risk to individual health. But why would fortification be necessary? Over decades of research, fortification has been identified as one of the most cost-effective nutrition interventions available, particularly for low- and middle-income countries around the globe. In fact, the worldwide fortification of commonly consumed foods provides enhanced nutrient intake throughout the lifespan for populations that are at risk for widespread nutritional deficiencies.

Even in wealthier countries, like the United States, fortification has led to positive health benefits for the entire population. In the U.S., micronutrient deficiency diseases like goiter, rickets, beriberi, and pellagra were once common health problems as recently as the 20th century. Thanks to systematic fortification within the U.S. food supply, these diseases have been virtually eliminated. Read on to learn more about the historical origins of food fortification in the U.S., as well as insights into fortification’s contributions to improved public health.

But before we dive in, let’s define two food-related terms that are often used interchangeably but are slightly different:

  • Fortification is the addition of nutrients to a food that were not originally present in that food, or were present in a different amount. Fortification as a term is often used more generally (as in the case of the WHO definition) than the term enrichment.
  • Enrichment refers to the addition of vitamins and minerals to a food in order to restore those nutrients to levels that were found in the food prior to storage, handling, and processing.

The 1920s: Iodine in Salt

During the 1921 American Medical Association (AMA) convention, two Ohio doctors presented findings from their clinical trial demonstrating the effectiveness of sodium iodide treatments for the prevention of goiter in Akron schoolgirls. Prior to their study, research from Europe had also suggested an association between iodine deficiency and goiter (an enlarged thyroid gland). It was found that without iodine, the body could not properly make thyroid hormones, which often resulted in an unsightly neck goiter or, in more serious cases, neurocognitive impairments. Iodine deficiency generally occurs in areas where agricultural soil has been depleted of iodine due to flooding, heavy rainfall, or glaciation.

Shortly after the publication of the 1921 study results, Michigan became the first state to institute a public campaign to encourage the consumption of dietary iodine via fortified salt. An extensive educational campaign that involved schoolteachers, industry stakeholders, and the medical and public health communities helped increase consumer awareness and demand for iodized salt. By 1924, iodized salt was commonplace, even though iodization was never made mandatory by the U.S. government. Following the implementation of the iodized salt awareness campaign, epidemiological studies found a significant decline in the incidence of goiter, further confirming the success of the program in Michigan. Most table salt continues to be fortified with iodine today.

The 1930s: Vitamin D in Milk

In the early 20th century, rickets (a childhood condition featuring too-soft bones and skeletal malformation due to incomplete bone growth) was common among poor children living in industrialized, northern U.S. cities. Inadequate nutrition, poor hygiene, and lack of exercise were among the factors believed to play a role in the formation of this disease, but the relationship between diet and rickets was not clearly understood until an English physician conducted the first experimental study on rickets with dogs. His observations of specific “anti-rachitic” factors found in cod liver oil, butter, and whole milk eventually led to the identification, purification, and synthesis of vitamin D. Prior to this discovery, irradiated milk, cod liver oil, and milk from yeast fed cattle were common treatments to combat rickets. After synthetic vitamin D became widely available, these older approaches to treat and prevent rickets were no longer needed.

Now we know that our bodies can produce vitamin D when skin cells are exposed to sunlight, making the sun an important source of vitamin D. However, concerns about skin cancer caused by sun overexposure make dietary sources of vitamin D a critical aspect of our recommended vitamin D intake. Naturally occurring sources of vitamin D are limited mostly to oily fish (e.g., salmon, mackerel, and sardines) and cod liver oil. Consequently, the Food and Drug Administration (FDA) has established a standard of identity (SOI) for milk, which includes the optional addition (except for evaporated milk) of vitamins A and D. (In contrast, certain foods, such as yogurt, have required SOIs that are legal definitions of a food that specify exactly what can and cannot be added to that food, and in what amounts.) Today, most of our milk is fortified with vitamin D. Besides fortified milk, select foods such as cereals and orange juice may also be fortified with vitamin D. To check if vitamin D is present in a food, you can always consult the Nutrition Facts label.

The 1940s: B Vitamins in Grains

By 1938, the American Medical Association’s Council of Foods and Nutrition endorsed the addition of nutrients to foods if there was sufficient scientific justification that doing so would improve public health. During this time, the American diet relied heavily on refined flours. And because processing wheat into refined flour removes essential B vitamins, nutritional deficiencies in B vitamins were more common than they are today. One example of such a deficiency is niacin deficiency, a condition also known as pellagra, which was then characterized by “the four D’s”: diarrhea, dermatitis, dementia, and death, and was widespread in southern states. To a lesser degree, beriberi (a neurological disorder), from thiamin deficiency, and riboflavin deficiency—presenting as redness, swelling, and cracking of the mouth and tongue—also persisted.

To address these health conditions, bakers began to add high-vitamin yeasts to their breads—followed by synthetic B vitamins, as those became increasingly available. By the end of 1942, 75 percent of white bread on the U.S. market was fortified with thiamin, niacin, iron, and riboflavin, helping to alleviate these devastating conditions. In 1942, the U.S. Army agreed to purchase only enriched flour for its soldiers in order to improve the health of recruits—thus creating additional demand for enrichment of the nation’s white bread.

Still, after World War II, the FDA decided not to mandate enrichment during the development of its standard of identity for bread. Instead, it established two different standards of identity for enriched and non-enriched products. Throughout the ensuing years, the standard of identity for other grain products, such as corn meals, enriched rice, and macaroni and noodle products, have also been established.

The 1990s: Folic Acid in Grains

In 1992, the U.S. Public Health Service recommended that all women of reproductive age get 400 micrograms (mcg) of folic acid (aka vitamin B9) daily to prevent neural tube defects in growing fetuses. In 1998, the Food and Nutrition Board of the National Academies of Sciences, Engineering, and Medicine (NASEM) echoed the U.S. Public Health service by also recommending that all women seeking to become pregnant take 400 mcg of folic acid daily (from fortified foods or supplements, or a combination of the two), in addition to consuming food with folate from a varied diet, in order to reduce the risk of having a baby with a neural tube defect.

In 1998, based on recommendations from the U.S. Public Health Service, the Centers for Disease Control and Prevention (CDC), and others, the U.S. government launched a public health intervention that required manufacturers to fortify cereal grain products already labeled as “enriched” with 140 mcg of folic acid per 100 grams of flour following the discoveries from several studies that showed folic acid supplementation was beneficial in reducing the occurrence of neural tube defects in newborns. According to the CDC, researchers using data from several birth defect tracking systems have found that, since the beginning of U.S. folic acid fortification, each year about 1,300 babies who might otherwise have had a neural tube defect are born without one. These data have resoundingly confirmed that folic acid fortification is a successful way to prevent neural tube defects.

In Conclusion

Food fortification has evolved tremendously over the years and has now expanded to include a variety of vitamins and minerals that are added to foods with the express intention of maintaining and improving public health. For more information on food fortification, check out the WHO, NASEM, and CDC resources below: