This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognizing you when you return to our website and helping our team to understand which sections of the website you find most interesting. We do not share any your subscription information with third parties. It is used solely to send you notifications about site content occasionally.

  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

Taking Supplementation Seriously Part I:

There is an ongoing debate on whether dietary supplements deserve to be part of a health-promoting strategy. Several medical organizations do not advise routine supplementation for people, without underlying deficiencies, citing safety concerns or lack of clear evidence of benefits, and suggest that an adequate diet should be sufficient in obtaining proper nutrition. Prophylactic use of supplemental vitamins or minerals, like iron, has sparked controversy. On the other hand, there is a wealth of published, peer-reviewed scientific data that present strong correlations between adequate nutrient intake and lowered disease risk/incidence, as well as studies in which nutrient interventions demonstrated significant health benefits. Hyperbolic media reports that “resveratrol may make you live longer” or “multivitamins may cause prostate cancer” further complicate the dialog.

How does one make the decision to take a dietary supplement? Perhaps it is best to start off simply. Let’s forget, for the moment, the potential benefits of the myriad of compounds that fall under the category of supplement (CoQ10, carnitine, probiotics, the scores of plant-derived antioxidants, etc.), and consider a basic supplement choice: the multivitamin. Most multivitamins truly fit the definition of dietary supplement: they provide a set of essential nutrients that should exist in the diet, but may be otherwise insufficient. What is essential? One definition is a chemical compound that is required for the proper health of an organism, but cannot be synthesized by the organism and must be obtained through diet. In humans there are at least 40 essential nutrients, including vitamins, trace elements, amino acids and fatty acids (see box at right). Additional minerals and compounds (silicon, lutein, boron) are not officially classified as essential, but have indispensable roles in human metabolism and are also only available through the diet. Add in the conditionally essential nutrients, like PABA or tyrosine (which we can make ourselves, but often not in sufficient quantities), and you can see an adequate diet has quite a chemical complexity.

Nutrients that are Recognized as Essential


A, B1, B2, B3, B5, B6, B7(biotin), B9(folate),
B12, Bp(choline) C, D, E, K

Calcium, chlorine, chromium, copper, iodine, iron,
magnesium, manganese, molybdenum, phosphorus,
potassium, nickel, selenium, sodium, zinc

Isoleucine, leucine, lysine, histidine, methionine,
phenylalanine, threonine, tryptophan, valine

Alpha-linolenic acid, linoleic acids

So in this simple case, the question becomes “Is my diet adequate to provide sufficient amounts of all the essential nutrients I need?” The answer, of course, varies by individual, but there are several observations that suggest supplementation of some nutrients may deserve your serious consideration:

1) The food supply may not be as healthy as it used to be

The deteriorating food supply is a popular target for nutritional discussion. Even if we were to ignore the proliferation of growth hormones, pesticides, heavy metals, antibiotics, GE organisms, and concentrate only on nutrient content, our food supply still leaves a lot to be desired (the increased availability of organic and artisanal products notwithstanding). Whether it has resulted from emphasis on less resource usage or greater pest resistance in the selection of cultivars, or elevated atmospheric CO2 levels that increase the carbohydrate to micronutrient ratio, the yield of vitamins and minerals in fruits and vegetables has fallen in the last half century. For example, analysis of 43 different food crops found that from 1950 to 1999, the average content of calcium, phosphorous, iron, total minerals, and vitamins A, B2, B3, and C have all dropped significantly, up to 38 percent for B2 and 15 percent for vitamin C. (Interestingly, one of the only vegetables to increase in nutrient value was carrots, which more than doubled their vitamin A content due to their selection for a more orange color.) Related studies have found an average 24 percent, 27 percent, and 46 percent reduction in levels of magnesium, iron, and calcium, respectively amongst vegetables, and statistically significant reductions in calcium, magnesium, copper, and sodium in vegetables and in fruits from their levels in the 1930s.

Even beef, which many would not consider healthy, has nonetheless not been spared. The common practice of raising beef in feedlots reduces their omega-3 content by over 75 percent, doubles their omega-6 content, and triples their fat content, compared to their free-range brethren. Although few would consider cattle a viable source of essential alpha-linolenic acid, and CLA, EPA, or DHA, it probably was at one point for many Americans.

Is the reduction in nutritive value in the modern food supply the definitive case for supplementation? Likely not: fruits and vegetable are still nutrient-dense food choices. But it does suggest supplementation may at least serve as a nutritional “insurance.”

2) Our desire to improve our health can put us at risk of deficiencies.

It comes as no surprise to most that losing weight requires the expenditure of more calories than are consumed; and that this can be approached by either restricting caloric intake, increasing caloric expenditure through exercise, or some combination of the two. Both of these approaches, however, carry the risk of nutrient deficiency. Dieting, which necessitates a significant reduction in calories (and in many cases, a dramatic alteration in food choices), can also limit the intake of essential nutrients. Even a carefully constructed eating plan can be significantly vitamin deficient. In an analysis of 4 popular, published diet plans that limited calories to 1100–1700 per day (including the NIH and American Heart Association-recommended “DASH diet”), all were found to be on average only 43.5 percent sufficient in recommended daily intakes (RDIs) for 27 essential micronutrients values, when their daily suggested menus were followed. The diet with the lowest average caloric intake, the South Beach Diet, outlined a menu of food choices that were deficient in 21 micronutrients. Diets that restrict calories certainly have a role in reducing obesity; large-scale studies by the National Institute on Aging, the CALERIE studies, suggest that significant caloric restriction can lower the risks of heart disease and diabetes, reduce markers of cancer and inflammation, and possibly slow the aging process. However, it is imperative that when drastically reducing caloric intake, one must be cognizant of possible deficiencies in essential micronutrients—vitamins and minerals. This is a case for supplementation.

During times of large energy expenditure, as in strenuous exercise, our vitamin and mineral needs can increase. For example, while thiamin deficiency is rare, short-term insufficiencies can increase the buildup of lactic acid during exercise, contributing to fatigue. Cellular concentrations of essential trace minerals can drop during exercise, and subclinical deficiencies, deficiencies that are significant to affect health, but not significant to cause classical deficiency symptoms, in physically active people have been observed for B2 and B6. Dieting combined with strenuous exercise, naturally, can further increase the possibility of deficiency for several vitamins. Similarly, in older athletes, who have lower energy requirements and consume fewer nutrients, data suggests that supplementation of several vitamins and minerals (B2, B6, B12, D, E, folate, calcium and iron) may be warranted in this group.

3) Despite fortification, there are still significant nutrient deficiencies in our population.

It appears that most Americans are starting out life with nutritionally complete diets; infants, toddlers and preschoolers generally meet or exceed dietary reference intakes (DRIs) for almost all essential nutrients where DRIs have been established. As we progress to adulthood, however, significant deficiencies in nutrient intake become apparent. Some deficiencies—phosphorus, iron—are significant for particular demographics, while others—vitamin D, potassium—may affect the majority of the population.

Potassium is the major intracellular ion in the body, and required for neural transmission, muscle contraction, and vascular tone. Adequate intake (AI) has been set at 4.7 g/day for adults; most adults have a median dietary intake substantially lower than this (2.8–3.3 g/day in men and 2.2–2.4 g/day in women). Less than 3 percent of the population consumes the AI. (It should be noted that the amounts of potassium in over-the-counter supplements is limited to <100 mg/day, making potassium a poor choice for supplementation.)

According to data from the National Health and Nutrition Examination Survey, nearly 75 percent of light-skinned and up to 90 percent of dark-skinned Americans are vitamin D insufficient (based on circulating blood levels of the vitamin), a doubling over the last 10 years. Humans can, in theory, produce sufficient vitamin D for their metabolic needs (full body UV exposure during summer months can produce 10,000–20,000 IU of D3 in 10 – 12 minutes in a light-skinned individual). However, in latitudes above 40 degrees (about half of the United States), there is insufficient UVB irradiation to produce vitamin D during winter months, and even summer sun exposure to the face and arms produces only minimal vitamin D. Additionally, the wavelengths of UV light needed to produce D are those associated with elevated skin cancer risk, which strengthens the case for supplementation for this vitamin.

The average intake of choline for older children, men, women and pregnant women, is far below the AI; adults over 70 consume an average of 264 mg/day, half of the recommended 550 mg. Only about 10 percent of Americans have usual choline intakes at or above the AI. Choline is essential for neurotransmitter synthesis (acetylcholine), cell-membrane signaling (phospholipids), lipid transport (lipoproteins), and for forming S-adenosylmethionine (SAMe), ubiquitous metabolic intermediate and methyl donor.

B12 (cobalamin) insufficiency may be fairly prevalent according to data from the Framingham Offspring Study; about 8 percent of adults were found to have a frank deficiency. Almost 40 percent of adults over a wide range of ages (23–83 years old) were found to have suboptimal levels, which may increase their risk of neurological and cardiovascular problems. Although the body can store several years worth of B12, up to 38 percent of adults may exhibit mild deficiency and depleted stores. Vitamin B12 and B6 status is especially dependent on age, as both become absorbed less efficiency with age.

Calcium, magnesium, and phosphorusare essential to skeletal health, and required for hundreds of metabolic reactions. Only 30 percent of the U.S. population over two years of age consume the RDI for calcium, and less the 55 percent consume the recommended amount of magnesium. While most adults get enough phosphorus in their diets, a significant population of older women (10 percent of women over 60 and 15 percent of women over 80) is getting less than 70 percent of the RDA. Phosphorus is required for the formation of calcium hydroxyapatite (the crystal form of calcium in bones); deficiency increases osteoporosis risk.

Iron deficiency is most prevalent in women (9–15 percent of women 12–49 years old), and toddlers (3–14 percent of children 1–5 years old) based on data from the 2003–2006 NHANES. Part of this deficiency may actually be the result of absorption, rather than low intake: Iron from grains or fortified grains (“non-heme iron”) is taken up less efficiently than “heme iron,” the form found in animal products. On average, women have been steadily increasing their grain consumption and decreasing beef consumption since the 1970s; this switch to a less available form of iron in the diet may explain the observed deficiency in whole-body iron stores, despite an adequate average intake.

Kevin M. Connolly, PhD

Kevin M. Connolly, PhD received his bachelor’s degree in anthropology from Brown University, and doctorate in biochemistry and molecular biology from UCLA. Before consulting for the dietary supplement industry, he spent 15 years in basic biochemistry research elucidating such diverse mechanisms as bacterial antibiotic resistance and collagen synthesis. He contributes to several online and print publications, and is a frequent guest on radio health programs throughout the country. When not writing, he teaches undergraduate biochemistry.