Can you trust #supplements studies? Here's how to read between the lines @deliciousliving #nutrition #health
Got science? You should. Science provides nutritional guidelines and can help you discern the best supplements for your personal needs. Unfortunately, study results often get lost in translation. The result: Regular supplement-bashing headlines, such as “More Evidence That Omega-3 Supplements Don’t Work” (Forbes, September 2012), “Enough Is Enough: Stop Wasting Your Money on Vitamin and Mineral Supplements” (Annals of Internal Medicine, December 2013), and that old chestnut that unfairly cratered the day’s leading botanical, “Depressed Not Helped by St. John’s Wort” (ABC News, 2000).
It’s tricky, of course, to conduct pharmaceutical-style, double-blind, placebo-controlled human clinical trials on slower-acting nutrients, so some scientists advocate a different supplement research paradigm based on biomarkers that can suggest health outcomes. In the meantime, here are three things to know when evaluating the latest supplement study news, and more on what they mean.
1. Meta-analyses have value, but they shouldn’t be considered definitive.
That’s because bias, either via inclusion or exclusion, invariably creeps in. Meta-analyses are not new and original studies in which participants are given a bioactive ingredient (such as omega-3s or lycopene) and researchers record its effects; “meta” means that librarians scan the online literature to study an ingredient, and interpretations often depend on exactly what they’re looking for. Two sets of researchers can—and do—reach different conclusions.
2. One study does not a conclusion make.
This is certainly true with a meta-analysis, but it’s also true even for a perfectly constructed and conducted human clinical trial. That’s why the Federal Trade Commission wants to make the “two clinical trial” rule the standard for making any sort of implied health claim with nutrients. Trial researchers need to validate results with another study, by a different set of researchers. Put another way: Trust but verify.
3. Any time a published paper suggests that its conclusions are bulletproof, be wary.
If the paper's author claims no further studies need be conducted, that Truth is contained herein, or that you can disregard the efficacy of said ingredient because of this study alone—big, fat “grain of salt” alert.
In fact, when I read studies, I start by reading the last paragraph to see if the authors make such a bold conclusion. Researchers tend to be pretty close-to-the-vest people, but when they make such end-game assertions, I stop reading right there. Hubris doesn’t impress, and it can lead you astray.
Case study: Omega-3 meta-analysis
Consider an infamous omega-3s meta-analysis published in the Journal of the American Medical Association (JAMA) in 2012. Its methods were fine, as far as meta-analyses go: Selected studies had to be longer than one year in duration, and the mean omega-3 dose was 1.51 grams per day (770 mg EPA and 600 mg DHA).
The analysis contained, however, wide variations in doses and specific disease states. For example, one study included in the analysis looked at 120 subjects with leg-muscle pain caused by poor blood flow (claudication). Patients took 270 mg EPA per day to discern if that would help all-cause mortality, cardiac death, heart attacks, and stroke. Conclusion: The study found a “small reduction in nonfatal coronary events … that warrants further investigation.” I’m thinking that is pretty impressive with such a relatively small EPA dose.
Another study included in the JAMA meta-analysis, meanwhile, used a much larger dose—2,900 mg EPA and 1,900 mg DHA—in patients with hardened arteries (atherosclerosis) and found a statistically significant 30 percent reduction in triglyceride levels (hello, Lovaza!) but only small changes in its primary end point, the diameter of the hardened arteries.
The researchers did mention the first big study on fish oil: the 2002 Italian GISSI trial, which gave 1,000 mg fish oil supplements for a year to more than 11,000 patients who had suffered a heart attack in the previous three months. The fish oil group had significantly fewer deaths. However, the researchers did not include the GISSI in their final analysis, which made it much easier to conclude that omega-3 fatty acids vis-à-vis major cardiovascular disease events had little, if any, effect.
The "conclusion"?
The researchers’ summary in this meta-analysis: “Our findings do not justify the use of omega-3 as a structured intervention in everyday clinical practice or guidelines supporting dietary omega-3 PUFA [polyunsaturated fatty acid] administration.” This would be news to those organizations that have issued guidelines vouching for omega-3s: the World Health Organization, the American Heart Association, and the U.S. National Academy of Sciences.
So is that “small effect” the best we can hope for with omega-3s? I say no. That’s because these studies tended to investigate people who were already suffering from a disease state. In addition, most of these patients were already receiving the standard of care for mainstream medicine: lots of pharmaceuticals. So, if you have seriously ill patients taking a suite of drugs, and then add a supplement to the mix, and the results don’t change significantly, does that mean the supplement does not work, or that it’s not as powerful as a drug? “I fear that in the fog of science the first one to get shot is the dietary supplement,” says Loren Israelsen, industry veteran and president of the United Natural Products Alliance.