Why I Don’t Quote a Lot of Nutritional Studies


You may have noticed that I don’t quote or cite lots and lots of nutritional (or other) studies or trials on my website.  Why?  While it looks nice to have lots of references attached to your articles, I have come to believe firmly that there are studies to “prove” absolutely ANY point of view that a person wants to have.  In other words, NO MATTER WHAT YOUR OPINION, you can find a study to back you up.  There are studies that “show” that saturated fat is “bad” for you, and studies to show that it is good for you.  Studies that claim to “show” red meat is “bad” for you, but on closer examination, you’ll find those same studies show no such thing.  How can that be?

(And I’m not even going to mention here the bias and lack of integrity and conflicts of interest that go into many studies and trials.  We all know it exists, but it is only PART of the problem.)

To borrow a line from a famous movie, we could probably find a study that can prove that “an elephant can hang over a cliff with only his tail tied to a daisy.  But use your common sense.”  (Just for fun, message me if you know the name of the movie, lol!  It was a good one!)

So, while studies are nice to have, I think that your own N=1 (in other words, your own experiment on yourself) is more compelling.

I was recently watching a video clip (HERE) of Dr. Aseem Malhotra and Dr. Joel Kahn debating the role of dietary fat with heart disease.  Dr. Kahn states that there is conclusive proof, in the form of nutritional studies, that show that Dr. Ornish’s plant based, low fat diet completely reverses cardiovascular risk factors.  Dr. Malhotra had an EXCELLENT response.  That is, in Dr. Ornish’s studies, he also eliminated sugar, processed and refined carbs, and most starches.  He got people to quit smoking, got them to exercise and control stress.  If ALL of these factors were changed, how can you pinpoint that the avoidance of fat in the diet was the SOLE cause of the improved risk factors?  The answer is, YOU CAN’T.

And that is one of the BIG problems with nutritional studies.   I’m going to copy my next few paragraphs from a previous post I did where it probably got buried in a much bigger, unrelated article.

Let’s talk about that very assertion that a low fat, plant-based diet is the definitive answer to heart disease (and I’m not anti-plant-based diet.  However, I think that if you choose a plant based diet, you should do so because you prefer it, not because a study “proved” it was “better.”)

Here we go.

Some folks say there are studies that “prove” that meat, or animal products for that matter, are bad for you. They say that there are studies that show that “meat eaters are less healthy.” Well, it’s the meat, right? (Wink wink) And then, by association, if meat is bad for you, ALL animal products are therefore, bad for you. Or is it that some of the meat eaters in these particular studies also smoked, drank more alcohol, exercised less, and ate more sugar and refined carbohydrates vs the plant eaters in these studies?

The law of good science states that only ONE variable can be tested at a time to prove any causative link. So you would have to take two groups where EVERYTHING is exactly the same (ie. all participants are non-smokers, all exercised the exact same amount, all avoided alcohol, all avoided sugar and refined carbs, all got the same amount of sleep, all had stress management support, etc) except one group added some meat to their diet and one went without.

No study to my knowledge like this has ever been done. In fact, this is what makes nutritional studies nearly impossible. Because two variables will ALWAYS be involved. If you lower one macronutrient, you increase another. If you don’t replace the missing macronutrient, then the calories won’t be the same (another variable).  So that is why nutritional studies are often not extremely definitive and why I’d rather see each person experiment for themselves, based on their own health markers, what is best for them.

(In my next article, we will discuss some of those important health markers and what are optimal levels.)

Here is an excellent article entitled “I asked 8 researchers why the science of nutrition is so messy. Here’s what they said.”  I’m going to copy and paste a portion of the article directly from the original post, as it was so great at explaining this issue that there is no way I could top it.  I am, however, going to add my comments (in bold, italics.)

(The full, original article is HERE.)


“I asked 8 researchers why the science of nutrition is so messy.
Here’s what they said.”

“There was a time, in the distant past, when studying nutrition was a relatively simple science.  In 1747, a Scottish doctor named James Lind wanted to figure out why so many sailors got scurvy, a disease that leaves sufferers exhausted and anemic, with bloody gums and missing teeth. So Lind took 12 scurvy patients and ran the first modern clinical trial.

The sailors were divided into six groups, each given a different treatment. The men who ate oranges and lemons eventually recovered — a striking result that pointed to vitamin C deficiency as the culprit.  This sort of nutritional puzzle solving was common in the pre-industrial era. Many of troubling diseases of the day, such as scurvy, pellagra, anemia, and goiter, were due to some sort of deficiency in the diet. Doctors could develop hypotheses and run experiments until they figured out what was missing in people’s foods. Puzzle solved.

Unfortunately, studying nutrition is no longer that simple. By the 20th century, medicine had mostly fixed scurvy and goiter and other diseases of deficiency. In developed countries, these scourges are no longer an issue for most people.

Today, our greatest health problems relate to overeating. People are consuming too many calories and too much low-quality food, bringing on chronic diseases like cancer, obesity, diabetes, and heart disease.  Unlike scurvy, these illnesses are much harder to get a handle on. They don’t appear overnight; they develop over a lifetime. And fixing them isn’t just a question of adding an occasional orange to someone’s diet. It involves looking holistically at diets and other lifestyle behaviors, trying to tease out the risk factors that lead to illness.

Nutrition science has to be a lot more imprecise. It’s filled with contradictory studies that are each rife with flaws and limitations. The messiness of this field is a big reason why nutrition advice can be confusing.  It’s also part of why researchers can’t seem to agree on whether tomatoes cause or protect against cancer, or whether alcohol is good for you or not, and so on, and why journalists so badly muck up reporting on food and health.

To get a sense for how difficult it is to study nutrition, I spoke to eight health researchers over the past several months. Here’s what they told me:

1.  It’s not practical to run randomized trials for most big nutrition questions.

In many areas of medicine, the randomized controlled trial is considered the gold standard for evidence. Researchers will take test subjects and randomly assign them to one of two groups. One group gets a treatment; the other gets a placebo.

The idea is that because people were randomly assigned, the only real difference between the two groups (on average) was the treatment. So if there’s a difference in outcomes, it’s fair to say that the treatment was the cause. (This was how James Lind figured out that citrus fruits seemed to have an effect on scurvy.)

The problem is that it’s just not practical to run these sorts of rigorous trials for most important nutrition questions. It’s too difficult to randomly assign different diets to different groups of people and have them stick with those diets for enough time to find clues about whether certain foods caused certain diseases.

“In an ideal world,” said the British physician and epidemiologist Ben Goldacre, “I would take the next 1,000 children born in Oxford Hospital, randomize them into two different groups, and have half of them eat nothing but fresh fruit and vegetables for the rest of their lives, and half eat nothing but bacon and fried chicken. Then I’d measure who gets the most cancer, heart disease, who dies the soonest, who has the worst wrinkles, who’s the most clever, and so on.”  But, Goldacre adds, “I would have to imprison them all, because there’s no way I would be able to force 500 people to eat fruits and vegetables for a life.’”

(This statement again shows bias, as if bacon and fried chicken fall into the same category, assuming all fat is bad.  Fried chicken is bad because it is dredged in flour and then fried in highly inflammatory polyunsaturated oils.  That is not even close to a fresh meat cooked in its own natural fats.  We see this all the time when they say to “avoid high fat foods” and show a picture of a sugar drenched doughnut!  Hello!  We’re blaming the wrong enemy.)

It’s undeniably a good thing that scientists can’t imprison people and force them to stick to a particular diet. But it means that real-world clinical trials on diet tend to be messy and not so clear-cut.

Take the Women’s Health Initiative, which featured one of the biggest and most expensive nutrition studies ever done. As part of the study, women were randomly assigned to two groups: One was told to eat a regular diet and the other a low-fat diet. They were then supposed to follow the diet for years.

The problem? When researchers collected their data, it was clear that no one did what they were told. The two groups basically had followed similar diets.  “They spent billions of dollars,” says Walter Willett, a Harvard physician and nutrition researcher, “and they never tested their hypothesis.”

Conversely, it is possible to conduct rigorous randomized control trials for very short-term questions. Some “feeding studies” keep people in a lab for a period of days or weeks and control everything they eat, for example. But these studies can’t measure the effects of specific diets for decades — they can only tell us about things like short-term changes in cholesterol. Researchers then have to infer what long-term health effects might result. There’s still some educated guesswork involved.

(The Women’s Health Initiative was an EXCELLENT example of study bias.  The study was created to “prove” the benefits of the low fat diet.  When it failed to do so, rather than admit that the low fat diet is an abismal failure, they blamed the study participants for “doing it wrong.”  In other words, the study wasn’t flawed, the participants were.  So, if the study had “proved” their point, they would have considered it a triumph for the low fat diet.  But since it proved that the low fat diet provides no benefit, they blamed the study participants.  Typical…)

2.  Instead, nutrition researchers have to rely on observational studies — which are rife with uncertainty

So instead of randomized trials, nutrition researchers have to rely on observational studies. These studies run for years and track very large numbers of people who are already eating a certain way, periodically checking in to see, for example, who develops heart disease or cancer.  This study design can be very valuable — it’s how scientists learned about the dangers of smoking and the benefits of exercise. But because these studies aren’t controlled like experiments, they’re a lot less precise and noisy.

An example: Say you wanted to compare people who eat a lot of red meat with fish eaters over many decades. One hitch here is that these two groups might have other differences as well. (After all, they weren’t randomly assigned.) Maybe fish eaters tend to be higher-income or better-educated or more health-conscious, on average — and that’s what’s leading to the differences in health outcomes. Maybe red meat eaters are more likely to eat lots of fatty foods or smoke.   Researchers can try to control for some of these “confounding factors,” but they can’t catch all of them.

(This is what I talk about above when I say that only one variable can be tested and otherwise, the group’s must be identical – except for that ONE variable.)

3.  Another difficulty: Many nutrition studies rely on (wildly imprecise) food surveys

Many observational studies — and other nutritional research — rely on surveys. After all, the scientists can’t hover over every single person and watch what they eat for decades. So they have subjects report on their diets.

This poses an obvious challenge. Do you remember what you ate for lunch yesterday? Did you sprinkle nuts or dressing on your salad? Did you snack afterward? Exactly how many potato chips did you eat?

Chances are you probably can’t answer these questions with any certainty. And yet, a lot of nutrition research today rests on just that kind of information: people’s self-reporting from memory of what they ate.  When researchers examined these “memory-based dietary assessment methods,” for a paper in the Mayo Clinic Proceedings, they found that this data was “fundamentally and fatally flawed.” Over the 39-year history of the National Health and Nutrition Examination Survey — which is a national study based on self-reported food intake — the researchers found that the alleged number of calories consumed by 67 percent of the women in the study was not “physiologically plausible” given their body mass index.

This may be because people lie about what they eat, offering answers that are more socially acceptable. Or it may be a simple failure of memory. Whatever the cause, this leaves researchers in a tricky place, so they’ve developed protocols to account for some of those errors.

Christopher Gardner, a Stanford nutrition researcher, says in some studies he provides food for people. Or he has dietitians go over people’s diet in detail, checking it against their bodyweight and health outcomes to make sure it seems valid. He builds in margins of error to account for potential problems in recall.  But he conceded that he and others in his field dream of having better tools, like chewing and swallowing monitors or wrist motion detectors that track “plate-to-mouth motion.”

Even better, said Gardner: “I want a camera, a stomach implant, a poop implant, and a thing in the toilet that grabs your pee and poop before you flush it away and electronically sends information off about what was in there.”

(This is actually how we got in this mess to begin with.  The low fat dietary recommendations were decided based on food surveys.  Now you know why we have been in so much trouble!)

4.  More complications: People and food are diverse

As if the problems with observational studies and survey data weren’t enough, researchers are also learning that different bodies have really different responses to the same food. That makes nutrition research even more difficult, introducing another confounding factor.

In a recent study published in the journal Cell, Israeli scientists tracked 800 people over a week, continuously monitoring their blood sugar levels to see how they responded to the same foods. Every person seemed to respond wildly differently, even to identical meals, “suggesting that universal dietary recommendations may have limited utility,” the researchers wrote.  “It’s now clear that the impact of nutrition on health cannot be simply understood by assessing what people eat,” said Rafael Perez-Escamilla, a professor of epidemiology and public health at Yale, “as this is strongly influenced by how the nutrients and other bioactive compounds derived from foods interact with the genes and the extensive gut microbiota that individuals have.”

(This is why you have to experiment on yourself.  No broad general recommendations will be good for all people.)

Making things even more maddeningly complicated, seemingly similar foods can differ wildly in nutrition profile. A local, farm-fresh carrot will probably be less diluted in its nutrients than a mass-produced baby carrot that’s been bagged in the grocery store. A hamburger at a fast-food restaurant will have different fat and salt content compared with one made at home. Even getting people to better report on every little thing they put into their bodies can’t completely address this variation.

There’s also the issue of food replacement: When you chose to eat something, you’re usually eating less of something else. So if a person decides to stick to a diet mostly composed of legumes, for example, that means he’s not eating red meat or poultry. This raises a question in studying his health outcomes: Was it the legumes he ate lots of or the meat he didn’t eat that made the difference?

(Once again, more than one variable being tested.)

The last problem is nicely illustrated by studies of dietary fat. When researchers followed people who ate low-fat diets, they realized that health outcomes were really affected by what study participants replaced the fat with. Those who replaced fat with sugary, refined carbohydrates ended up having obesity and other health issues at least as frequently as those eating higher-fat diets.

(Higher fat diets have NEVER been shown to be a problem, only when they are mixed with high carb.)

5.  Conflict of interest is a huge problem in nutrition research

There’s one final problem with nutrition research that adds to the confusion. Right now, nutrition science is horribly underfunded by government — leaving lots of space for food companies and industry groups to sponsor research.

This means, quite simply, that food and beverage makers pay for many nutrition studies — with sometimes dubious results. More troubling: The field of nutrition research hasn’t quite caught up to medicine when it comes to building in safeguards to address potential conflicts of interest.  “So much research is sponsored by industry,” wrote nutrition and food policy researcher Marion Nestle in a recent issue of JAMA, “that health professionals and the public may lose confidence in basic dietary advice,”

Industry-funded studies tend to have results that are more favorable to industry. Between March and October last year, Nestle identified 76 industry-funded studies. Of those, 70 reported results that were favorable to the industry sponsor. “In general,” she wrote, “independently funded studies find correlations between sugary drinks and poor health, whereas those supported by the soda industry do not.”

(In other words, you can buy a study to say anything you want.)


While nutrition studies are not COMPLETELY futile, I would NOT rely on them heavily to make decisions about what you personally should eat.  As I mentioned earlier, you need to do what is best for YOU.  What gives you the best outcomes.  What makes you: feel your best, control your weight and achieve optimal health markers.  And lastly, it must be a plan you can live with for life.  Temporary changes to your lifestyle will lead to temporary results.  That is why diets don’t work.  So, you have to find a way of eating and living that is something you are able and willing to do for life, for long term results.

So, if you don’t see a lot of nutrition studies quoted on this page, it’s NOT because I don’t read them.  I have read hundreds and hundreds of them and I often share their content in a logical and common sense way for my readers to benefit from.  There are some researchers out there who are unbiased and whose integrity I have come to trust.  However,  I’m just not going to cite every statement I make with a reference because, in the end, only you can decide what is best for you and you will only be able to determine that with your own self experiment.

As always, be well.