Sunday, January 17, 2016

Cholesterol effects on heart

The controversy surrounding the lipid hypothesis, in particular the relationship between elevated total and LDL cholesterol and coronary heart disease was considered largely resolved and regarded as scientific fact within the scientific community by 1984 when the expert panel from the National Institutes of Health (NIH) reviewed the relevant literature and agreed that the relationship was causal. The panel concluded:
Since 1984 evidence accumulated from over 100 randomized controlled trials of various medical and dietary based lipid modifying interventions has further established that lowering LDL cholesterol significantly decreases the risk of coronary heart disease and all-cause mortality, independent of changes to HDL cholesterol and triglycerides, and non-lipid effects of specific drugs.3 4
Controversy however has lingered over whether medical and dietary based interventions to lower total and LDL cholesterol, and perhaps triglycerides may increase the risk of certain stroke subtypes, in particular hemorrhagic stroke. Controversy has arisen in part due to the interpretation of certain statin trials, prospective cohort studies, and observational studies in certain populations with unique cardiovascular profiles, in particular the Japanese. This has led some to suggest that physiological levels of LDL cholesterol (less than 70 mg/dl; 1.8 mmol/l), the levels observed in newborn humans, free-ranging mammals, and human populations on low cholesterol diets that do not develop atherosclerosis may somehow increase the risk of hemorrhagic stroke.

There are two major categories of stroke, ischemic and hemorrhagic. Ischemic stroke occurs as a result of an obstruction with the blood supply to the brain, while hemorrhagic stroke occurs as a result of a rapture of a weakened blood vessel. In contrast to the observed decline of stroke incident in Japan where there was a significant improvement in a number of major risk factors but an increase in mean serum cholesterol, Finland experienced one of the highest rates of stroke mortality in the world as well as one of the largest declines, which was in part explained by a decrease in serum cholesterol.8 Unlike Japan, Finland also experienced the highest rate of coronary heart disease mortality in the world as well as the largest decline, which was predominantly explained by cholesterol lowering dietary changes . Furthermore, evidence suggests that Japanese Zen monks who consume significantly less meat and fish than the general Japanese population experience lower rates of stroke and all-cause mortality, independent of BMI, alcohol intake and other lifestyle factors.
At the opposite end of the dietary spectrum higher rates of stroke mortality have been observed among the three main Inuit populations, including those in Greenland, Canada and Alaska compared to their non-Inuit Western counterparts, yet experience similar rates of non-stroke cardiovascular mortality. Evidence of atherosclerosis and other chronic and degenerative diseases have been observed in numerous preserved Inuit mummies that date back to pre-western contact, suggesting that their high rate of cardiovascular mortality cannot be entirely explained by influences of modern dietary and lifestyle factors . Furthermore, the declining rates of cardiovascular mortality, including stroke among the Inuit undergoing a rapid transition towards a western diet and lifestyle has raised questions regarding the health properties of the traditional Inuit diet based on marine animals.
Coronary atherosclerosis in a pre-contact Inuit mummy dating back 1,600 years*


Recently the largest meta-analysis of statin based randomized controlled trials on the effect of lowering LDL cholesterol and risk of stroke was published, including 31 trials with >182,000 participants and >6,200 cases of stroke. Statins significantly decreased the risk of total and ischemic stroke and all-cause mortality, without evidence of publication bias, consistent with findings from animal studies. There was however a small statistically insignificant increase in incidence of hemorrhagic stroke in the statin group which was not related to either the degree of reduction of LDL or the achieved LDL. The researchers provided the following possible explanation for these findings:
In addition to their lipid-lowering properties, statins may have antithrombotic properties by inhibiting platelet aggregation and enhancing fibrinolysis. The antithrombotic affects of statins could account for a theoretically increased risk of bleeding complications.
All of the very large prospective cohort studies that included >300,000 participants have either found no association between total and LDL cholesterol and risk of hemorrhagic stroke, or an inverse association confined to participants with hypertension, or a positive association confined to participants with low blood pressure. A prospective study with >787,000 Korean participants and >9,900 cases of stroke found that while serum cholesterol was associated with a higher risk of ischemic stroke, the researchers found suggestive evidence that the inverse association between serum cholesterol and hemorrhagic stroke confined to hypertensive participants was not causal, but acted as a marker of binge drinking.The researchers explained:
In our study, increased risk of hemorrhagic stroke in people with low concentrations of blood cholesterol (less than 4.14 mmol/l) was restricted to those with high GGT values [a measure of alcohol intake]; this relation was less evident when alcohol consumption was measured by self report. The measures of blood pressure might not have been a true reflection of risk, as transient high blood pressure associated with binge drinking may have an important role in hemorrhagic stroke. At low concentrations of GGT, low serum cholesterol was not associated with a higher risk of hemorrhagic stroke. In effect, low blood cholesterol may act as a marker of the health damaging effects of alcohol, rather than be a cause of hemorrhagic stroke.
There maybe limitations with the studies which only address whether blood pressure considered by hypertension status modifies the association between serum cholesterol and risk of stroke. As with hypercholesterolemia, the definition of hypertension, blood pressure of >140/90 mmHg, far exceeds levels that have been clearly scientifically documented as being optimal. For example, a meta-analysis of 61 prospective studies including >958,000 participants and >11,900 cases of stroke deaths found that lower usual blood pressure was associated with a reduced risk of mortality from stroke and coronary heart disease, without any evidence of a threshold down to at least 115/75 mmHg. These findings are consistent with a meta-analysis of 147 randomized controlled trials that administered blood pressure lowering medication. This justifies investigating whether optimal blood pressure compared to high-normal blood pressure further modifies the association between serum lipids and the risk of stroke subtypes.
A meta-analysis of 61 prospective studies with >892,000 participants and >11,600 cases of stroke deaths found not only that serum cholesterol was inversely associated with total and hemorrhagic stroke mortality in participants with very high baseline systolic blood pressure (>145 mmHg), but that lower serum cholesterol was actually associated with a significantly lower risk of hemorrhagic, ischemic and total stroke mortality in participants with near optimal or ‘physiological’ baseline systolic blood pressure (less than 125 mmHg)(Fig. 1). As most participants in the age range most susceptible to stroke had either high-normal blood pressure or hypertension, the combined results were biased towards finding an inverse association between serum cholesterol and hemorrhagic stroke mortality.
Figure 1. Systolic blood pressure specific hazard ratios for 1 mmol/L lower usual total cholesterol and risk of stroke mortality
If this association is causal and not obscured by other factors such as binge drinking, this may explain why populations with low cholesterol and high blood pressure such as the Japanese have high rates of stroke, in particular hemorrhagic stroke, and populations that maintain physiological levels of both cholesterol and blood pressure throughout life have an observed absence of stroke.

There is limited suggestive evidence that the atherosclerosis build-up process in the carotid and major cerebral arteries caused by excess LDL cholesterol in-turn reduces arterial blood supply to the brain that would otherwise cause the blood vessels in the brain to rupture in the presence of high blood pressure, thus explaining why elevated cholesterol may lower the risk of cerebral hemorrhage in people with high blood pressure. Indeed, a Japanese study found there was an inverse association between cholesterol and hemorrhagic stroke in an earlier cohort when the mean blood pressure was high and atherosclerosis was relatively low, but no association in the later cohort of the same population when mean blood pressure was reduced from hypertensive to high-normal blood pressure.
Evidence from several but not all observational studies also found that low triglycerides were associated with a statistically significant or non-significant increased risk of hemorrhagic stroke. There is limited data regarding whether the association between low triglycerides and hemorrhagic stroke is modified by blood pressure or alcohol intake, but at least one large study found that the association was stronger among participants with high blood pressure.
As there is convincing evidence that blood pressure increases the risk of stroke at any given cholesterol concentration, it would be advisable that everyone should aim to achieve an optimal blood pressure of less than 115/75 mmHg. Although a number of lifestyle changes including exercise and weight loss can lower blood pressure, a number of dietary changes can also effectively lower blood pressure. This includes reducing intake of salt and increasing intake of dietary fiber rich foods including whole grains, flavonoid rich foods including berries, soy, cocoa solids, and vitamin C and magnesium. These nutrients derived primarily from whole-plant foods may in-turn explain why intervention and observational studies have found that vegetarian diets, in particular vegan diets have favorable effects on blood pressure.

As statins provide little appreciable protection against cancer, and like all drugs have adverse effects including but not limited to an increased risk of developing type II diabetes and memory loss or impairment, a significantly greater benefit would be achieved by lowering LDL cholesterol with a whole-foods plant based diet combined with regular exercise in order to not only lower the risk of cardiovascular disease but many other chronic and degenerative diseases.I review the evidence of dietary factors and the risk of stroke.

Friday, January 8, 2016

Biotin all You need to know


We’ve just updated our page on Biotin, an essential vitamin, also known as vitamin B7. It was discovered in a yeast culture at the same time as several other B-vitamins.
Biotin was initially researched in the context of skin and hair health, and today it is almost exclusively sold as a dietary supplement marketed to improve skin, nail, and hair quality. However, biotin is plentiful in food and rarely needs to be supplemented.
Apart from removing all food containing biotin from your diet (which makes it nigh impossible to maintain a healthy diet) the only way to cause a biotin deficiency is by eating excessive raw egg whites. Egg whites contain a protein called avidin, which is destroyed when the egg is cooked. Avidin binds to biotin and eliminates it from the intestines before it is absorbed.
At the moment, biotin does not have much evidence to support its use as an aesthetics supplement. Though it is biologically possible that increasing biotin intake or normalizing a deficiency could improve nail, hair, and skin quality, there is only one study to date to support this claim. This study found that women supplementing 2.5 mg of biotin over six months experienced improved nail health, as they were suffering from brittle and splitting nails. There are no strong studies to suggest healthy people supplementing biotin would experience any benefits.
There is preliminary evidence to suggest biotin could have a mild anti-diabetic effect. Much more research is needed to confirm this hypothesis.
Biotin is an essential vitamin, but it’s an underwhelming dietary supplement. There is very little evidence to support its use as a health and beauty supplement.

The skinny on the trans fat ban


On June 16, 2015, the United States Food and Drug Administration (FDA) announced their decision to eliminate trans fat from food in the United States by 2018, with a gradual phase-out period beginning immediately.
Take THAT, trans fat advocates! Hold on ... are there any trans fat advocates? While some dislike government regulation of foods and nutrients, there isn’t much debate about trans fat health effects anymore.
This brings up a question … if we all know that trans fat is bad, why is it still a public and personal health issue? Well, it is true that trans fat consumption has dipped considerably, with blood levels dropping by 58% in the 2000s. But incremental consumption of industrially produced trans fat is incrementally harmful, and the National Academy of Science has concluded that there is no safe trans fat dose.
So out of all the nutrient and nutrient-like substances out there, trans fat hold the dubious distinction of being one of the only categorically harmful ones. And you might not always know that you’re consuming trans fat, since some soybean and canola oils can have hidden trans fat inside.
Trans fat is an unsaturated fatty acid and a byproduct of partially hydrogenated oils (PHOs.) It is found in many processed food products, including margarine, coffee creamer, fast food, frozen pizza, snack foods and other baked goods. Trans fat is also found in some peanut butter. It is frequently used by the food industry because it improves flavor stability and shelf life of food. Since trans fat has a different melting point depending on how processed it is, it’s also a very flexible ingredient. But aside from these benefits, it seems that the primary reason trans-fat was added into the food system was the demonization of saturated fat by the USDA in the 1950s. By the 1980s, activist organizations were denouncing food manufacturers for using ‘unhealthy’ saturated fats in their foods, and endorsing trans fat as a ‘healthier’ alternative. Considering the benefits to shelf life, flavor stability, and flexibility, manufacturers gladly made the change.
Some types of trans fat are naturally produced by ruminant animals. This group of animals includes cattle, sheep, goats, buffalo, deer, and other animals with four stomach compartments. The first, and largest, part of the stomach, called the rumen, is where trans fat is produced. Humans can create trans fat through a commercial process called hydrogenation, in which hydrogen gas is boiled through oil (usually vegetable oil) to allow the oil to saturate, which determines its thickness.
Medical professionals consider trans fat to be one of the most unhealthy compounds found in today’s food. Trans fat consumption is associated with increased low-density lipoprotein cholesterol (LDL-C and inflammation), and decreased high-density lipoprotein cholesterol (HDL-C). These health risks can speed up the development of atherosclerosis (clogging and hardening arteries) and increase the risk of diabetes, coronary heart disease, and cardiac-related sudden death. However, a recent systematic review strongly suggests that these negative health effects are primarily attributed to the consumption of industrially-produced trans fatty acids (IP-TFA), but not ruminant-derived trans fatty acids (R-TFA). In fact, most animal models have demonstrated that IP-TFA and R-TFA have different effects on CVD risk factors. For instance, a rat study showed that supplementation with an R-TFA called Vaccenic acid had either a neutral or beneficial effect on CVD risk markers such as total cholesterol, LDL-C, and fasting and postprandial triglycerides.
Trans fat can be made commercially, or naturally by certain animals. It is used in the food industry to improve flavor and shelf life, but the FDA has announced it will be phased out of the U.S. food supply because it is damaging to health.

Trans fat and disease risk

This increased risk is significant. A 2006 meta-analysis found that a 2% increase in trans fat intake is associated with a 23% increase in cardiovascular disease risk. Cutting commercial trans fat intake from 2.1% of daily energy intake to 1.1% could potentially prevent 72,000 cardiovascular deaths. A drop to 0.1% of daily energy intake could potentially prevent 228,000 cardiovascular deaths every year in the U.S.
While the evidence on ruminant-produced trans fat isn’t conclusive regarding potential heart health benefits (especially at the doses commonly ingested), a recent meta-analysis points to no detrimental impact on cardiovascular disease markers.
Even though the FDA has recognized the negative health effects of trans fat and is taking steps to remove it, trans fat is still prevalent in our food. While the American Dietetic Association (ADA) recommends no more than 1% of your daily calories come from trans fat, unclear nutrition labels can sneak a lot of trans fat onto your plate. If a nutrition label claims the product contains “partially hydrogenated” fat or “zero grams of trans fat,” that doesn’t mean there is no trans fat in the product. This is because the FDA previously allowed products to be labeled with zero grams of trans fat as long as the product had less than 0.5 grams. Multiple servings of “zero grams of trans fat” food can result in much more ingested trans fat than the ADA recommends.
Trans fat consumption is a significant contributor to cardiovascular disease. The FDA has long recognized this and finally decided to gradually eliminate it from our food system by 2018. Until then, any industrially produced trans fats still present in our food system should be avoided, though this can be quite difficult due to confusing and misleading nutritional labels.

Fermented foods, neuroticism, and social anxiety


Today’s competitive society is full of stressed people. Extreme and debilitating distress, along with the fear of being judged and criticized by other people can cause panic and social anxiety, characterized by intense sweating, shaking, muscular tension, confusion, and an elevated heart rate. Social anxiety can make social situations very difficult, and if it occurs often, it can severely interfere with day-to-day activities, to the point where socially anxious people will avoid social interactions all together.
Social anxiety is also known as social phobia, as defined by the Diagnostic and Statistical Manual of Mental Disorders (DSM-5). With up to 10.7% of people experiencing this condition at some point in their life, it is the third most common lifetime anxiety and mood disorder in the United States.
Social anxiety, a debilitating disorder that makes social situations extremely distressful and difficult to navigate, is the third most common lifetime anxiety and mood disorder in the United States.
Some research suggests that phobias are, at least in part, hereditary. In fact, a recent twin-study found that the sibling more likely to develop social phobia was the one that inherited genes predisposing them to neuroticism, a personality trait characterized by the tendency to respond poorly to stressors, often leading to the experience of negative emotions, such as anger, envy, nervousness, guilt, anxiety, and depression.

Treating social anxiety

Fortunately, there are a variety of potential treatments for this disorder. Traditionally, cognitive behavioral therapy and selective serotonin reuptake inhibitors (SSRIs) are used. Recently, probiotics (defined as “live microorganisms that, when administered in adequate amounts, confer a health benefit to the host”) have shown promise as a supplement to the traditional treatments for social anxiety. Even though the research is still in its infancy, the fact that probiotics have excellent safety profiles and traditional treatments often only provide partial symptom relief makes them enticing treatment targets.
Research suggests that social anxiety may have a hereditary component. A couple common treatments exist, but they only provide a partial relief of symptoms. Fortunately, probiotics have shown some very early promise as a potential safe supplement to traditional treatments.
A recent study has been touted in the media as providing evidence for the anti-anxiety efficacy of consuming fermented foods that are likely to contain active probiotic cultures. Tantalizing headlines included “Sauerkraut Could Be The Secret To Curing Social Anxiety”. However, there were several limitations, warranting a much deeper look than most media outlets took. Let’s see what this study can really tell us, if anything.

The study

Researchers provided surveys to 710 university students to determine their level of social anxiety, neuroticism, and agoraphobia. The survey also asked how often the students exercised, and how often they ate fermented food. As hypothesized, students that ate more fermented foods tended to experience less social anxiety. Moreover, they found that, as seen in the figure below, social anxiety and neuroticism were positively correlated, and the more neurotic a person was the greater that chance that high fermented food intake might help reduce levels of predicted social anxiety.

However, before jumping to conclusions, there are a few extremely important limitations to consider. This was a cross-sectional study, which only shows correlation, not direct causality. The authors cannot be sure if it was the high fermented food intake that led to low levels of social anxiety or if low levels of social anxiety led to increased consumption of fermented food. It is also possible that an unknown variable caused both the increased consumption of fermented foods and the decreased social anxiety.
It’s also possible that some property of the foods other than their probiotic content affected social anxiety. Furthermore, the nature of a survey or questionnaire is subject to self-report bias. The authors can’t be sure if the participants were being truthful or could even remember exactly what they ate or how much they exercised over the past thirty days. And since the sample was made up of college students, the findings may not be applicable to the general population. Finally, and most importantly, lower levels of predicted social anxiety were also observed in participants that ate more fruit and vegetables, as well as those who exercised frequently. It’s not possible to determine whether exercise or fruit and vegetable consumption are confounding variables or not. This is especially important in light of a recent randomized control trial that found a reduction in symptoms of social anxiety following two months of aerobic exercise.
Based on survey data from 710 university students, a recent study found that consumption of fermented food likely to contain active probiotic cultures was inversely associated with predicted levels of social anxiety. However, due to many limitations and confounding variables, further research is needed before any assertions can be made.
That being said, this study is consistent with other clinical trials that have also demonstrated anxiolytic effects of pre and probiotics in humans. Unfortunately, the exact biological mechanism behind this is still unclear. However, according to preclinical animal trials, there is mounting evidence that certain gut microbiota can have anxiolytic effects through gut-brain pathways, possibly via the vagus nerve. Supporting these findings, the ability of the gut and brain to bidirectionally communicate through neural, endocrine, and immune pathways, also known as the gut-brain axis, has long been recognized, and recent research has made it increasingly clear that interactions with intestinal microbiota are an important part of this communication.
Furthermore, a couple more specific potential mechanisms for how the probiotics confer their anxiolytic effects have been proposed. For instance, given that research has found a positive association between gut inflammation and anxiety-like behaviors, some have hypothesized that probiotics could potentially colonize the gut, displacing species that are harmful to health, and, in turn, may reduce gut inflammation and the associated anxiety-like behaviors. Others have proposed the involvement of the serotonergic system in the neurobiology of anxiety, especially since research surfaced suggesting that certain intestinal microbiota can increase levels of tryptophan in the blood, and therefore potentially facilitate the turnover of serotonin in the brain.
Overall, research on probiotics and anxiety is still in its early stages. According to the authors, this is the first study to provide some extremely limited observational evidence for the efficacy of probiotic supplementation to fight, specifically, social anxiety, and thus, they did not mean to infer causality. Regardless, due to several limitations imposed by the study design, and the huge number of possible confounding variables, this study should solely serve as preliminary evidence, especially considering how strong of a confounding variable exercise is, as several papers have demonstrated its anxiolytic effects. However, if further well-conducted RCTs can suggest a causal role, independent of exercise and other possible confounding variables, probiotics or fermented foods consumption could potentially serve as great low-risk supplement to traditional treatment for social anxiety.
While the study results seem to support probiotic supplementation to help treat social anxiety, they can easily be misinterpreted in the midst of several limitations and confounding variables. The only thing we can state for certain is that further well-conducted RCTs are necessary before we make any conclusions about probiotics and their possible, low-risk health benefits.

The science behind munchies: marijuana and your appetite


We all know that marijuana is a popular recreational drug- and that it’s also got a variety of medicinal uses, including reducing nausea and boosting appetite. But what, exactly is marijuana - and how does it affect the appetite and digestive system?
The answer to that first question is pretty simple, so let’s start with that. The term ‘marijuana’ refers to several plants in the cannabis genus, including sativa, indica, and ruderalis.
Doctors typically prescribe marijuana to treat inflammatory, gastrointestinal, and cognitive ailments. Marijuana is also frequently administered to cancer patients, since it helps ease the pain associated with chemotherapy while increasing the patient’s appetite. This is why marijuana is used in an effort to minimize weight loss, which could lead to further health complications.
As you can imagine, this increase in appetite is one of marijuana’s most well-known effects, you might refer to it as “the munchies”. In fact, historical sources confirm that people as early as 300 BCE knew that cannabis stimulates appetite, and noted how these cravings were for sweet and savory food. Let’s dig into why that happens.

How marijuana works

One of the main active ingredients in marijuana - a chemical compound known as tetrahydrocannabinol (THC) - is one of the main culprits responsible for “the munchies”. Once the marijuana is consumed (normally by smoking), THC activates a receptor called cannabinoid receptor type 1 (CB1), which helps increase appetite. CB1 is also involved with the receptor for ghrelin, a hormone that contributes to an increase in the sensation of hunger.
CB1 receptors appear in a variety of different areas of the body. In each of these areas, these CB1 receptors act in slightly different ways - and many of those effects help increase the desire to eat. CB1 receptors are found in all of the following areas:
  • The hypothalamus and rhombencephalon, two sections of the brain that help regulate food intake.
  • The basal ganglia, where they may help enhance the pleasure we get from eating.
  • The stomach and the small intestine, which also secrete ghrelin, speeding up digestion.
  • The limbic forebrain, where they may also influence the palatability of food.
Researchers have found that inhaling cannabis is also associated with lower levels of peptide tyrosine tyrosine (PYY), a peptide that contributes to appetite suppression. People who use marijuana recreationally tend to have increased levels of ghrelin and decreased levels of PYY, which may be one reason why their daily caloric intake tends to be greater.
Studies have also shown that a person’s method of THC consumption (oral capsules, smoke inhalation, or suppository) can influence their food choice, as well as their overall food consumption. For example, study participants who took a suppository consumed significantly more calories throughout the day than participants who took an oral capsule.
Recent research on CB1 has revealed that a synthetic form of THC (dronabinol) can activate a subset of neurons called proopiomelanocortin neurons (POMC). Though POMC are usually responsible for the feeling of fullness after a meal, these neurons can either release hormones that suppress hunger, or hormones that increase appetite. When CB1 is activated, these hormones prevent POMC from suppressing hunger, and enable it to start increasing your appetite.

Suppressing appetite through the CB1 receptor

Since activating the CB1 receptor contributes to an increase in appetite, blocking it has the opposite effect. Studies on individual cells show that blocking CB1 receptors significantly increases production of adiponectin, a hormone with anti-inflammatory effects and a negative correlation with obesity.
Researchers have also used compounds that can block the CB1 receptor - which are known as endocannabinoid antagonists - to treat obesity associated with eating disorders, which is characterized by compulsive binge eating or cravings for sweets and snacks. Animal studies show that rats given rimonabant, an endocannabinoid antagonist anti-obesity drug, experience weight loss and reduced levels of blood insulin.
Still, a lot more research is needed before we can start recommending these kinds of therapies to human patients. The CB1 drug Rimonabant, for example, failed to earn approval from the U.S. Food and Drug Administration (FDA) - and it’s no longer sold in Europe either, due to side effects associated with its use, which include severe depression and suicidal thoughts. Since CB1 receptors are found all throughout the body, it is difficult to pinpoint the cause of these side effects.
Future endocannabinoid antagonists, however, may play a role in treating obesity by blocking CB1 receptors, increasing adiponectin production, and reducing appetite.
Marijuana has been a part of our society longer than any one civilization, and researchers continue to paint a more complete picture of the compound with every passing year. Follow-up studies will not only need to investigate CB1’s effects throughout the body, but also the different ways THC functions when ingested in various ways. More research on marijuana may also lead to breakthroughs in the fight against obesity because of how effective manipulating hunger can be when it comes to controlling our daily caloric consumption.

Really-low-fat vs somewhat-lower-carb - a nuanced analysis



Introduction

With so many low-carb trial results rolling in each year, you might think that it’s case closed: everything there is to know is known. But there are still a few key pieces missing, and one of those pieces has just been released in the form of a six-day feeding study. Why only six days? That can’t tell us anything, right? Read on to see how revealing this study actually was, as well as what it can’t show (and likely wasn’t designed to show).
Low-carbohydrate diets have become even more popular in the past few years, bolstered by the so-called “carbohydrate-insulin hypothesis of obesity”. This hypothesis suggests that carbohydrates are the main culprit of weight gain. Things get complicated here, because there are both practical factors (e.g. going low-carb means limiting your food options, which typically makes snacking on junk food more difficult) and physiological factors at play. For the latter category, advocates claim that you can harness the power of decreased insulin levels (from carb restriction) and lose more fat due to factors such as elevated free fatty acid release from fat cells and increased fat oxidation.
Some low-carb advocates believe that carbohydrates are uniquely fattening, due to the effects of insulin. Despite a plethora of studies on low carb effects, there are still important areas left to research.
Lo and behold, both meta-analyses and long-term trials often show low carb diets to be as good or better than other diets for weight loss. However, participants usually self-report their food intake, and the longer the trial, the more likely life will get in the way. It’s a catch-22 of sorts: you want long trials to make sure the diet is sustainable, but the longer the trial the more likely there will be unwanted variations in diet.
Despite all the low-carb trials of recent years, there was still a lack of highly-controlled studies that solely altered carbs and fat in participants living in (aka stuck in) a metabolic ward. To address this issue, Kevin Hall’s team at the National Institutes of Health designed a short-term study to isolate the different effects of a restricted-fat diet versus restricted-carb diet on body weight, energy expenditure, and fat balance.
Low-carb performs fairly well for weight loss in trials over the course of months. A recent short-term study controlled several extra variables to isolate the effects of carb reduction on fat loss.

The Study

Methods
The participants were 19 obese volunteers (ten males and nine females) with no apparent disease.
All participants were required to reside within a metabolic unit, where they …
  • First received a baseline weight maintenance diet for five days before being randomized to either ….
  • The Restricted Fat group or Restricted Carbohydrate group, where caloric intake was reduced by 30% from baseline for six days, by either fat or carb reduction.
The carb levels ended up being 352 grams for Restricted Fat versus 140 for Restricted Carb, and the fat levels 17 versus 108. In other words, (moderately lower carb than typical diets) versus (oh my goodness I can count my fat gram intake on my fingers and toes!).
This trial wasn’t designed to explore a real-life 100-gram-and-under low carb diet and especially not a ketogenic diet. Rather, it was a mechanistic study designed so that they could reduce energy substantially and equally from fat or carbs, but without changing more than one macronutrient. If they lowered carbs much more in the Restricted Carb group (like under 100 grams), they’d then have to go into negative fat intake for the Restricted Fat group. And negative fat intake is impossible (*except for in quantum parallel universes). One more note: all participants kept dietary protein constant and exercised on a treadmill for an hour a day.
19 study participants spent six days in a metabolic unit, eating either a moderately low-carb diet (140 grams) or a really darn low-fat diet (17 grams). The trial used these specific levels in order to achieve isocaloric diets with large carb/fat reductions, without having to alter more than one macronutrient.
After the completion of this first phase, subjects went home for a two to four week washout period where they resumed their normal eating habits. Participants then returned to the study center to undertake the same protocol except with switched intervention groups. So those formerly in the restricted carbohydrate group would now be in the restricted fat group, and vice-versa.
This crossover design was one of the many ways in which this trial was more stringent than most previous studies (other reasons are shown in the above graphic), since crossing over eliminates much of the variability that normal randomized trials have. Randomized trials may be considered the gold standard, but this trial was a mix between gold, platinum, and titanium. Very strong, very valuable. They even had participants wear accelerometers on their hips to measure physical activity.
The crossover design was one of many rigorous aspects of the study, along with a relatively large sample size for a metabolic ward study, highly controlled variables, and multiple accurate measurement techniques.
Why not use just DXA for measuring body composition?
The primary method used to measure body composition was to calculate the difference between dietary fat intake and fat oxidation, as measured in a metabolic chamber by indirect calorimetry, which estimates the heat released by a person based on the amount of O2 they consumed and CO2 they produced over a specific period of time. While indirect calorimetry isn’t perfect, it’s accurate and reliable enough to be the standard method used to measure energy expenditure in these types of studies.

Body weight and a dual-energy X-ray absorptiometry (DXA) scanner were also used, but the former can’t calculate body fat, and the latter isn’t sensitive enough to detect the minute difference in body fat loss that usually occurs between eucaloric diet interventions of different macronutrients.
DXA is quite accurate for changes over the long term, but indirect calorimetry is needed for a study of this nature.
Results, limitations, and other considerations
As expected, the researchers found that the Restricted Carb diet resulted in a decrease in daily insulin secretion (by 22%) and a sustained increase in fat oxidation, whereas the Restricted Fat diet resulted in no significant change of either. Despite this, by the end of the six-day period, the Restricted Fat diet resulted in greater fat loss than did the Restricted Carb group (463g vs. 245g).
The Restricted Carb dieters also had lower energy expenditure, to the tune of 98 fewer kcal/d compared to only 50 fewer in the Restricted Fat group. This isn’t enough calories to account for the difference in fat loss though. So why exactly did the Restricted Fat group lose that much more fat than the Restricted Carb group? The paper doesn’t get into this much, but gives some hints:
"Model simulations suggest that the differences in fat loss were due to transient differences in carbohydrate balance along with persistent differences in energy and fat balance. The model also implicated small persistent changes in protein balance resulting from the fact that dietary carbohydrates preserve nitrogen balance to a greater degree than fat”
… so their mathematical model points to a few possible minor factors, including a possible small benefit from dietary carbs benefiting protein balance. The use of this complex model to extend the results out further is interesting, as it partly compensates for having a six-day-only study (which is normal in the world of metabolic ward studies), but on the other hand the model isn’t something that is easily understandable by people other than the study authors. Maybe it’s really accurate, maybe it’s not.
“Very low carbohydrate diets were predicted to result in fat losses comparable to low fat diets. Indeed, the model simulations suggest that isocaloric reduced-energy diets over a wide range of carbohydrate and fat content would lead to only small differences in body fat and energy expenditure over extended durations.”
… ah, so if the researchers were able to reduce carbs to a much lower level (which they couldn’t, due to the study design factors described earlier), the diets would have actually led to similar weight loss. That makes those “New Study Shows Low-Carb Failure!” headlines sound a bit silly. If you take a really-darn-low-fat diet like the Restricted Fat diet, and compare it to a very-low-carb diet, you’re comparing two extreme diets and are more likely to get some metabolic advantage. Our bodies are typically accustomed to a somewhat balanced mix of fuel, and extreme macronutrient diets can probably game the system a bit for a modicum of extra fat loss.
More importantly, the authors put the results really important context: the differences in body fat loss between a wide range of different carb intakes are predicted to be very small (although the Restricted Fat diet was predicted to sustain its slight advantage over the course of months). This study wasn’t meant to demonstrate that low(ish) carbs are bad or low-fat is good, it was simply testing the hypothesis that carb reduction provides some secret sauce for fat loss in highly controlled conditions.
Surprisingly, the paper doesn’t mention the term “glycogen” even once. The study participants did an hour of incline treadmill a day, and since they had an average BMI around 36, that could mean a decent amount of glycogen burn with prolonged activity at a high body weight. So if liver and muscle glycogen happened to be relatively more depleted in the Restricted Carb group (since they replenished glycogen less by fewer dietary carbs), that might mean less fat loss in the short run, which may not apply as much to the long run when glycogen is in a steady-state.
Although the Restricted Fat group lost a bit more fat, fat loss over time was predicted to be similar over a range of carb intakes based on a mathematical model of metabolism. The main application of the study may be that despite a reduction in insulin, there was no extra fat loss advantage for the Restricted Fat diet … which more so argues against the “Carbohydrate-Insulin Theory of Obesity” rather than denying the efficacy of low-carb diets.
As always, there are a few more limitations to consider.
Due to the sample population chosen, the results of this study only apply to obese adults who are otherwise healthy. Further limiting the generalizability of the results, the tightly controlled study design does not accurately represent the free-living world, as most of us do not have strict external controls on our food choices.
And to repeat a very important point: this study was not meant to inform long-run dietary choices. In the long-run, the choice between restricting fat or restricting carbs to achieve a caloric deficit may come down to one thing: diet adherence.
While preference for certain foods may dictate which diet is easier to adhere to, this isn’t always the case. For instance, it seems that insulin-resistant individuals have an easier time adhering to a low-carbohydrate diet. Nowadays, new dieters often pair low-carb with higher protein, the latter of which can boost weight loss. And since there are plenty of high-sugar but low-fat junk foods (see Mike and Ike, et al.) but not so many high-fat but low-carb junk foods, low carb intakes can sometimes mean an easier time staying away from junk food when compared to low fat diets.
What about the six day trial duration? Does that mean the results are less valid? Well, it depends on the question you want answered. There aren’t any six-month-long metabolic chamber studies because they would both be ludicrously expensive and turn into studies of hospital patients rather than free-living people. So the researchers chose to shed light onto this question:
“Could the metabolic and endocrine adaptations to carbohydrate restriction result in augmented body fat loss compared to an equal calorie reduction of dietary fat?”
Quite clearly, the answer was no in this study. Many in the low-carb blogosphere have argued that six days was too short for fat-adaptation. Maybe, but the paper also said:
“Net fat oxidation increased substantially during the RC (restricted carbohydrate) diet and reached a plateau after several days, whereas the RF (restricted fat) diet appeared to have little effect.”
So restricting fat didn’t change metabolism much, while restricting carbs increased fat oxidation at first but not after a few days. It’s possible that other physiological mechanisms (e.g. mitochondria-related factors) may take longer to adapt (especially in very low carb / keto diets), but this wasn’t a study of keto diets, it was a study testing whether carb restriction leads to extra fat loss compared with fat restriction.
And while not without their limitations, free-living studies have generally shown that the low-carb groups tend to lose a bit more fat mass by the six-month mark (even when controlling for energy intake), but weight loss at the end of the trial tends to be similar. A big part of that is likely the increased protein that is typically coupled with lower carb. Those are things that media reports won’t mention when covering the current study -- context matters, and the totality of research suggests that media harping on each low-carb trial in isolation is dumb.

Summary

As usual, don’t bother with media headlines -- this study is NOT a blow to low-carb dieting, which can be quite effective due to factors such as typically higher protein and more limited junk food options. Rather, this study shows that a low-carb diet isn’t necessary for fat loss and that lowering carbs and insulin doesn’t provide a magical metabolic advantage.
It bears repeating: if you even try to apply this study to the real world of dieting choices, you will be frowned upon strongly. Even the lead author writes:
"Translation of our results to real-world weight-loss diets for treatment of obesity is limited since the experimental design and model simulations relied on strict control of food intake, which is unrealistic in free-living individuals."
This study was strictly meant to fill in a gap in the knowledge base of diet physiology. Got it?
If you need a broad and simple takeaway from this study, here is one: weight loss does not rely on certain carb levels or manipulation of insulin, it relies on eating less. Don’t be scared that eating carbs will cause insulin to trap fat inside your fat cells.

Scientists just found that red meat causes cancer ... or did they?


In the past couple days, unless you’ve been living under a rock, you’ve probably been seeing headlines of the “Red Meat as Carcinogenic as Smoking!” variety.
What happened? Just a year and a half ago, we covered the (ludicrous) media frenzy on “High Protein Diets as Dangerous as Smoking”. Are reporters taking crazy pills, or is there really something to the headlines this time?
To understand this issue, you have to understand just a bit about the science of red meat metabolites as well as about epidemiology. The following is a quick primer. But before reading, please realize that being labeled as a carcinogen is fairly common: Not only are harmful compounds like alcohol potentially carcinogenic, but so are the health elixirs aloe vera and yerba mate tea. The actual impact depends on the dose, what makes up the rest of your diet, and many other factors.
The connection between red meat and cancer is more complex than most articles suggest. It’s going to take more than a couple minutes to understand the research issues involved.

Know your meat

First off, you better know what exactly red meat is, if you’re planning to enlighten your friends on this issue (don’t actually enlighten them unless they ask ... people hate it when you give unprompted nutrition lectures).
Pork is not white meat, no matter what the television tells you. The meat of mammals like pigs is typically red when raw, due to its high hemoglobin content, so researchers and the USDA consider pork to be red meat.
Although both fish and chicken fall under the white meat category, they’re pretty darn different in nutrient content. So we can already see that the red vs. white dichotomy is a bit too simplistic to rationally motivate health or policy decisions, without even delving into grass-fed meat vs. cheap grain-fed meat, or other assorted issues. But there is something in red meat that could potentially cause cancer: That very same hemoglobin we mentioned may have unique properties, and those properties may depend on how the meat is processed and cooked. We’ll get more into that in a moment.
Red meats are not all the same, and neither are white meats. Oh, and pork is not white meat.

What did the paper actually say?

If you’re a big advocate of red meat and know something about nutrition (especially just enough to be dangerous), be careful not to scoff at these findings and presume you know more than the scientists do.
The headlines are based on a short summary paper that refers to a massive analysis of over 800 studies. Earlier this month, experienced researchers met at the International Agency for Research on Cancer (IARC), an arm of the World Health Organization (WHO), and came up with some strong conclusions on red meat.
Specifically, with regard to colorectal cancer, they classified processed red meat as a “Group 1” carcinogen (“carcinogenic to humans”). As for regular red meat, it was classified as a “Group 2A” carcinogen (“probably carcinogenic to humans”).
There are a few important things you should know straight off the bat. First, the findings were mostly referring to cancer of the colon or rectum. While colorectal cancer is very important (it’s the third leading cause of cancer death in the US), you can’t generalize the researchers’ findings to all other cancer types.
Cancer is not one monolithic condition. The researchers were mostly making conclusions about one type: colorectal cancer.
Second, it’s not like we didn’t know this stuff already. Processed red meat has been strongly linked to colorectal cancer for many years (we’ll get into the mechanisms later). Regular ol’ unprocessed red meat is more of a mixed bag in terms of evidence, but still has several mechanisms by which it may increase cancer incidence.
It’s extra funny that the media headlines of the past days have been so extreme, since they’re based on a 1.5-page summary! The full conclusions will be disclosed at a later time in a WHO monograph on processed red meat and cancer. The currently available publication doesn’t even begin to delve into the overall risk of cancer or the magnitude of the risks of different cancers — we’ll have to wait for the monograph to find out (at which point crazy media headlines will once again ensue).
This isn’t really new information. It’s based on studies that have been conducted for the past 10-20 years. So this is more of a new framing of evidence than a presentation of new evidence. Plus, the full paper isn’t even out yet.
Third, just because the WHO is a big (big) deal doesn’t mean they can’t be wrong. Wrong is a strong word though, let’s just say “slightly off”. For example, let’s look at their position on salt intake. Their guidelines call for less than 2,000 mg of salt a day, despite other organizations like the US Centers for Disease Control and Prevention (CDC) having updated their views based on new research. A large amount of research shows that keeping salt intake under than 2,000 mg/day doesn’t just lack evidence for benefit, but might actually be harmful.
That doesn’t mean that eating a ton of salt via processed food is healthy. It just means that obsessive salt reduction is probably not a great idea for most people. Salt is an essential nutrient, after all.
Expert opinion doesn’t equal fact. Major health organizations have disagreed on the health impact of food and nutrients several times before.
Fourth, much of the evidence that was reviewed was epidemiological evidence, observing people over time to see if disease develops. Some people mistakenly ignore these studies, thinking that the “gold standard” randomized trial is the only acceptable type of evidence. Wrong. You can’t really do many cancer randomized trials, since cancer takes so long to develop and so many potential causative factors are involved. So the only option is combining a ton of epi evidence with animal and in vitro studies, plus a sprinkling of randomized trials that look at intermediate outcomes.
But you have to take the bad with the good with epi evidence — human diets and lifestyles are so variable that synthesizing the results of studies on Japanese people with the results of studies on Americans is going to be … difficult.
Much of the evidence reviewed was observational/epidemiological in nature, rather than from randomized trials. This means that causation is hard to pin down.

Credit: Cancer Research UK
Finally, and we can’t say this enough: The dose makes the poison. If you make a habit of eating bacon for breakfast, brats for lunch, and ham for dinner, then your total dose of red meat is high, plus it’s the processed kind that is much more likely to cause cancer. But if you have a grass-fed beef burger once or twice a week, that’s not even close to being a sure-fire recipe for cancer.
Red meat is not inherently unhealthy. As with most everything, the type and dose make the poison.

First suspect: NOCs

We’ll now give a quick rundown of physiological mechanisms. When it comes to understanding why red meat might or might not impact cancer, understanding those mechanisms matters more than the ability to regurgitate the content of various epidemiological studies.
We’ve established that the key cancer here is colorectal cancer. So most of these mechanisms have to do with intestinal damage.
If you’re in the habit of teaching your toddler nutritional science, then we’ve got some good news for you: Two of the villains in the red meat and cancer story are colors! Specifically, red and black. And one possible hero is also a color: green. Since Halloween is around the corner, just think of it as pirates versus jolly green giants.
Red meat is red because of hemoglobin, the red pigment in blood. When we eat red meat, part of this pigment can be processed in the gut into something called N-nitroso compounds (NOCs), which can damage the gut lining. When cells in the gut lining are damaged, the gut repairs itself by cell replication, and DNA damage can occur over time.
But unprocessed red meat (like a steak or ground beef) doesn’t have as direct an impact on gut damage as does processed red meat (like bacon or hot dogs). The chemicals in these products can lead to NOCs being formed at a faster rate than from the meat itself.
Luckily, most people don’t eat an all-red-meat diet. And it turns out that if you eat green veggies with your meat, evidence (animal evidence mostly) suggests that the colon cancer risk can be substantially reduced! The reason is that the green chlorophyll molecule is basically a heme molecule, but with a magnesium atom instead of an iron atom in the middle. Having some of that in your gut to compete with the meat pigment might reduce or potentially eliminate them from being turned into dangerous chemicals.
One mechanism for red meat increasing the risk of colorectal cancer is gut damage from chemicals such as NOCs. This comes about directly from the red pigment in these meats. But red meat as part of a plant-rich diet might be less dangerous.

Second suspect: Heat compounds

The other villain is that delicious char that forms on grilled foods. It turns out that this char, and high cooking in general, creates chemicals that can damage the gut, such as heterocyclic amines (HCAs). Red meat produces higher levels of these chemicals than white meat. As a side note, this is an excellent example of literal caveman/paleo eating not being inherently healthy: Cooking meat over an open fire on a regular basis would subject you to high levels of harmful chemicals.
Yet again, you have to consider the entirety of a meal. Certain compounds in cruciferous veggies (like broccoli or Brussel sprouts) my substantially reduce the impact of HCAs in cooked meat. And, thank goodness, even marinades with certain spices can reduce HCAs! Caribbean spices seem to perform the best, which is a clear indication that you should vacation there, or perhaps even in the actual Spice Islands.
Those black char lines might be a flavoring benefit but a health disaster — depending on how frequently you eat them, that is, plus whether you eat certain veggies with them, and what you marinate your meat with, if you marinate it at all.

Third suspect: Iron

Red meat is rich in iron, and iron is the goldilocks of all minerals. Many people are low in iron, so anemia is a public health concern. But on the opposite end, iron is very easily oxidized (think rust). The iron in meat can easily build up in intestinal cells, since it isn’t tightly bound to other compounds like it is in many plants. When this iron oxidizes, it eventually causes cell damage, which makes its link to colorectal cancer quite easy to understand.
There’s a reason why iron supplements aren’t recommended to the general population: In the body, even moderately high levels of iron can be dangerous — especially to colon cells, according to the recent IARC paper.

Fourth suspect: TMAO

You may have heard the term “TMAO” batted around in relation to red meat. What exactly is it? Disambiguation: It’s not that acronym people type when they find something funny on the Internet (that’s LMAO). No, TMAO stands for Trimethylamine N-oxide, a controversial compound that some research has linked to colon cancer. Different meats have different amino acid profiles, and red meat happens to be high in L-carnitine. L-carnitine gets metabolized by some of the bacteria in your gut, and eventually turns into TMAO.
While many studies link TMAO to disease, there’s more to this than meets the eye. If you’re a gut microbiome junkie, you probably know that two individuals can have extremely different gut bacterial profiles. In fact, vegans and vegetarians produce less TMAO from a given dose of L-carnitine than do meat eaters. It turns out that certain types of bacteria can increase your TMAO production, while other types can decrease it. It’s possible that a gut full of friendly flora could make TMAO much less of an issue.
TMAO got a lot of press around 2012-2013 for being “the reason” why red meat is unhealthy. But that’s overly simplistic — your gut bacteria are the factories that produce TMAO, so gut health and bacterial profile is key in determining the impact of red meat.

Fifth suspect: Neu5Gc

We’ve already covered Neu5Gc in great depth in our monthly research digest, so we’ll make this quick. All mammals other than humans (plus, strangely, ferrets, along with a very few monkey species) have a type of sugar in their bodies called Neu5Gc, whereas we have Neu5Ac. Since the molecules are so similar, Neu5Gc can get incorporated into our cells; but since they’re still different, Neu5Gc can then become prey to the immune system, which results in inflammation. Previous evidence showed that human tumors can contain high levels of Neu5Gc. And then in 2015, the first strong animal study of this compound showed that very high levels can cause cancer in mice.
If you eat red meat (or drink milk), you’re likely to have antibodies to Neu5Gc. Vegans don’t have these antibodies, suggesting that consumption of animal products is the reason why these antibodies exist in humans. Neu5Gc has a plausible mechanism for increasing cancer risk. The doses required are unknown, though.

Recommendations

Examine.com is not a diet guru, we just collect and interpret evidence. But it’s pretty clear from the evidence that eating red meat every day has a decent chance of increasing cancer risk, specifically colorectal cancer. Consuming high amounts of processed red meat in particular is really playing with fire. And actually, playing with fire (in the form of grilling meat very often) is also playing with fire. So mix up your cooking methods, and try some gentle cooking techniques if you haven’t already.
But all that being said, the evidence is mostly observational or mechanistic in nature. Due to the practical impossibility of running multi-decade controlled trials, the increased risk from eating different amounts of red meat is not really known. In this case, as in many others, moderation may be key.