Friday, October 28, 2011

The Skinny on Fat Loss

     J.B.S. Haldane, one of the most eccentric and brilliant biologists of the twentieth century described four stages in the acceptance of a new theory.
1. This is worthless nonsense.
2. This is an interesting, but perverse, point of view.
3. This is true, but quite unimportant.
4. I always said so.

     The public remains in stage one or two with regard to some essential truths in health and fitness.

     Let's start with one of the most unfortunate misunderstandings in health and fitness history, "FAT BURNING ZONES". The most common reason people exercise is to lose fat. And in the vast majority of cases, they fail. But it's not their fault. The public was told that longer duration low-intensity exercise, like a jog on a treadmill or a steady session on a stationary bike provides the heart rate that is optimal for burning fat.

     WRONG!

     For a definitive review of how aerobic exercise fails as a weight loss strategy see Thorogood et al. American Journal of Medicine 2011 Aug;124(8):747-55

     It is true that in this "fat burning" heart rate zone proportionally more fat than carbohydrate is used. But there is much less total fat burnt with these exercises than more demanding routines, such as interval training. But we should have known something didn't sound right.

     Here's a quick quiz?

     Who has a higher body fat percentage, the marathoner or the sprinter?

     If you said this has to be a trick question and I'll go with the counterintuitive response, you were correct! Why is the sprinter, whose training runs total less distance, less time, and consume less calories, leaner? Because he does resistance/weight training. That's right. It's exactly the opposite of what we were led to believe. Metabolic or interval resistance training, which the pros have been doing since the 1950's, is the most time-efficient way to burn fat. It gives the biggest bang for the buck and you see results faster than with any other intervention.

     The reason these more intense forms of exercise burn more fat is because they induce a metabolic disturbance that requires lots of energy to recover from. This is a key point. It is not how many calories or how much fat you burn during the exercise. It is what happens after. All the pathways that are stimulated in order to address the "insult" of excessive demand during the exercise are sometimes called the "afterburn". This includes a host of reactions such as EPOC (Excess Post-Exercise Oxygen Consumption) and increases in resting metabolic rate. Let's face it, if you work out an hour a day (which is heroic) that leaves 23 non-exercise hours per day. If your routine only changed your physiology during that hour, it could not have much impact on anything.

      I know this sounds like an advertisement, but it gets even better.

     The "fat burning zone" concept is actually a symptom of a much larger misconception, the endurance/cardio training - strength/resistance training dichotomy.  Most of the world, from doctor to fitness professional to health conscious layman believe that these are mutually exclusive domains. This has also proven to be false. And that's good news.

     Traditionally, exercise has been classified as either strength or endurance. Strength training consisting of short-duration, intense muscular work that results in hypertrophy, vs. endurance training which is characterized by prolonged, low to moderate intensity work that results in increased oxidative or aerobic capacity. The scientific community believed that these two forms of exercise triggered different pathways that could not be engaged simultaneously. However, recent research has found considerable overlap in these two pathways.

   For example, high intensity interval training (which is really just using resistance training with supersets or circuits to elevate heart rate and not allowing for sufficient recovery between sets) produces similar metabolic and performance adaptations to endurance training. No one thought it was possible to improve aerobic performance this way. In fact, HIIT appears to be better than endurance-type training for muscle buffering capacity (getting better at eliminating lactic acid).

     I'll leave you with a study that beautifully illustrates what we've been speaking about today. Bryner et al. in Effects of resistance vs. aerobic training combined with an 800 calorie liquid diet on lean body mass and resting metabolic rate in Journal of the American College of Nutrition, 1999 April;18(2):115-21.

 ( I'm sure you noticed the 800 calorie diet! The authors were interested in looking at the effect of different types of exercise on subjects on a very low-calorie diet (VLCD). One of the problems with VLCDs is their tendency to cause muscle loss and lower resting metabolic rate, two things that make it even more difficult to loose weight. )

   So, there were two groups, one did aerobic exercise, the other resistance training. The aerobic group exercised for four hours per week and the resistance group did 2-4 sets of 8-15 repetitions for 10 exercises, three times per week.

     Both groups lost weight. But the resistance training subjects did not lose muscle, lost much more fat  and experienced an increase in their resting metabolic rate compared to the aerobic group. (The aerobic group's metabolic rate decreased.) The most stunning result, however was the VO2max increased equally in the two groups!

     So if you want to improve aerobic performance, get stronger, and lose fat, intensify and shorten your workout.

     For once, less is more.
    





  

Monday, October 24, 2011

Probiotics: Pro or Con?

     There are way more bacterial cells living in our gut than the total number of our own cells in our entire body. We are, so to speak, colonized. These gut microbes turn out to be incredibly important. Anyone who has been on antibiotics, which kill many of these bacteria, can attest to the stomach misery caused by upsetting the balance of these little lodgers. Growing evidence suggests that too many of the wrong bugs can cause obesity.

     We are born with a pristine intestine, literally sterile. However, it is immediately invaded by the bacteria in mother's milk and environmental bacteria introduced by bottle. The average adult harbors between 1,000 and 1,500 bacterial species, 160 of which constitute the core group or what's called the core microbiota.

     Researchers have noticed that altered gut microbiota is associated with diseases that became prevalent in the 21st century. For instance, a reduced diversity of these bugs is seen in inflammatory bowel disease, metabolic syndrome (prediabetes) and obesity.  Specifically, the number of Firmicutes was increased, and the number of Bacteroidetes was reduced in obese people compared with lean folks. Interestingly, weight loss by dieting eliminated those differences. These two types of bacteria represent over 90% of all bacterial cells found in the human intestine.

     So how do these critters make us fat?

     Diet, not surprisingly, has a profound effect on what grows in our gut.  Switching from a lean diet to a high-fat Western diet dramatically alters the microbiota in a negative way. These changes are incredibly fast, starting in the first 24 hours of the introducing the new foods.

     Once the "bad" bacteria overpopulate, it is easier to absorb calories from the gut. The bugs provide an increased capacity not only to breakdown nutrients, but also make the gut wall more permeable. This allows more nutrient absorption (mostly glucose i.e. sugar). They also exert their influence beyond the gut promoting fat storage throughout the body by a variety of mechanisms including the altering of hormone levels responsible for orchestrating appetite, satiety, and fat metabolism.


     What can we do?

     We are still learning how best to harness probiotics. Different strains of lactobacilli (gasseri is one) have been shown to decrease fat and the risk for type 2 diabetes by increasing insulin sensitivity. Inulin-type fructans (found in fruits and vegetables) reduced weight, appetite, and blood sugar levels, and increased insulin sensitivity.

     But the bottom line remains, if eating meats, make them lean. The fresher and less altered the food, the better, in part because it will have a positive effect on the gut microbiota. Lots of local vegetables, and fresh dairy with live cultures are best for the same reason.

     Probiotics are no con job. Just do your homework before collecting bottles of preparations in your medicine cabinet. We are just beginning to understand these microbes.

Thursday, October 13, 2011

If It Ain't Broken, Don't Fix It: A Bad Week For Vitamins

     There is an old Sufi story in which Mulla Nasrudin is in his yard, in front of his house throwing corn. A man passing by is puzzled and stops. "Mulla Nasrudin", he asks, "why are you throwing corn all over your yard?" "It keeps the tigers away," he replies. "But there are no tigers around here." "Well it works then, doesn't it?" the mulla replies.

     This parable is similar to our health behavior. We are advised by "experts" (often people selling us something) who cite epidemiological studies that would suggest that if you take some product you'll prevent some malady. The problem with most of these studies is they cannot demonstrate a cause effect relation between an intervention (like a vitamin) and an outcome (lets say not getting cancer).

     Approximately one-half of adult Americans used dietary supplements in the year 2000, with sales of $20 billion that year. This represents a shift from using vitamins and minerals to prevent deficiency states, to using them in the absence of malnourishment, to promote wellness and prevent disease. Unfortunately, we have no good data that indicates this makes sense. In fact, the results of randomized clinical trials suggest vitamin and mineral use can be harmful.

     A large, well designed, and well conducted study published this week in the Archives of Internal Medicine reported the results of the Iowa Women's Health Study. The investigators assessed the relation between vitamin and mineral supplementation and mortality in 38,772 women with a mean age of 61.6 years. They found that the use of multivitamins, B6, folic acid, magnesium, zinc, iron and copper was associated with an increased risk of mortality. The association was strongest for supplemental iron. The association for iron was dose dependent, that is, the higher the dose, the more deaths observed.

     A second study published this week in the Journal of the American Medical Association demonstrated that vitamin E supplementation not only does not protect against prostate cancer, it may increase risk. Men receiving a common dose of vitamin E (400IU) had a 17% increased risk for prostate cancer compared to men who received placebo.

     In my recent blog, That Which Does Not Kill Us Makes Us Stronger: Why You Should Throw Out Your Vitamins, I discussed the negative effect of antioxidant vitamins before exercise. The JAMA paper cited above contributes to a growing body of evidence indicating that vitamin E, vitamin A, and beta-carotene can be harmful.

     These negative reports are particularly concerning given that those who take supplements show a greater range of healthy lifestyle factors (non-smokers, low-fat diets, exercisers) than non users. So even in the context of good preventive health practices, vitamins can impair health.

     If one thinks about evolution, and how we've eaten throughout our existence, it is not shocking to read these reports. We have never taken in such quantities of these nutrients in our history. The vitamins and minerals were always packaged in foods that dictated how how they were assimilated into our system. More is not better. It seems to be worse.

     We must wonder how much of our life is spent throwing corn, and what our own personal tigers are.

       





Vitamin takers were stunned to learn that their supplements

Saturday, October 8, 2011

Sweet Nothings? Sugar Substitutes and Weight Gain

     The only source of sweet for 99.9% of human existence has been glucose and fructose. Not surprisingly we developed a physiology where feeding behavior is largely controlled by the ebb and flow of blood levels of these sugars and their metabolites which reflect our energy status. In other words, a part of the brain watches our gas tank and sends messages accordingly, directing us toward or away from the kitchen. The obesity epidemic strongly suggests that we have lost this signal.

      The sources of sweet started to change after World War II. The combination of a sugar shortage and a changing esthetic that favored a thin figure encouraged women to try a sugar substitute. Saccharin (Sweet N' Low), the oldest nonnutritive sweetener was discovered in 1879 at Johns Hopkins during experimentation with coal tar derivatives. Saccharin had been used to replace sugar in soda marketed to diabetics until after the war when soda bottle labels were changed from "for use only in people who must limit sugar intake" to "for use in people who desire to limit sugar intake." Saccharin, which is 300 times sweeter than sucrose (table sugar, a disaccharide composed of 50% glucose and 50% fructose) was followed by cyclamate in 1937. Concern over cyclamate's capacity to cause cancer took it off the market in 1969. Similar concerns resulted in the FDA's plan to pull saccharin in 1977, but consumer protest reversed the decision. A warning label accompanied all saccharine products until 2000 when subsequent studies demonstrated that it is not carcinogenic. These investigations essentially silenced concerns over the safety of artificial sweeteners. Cyclamate continues to be available in 50 countries including Canada.      

     The next generation of sugar substitutes gave us aspartame (NutraSweet, Equal, 200 times sweeter than sucrose), sucralose (Splenda, 600 times sweeter than sucrose), and Neotame, the sweetest, weighing in at 7,000 times the sweetness of sucrose. These sweeteners have been well received. Between 1999 and 2004 6,000 food products containing these agents went to market. According to foodfacts.com, an ingredient search engine, there are now no fewer than 3,648 foods containing these chemicals in the U.S.. A sizable majority of americans consume artificial sweeteners, usually believing that they are making the healthy choice. In fact diet soda drinkers diets contain more whole grains and low-fat dairy, and less processed meat and refined sugar, than the general population. The idea that diet soda is a health food has accelerated with the recent "low-carb" diet fad.


     Unfortunately, what was supposed to provide the perfect solution to caloric overload and weight gain by eliminating the need for sugar failed miserably. In fact, many large epidemiological studies have demonstrated a positive correlation between artificial sweeteners and weight gain. How could this happen? Ironically, exactly what seemed to make nonnutritive sweeteners ideal, the capacity to provide unlimited sweetness with zero caloric load, opened the door to overeating on a scale our species has never witnessed.

     Human taste provides sensations of sweet, sour, salty, bitter, savory and possibly fat and metallic. While the identification and tracking of food relied upon the visual and olfactory systems, animals developed the capacity for taste in order to recognize potential nutrients and poisons. A keen sense of taste was enormously adaptive because it provided a guide to what was full of energy/calories (sweet), a source of electrolytes (salty), rich in protein (savory) and a potential toxin (bitter).

     Because life ceases without an energy source, our capacity to discern small differences in sweetness and our preference for the sweeter, is innate, not learned. We come into the world fully loaded with a genius for choosing the sweeter option, the product of about two and half million years of evolution. Newborns will invariably prefer a sweetened nipple. Numerous experiments have documented infants' pleasure response to sweetened water including a slowed heart beat, relaxed face, hedonic brain pattern and endorphin release. Infants also learn to associate thicker fluids with greater sweetness because the viscosity and caloric density of human breast milk vary together.

     Experiments in a variety of animals including humans have repeatedly demonstrated that artificial sweeteners increase hunger and total energy intake while sugar seems to trigger a mechanism that keeps energy consumption fairly constant. Functional MRI studies, where they take pictures of the brain while someone ingests something and see what areas are active, indicate that the food reward system responds differently to sugar versus artificial sweetener. This reward system is not only what drives appetite, but also when turned off, allows us to push away from the table before loosening our belts.

     When man tampered with nature and uncoupled the sensory signal (sweetness) from caloric load, a pairing that we adjusted to for over 100,000 generation, our capacity to know when we had enough was eradicated.  Failure to activate the full food reward response fuels increased consumption.

     There is another unanticipated side-effect of these sugar impostors. In 2005 Americans ate 24 pounds of sugar substitutes per person, double the 1980 rates. Surprisingly, sugar consumption increased by 25% between 1980 and 2005. Our sweet receptors evolved in environments with so little sugar they seem to have no shut off point. By exposure to compounds that are hundreds to thousands of times sweeter than sugar, our taste for sweetness is being up-regulated. This has translated into consuming more sugar while using sugar substitutes.

     Once again what seemed like a no-brainer proved to be a disaster because of a disregard for our evolutionary history. It is not unreasonable to suggest that sugar substitutes have significantly contributed to the obesity and type 2 diabetes epidemics. Completely change something as basic as the fuel we've survived on since the beginning? What could possibly go wrong?

 

Saturday, October 1, 2011

That Which Does Not Kill Us Makes Us Stronger: Why you should throw out Your vitamins

     In this era of mass consumption of supplements and foods hawked as medicinal, we are bombarded by "healthspeak", a language that few understand. Antioxidant, free radical, and oxidative stress are prime examples of mystery expressions. Every field develops its own terminology in an attempt to create precise and agreed upon meanings to facilitate the communication of complicated ideas. Such technical jargon may be used by professionals in its field of origin or in the culture at large as a means to gloss over what is poorly understood, language masquerading as comprehension. It can provide a false sense of mastery, an attempt at reassuring ourselves that we know what's going on. But the most basic questions expose our ignorance. Ask anyone what an antioxidant is, if you want to have some fun.

     In order to tell you why you should throw out your vitamins, we need to go over some of the language and science in that arena. I  promise to make it as painless as possible.

     As you may have guessed oxidation has to do with oxygen. It so happens that oxygen is both necessary for life and extremely toxic. The tragic cases of blindness in premature infants in the 1940s caused by high oxygen levels in the newly invented incubators gave us a taste of oxygen's destructive potential. The discovery of superoxide dismutase (SOD) in 1969, an agent that protects against oxygen damage and is found in almost all aerobic cells, marked the beginning of a vibrant field dedicated to the study of oxygen's effect on cell signaling, disease and ageing.

     You might wonder how we came to experience oxygen as both vital and deadly.  The answer is simple. There was no oxygen in the earth's atmosphere when the earliest life forms developed. 2.45 billion years ago blue-green algae evolved from the primordial ooze with the capacity to use sunlight, water and carbon dioxide to produce carbohydrates and oxygen, a process known as photosynthesis.
It then took 1 billion years (the "boring billion") for the oxygen levels to get high enough to enable the evolution of animals. There was a significant advantage in utilizing oxygen metabolically to generate energy, but it came with a price.

     Oxygen's structure is unstable (its outer ring lacks a full set of electrons) and makes it want to react with almost anything in its vicinity. In doing so it destabilizes its neighbor. This can cause all sorts of damage to proteins, including DNA/RNA, and is considered a major cause of ageing and disease. Every compound, including oxygen, that can accept electrons is an oxidant or oxidizing agent. These oxidants are often referred to as "radicals". On the other hand, any agent that can donate electrons is an antioxidant or reducing agent.

     So with the advent of an oxygen-rich atmosphere, organisms had to develop defenses against these new noxious oxidizing agents. Two of the most important antioxidants that our bodies manufacture are superoxide dismutase (SOD) and glutathione peroxidase, names you'll see bandied about in many health foods/products. However, despite nature's defense systems, some oxidative damage is always occurring.

     Enter the antioxidant vitamins, center stage. Theoretically, it makes perfect sense. If oxidative damage is a common cause of disease and ageing, antioxidant vitamins, like C and E, should help. Unfortunately, these vitamins have been a disappointment. They work beautifully in the laboratory, in test tube experiments, but not in animal studies. This has had no impact on antioxidant vitamin sales, the most popular nutraceutical.

     But it gets worse. In 2009 Ristow et al. published a stunning report in the Proceedings of the National Academy of Sciences entitled, "Antioxidants prevent health-promoting effects of physical exercise in humans" that turned everything on its head. Exercise, the most effective defense against obesity and type 2 diabetes (the acquired type associated with excess weight), exerts its therapeutic effects by increasing insulin sensitivity. In fact, exercise is more effective than medication in preventing type 2 diabetes in high risk individuals.

     For years it had been believed that exercise (contracting muscle fibers) caused oxidative damage. Ristow's lab demonstrated that exercise-induced oxidation actually plays an essential role in promoting insulin sensitivity. These changes are eliminated by daily consumption of the antioxidant vitamins C (500mg twice/day) and E (400IU/day). That is to say, C and E appear to block one of the most important beneficial effects of exercise on metabolism.

     This suggests that oxidative stress, something we thought was bad, is necessary to promote the production of our innate defense mechanisms. Interestingly, the use of antioxidants in type 2 diabetes is associated with increased hypertension and with overall mortality in the general population. How do we make sense of this?

     The repeated exposure to sub-lethal doses of stress results in greater stress resistance. This adaptive phenomenon is called hormesis. Such exposure has been shown to improve immune responses, decrease tumor formation and significantly slow ageing.

     It is not outlandish to wonder whether antioxidant vitamins are actually contributing to the diabetes epidemic. The fact that a diet rich in vegetables (a source of many antioxidants) decreases the risk of type 2 diabetes may be true despite vegetables antioxidant content.

     If all the vitamins were thrown into the sea, it would be all the better for mankind and all the worse for the fishes.