ReportWire

Tag: calories

  • Processed Foods and Obesity  | NutritionFacts.org

    Processed Foods and Obesity  | NutritionFacts.org

    [ad_1]

    The rise in the U.S. calorie supply responsible for the obesity epidemic wasn’t just about more food, but a different kind of food.

    The rise in the number of calories provided by the food supply since the 1970s “is more than sufficient to explain the US epidemic of obesity.” Similar spikes in calorie surplus were noted in developed countries around the world in parallel with and presumed to be primarily responsible for, the expanding waistlines of their populations. After taking exports into account, by the year 2000, the United States was producing 3,900 calories for every man, woman, and child—nearly twice as much as many people need. 

    It wasn’t always this way. The number of calories in the food supply actually declined over the first half of the twentieth century and only started its upward climb to unprecedented heights in the 1970s. The drop in the first half of the century was attributed to the reduction in hard manual labor. The population had decreased energy needs, so they ate decreased energy diets. They didn’t need all the extra calories. But then the “energy balance flipping point” occurred, when the “move less, stay lean phase” that existed throughout most of the century turned into the “eat more, gain weight phase” that plagues us to this day. So, what changed?

    As I discuss in my video The Role of Processed Foods in the Obesity Epidemic, what happened in the 1970s was a revolution in the food industry. In the 1960s, most food was prepared and cooked in the home. The typical “married female, not working” spent hours a day cooking and cleaning up after meals. (The “married male, non-working spouse” averaged nine minutes, as you can see below and at 1:34 in my video.) But then a mixed-blessing transformation took place. Technological advances in food preservation and packaging enabled manufacturers to mass prepare and distribute food for ready consumption. The metamorphosis has been compared to what happened a century before with the mass production and supply of manufactured goods during the Industrial Revolution. But this time, they were just mass-producing food. Using new preservatives, artificial flavors, and techniques, such as deep freezing and vacuum packaging, food corporations could take advantage of economies of scale to mass produce “very durable, palatable, and ready-to-consume” edibles that offer “an enormous commercial advantage over fresh and perishable whole or minimally processed foods.” 

    Think ye of the Twinkie. With enough time and effort, “ambitious cooks” could create a cream-filled cake, but now they are available around every corner for less than a dollar. If every time someone wanted a Twinkie, they had to bake it themselves, they’d probably eat a lot fewer Twinkies. The packaged food sector is now a multitrillion-dollar industry.

    Consider the humble potato. We’ve long been a nation of potato eaters, but we usually baked or boiled them. Anyone who’s made fries from scratch knows what a pain it is, with all the peeling, cutting, and splattering of oil. But with sophisticated machinations of mechanization, production became centralized and fries could be shipped at -40°F to any fast-food deep-fat fryer or frozen food section in the country to become “America’s favorite vegetable.” Nearly all the increase in potato consumption in recent decades has been in the form of french fries and potato chips. 

    Cigarette production offers a compelling parallel. Up until automated rolling machines were invented, cigarettes had to be rolled by hand. It took 50 workers to produce the same number of cigarettes a machine could make in a minute. The price plunged and production leapt into the billions. Cigarette smoking went from being “relatively uncommon” to being almost everywhere. In the 20th century, the average per capita cigarette consumption rose from 54 cigarettes a year to 4,345 cigarettes “just before the first landmark Surgeon General’s Report” in 1964. The average American went from smoking about one cigarette a week to half a pack a day.

    Tobacco itself was just as addictive before and after mass marketing. What changed was cheap, easy access. French fries have always been tasty, but they went from being rare, even in restaurants, to being accessible around each and every corner (likely next to the gas station where you can get your Twinkies and cigarettes).

    The first Twinkie dates back to 1930, though, and Ore-Ida started selling frozen french fries in the 1950s. There has to be more to the story than just technological innovation, and we’ll explore that next.

    This explosion of processed junk was aided and abetted by Big Government at the behest of Big Food, which I explore in my video The Role of Taxpayer Subsidies in the Obesity Epidemic.

    This is the fifth video in an 11-part series. Here are the first four: 

    Videos still to come are listed in the related videos below.

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Cutting the Calorie-Rich-And-Processed Foods  | NutritionFacts.org

    Cutting the Calorie-Rich-And-Processed Foods  | NutritionFacts.org

    [ad_1]

    We have an uncanny ability to pick out the subtle distinctions in calorie density of foods, but only within the natural range.

    The traditional medical view on obesity, as summed up nearly a century ago: “All obese persons are, alike in one fundamental respect,—they literally overeat.” While this may be true in a technical sense, it is in reference to overeating calories, not food. Our primitive urge to overindulge is selective. People don’t tend to lust for lettuce. We have a natural inborn preference for sweet, starchy, or fatty foods because that’s where the calories are concentrated.

    Think about hunting and gathering efficiency. We used to have to work hard for our food. Prehistorically, it didn’t make sense to spend all day collecting types of food that on average don’t provide at least a day’s worth of calories. You would have been better off staying back at the cave. So, we evolved to crave foods with the biggest caloric bang for their buck.

    If you were able to steadily forage a pound of food an hour and it had 250 calories per pound, it might take you ten hours just to break even on your calories for the day. But if you were gathering something with 500 calories a pound, you could be done in five hours and spend the next five working on your cave paintings. So, the greater the energy density—that is, the more calories per pound—the more efficient the foraging. We developed an acute ability to discriminate foods based on calorie density and to instinctively desire the densest.

    If you study the fruit and vegetable preferences of four-year-old children, what they like correlates with calorie density. As you can see in the graph below and at 1:52 in my video Friday Favorites: Cut the Calorie-Rich-And-Processed Foods, they prefer bananas over berries and carrots over cucumbers. Isn’t that just a preference for sweetness? No, they also prefer potatoes over peaches and green beans over melon, just like monkeys prefer avocados over bananas. We appear to have an inborn drive to maximize calories per mouthful. 

    All the foods the researchers tested in the study with four-year-old kids naturally had less than 500 calories per pound. (Bananas topped the chart at about 400.) Something funny happens when you start going above that: We lose our ability to differentiate. Over the natural range of calorie densities, we have an uncanny aptitude to pick out the subtle distinctions. However, once you start heading towards bacon, cheese, and chocolate territory, which can reach thousands of calories per pound, our perceptions become relatively numb to the differences. It’s no wonder since these foods were unknown to our prehistoric brains. It’s like the dodo bird failing to evolve a fear response because they had no natural predators—and we all know how that turned out—or sea turtle hatchlings crawling in the wrong direction towards artificial light rather than the moon. It is aberrant behavior explained by an “evolutionary mismatch.”

    The food industry exploits our innate biological vulnerabilities by stripping crops down into almost pure calories—straight sugar, oil (which is pretty much pure fat), and white flour (which is mostly refined starch). It also removes the fiber, because that effectively has zero calories. Run brown rice through a mill to make white rice, and you lose about two-thirds of the fiber. Turn whole-wheat flour into white flour, and lose 75 percent. Or you can run crops through animals (to make meat, dairy, and eggs) and remove 100 percent of the fiber. What you’re left with is CRAP—an acronym used by one of my favorite dieticians, Jeff Novick, for Calorie-Rich And Processed food.

    Calories are condensed in the same way plants are turned into addictive drugs like opiates and cocaine: “distillation, crystallization, concentration, and extraction.” They even appear to activate the same reward pathways in the brain. Put people with “food addiction” in an MRI scanner and show them a picture of a chocolate milkshake, and the areas that light up in their brains (as you can see below and at 4:15 in my video) are the same as when cocaine addicts are shown a video of crack smoking. (See those images below and at 4:18 in my video.) 

    “Food addiction” is a misnomer. People don’t suffer out-of-control eating behaviors to food in general. We don’t tend to compulsively crave carrots. Milkshakes are packed with sugar and fat, two of the signals to our brain of calorie density. When people are asked to rate different foods in terms of cravings and loss of control, most incriminated was a load of CRAP—highly processed foods like donuts, along with cheese and meat. Those least related to problematic eating behaviors? Fruits and vegetables. Calorie density may be the reason people don’t get up in the middle of the night and binge on broccoli.

    Animals don’t tend to get fat when they are eating the foods they were designed to eat. There is a confirmed report of free-living primates becoming obese, but that was a troop of baboons who stumbled across the garbage dump at a tourist lodge. The garbage-feeding animals weighed 50 percent more than their wild-feeding counterparts. Sadly, we can suffer the same mismatched fate and become obese by eating garbage, too. For millions of years, before we learned how to hunt, our biology evolved largely on “leaves, roots, fruits, and nuts.” Maybe it would help if we went back to our roots and cut out the CRAP. 

    A key insight I want to emphasize here is the concept of animal products as the ultimate processed food. Basically, all nutrition grows from the ground: seeds, sunlight, and soil. That’s where all our vitamins come from, all our minerals, all the protein, all the essential amino acids. The only reason there are essential amino acids in a steak is because the cow ate them all from plants. Those amino acids are essential—no animals can make them, including us. We have to eat plants to get them. But we can cut out the middlemoo and get nutrition directly from the Earth, and, in doing so, get all the phytonutrients and fiber that are lost when plants are processed through animals. Even ultraprocessed junk foods may have a tiny bit of fiber remaining, but all is lost when plants are ultra-ultraprocessed through animals.

    Having said that, there was also a big jump in what one would traditionally think of as processed foods, and that’s the video we turn to next: The Role of Processed Foods in the Obesity Epidemic.

    We’re making our way through a series on the cause of the obesity epidemic. So far, we’ve looked at exercise (The Role of Diet vs. Exercise in the Obesity Epidemic) and genes (The Role of Genes in the Obesity Epidemic and The Thrifty Gene Theory: Survival of the Fattest), but, really, it’s the food.

    If you’re familiar with my work, you know that I recommend eating a variety of whole plant foods, as close as possible to the way nature intended. I capture this in my Daily Dozen, which you can download for free here or get the free app (iTunes and Android). On the app, you’ll see that there’s also an option for those looking to lose weight: my 21 Tweaks. But before you go checking them off, be sure to read about the science behind the checklist in my book How Not to Diet. Get it for free at your local public library. If you choose to buy a copy, note that all proceeds from all of my books go to charity. 

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Are We Polar Bears in a Jungle?  | NutritionFacts.org

    Are We Polar Bears in a Jungle?  | NutritionFacts.org

    [ad_1]

    Rather than being some kind of disorder or a failure of willpower, weight gain is largely a normal response by normal people to an abnormal situation.

    It’s been said that “Nothing in biology makes sense except in the light of evolution.” The known genetic contribution to obesity may be small, but, in a certain sense, you could argue that it’s all in our genes. The excess consumption of available calories may be hardwired into our DNA. We were born to eat.

    Throughout human history and beyond, we existed in survival mode—in unpredictable scarcity. We’ve been programmed with a powerful drive to eat as much as we can while we can and just store the rest for later. Food availability could never be taken for granted, so those who ate more at the moment and were best able to store more fat for the future might better survive subsequent shortages to pass along their genes. So, generation after generation, millennia after millennia, those with lesser appetites may have died out, while those who gorged may have selectively lived long enough to pass along their genetic predisposition to eat and store more calories. That may be how we evolved into such voracious calorie-conserving machines. Now that we’re no longer living in such lean times, though, we’re no longer so lean ourselves.

    What I just described is the “thrifty gene” concept proposed in 1962. As I discuss in my video The Thrifty Gene Theory: Survival of the Fattest, it suggests that obesity is the result of a “‘mismatch’ between the environment in which humans evolved and our modern environment”—like being a polar bear in a jungle. All that fur and fat may have given polar bears an edge in the Arctic but would be decidedly disadvantageous in the Congo. Similarly, a propensity to pack on the pounds may have been a plus in prehistoric times but can turn into a liability when our scarcity-sculpted biology is plopped down into the land of plenty. So, it’s not gluttony or sloth. Obesity may simply be “a normal response to an abnormal environment.”

    Much of our physiology is finely tuned to stay within a narrow range of upper and lower limits. If we get too hot, we sweat; if we get too cold, we shiver. Our body has mechanisms to keep us in balance. In contrast, our bodies have had little reason to develop an upper limit to the accumulation of body fat. In the beginning, there may have been evolutionary pressures to keep lithe and nimble in the face of predation, but thanks to things like weapons and fire, we haven’t had to outrun as many saber-toothed tigers for about two million years or so. This may have left our genes with the one-sided selection pressures to binge on every morsel in sight and stockpile as many calories as possible in our bodies.

    What was once adaptive is now a problem—or at least so says the thrifty gene hypothesis that originated more than half a century ago. It “provides a simple and elegant explanation for the modern obesity epidemic and was quickly embraced by scientists and lay people alike.” Although the researcher, James Neel, later distanced himself from the original proposal, the basic premise, despite remaining mostly theoretical, is still “largely accepted” by the scientific community, and the implications are profound.

    In 2013, the American Medical Association voted to classify obesity as a disease (going against the advice of its own Council on Science and Public Health). Not that it necessarily matters what we call it, but disease implies dysfunction. Bariatric drugs and surgery are not correcting an anomaly in human physiology. Our bodies are just doing what they were designed to do in the face of excess calories. Rather than being some sort of disorder, weight gain is largely “a normal response by normal people to an abnormal environment.” As you can see below and at 4:12 in my video, more than 70 percent of Americans are now overweight. It’s normal. 

    “A body gaining weight when excess calories are available for consumption is behaving normally. Efforts to curtail such weight gain with drugs [or surgery] are not efforts to correct an anomaly in human physiology, but rather to deconstruct and reconstruct its normal operations at the core.”

    If weight gain is largely a normal response by normal people to an abnormal situation, what exactly is that abnormal situation? Calorie-Rich-And-Processed Foods. (I’ll let you work out the acronym.) That’s the topic we’ll turn to next.

    This is the third in an 11-video series on the history of the obesity epidemic. If you missed the first two, see The Role of Diet vs. Exercise in the Obesity Epidemic and The Role of Genes in the Obesity Epidemic.

    There are eight more coming up. See the related posts below.

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • What Is the Role of Our Genes in the Obesity Epidemic?  | NutritionFacts.org

    What Is the Role of Our Genes in the Obesity Epidemic?  | NutritionFacts.org

    [ad_1]

    The “fat gene” accounts for less than 1 percent of the differences in size between people.

    To date, about a hundred genetic markers have been linked to obesity, but when you put them all together, overall, they account for less than 3 percent of the difference in body mass index (BMI) between people. You may have heard about the “fat gene,” called FTO, short for FaT mass and Obesity-associated). It’s the gene most strongly linked to obesity, but it explains less than 1 percent of the difference in BMI between people, a mere 0.34 percent. 

    As I discuss in my video The Role of Genes in the Obesity Epidemic, FTO codes for a brain protein that appears to affect our appetite. Are you one of the billion people who carry the FTO susceptibility genes? It doesn’t matter because it only appears to result in a difference in intake of a few hundred extra calories a year. The energy imbalance that led to the obesity epidemic is on the order of hundreds of calories a day, and that’s the gene known so far to have the most effect. The chances of accurately predicting obesity risk based on FTO status is “only slightly better than tossing a coin.” In other words, no, those genes don’t make you look fat.

    When it comes to obesity, the power of our genes is nothing compared to the power of our fork. Even the small influence the FTO gene does have appears to be weaker among those who are physically active and may be abolished completely in those eating healthier diets. FTO only appears to affect those eating diets higher in saturated fat, which is predominantly found in meat, dairy, and junk food. Those eating more healthfully appear to be at no greater risk of weight gain, even if they inherited the “fat gene” from both of their parents.

    Physiologically, FTO gene status does not appear to affect our ability to lose weight. Psychologically, knowing we’re at increased genetic risk for obesity may motivate some people to eat and live more healthfully, but it may cause others to fatalistically throw their hands up in the air and resign themselves to thinking that it just runs in their family, as you can see in the graph below and at 2:11 in my video. Obesity does tend to run in families, but so do lousy diets. 

    Comparing the weight of biological versus adopted children can help tease out the contributions of lifestyles versus genetics. Children growing up with two overweight biological parents were found to be 27 percent more likely to be overweight themselves, whereas adopted children placed in a home with two overweight parents were 21 percent more likely to be overweight. So, genetics do play a role, but this suggests that it’s more the children’s environment than their DNA.

    One of the most dramatic examples of the power of diet over DNA comes from the Pima Indians of Arizona. As you can see in the graph below and at 3:05 in my video, they not only have among the highest rates of obesity, but they also have the highest rates of diabetes in the world. This has been ascribed to their relatively fuel-efficient genetic makeup. Their propensity to store calories may have served them well in times of scarcity when they were living off of corn, beans, and squash, but when the area became “settled,” their source of water, the Gila River, was diverted upstream. Those who survived the ensuing famine had to abandon their traditional diet to live off of government food programs and chronic disease rates skyrocketed. Same genes, but different diet, different result. 

    In fact, a natural experiment was set up. The Pima living over the border in Mexico come from the same genetic pool but were able to maintain more of their traditional lifestyle, sticking with their main staples of beans, wheat flour tortillas, and potatoes. Same genes, but seven times less obesity and about four times less diabetes. You can see those graphs below and at 3:58 and 4:02 in my video. Genes may load the gun, but diet pulls the trigger.

    Of course, it’s not our genes! Our genes didn’t suddenly change 40 years ago. At the same time, though, in a certain sense, it could be thought of as all in our genes. That’s the topic of my next video The Thrifty Gene Theory: Survival of the Fattest.

    This is the second in an 11-video series on the obesity epidemic. If you missed the first one, check out The Role of Diet vs. Exercise in the Obesity Epidemic

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • The Roles Diet and Exercise Play in the Obesity Epidemic  | NutritionFacts.org

    The Roles Diet and Exercise Play in the Obesity Epidemic  | NutritionFacts.org

    [ad_1]

    The common explanations for the cause of the obesity epidemic put forward by the food industry and policymakers, such as inactivity or a lack of willpower, are not only wrong, but actively harmful fallacies.

    Obesity isn’t new, but the obesity epidemic is. We went from a few corpulent kings and queens, like Henry VIII or Louis VI (known as Louis le Gros, or “Louis the Fat”), to a pandemic of obesity, now considered to be “arguably the gravest and most poorly controlled public health threat of our time.” As you can see below and at 0:34 in my video The Role of Diet vs. Exercise in the Obesity Epidemic, about 37 percent of American men are obese and 41 percent of American women, with no end in sight. Earlier reports had suggested that the rise in obesity was at least slowing down, but even that doesn’t appear to be the case. Similarly, we had thought we were turning the corner on childhood obesity “[a]fter 35 years of unremittingly bad news,” but the bad news continues. Childhood and adolescent obesity rates have continued to rise, now into the fourth decade. 

    Over the last century, obesity appears to have jumped ten-fold, from about 1 in 30 to now 1 in 3, but it wasn’t a steady rise. As you can see in the graph below and at 1:15 in my video, something seems to have happened around the late 1970s—and not just in the United States, but around the globe. The obesity pandemic took off at about the same time in the 1970s and 1980s in most high-income countries. The fact that the rapid rise “seemed to begin almost concurrently” across the industrialized world suggests a common cause. What might that trigger have been? 

    Any potential driver would have to be global and “coincide with the upswing of the epidemic.” So, the change would have had to have started about 40 years ago and would have had to have been able to spread rapidly around the globe. Let’s see how all the various theories stack up. For example, as you can see below and at 1:55 in my video, some have blamed changes in our built environment and shifts in city planning that have made our communities less conducive to walking, biking, and grocery shopping. That doesn’t meet our criteria for a credible cause, though, because there was no universal, simultaneous change in our neighborhoods within that time frame.

    When researchers surveyed hundreds of policymakers, most blamed the obesity epidemic on a “lack of personal motivation.” Do you see how little sense that makes? In the United States, for example, obesity shot up across the entire population in the late 1970s, as you can see at 2:26 in my video. I concur with the researchers who “believe it is implausible that each age, sex, and ethnic group, with massive differences in life experience and attitudes, had a simultaneous decline in willpower related to healthy nutrition or exercise.” More plausible than a global change like our characters would be some global change like our lives. 

    The food industry blames inactivity. “If all consumers exercised,” said the CEO of PepsiCo, “obesity wouldn’t exist.” Coca-Cola went a step further, spending $1.5 million to create the Global Energy Balance Network to downplay the role of diet. Leaked emails show the company planned on using the front to “serve as a ‘weapon’ to ‘change the conversation’ about obesity in its ‘war’ with public health.

    This tactic is so common among food and beverage companies that it even has a name: “leanwashing.” You’ve heard of greenwashing, where companies deceptively pretend to be environmentally friendly. Leanwashing is the term used to describe companies that try to position themselves as helping to solve the obesity crisis when they’re instead directly contributing to it. For example, the largest food company in the world, Nestlé, has “rebranded itself as the ‘world’s leading nutrition, health and wellness company. Yes, that Nestlé, makers of Nesquik, Cookie Crisp, and historically more than a hundred different brands of candy, including Butterfinger, Kit Kat, Goobers, Gobstoppers, Runts, and Nerds. Another one of its slogans is “Good Food, Good Life.” Its Raisinets may have some fruit, but Nestlé seems to me more Willy Wonka than wellness. 

    The constant corporate drumbeat of overemphasis on physical inactivity appears to be working. In response to the Harris poll question, “Which of these do you think are the major reasons why obesity has increased?,” a “huge majority of 83% chose lack of exercise, while only 34% chose excessive calorie consumption.” “Confusion about the effect of exercise on the energy balance” has been identified as one of the most common misconceptions about obesity. The scientific community has “come to a fairly decisive conclusion” that the factors governing calorie intake more powerfully affect overall calorie balance. It’s our fast food more than our slow motion. 

    “There is considerable debate in the literature today about whether physical activity has any role whatsoever in the epidemic of obesity that has swept the globe since the 1980s.” The increase in caloric intake per person is more than enough to explain the obesity epidemic in the United States and also explain it globally. If anything, the level of physical activity over the last few decades has gone up slightly in both Europe and North America. Ironically, this may be a result of the extra energy it takes to move around our heavier bodies, making it a consequence of the obesity problem rather than the cause.

    “Formal exercise plays a very small role in the total daily physical activity energy expenditure.” Think how much more physical work people used to do in the workplace, on the farm, or even in the home. It’s not just the shift in collar color from blue to white. Increasing automation, computerization, mechanization, motorization, and urbanization have all contributed to increasingly more sedentary lifestyles over the last century—and that’s the problem with the theory. The occupational shifts and advent of labor-saving devices “have been gradual and largely predated the dramatic increase in weight gain across the developed world in the past few decades.” Washing machines, vacuum cleaners, and the Model T were all invented before 1910. Indeed, when put to the test using state-of-the-art methods to measure energy in and energy out, it was caloric intake, not physical activity, that predicted weight gain over time. 

    The common misconception that obesity is mostly due to lack of exercise may not just be a benign fallacy. Personal theories of causation appear to impact people’s weight. Those who blame insufficient exercise are significantly more likely to be overweight than those who implicate a poor diet. Put those who believe lack of exercise causes obesity in a room with chocolate, and they can covertly be observed consuming more candy. Those holding that view may be different in other ways, though. You can’t prove cause and effect until you put it to the test. And, indeed, as you can see in the graph below, and at 7:22 in my video, people randomized to read an article implicating inactivity went on to eat significantly more sweets than those reading about research that indicated diet. A similar study found that those presented with research blaming genetics subsequently ate significantly more cookies. The title of that paper? “An Unintended Way in Which the Fat Gene Might Make You Fat.” 

    When I sat down to write How Not to Diet, I knew this “what triggered the obesity epidemic” was going to be a big question I had to face. Was it inactivity (just kids sitting around playing video games or scrolling on their phones)? Was it genetic? Was it epigenetic (something turning on our fat genes)? Or was it just the food? Were we eating more fat all of a sudden? More carbs? More processed foods? Or were we just eating more period, because of bigger serving sizes or more snacking? Inquiring minds wanted to know. 

    This is the first in an 11-video series to answer this question, which I originally released in a two-hour webinar in 2020. Check out the webinar digital download here. Or, check them out in the related posts below.  

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Children’s Cereals: Candy for Breakfast?  | NutritionFacts.org

    Children’s Cereals: Candy for Breakfast?  | NutritionFacts.org

    [ad_1]

    Plastering front-of-package nutrient claims on cereal boxes is an attempt to distract us from the incongruity of feeding our children multicolored marshmallows for breakfast.

    The American Medical Association started warning people about excess sugar consumption more than 75 years ago, based in part on our understanding that “sugar supplies nothing in nutrition but calories, and the vitamins provided by other foods are sapped by sugar to liberate these calories.” So, added sugars aren’t just empty calories, but negative nutrition. “Thus, the more added sugars one consumes, the more nutritionally depleted one may become.”

    Given the “totality of publicly available scientific evidence,” the Food and Drug Administration (FDA) decided to make processed food manufacturers declare “added sugars” on their nutrition labels. The National Yogurt Association was livid and said it “continues to oppose the ‘added sugars’ declaration,” since it needed “‘added sugars’ to increase palatability” of its products. The junk food association questioned the science, whereas the ice cream folks seemed to imply that consumers are too stupid to “understand or know how to use the added sugar declaration,” so it’s better just to leave it off. The world’s biggest cereal company, Kellogg’s, took a similar tact, opposing it so as not “to confuse consumers.” Should the FDA proceed with such labeling against Kellogg’s objections, the cereal giant pressed that “an added sugars declaration…should be communicated as a footnote.” It claimed that its “goal is to provide consumers with useful information so they can make informed choices.” This is from a company that describes its Froot Loops as “packed with delicious fruity taste, fruity aroma, and bright colors.” Keep in mind that Froot Loops has more sugar than a Krispy Kreme doughnut, as you can see in the graph below and at 1:46 in my video Friday Favorites: Kids’ Breakfast Cereals as Nutritional Façade

    Froot Loops is more than 40 percent sugar by weight! You can see the cereal box’s Nutrition Facts label below and at 1:50 in my video

    The tobacco industry used similar terms, such as “light,” “low,” and “mild” to make its products appear healthier—before it was barred from doing so. “Now sugar interests are fighting similar battles over whether their terminology, including ‘healthy,’ ‘natural,’ ‘naturally sweetened,’ and even ‘lightly sweetened,’ is deceptive to consumers.”

    But if you look at the side of a cereal box, as shown below and at 2:13 in my video, you can see all those vitamins and minerals that have been added. That was one of the ways the cereal companies responded to calls for banning sugary cereals. General Mills defended the likes of Franken Berry, Trix, and Lucky Charms for being fortified with essential vitamins. 

    Sir Grapefellow, I learned, was a “grape-flavored oat cereal” complete with “sweet grape star bits”—that is, marshmallows. Don’t worry. It was “vitamin charged!” You can see that cereal box below and at 2:31 in my video

    Sugary breakfast cereals, said Dr. Jean Mayer from Harvard, “are not a complete food even if fortified with eight or 10 vitamins.” Senator McGovern replied, “I think your point is well taken that these products may be mislabeled or more correctly called candy vitamins than cereals.” 

    Plastering nutrient claims on cereal boxes can create “a ‘nutritional façade’ around a product, acting to distract attention away” from unsavory qualities, such as excess sugar content. Researchers found that the “majority of parents misinterpreted the meaning of claims commonly used on children’s cereals,” raising significant public health concerns. Ironically, cereal boxes bearing low-calorie claims were found to have more calories on average than those without such a claim. The cereal doth protest too much. 

    Even candy bar companies are getting in on the action, bragging about protein content because of some peanuts. Like the Baby Ruth, a candy bar that has 50 grams of sugar. Froot Loops could be considered breakfast candy, as the same serving would have 40 sugar grams, as you can see below and at 3:45 in my video

    Given that “research suggests that consumers believe front-of-package claims, perceive them to be government-endorsed, and use them to ignore the Nutrition Facts Panel,” there’s been a call from nutrition professionals to consider “an outright ban on all front-of-package claims.” The industry’s short-lived “Smart Choices” label, as you can see below and at 4:13 in my video, was met with disbelief when it was found adorning qualifying cereals like Froot Loops and Cookie Crisp. The processed food industry spent more than a billion dollars lobbying against the adoption of more informative labeling (a traffic-light approach), “opposing most aggressively the use of a red light suggesting that any food was too high in anything.” 

    I was invited to testify as an expert witness in a case against sugary cereal companies. (I donated my fee, of course.) Check out the related posts below for a video series and blogs that are a result of some of the research I did. 

    You may also be interested in videos and blogs on the food industry; see related posts below.

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • How Much Added Sugar Is Okay?  | NutritionFacts.org

    How Much Added Sugar Is Okay?  | NutritionFacts.org

    [ad_1]

    Public health authorities continue to lower the upper tolerable limit of daily added sugar intake.

    Dating back to the original “Dietary Goals for the United States” in 1977, also known as the so-called McGovern Report, leading nutrition scientists didn’t only call for a reduction in meat and other sources of saturated fat and cholesterol, such as dairy and eggs, but also sugar. The goal was to reduce America’s sugar intake to no more than 10 percent of our daily diet.

    “The conclusions would hang sugar,” reported the president of the Sugar Association. “The McGovern Report has to be neutralized.” The National Cattlemen’s Association was on its side and, just like Big Sugar, appealed to the Senate Select Committee to withdraw the report.

    “The Sugar Industry Empire Strikes Back”—and it appeared to work. When the official U.S. Dietary Guidelines were released in 1980 and again in 1985, it was without a specific limit, like 10 percent. It “said, simply, and in just four words, ‘Avoid too much sugar.’” (Whatever that means.) “In 1990, it went to five words, ‘Use sugars only in moderation,’ and in 1995 to six: ‘Choose a diet moderate in sugars.’” In 2000, it at least went back to limiting intake—specifically, “‘Choose beverages and foods to limit your intake of sugars’ (ten words), but even that was too strong. Under pressure from sugar lobbyists, the government agencies substituted the word ‘moderate’ for ‘limit’ so it read ‘Choose beverages and foods to moderate your intake of sugars.’” Then, the 2005 guidelines committee dropped the s-word completely, encouraging Americans to “Choose carbohydrates wisely…” Again, what does that mean? If only there were a dietary guidelines committee that could guide us….

    The Sugar Association expressed optimism about that 2005 Committee. In its Sugar E-News, it wrote that Sugar Association Incorporated (SAI) “is committed to the protection and promotion of sucrose [table sugar] consumption. Any disparagement of sugar will be met with forceful, strategic public comments”—and it wasn’t kidding. “In 2003, [the World Health Organization] WHO released a joint report with the Food and Agriculture Organization entitled Diet, nutrition and the prevention of chronic diseases which, for the first time [since the McGovern Report], called for a reduction in sugar intake to under 10% of total dietary energy [caloric] consumption.” The Sugar Association responded by threatening to get the United States to withdraw all funding from the WHO. You can see it yourself in black and white at 2:22 in my video Friday Favorites: The Recommended Daily Added Sugar Intake. The Sugar Association threatened to pressure Congress to withdraw funding from the World Health Organization—polio vaccinations and AIDS medications be damned! Don’t mess with the candy man. The threat was described as “tantamount to blackmail and worse than any pressure exerted by the tobacco lobby.” 

    Fifteen years later and 40 years after the first proposed McGovern Report, the 2015 to 2020 Dietary Guidelines for Americans lays out the 10 percent limit as a key recommendation: “Consume less than 10 percent of calories per day from added sugars.” This is currently exceeded by every age bracket in the United States starting at age one, as you can see in the graph below and at 2:58 in my video, with adolescents averaging 87 grams of sugar a day. That means the average teen is effectively eating 29 sugar packets a day. 

    The Sugar Association describes the 10 percent limit as “extremely low.” Well, I mean, it is only up to about a dozen spoonsful a day. Of course, there is no dietary requirement for added sugar at all, and every single calorie we get from added sugar is a wasted opportunity to get calories from sources that provide nutrition. To the American Heart Association’s credit, it went further by trying to push added sugar intake down to about 6 percent of calories, for which a single can of soda could send you over the limit. That’s an added sugar limit exceeded by 90 percent of Americans.

    In 2017, the American Heart Association (AHA) released its guidelines for children, recommending they get no more than about six teaspoons per day. In that case, a single serving of nearly a hundred cereals on the U.S. market would exceed the entire recommended daily limit. The AHA recommends no added sugars at all for children under the age of two, a recommendation that’s violated in up to 80 percent of toddlers, as you can see below and at 4:20 in my video

    In the United States, “at least 65 countries have implemented dietary guidelines or public health policies to curb sugar consumption to encourage maintenance of healthy body weight.” In the United Kingdom, the Scientific Advisory Committee on Nutrition made new recommendations to reduce added sugars down to 5 percent, which is also the direction the World Health Organization is headed. The WHO always seems to be ahead of the curve. Why? Because its policy-making process is at least partially protected “against industry influence.” Unlike governments, which may have competing interests in commerce and trade, “WHO is exclusively concerned with health.”

    I spoke at a hearing of the 2020 Dietary Guidelines Committee. Watch the highlights and my speech here: Highlights from the 2020 Dietary Guidelines Hearing.

    The sugar industry keeps pretty busy, as you’ll see from my recent videos, Friday Favorites: Are Fortified Kids’ Breakfast Cereals Healthy or Just Candy? and Flashback Friday: Sugar Industry Attempts to Manipulate the Science.

    Check the related posts below for my other popular videos and blogs on sugar.

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Seasonal Weight Gain in the Fall  | NutritionFacts.org

    Seasonal Weight Gain in the Fall  | NutritionFacts.org

    [ad_1]

    SAD doesn’t just stand for the standard American diet.

    There’s a condition known as seasonal affective disorder that is characterized by increased appetite and cravings, as well as greater sleepiness and lethargy, that begins in autumn when light exposure starts to dwindle. This now appears to represent the far end of a normal spectrum of human behavior. We appear to eat more as the days get shorter. There is a “marked seasonal rhythm” to calorie intake with greater meal size, eating rate, hunger, and overall calorie intake in the fall. 

    In preparation for winter, some animals hibernate, doubling their fat stores with autumnal abundance to deal with the subsequent scarcity of winter. Genes have been identified in humans that are similar to hibernation genes, which may help explain why we exhibit some of the same behaviors, and the autumn effect isn’t subtle. As you can see in the graph below and at 1:06 in my video Friday Favorites: Why People Gain Weight in the Fall, researchers calculated a 222-calorie difference between how many calories we consume in the fall versus the spring. This isn’t just because it’s colder, either, since we eat more in the fall than in the winter. It appears we’re just genetically programmed to prep for the deprivation of winter that no longer comes. 

    It’s remarkable that, in this day and age of modern lighting and heating, our bodies would still pick up enough environmental cues of the changing seasons to have such a major influence on our eating patterns. Unsurprisingly, bright light therapy is used to treat seasonal affective disorder, nearly tripling the likelihood of remission, compared to placebo. Though it’s never been tested directly, it can’t hurt to take the dog out for some extra morning and daytime walks in the fall to try to fend off some of the coming holiday season weight gain.

    People blame the holidays for overeating, but it may be that “rather than the holidays causing heightened intake, the seasonal heightening of intake in the fall may have caused the scheduling of holidays at that time.”

    Regardless, as you can see below and at 2:15 in my video, other “specific recommendations for the prevention of obesity and metabolic syndrome by improving the circadian system health,” based on varying degrees of evidence, include: sleeping during the night and being active during the day; sleeping enough—at least seven or eight hours a night; early to bed, early to rise; and short naps are fine. (Contrary to popular belief, daytime napping does not appear to adversely impact sleep at night.) Also recommended: avoiding bright light exposure at night; sleeping in total darkness when possible; making breakfast or lunch your biggest meal of the day; not eating or exercising right before bed; and completely avoiding eating at night. 

    This was the last video in my chronobiology series. If you missed any of the others, check out the related posts below.

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Irregular Meals, Night Shifts, and Metabolic Harms  | NutritionFacts.org

    Irregular Meals, Night Shifts, and Metabolic Harms  | NutritionFacts.org

    [ad_1]

    What can shift workers do to moderate the adverse effects of circadian rhythm disruption?

    Shift workers may have higher rates of death from heart disease, stroke, diabetes, dementia, and cardiovascular disease, as well as higher rates of death from cancer. Graveyard shift, indeed! But, is it just because they’re eating out of vending machines or not getting enough sleep? Highly controlled studies have recently attempted to tease out these other factors by putting people on the same diets with the same sleep—but at the wrong time of day. Redistributing eating to the nighttime resulted in elevated cholesterol and increases in blood pressure and inflammation. No wonder shift workers are at higher risk. Shifting meals to the night in a simulated night-shift protocol effectively turned about one-third of the subjects prediabetic in just ten days. Our bodies just weren’t designed to handle food at night, as I discuss in my video The Metabolic Harms of Night Shifts and Irregular Meals.

    Just as avoiding bright light at night can prevent circadian misalignment, so can avoiding night eating. We may have no control over the lighting at our workplace, but we can try to minimize overnight food intake, which has been shown to help limit the negative metabolic consequences of shift work. When we finally do get home in the morning, though, we may disproportionately crave unhealthy foods. In one experiment, 81 percent of participants in a night-shift scenario chose high-fat foods, such as croissants, out of a breakfast buffet, compared to just 43 percent of the same subjects during a control period on a normal schedule.

    Shiftwork may also leave people too fatigued to exercise. But, even at the same physical activity levels, chronodisruption can affect energy expenditure. Researchers found that we burn 12 to 16 percent fewer calories while sleeping during the daytime compared to nighttime. Just a single improperly-timed snack can affect how much fat we burn every day. Study subjects eating a specified snack at 10:00 am burned about 6 more grams of fat from their body than on the days they ate the same snack at 11:00 pm. That’s only about a pat and a half of butter’s worth of fat, but it was the identical snack, just given at a different time. The late snack group also suffered about a 9 percent bump in their LDL cholesterol within just two weeks.

    Even just sleeping in on the weekends may mess up our metabolism. “Social jetlag is a measure of the discrepancy in sleep timing between our work days and free days.” From a circadian rhythm standpoint, if we go to bed late and sleep in on the weekends, it’s as if we flew a few time zones west on Friday evening, then flew back Monday morning. Travel-induced jet lag goes away in a few days, but what might the consequences be of constantly shifting our sleep schedule every week over our entire working career? Interventional studies have yet put it to the test, but population studies suggest that those who have at least an hour of social jet lag a week (which may describe more than two-thirds of people) have twice the odds of being overweight. 

    If sleep regularity is important, what about meal regularity? “The importance of eating regularly was highlighted early by Hippocrates (460–377 BC) and later by Florence Nightingale,” but it wasn’t put to the test until the 21st century. A few population studies had suggested that those eating meals irregularly were at a metabolic disadvantage, but the first interventional studies weren’t published until 2004. Subjects were randomized to eat their regular diets divided into six regular eating occasions a day or three to nine daily occasions in an irregular manner. Researchers found that an irregular eating pattern can cause a drop in insulin sensitivity and a rise in cholesterol levels, as well as reduce the calorie burn immediately after meals in both lean and obese individuals. The study participants ended up eating more, though, on the irregular meals, so it’s difficult to disentangle the circadian effects. The fact that overweight individuals may overeat on an irregular pattern may be telling in and of itself, but it would be nice to see such a study repeated using identical diets to see if irregularity itself has metabolic effects.

    Just such a study was published in 2016: During two periods, people were randomized to eat identical foods in a regular or irregular meal pattern. As you can see in the graph below and at 4:47 in my video, during the irregular period, people had impaired glucose tolerance, meaning higher blood sugar responses to the same food.

    They also had lower diet-induced thermogenesis, meaning the burning of fewer calories to process each meal, as seen in the graph below and at 4:55 in my video.

    The difference in thermogenesis only came out to be about ten calories per meal, though, and there was no difference in weight changes over the two-week periods. However, diet-induced thermogenesis can act as “a satiety signal.” The extra work put into processing a meal can help slake one’s appetite. And, indeed, “lower hunger and higher fullness ratings” during the regular meal period could potentially translate into better weight control over the long term. 

    The series on chronobiology is winding down with just two videos left in this series: Shedding Light on Shedding Weight and Friday Favorites: Why People Gain Weight in the Fall.

    If you missed any of the other videos, see the related posts below. 
     

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Syncing Your Brain and Body Clocks  | NutritionFacts.org

    Syncing Your Brain and Body Clocks  | NutritionFacts.org

    [ad_1]

    Exposure to bright light synchronizes the central circadian clock in our brain, whereas proper meal timing helps sync the timing of different clock genes throughout the rest of our body. 
     
    One of the most important breakthroughs in recent years has been the discovery of “peripheral clocks.” We’ve known for decades about the central clock—the so-called suprachiasmatic nucleus. It sits in the middle of our brain right above the place where our optic nerves cross, allowing it to respond to day and night. Now we also know there are semi-autonomous clocks in nearly every organ of our body. Our heart runs on a clock, our lungs run on one, and so do our kidneys, for instance. In fact, up to 80 percent of the genes in our liver are expressed in a circadian rhythm.

    Our entire digestive tract is, too. The rate at which our stomach empties, the secretion of digestive enzymes, and the expression of transporters in our intestinal lining for absorbing sugar and fat all cycle around the clock. So, too, does the ability of our body fat to sop up extra calories. The way we know these cycles are driven by local clocks, rather than being controlled by our brain, is that you can take surgical biopsies of fat, put them in a petri dish, and watch them continue to rhythm away.

    All of this clock talk is not just biological curiosity. Our health may depend on keeping all of them in sync. “Imagine a child playing on a swing.” Picture yourself pushing, but you become distracted by what’s going on around you in the playground and stop paying attention to the timing of the push. So, you forget to push or you push too early or too late. What happens? Out of sync, the swinging becomes erratic, slows, or even stops. That is what happens when we travel across multiple time zones or have to work the night shift.

    The “pusher” in this case is the light cues falling onto our eyes. Our circadian rhythm is meant to get a “push” from bright light every morning at dawn, but if the sun rises at a different time or we’re exposed to bright light in the middle of the night, this can push our cycle out of sync and leave us feeling out of sorts. That’s an example of a mismatch between the external environment and our central clock. Problems can also arise from a misalignment between the central clock in our brain and all the other organ clocks throughout our body. An extreme illustration of this is a remarkable set of experiments suggesting that even our poop can get jet lag.

    As you can see below and at 2:31 in my video How to Sync Your Central Circadian Clock to Your Peripheral Clocks, our microbiome seems to have its own circadian rhythm.

    Even though the bacteria are down where the sun doesn’t shine, there’s a daily oscillation in both bacterial abundance and activity in our colon, as you can see in the graph below and at 2:43 in my video. Interesting, but who cares? We all should. 

    Check this out: If you put people on a plane and fly them halfway around the world, then feed their poop to mice, those mice grow fatter than mice fed preflight feces. The researchers suggest the fattening flora was a consequence of “circadian misalignment.” Indeed, several lines of evidence now implicate “chronodisruption”—the state in which our central and peripheral clocks diverge out of sync—as playing a role in conditions such as premature aging and cancer, as well as ranging to others like mood disorders and obesity.

    Exposure to bright light is the synchronizing swing pusher for our central clock. What drives our internal organ clocks that aren’t exposed to daylight? Food intake. That’s why the timing of our meals may be so important. Researchers removed all external timing cues by keeping study participants under constant dim light and found that you could effectively decouple central rhythms from peripheral ones just by shifting meal times. They took blood draws every hour and biopsies of the subjects’ fat every six hours to demonstrate the resulting metabolic disarray.

    Just as morning light can help sync the central clock in our brain, morning meals can help sync our peripheral clocks throughout the rest of our body. Skipping breakfast disrupts the normal expression and rhythm of these clock genes themselves, which coincides with adverse metabolic effects. Thankfully, they can be reversed. Take a group of habitual breakfast-skippers and have them eat three meals at 8:00 am, 1:00 pm, and 6:00 pm, and their cholesterol and triglycerides improve, compared to taking meals five hours later at 1:00 pm, 6:00 pm, and 11:00 pm. There is a circadian rhythm to cholesterol synthesis in the body, too, which is also “strongly influenced by food intake.” This is evidenced by the 95 percent drop in cholesterol production in response to a single day of fasting. That’s why a shift in meal timing of just a few hours can result in a 20-point drop in LDL cholesterol, thanks to eating earlier meals, as you can see below and at 5:00 in my video

    If light exposure and meal timing help keep everything synced, what happens when our circumstances prevent us from sticking to a normal daytime cycle? We’ll find out in The Metabolic Harms of Night Shifts and Irregular Meals. If you’re just coming into the series, be sure to check out the related posts below.  

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Circadian Rhythms and Our Blood Sugar Levels  | NutritionFacts.org

    Circadian Rhythms and Our Blood Sugar Levels  | NutritionFacts.org

    [ad_1]

    The same meal eaten at the wrong time of day can double blood sugar. 

    We’ve known for more than half a century that our glucose tolerance—the ability of our body to keep our blood sugars under control—declines as the day goes on. As you can see in the graph below and at 0:25 in my video How Circadian Rhythms Affect Blood Sugar Levels, if you hook yourself up to an IV and drip sugar water into your vein at a steady pace throughout the day, your blood sugars will start to go up at about 8:00 pm, even though you haven’t eaten anything and the infusion rate didn’t change.

    The same amount of sugar is going into your system every minute, but your ability to handle it deteriorates in the evening before bouncing right back in the morning. A meal eaten at 8:00 pm can cause twice the blood sugar response as an identical meal eaten at 8:00 am, as shown in the graph below and at 0:51 in my video. It’s as if you ate twice as much. Your body just isn’t expecting you to be eating when it’s dark outside. Our species may have only discovered how to use fire about a quarter million years ago. We just weren’t built for 24-hour diners. 

    One of the tests for diabetes is called the glucose tolerance test, which sees how fast our body can clear sugar from our bloodstream. You swig down a cup of water with about four and a half tablespoons of regular corn syrup mixed in, then have your blood sugar measured two hours later. By that point, your blood sugar should be under 140 mg/dL. Between 140 and 199 is considered to be a sign of prediabetes, and 200 and up is a sign of full-blown diabetes, as you can see in the graph below and at 1:37 in my video

    The circadian rhythm of glucose tolerance is so powerful that a person can test normal in the morning but as a prediabetic later in the day. Prediabetics who average 163 mg/dL at 7:00 am may test out as frank diabetics at over 200 mg/dL at 7:00 pm, as you can see in the graph below and at 1:53 in my video

    Choosing lower glycemic foods may help promote weight loss, but timing is critical. Due to this circadian pattern in glucose tolerance, a low-glycemic food at night can cause a higher blood sugar spike than a high-glycemic food eaten in the morning, as you can see below and at 2:05 in my video.

    We’re so metabolically crippled at night that researchers found that eating a bowl of All Bran cereal at 8:00 pm caused as high a blood sugar spike as eating Rice Krispies at 8:00 am, as you can see in the graph below and at 2:23 in my video.

    High glycemic foods at night would seem to represent the worst of both worlds. So, if you’re going to eat refined grains and sugary junk, it might be less detrimental in the morning, as you can see in the graph below and at 2:32 in my video.  

    The drop in glucose tolerance over the day could therefore help explain the weight-loss benefits of frontloading calories towards the beginning of the day. Even just taking lunch earlier versus later may make a difference, as you can see in the graph below and at 2:48 in my video.

    People randomized to eat a large lunch at 4:30 pm suffered a 46 percent greater blood sugar response compared to an identical meal eaten just a few hours earlier at 1:00 pm. A meal at 7:00 am can cause 37 percent lower blood sugars than an identical meal at 1:00 pm, as you can see below, and at 3:04 in my video.

    Now, there doesn’t seem to be any difference between a meal at 8:00 pm and the same meal at midnight; they both seem to be too late, as you can see below, and at 3:15 in my video.

    But, eating that late, at midnight or even 11:00 pm, can so disrupt your circadian rhythm that it can mess up your metabolism the next morning, resulting in significantly higher blood sugars after breakfast, compared to eating the same dinner at 6:00 pm the evening before, as shown in the graph below and at 3:32 in my video.

    So, these revelations of chronobiology bring the breakfast debate full circle. Skipping breakfast not only generally fails to cause weight loss, but it worsens overall daily blood sugar control in both diabetic individuals and people who are not diabetic, as you can see in the graph below and at 3:44 in my video.

    Below and at 3:53, you can see a graph showing how the breakfast skippers have higher blood sugars even while they’re sleeping 20 hours later. This may help explain why those who skip breakfast appear to be at higher risk of developing type 2 diabetes in the first place. 

    Breakfast skippers also tend to have higher rates of heart disease, as well as having higher rates of atherosclerosis, in general. Is this just because “skipping breakfast tends to cluster with other unhealthy choices, including smoking” and sicklier eating habits overall? The link between skipping breakfast and heart disease—even premature death in general—seems to survive attempts to control for these confounding factors, but you don’t really know until you put it to the test.

    Does skipping breakfast lead to higher cholesterol, for example? Yes, researchers found a significant rise in LDL (bad) cholesterol in study participants randomized to skip breakfast; they were about 10 points higher within just two weeks, as you can see below and at 4:45 in my video.

    The Israeli study with the caloric distribution of 700 calories for breakfast, 500 for lunch, and 200 for dinner that I’ve discussed previously found that the triglycerides of the king-prince-pauper group (those eating more at breakfast versus dinner) got significantly better—a 60-point drop—while those of the pauper-prince-king group got significantly worse (a 26-point rise). So, consuming more calories in the morning relative to the evening may actually have a triple benefit: more weight loss, better blood sugar control, and lower heart disease risk, as you can see below and at 5:18 in my video

    If you’re going to skip any meal, whether you’re practicing intermittent fasting or time-restricted feeding (where you try to fit all of your food intake into a certain time window each day), it may be safer and more effective to skip dinner rather than breakfast.

    I’m back with the next installment of the chronobiology series! I previously explored eating breakfast for weight loss (Is Breakfast the Most Important Meal for Weight Loss? and Is Skipping Breakfast Better for Weight Loss?), introduced chronobiology (How Circadian Rhythms Can Control Your Health and Weight), and looked at the science on eating more in the mornings than the evenings (Eat More Calories in the Morning to Lose Weight, Breakfast Like a King, Lunch Like a Prince, Dinner Like a Pauper, and Eat More Calories in the Morning Than the Evening).

    Next, you’ll see How to Sync Your Central Circadian Clock to Your Peripheral Clocks.

    The series will wrap up in the next couple of weeks. See videos and blogs in related posts below.

    Note: The Israeli 700/500/200 study that I mentioned is detailed in the Breakfast Like a King, Lunch Like a Prince, Dinner Like a Pauper video if you want to know more. Also, check the corresponding blog in related posts. 

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Morning Calories vs. Evening Calories  | NutritionFacts.org

    Morning Calories vs. Evening Calories  | NutritionFacts.org

    [ad_1]

    Why are calories eaten in the morning less fattening than calories eaten in the evening? 

    One reason calories consumed in the morning are less fattening than those eaten in the evening is that more calories are burned off in the morning due to diet-induced thermogenesis. That’s the amount of energy the body takes to digest and process a meal, given off in part as waste heat. If people are given the same meal in the morning, afternoon, or night, their body uses up about 25 percent more calories to process it in the afternoon than at night and about 50 percent more calories to digest it in the morning, as you can see below and at 0:36 in my video Eat More Calories in the Morning Than the Evening. That leaves fewer net calories in the morning to be stored as fat.

    Let’s put some actual numbers to it. A group of Italian researchers randomized 20 people to eat the same standardized meal at either 8:00 am or 8:00 pm and had them return a week later to do the opposite. So, each person had a chance to eat the same meal for breakfast and dinner. After every meal, the study participants were placed in a “calorimeter” contraption to precisely measure how many calories they were burning over the next three hours. As you can see below and at 1:18 in my video, the researchers calculated that the meal given in the morning took about 300 calories to digest, whereas the same meal given at night only used up about 200 calories to process. The meal was about 1,200 calories, but, when eaten in the morning, it ended up only providing about 900 calories compared to more like 1,000 calories at night. Same meal, same food, same amount of food, but effectively 100 fewer calories when consumed in the morning rather than at night. So, a calorie is not just a calorie. It depends on when we eat it. 

    But why do we burn more calories when eating a morning meal? Is it behavioral or biological? If you started working the graveyard shift, sleeping during the day and working all night, which meal would net you fewer calories? Would it be the “breakfast” you had at night before you went to work or the “dinner” you had in the morning before you went to bed? In other words, is it something about eating before you go to sleep that causes your body to hold onto more calories, or is it built into our circadian rhythm, where we store more calories at night regardless of what we’re doing? You don’t know until you put it to the test.

    Harvard researchers randomized people to identical meals at 8:00 am versus 8:00 pm while under simulated night shifts or day shifts. Regardless of activity level or sleeping cycle, the number of calories that were burned processing the morning meals was 50 percent higher than in the evening, as you can see in the graph below and at 2:45 in my video. So, the difference is explained by chronobiology: It’s just part of our circadian rhythm to burn more meal calories in the morning. But, why? What exactly is going on? 

    How does it make sense for our body to waste calories in the morning when we have the whole day ahead of us? 

    Our body isn’t so much wasting calories as investing them. When we eat in the morning, our body bulks up our muscles with glycogen, which is the primary energy reserve our body uses to fuel our muscles, but this takes energy. In the evening, our body expects to be sleeping for much of the next 12 hours, so rather than storing blood sugar as extra glycogen in our muscles, it preferentially uses it as an energy source, which may end up meaning we burn less of our backup fuel (body fat). In the morning, however, our body expects to be running around all day, so instead of just burning off breakfast, our body continues to dip into its fat stores while we use breakfast calories to stuff our muscles full of the energy reserves we need to move around over the day. That’s where the “inefficiency” may come from. The reason it costs more calories to process a morning meal is that, instead of just burning glucose (blood sugar) directly, our body uses up energy to string glucose molecules together into chains of glycogen in our muscles, which are then just going to be broken back down into glucose later in the day. That extra assembly/disassembly step takes energy—energy that our body takes out from the meal, leaving us with fewer calories.

    So, in the morning, our muscles are especially sensitive to insulin, rapidly pulling blood sugar out of our bloodstream to build up glycogen reserves. At night, though, our muscles become relatively insulin-resistant and resist the signal to take in extra blood sugar. So, does that mean you get a higher blood sugar and insulin spike in the evening compared to eating the same meal in the morning? Yes. As you can see in the graph below and at 5:02 in my video, in that 100-calorie-difference study, for example, blood sugars rose twice as high after the 8:00 pm meal compared to the same meal eaten in the morning.

    So, shifting the bulk of our caloric intake towards the morning would appear to have a dual benefit—more weight loss, and better blood sugar control, as shown in the graph below and at 5:12 in my video

    If you thought dual benefits sounded good, stay tuned for triple benefits! I dive deeper into circadian rhythms. See related posts below.

    My last few videos (see below) focus on why science points to loading your calories towards the beginning of the day.

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Fighting Cancer and the Common Cold with Garlic  | NutritionFacts.org

    Fighting Cancer and the Common Cold with Garlic  | NutritionFacts.org

    [ad_1]

    Raw garlic is compared to roasted, stir-fried, simmered, and jarred garlic.

    Garlic lowers blood pressure, regulates cholesterol, and stimulates immunity. I’ve talked before about its effect on heart disease risk factors, but what about immunity? Eating garlic appears to offer the best of both worlds, dampening the overreactive face of the immune system by suppressing inflammation while boosting protective immunity—for example, the activity of our natural killer cells, which our body uses to purge cells that have been stricken by viruses or cancer. “In World War II garlic was called ‘Russian Penicillin’ because, after running out of antibiotics, the soviet government turned to these ancient treatments for its soldiers,” but does it work? You don’t know until you put it to the test.

    How about preventing the common cold? As I discuss in my video Benefits of Garlic for Fighting Cancer and the Common Cold, it is perhaps “the world’s most widespread viral infection, with most people suffering approximately two to five colds per year.” In the first study “to use a double-blind, placebo-controlled design to investigate prevention of viral disease with a garlic supplement,” those randomized to the garlic suffered 60 percent fewer colds and were affected 70 percent fewer days. So, those on garlic not only had fewer colds, but they also recovered faster, suffering only one and a half days instead of five. Accelerated relief, reduced symptom severity, and faster recovery to full fitness. Interesting, but that study was done about two decades ago. What about all of the other randomized controlled trials? There aren’t any. There’s only that one trial to date. Still, the best available balance of evidence suggests that, indeed, “garlic may prevent occurrences of the common cold.”

    What about cancer? Is garlic “a stake through the heart of cancer?” As you can see below and at 2:05 in my video, various garlic supplements have been tested on cells in a petri dish or lab animals, but there weren’t any human studies to see if garlic could affect gene expression until now. 

    Researchers found that if you eat one big clove’s worth of crushed raw garlic, you get an alteration of the expression of your genes related to anti-cancer immunity within hours. You can see a big boost in the production of cancer-suppressing proteins like oncostatin when you drip garlic directly on cells in a petri dish, as shown in the graph below and at 2:25 in my video.   

    What’s more, you can also see boosted gene expression directly in your bloodstream within three hours of eating it, as seen below and at 2:34 in my video. Does this then translate into lower cancer risk?

    As you can see in the graph below and at 2:44 in my video, after putting together ten population studies, researchers found that those reporting higher consumption of garlic only had half the risk of stomach cancer.

    How do you define “high” garlic consumption? Each study was different, from a few times a month to every day. Regardless, those who ate more garlic appeared to have lower cancer rates than those who ate less, suggesting a protective effect. Stomach cancer is a leading cause of cancer-related death around the world, and garlic “is relatively cheap; the product is freely available and easy to incorporate into a daily diet in a palatable manner”—and safely, too, so why not? And, perhaps, the more, the better. 

    The only way to prove garlic can prevent cancer is to put it to the test. Thousands of individuals were randomized to receive seven years of a garlic supplement or a placebo. Those getting garlic did tend to get less cancer and die from less cancer, as you can see below and at 3:35 in my video, but the findings were not statistically significant, meaning that could have just happened by chance. 

    Why didn’t we see a more definitive result, given that garlic eaters appear to have much lower cancer rates? Well, the researchers didn’t give them garlic; they gave them garlic extract and oil pills. It’s possible that some of the purported active components weren’t preserved in supplement form. Indeed, one study of garlic supplements, for example, found that it might take up to 27 capsules to obtain the same amount of garlic goodness found in just half a clove of crushed raw garlic.  

    What happens if you cook garlic? If you compare raw chopped garlic to garlic simmered for 15 minutes, boiled for 6 minutes, or stir-fried for just 1 minute, you can get a three-fold drop in one of the purported active ingredients called allicin when you boil it, even more of a loss if you simmer it too long, and seemingly total elimination by even a single minute of stir-frying, as seen below and at 4:21 in my video. What about roasted garlic? Surprisingly, even though roasting is hotter than boiling, that cooking method preserved about twice as much. Raw garlic has the most, but it may be easier for some folks to eat two to three cloves of cooked garlic than even half a clove of raw. 

    What about pickled garlic or those jars of minced garlic packed in water or oil? Fancy, fermented black garlic? Though jarred garlic may be more convenient, they have comparatively less garlicky goodness, especially pickled garlic, and the black garlic falls far behind, as you can see in the graph below and at 5:12 in my video

    Can you eat too much? The garlic meta-analysis suggests there are no real safety concerns with side effects or overdosing, though that’s with internal use. You should not stick crushed garlic on your skin. It can cause irritation and, if left on long enough, can actually burn you. Wrap your knees with a garlic paste bandage or stick some on your back overnight, and you can end up burned, as seen below, and at 5:42 and 5:44 in my video.  

    Definitely don’t rub garlic on babies, even if you see an online article saying that topical garlic is good for respiratory disorders and your little one is congested. Below and at 5:57 in my video, you can see the blisters she got. The poor pumpkin! “It is crucial…to explain to patients that ‘natural’ does not equal ‘safe’…” 

    Don’t put it on your toes, don’t use it as a face mask, and don’t use it to try to get out of military service either.  

    If you just eat it like you’re supposed to, there shouldn’t be a problem. Some people can get an upset stomach if they eat too much, though, and you can’t really say there aren’t any side effects, given the “body odor and bad breath.”

    The other video I mentioned is Friday Favorites: Benefits of Garlic Powder for Heart Disease. What else can garlic do? See related posts below.

    And, for more on specific foods for fighting colds and cancer, check out related posts below. 

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Lose Weight by Eating More in the Morning  | NutritionFacts.org

    Lose Weight by Eating More in the Morning  | NutritionFacts.org

    [ad_1]

    A calorie is not a calorie. It isn’t only what you eat, but when you eat.

    Mice are nocturnal creatures. They eat during the night and sleep during the day. However, if you only feed mice during the day, they gain more weight than if they were fed a similar amount of calories at night. Same food and about the same amount of food, but different weight outcomes, as you can see in the graph below and at 0:18 in my video Eat More Calories in the Morning to Lose Weight, suggesting that eating at the “wrong” time may lead to disproportionate weight gain. In humans, the wrong time would presumably mean eating at night. 

    Recommendations for weight management often include advice to limit nighttime food consumption, but this was largely anecdotal until it was first studied experimentally in 2013. Researchers instructed a group of young men not to eat after 7:00 pm for two weeks. Compared to a control period during which they continued their regular habits, they ended up about two pounds lighter after the night-eating restriction. This is not surprising, given that dietary records show the study participants inadvertently ate fewer calories during that time. To see if timing has metabolic effects beyond just foreclosing eating opportunities, you’d have to force people to eat the same amount of the same food, but at different times of the day. The U.S. Army stepped forward to carry out just such an investigation.

    In their first set of experiments, Army researchers had people eat a single meal a day either as breakfast or dinner. The results clearly showed the breakfast group lost more weight, as you can see in the graph below and at 1:35 in my video. When study participants ate only once a day at dinner, their weight didn’t change much, but when they ate once a day at breakfast, they lost about two pounds a week. 

    Similar to the night-eating restriction study, this is to be expected, given that people tend to be hungrier in the evening. Think about it. If you went nine hours without eating during the day, you’d be famished, but people go nine hours without eating overnight all the time and don’t wake up ravenous. There is a natural circadian rhythm to hunger that peaks around 8:00 pm and drops to its lowest level around 8:00 am, as you can see in the graph below and at 2:09 in my video. That may be why breakfast is typically the smallest meal of the day. 

    The circadian rhythm of our appetite isn’t just behavioral, but biological, too. It’s not just that we’re hungrier in the evening because we’ve been running around all day. If you stayed up all night and slept all day, you’d still be hungriest when you woke up that evening. To untangle the factors, scientists used what’s called a “forced desynchrony” protocol. Study participants stayed in a room without windows in constant, unchanging, dim light and slept in staggered 20-hour cycles to totally scramble them up. This went on for more than a week, so the subjects ended up eating and sleeping at different times throughout all phases of the day. Then, the researchers could see if cyclical phenomena are truly based on internal clocks or just a consequence of what you happen to be doing at the time.  

    For instance, there is a daily swing in our core body temperature, blood pressure, hormone production, digestion, immune activity, and almost everything else, but let’s use temperature as an example. As you can see in the graph below and at 3:21 in my video, our body temperature usually bottoms out around 4:00 am, dropping from 98.6°F (37°C) down to more like 97.6°F (36.4°C). Is this just because our body cools down as we sleep? No. By keeping people awake and busy for 24 hours straight, it can be shown experimentally that it happens at about the same time no matter what. It’s part of our circadian rhythm, just like our appetite. It makes sense, then, if you are only eating one meal per day and want to lose weight, you’d want to eat in the morning when your hunger hormones are at their lowest level. 

    Sounds reasonable, but it starts to get weird.

    The Army scientists repeated the experiment, but this time, they had the participants eat exactly 2,000 calories either as breakfast or as dinner, taking appetite out of the picture. The subjects weren’t allowed to exercise either. Same number of calories, so the same change in weight, right? No. As you can see in the graph below and at 4:18 in my video, the breakfast-only group still lost about two pounds a week compared to the dinner-only group. Two pounds of weight loss eating the same number of calories. That’s why this concept of chronobiology, meal timing—when to eat—is so important. 

    Isn’t that wild? Two pounds of weight loss a week eating the same number of calories! That was a pretty extreme study, though. What about just shifting a greater percentage of calories to earlier in the day? That’s the subject of my next video: Breakfast Like a King, Lunch Like a Prince, Dinner Like a Pauper. First, let’s take a break from chronobiology to look at the Benefits of Garlic for Fighting Cancer and the Common Cold. Then, we’ll resume checking other videos in the related posts below.

    If you missed the first three videos in this extended series, also check out related posts below. 

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Milk Hormones and Female Infertility  | NutritionFacts.org

    Milk Hormones and Female Infertility  | NutritionFacts.org

    [ad_1]

    Dairy consumption is associated with years of advanced ovarian aging, thought to be due to the steroid hormones or endocrine-disrupting chemicals in cow’s milk.
     
    When it comes to the amount of steroid hormones we are exposed to in the food supply, dairy “milk products supply about 60–80% of ingested female sex steroids.” I’ve talked about the effects of these estrogens and progesterone in men and prepubescent children, and how milk intake can spike estrogen levels within hours of consumption. You can see graphs illustrating these points from 0:25 in my video The Effects of Hormones in Milk on Infertility in Women. In terms of effects on women, I’ve discussed the increased endometrial cancer risk in postmenopausal women. What about reproductive-age women? Might dairy hormones affect reproduction? 
     
    We’ve known that “dairy food intake has been associated with infertility; however, little is known with regard to associations with reproductive hormones or anovulation.” How might dairy do it? By affecting how the uterus prepares, or by affecting the ovary itself? Researchers found that women who ate yogurt or cream had about twice the risk of sporadic anovulation, meaning failure of ovulation, so some months there was no egg to fertilize at all. Now, we know most yogurt is packed with sugar these days. Even plain Greek yogurt can have more sugar than a double chocolate glazed cake donut, but the researchers controlled for that and the results remained after adjusting for the sugar content, “which suggests that the risk of anovulation was independent of the sugar content included in many flavored yogurt products.” We don’t know if this was just a fluke or exactly what the mechanism might be, but if women skip ovulation here and there throughout their lives, might they end up with a larger ovarian reserve of eggs? 
     
    Women are starting to have their first baby later in life. As you can see in the graph below and at 2:02 in my video, there’s been a rise in women having babies when they’re in their late 30s and 40s.

    We used to think that women’s ovarian reserve of eggs stayed relatively stable until a rapid decline at about age 37, but now we know it appears to be more of a gradual loss of eggs over time. The graph below and at 2:22 in my video charts a steady loss starting at peak fertility in one’s 20s.

    This measures “antral follicle count,” which is an ultrasound test where you can count the number of “next batter up” eggs in the ovaries, as you can see below and at 2:31 in my video. It is probably the best reflection of true reproductive age. It’s a measure of ovarian reserve—how many eggs a woman has left.

    What does this have to do with diet? Researchers at Harvard looked at the association of various protein intakes with ovarian antral follicle counts among women having trouble getting pregnant. “Even though diminished ovarian reserve is one of the major causes of female infertility, the process leading to reproductive senescence [deterioration with age] currently is poorly understood. In light of emerging population trends towards delayed pregnancy, the identification of reversible factors (including diet) that affect the individual rates of reproductive decline might be of significant clinical value.”

    The researchers performed ultrasounds on all the women, studied their diets, and concluded that higher intake of dairy protein was associated with lower antral follicle counts—in other words, accelerated ovarian aging. The graph below and at 3:39 in my video shows what counts look like in nonsmokers: Significantly lower ovarian reserve (12.7 antral follicle counts) at the highest dairy intake, which would be like three ounces of cheese a day, compared to the lowest dairy intake (16.9 antral follicle counts).

    What do these numbers mean in terms of biological age? Is 16.9 down to 12.7 really that much of a difference? As you can see below and at 3:58 in my video, when you look at women with really robust ovaries, a follicle count of 16.9 is what you might see in a 36- or 37-year-old, whereas 12.7, which is what you can see in women eating the most dairy, is what you might see in a really fertile 50-year-old. So, we’re talking year’s worth of ovarian aging between the highest and lowest dairy consumers.

    While it wasn’t possible for the researchers to “identify the underlying mechanism linking higher dairy protein intake to lower AFC,” antral follicle count, they had educated guesses. (1) It could be the steroid hormones and growth factors or (2) “the contamination of milk products by pesticides and endocrine disrupting chemicals that may negatively impact” the development of these ovarian follicles and egg competence.

    “Regarding the former [the hormones], studies suggest that commercial milk (derived from both pregnant and non-pregnant animals) contains large amounts of estrogens, progesterone, and other placental hormones that are eventually released into the human food chain, with dairy intake accounting for 60–80% of the estrogens consumed. Dairy estrogens overcome [survive] processing, appear in raw whole cow’s and commercial milk products, are found in substantially higher concentrations with increasing amounts of milk fat, with no apparent difference between organic and conventional dairy products…” Hormones are just naturally in cows’ bodies, so they aren’t just in the ones injected with growth hormones. And, once these bovine hormones are inside the human body, they get converted to estrone and estradiol, the main active human estrogens. Following absorption, bovine steroids may then affect reproductive outcomes.

    The researchers asserted that further studies are needed and that “it is imperative that these findings are reproduced in prospective studies designed to clarify the biology underlying the observed associations. The latter might be crucial given that consumption of another species’ milk by humans is an evolutionary novel dietary behavior that has the potential to alter reproductive parameters and may have long-term adverse health effects.”

    The video I mentioned about the effects of these estrogens and progesterone in men and prepubescent children is The Effects of Hormones in Dairy Milk on Cancer.

    I talk about the effect of dairy estrogen on male fertility in Dairy Estrogen and Male Fertility.

    How else might diet affect fertility? See related posts below. 

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Are Branched-Chain Amino Acids Good for Us?  | NutritionFacts.org

    Are Branched-Chain Amino Acids Good for Us?  | NutritionFacts.org

    [ad_1]

    I discuss why we may not want to exceed the recommended intake of protein.

    Diabetes isn’t just about the amount of body fat, but also the distribution of body fat. At 0:26 in my video Are BCAA (Branched-Chain Amino Acids) Healthy?, you can view cross-sections of thighs from two different patients using MRI. In the images, the fat shows up as white and the thigh muscle is black. At first glance, you might think the bottom cross-section has more fat since it’s ringed with more white. That is the subcutaneous fat, the fat under the skin. But, if you look at the top cross-section, you’ll see how the middle of the thigh muscle is more marbled with fat, like those really fatty Japanese beef steaks. That is the fat infiltrating into the muscle. In the graph below and at 0:48 in my video, the two cross sections are colored so you can see the different types of fat: the fat infiltrating the muscle in red, the fat between the muscles in green, and subcutaneous fat outside of the muscles and under the skin in yellow. If you add up all three types of fat, both of those thighs actually have the same amount of fat—just distributed differently.

    This seems to be the critical factor in terms of determining insulin resistance, the cause of type 2 diabetes. Researchers found that the subcutaneous adipose tissue, the fat right under the skin, was not associated with insulin resistance. Going back to the two cross sections, as seen below and at 1:20 in my video, it is healthier to have the bottom thigh with the thicker ring of subcutaneous fat but less fat infiltrating muscle than the top thigh with more fat present in the muscle.

    Is it possible a more plant-based diet also affects a more healthful distribution of fat?

    We now know the effect of a vegetarian diet versus a conventional diabetic diet on thigh fat distribution in patients with type 2 diabetes. Researchers took 74 people with diabetes and randomly assigned them to follow either a vegetarian diet or a conventional diabetic diet. Both diets were calorie-restricted by the same number of calories. The vegetarian diet was also egg-free, and dairy was limited to a maximum of one serving of low-fat yogurt a day. What did the researchers find? The reduction in the more benign subcutaneous fat was comparable; it was about the same in both groups. However, the more dangerous fat—the fat lodged inside the muscle itself—“was reduced only in response to a vegetarian diet.” So, even getting the same number of calories, there can be a healthier weight loss on a more plant-based diet.

    Those eating strictly plant-based also had lower levels of fat stuck inside the individual muscle fibers themselves, which may help explain why vegans in particular are often found to have the lowest odds of diabetes. It is not just because vegans are generally slimmer either. Even if you match subjects pound for pound, there is significantly less fat inside the muscle cells of vegans compared to omnivores. This is a good thing, since storing fat in muscle cells “may be one of the primary causes of insulin resistance,” which is what’s behind both prediabetes and type 2 diabetes. On the other hand, if you put someone on a high-fat diet, the fat in their muscle cells shoots up by 54 percent in just a single week.

    What about a high-protein diet? That may undermine one of the principal benefits of weight loss: eliminating the weight-loss-induced improvement in insulin resistance. Researchers put obese individuals on a calorie-restricted diet of less than 1,400 calories a day until they lost 10 percent of their body weight. Half of the participants were getting more of a regular protein intake (73 grams a day), and the other half were on a higher-protein diet (about 105 daily grams). Normally, if you lose 10 percent of your body weight, your insulin resistance improves. That’s why it is so critical for obese individuals with type 2 diabetes to lose weight. However, the beneficial effect of a 10 percent weight loss was eliminated by the high protein intake. Those extra 32 grams of protein a day abolished the weight-loss benefit. “The failure to improve…insulin sensitivity in the WL-HP [weight-loss high-protein] group is clinically important because it reflects a failure to improve a major pathophysiological [cause-and-effect] mechanism involved in the development of T2D,” type 2 diabetes. In summary, the researchers concluded that they demonstrated “the protein content of a weight loss diet can have profound effects on metabolic function.” 

    Is this true of any protein? As you can see below and at 4:19 in my video, if you split it between animal protein versus plant protein, following people over time, intake of animal protein is associated with an increased risk of diabetes in most studies.

    Intake of plant protein, however, appears to have either a neutral or even protective association with diabetes, as shown below and at 4:25 in my video

    Those were just observational studies, though. People who eat a lot of animal protein might have many unhealthy behaviors. However, you see the same thing in randomized, controlled, interventional trials, where you can improve blood sugar control just by replacing sources of animal protein with plant protein.

    We think it may be the branched-chain amino acids concentrated in animal protein. Higher levels in the bloodstream are associated with obesity and the development of insulin resistance. As you can see below and at 5:00 in my video, we may be able to drop our levels by sticking to plant proteins, but you don’t know if that has metabolic effects until you put it to the test. 

    Ruining the suspense, researchers titled their study: “Decreased Consumption of Branched-Chain Amino Acids Improves Metabolic Health.” They demonstrated that “a moderate reduction in total dietary protein or selected amino acids can rapidly improve metabolic health,” and this included improving blood sugar control, while also decreasing body mass index (BMI) and body fat. As you can see at 5:27 in my video, the protein-restricted group was eating hundreds more calories per day, significantly more calories than the control group, so they should have gained weight. But, no. They lost weight! After about a month and a half, they were eating more calories but lost more weight—about five more pounds than participants in the control group who were eating fewer calories, as you can see at 5:38 in my video. What’s more, this “protein restriction” had people eat the recommended amount of protein per day, about 56 daily grams. They should have been called the normal protein group or the recommended protein group instead, and the group eating more typically American protein levels and suffering because of it should have been called the excess protein group. Just sticking to the recommended protein intake doubled the levels of a pro-longevity hormone called FGF21, too, but we’ll save that for another discussion.

    To better understand the negative impact of omnivores getting too much protein relative to vegetarians, see my video Flashback Friday: Do Vegetarians Get Enough Protein?.

    I have several additional videos and blogs that may help explain some of the benefits of plant-based proteins. Check in the related posts below.

    Of course, the best way to treat type 2 diabetes is to get rid of it by treating the underlying cause, as described in my video How Not to Die from Diabetes

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • The Pros of Early Time-Restricted Eating  | NutritionFacts.org

    The Pros of Early Time-Restricted Eating  | NutritionFacts.org

    [ad_1]

    Calories eaten in the morning count less than calories eaten in the evening, and they’re healthier, too.
     
    Time-restricted feeding, where you limit the same amount of eating to a narrow evening window, has benefits compared to eating in the evening and earlier in the day, but it also has adverse effects because you’re eating so much, so late, as you can see below and at 0:12 my video The Benefits of Early Time-Restricted Eating

    The best of both worlds was demonstrated in 2018 when researchers put time-restricted feeding into a narrow window earlier in the day. As you can see below and at 0:28 in my video, individuals who were randomized to eat the same food, but only during an 8:00 a.m. to 3:00 p.m. eating window, experienced a drop in blood pressure, oxidative stress, and insulin resistance, even when all of the study subjects were maintained at the same weight. Same food, same weight, but with different results. The drops in blood pressure were extraordinary, from 123/82 down to 112/72 in five weeks, and that is comparable to the effectiveness of potent blood-pressure drugs.


    The longest study to date on time-restricted feeding only lasted for 16 weeks. It was a pilot study without a control group that involved only eight people, but the results are still worth noting. Overweight individuals, who, like most of us, had been eating for more than 14 hours a day, were instructed to stick to a consistent 10- to 12-hour feeding window of their own choosing, as you can see below and at 1:17 in my video. On average, they were able to successfully reduce their daily eating duration by about four and a half hours and had lost seven pounds within 16 weeks. 

    They also reported feeling more energetic and sleeping better, as seen in the graph below and at 1:32 in my video. This may help explain why “all participants voluntarily expressed an interest in continuing unsupervised with the 10-11 hr time-restricted eating regimen after the conclusion of the 16-week supervised intervention.” You don’t often see that after weight-loss studies. 

    Even more remarkably, eight months later and even one year post-study, they had retained their improved energy and sleep (see in the graph below and 1:55 in my video), as well as retained their weight loss (see in the graph below and 1:58 in my video)—all from one of the simplest of interventions: sticking to a consistent 10- to 12-hour feeding window of their own choosing. 
    How did it work? Even though the study “participants were not overtly asked to change nutrition quality or quantity,” they appeared to unintentionally eat hundreds of fewer calories a day. With self-selected time frames for eating, you wouldn’t necessarily think to expect circadian benefits, but because they had been asked to keep the eating window consistent throughout the week, “metabolic jet lag could be minimized.” The thinking is that because people tend to start their days later on weekends, they disrupt their own circadian rhythm. And, indeed, it is as if they had flown a few time zones west on Friday evening, then flew back east on Monday morning, as you can see in the graph below and at 2:40 in my video. So, some of the metabolic advantages may have been due to maintaining a more regular eating schedule. 


    Early or mid-day time-restricted feeding may have other benefits as well. Prolonged nightly fasting with reduced evening food intake has been associated with lower levels of inflammation and has also been linked to better blood sugar control, both of which might be expected to lower the risk of diseases, such as breast cancer. So, data were collected on thousands of breast cancer survivors to see if nightly fasting duration made a difference. Those who couldn’t go more than 13 hours every night without eating had a 36 percent higher risk of cancer recurrence. These findings have led to the suggestion that efforts to “avoid eating after 8 pm and fast for 13 h or more overnight may be a beneficial consideration for those patients looking to decrease cancer risk and recurrence,” though we would need a randomized controlled trial to know for sure. 
     
    Early time-restricted feeding may even play a role in the health of perhaps the longest-living population in the world, the Seventh-day Adventist Blue Zone in California. As you can see in the graph below and at 3:55 in my video, slim, vegetarian, nut-eating, exercising, non-smoking Adventists live about a decade longer than the general population. 

    Their greater life expectancy has been ascribed to these healthy lifestyle behaviors, but there’s one lesser-known component that may also be playing a role. Historically, eating two large meals a day, breakfast and lunch, with a prolonged overnight fast, was a part of Adventist teachings. Today, only about one in ten Adventists surveyed were eating just two meals a day. However, most of them, more than 60 percent of them, reported that breakfast or lunch was their largest meal of the day, as you can see below and at 4:26 in my video. Though this has yet to be studied concerning longevity, frontloading one’s calories earlier in the day with a prolonged nightly fast has been associated with significant weight loss over time. This led the researchers to conclude: “Eating breakfast and lunch 5–6 h apart and making the overnight fast last 18–19 h may be a useful practical strategy” for weight control. The weight may be worth the wait. 


    For more on fasting, click here
     
    My big takeaway from all of the intermittent fasting research I looked at is, whenever possible, eat earlier in the day. At the very least, avoid late-night eating whenever you can. Eating breakfast like a king and lunch like a prince, with or without an early dinner for a pauper, would probably be best. 
     
    For more on fasting, fasting for disease reversal, and fasting and cancer, check the related videos below.  

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • What the Science Says About Time-Restricted Eating  | NutritionFacts.org

    What the Science Says About Time-Restricted Eating  | NutritionFacts.org

    [ad_1]

    Are there benefits to giving yourself a bigger daily break from eating? 
     
    The reason many blood tests are taken after an overnight fast is that meals can tip our system out of balance, bumping up certain biomarkers for disease, such as blood sugars, insulin, cholesterol, and triglycerides. Yet, as you can see in the graph below and at 0:20 in my video Time-Restricted Eating Put to the Test, fewer than one in ten Americans may even make it 12 hours without eating. As evolutionarily unnatural as getting three meals a day is, most of us are eating even more than that. One study used a smartphone app to record more than 25,000 eating events and found that people tended to eat about every three hours over an average span of about 15 hours a day. Might it be beneficial to give our bodies a bigger break? 

    Time-restricted feeding is “defined as fasting for periods of at least 12 hours but less than 24 hours,” and this involves trying to confine caloric intake to a set window of time, typically ranging from 3 to 4 hours, 7 to 9 hours, or 10 to 12 hours a day, which results in a daily fast lasting 12 to 21 hours. When mice are restricted to a daily feeding window, they gain less weight even when fed the same amount as mice “with ad-lib access.” Rodents have such high metabolisms, though, that a single day of fasting can starve away as much as 15 percent of their lean body mass. This makes it difficult to extrapolate from mouse models. You don’t know what happens in humans until you put it to the test. 
     
    The drop-out rates in time-restricted feeding trials certainly appear lower than most prolonged forms of intermittent fasting, suggesting it’s more easily tolerable, but does it work? Researchers found that when people stopped eating from 7:00 p.m. to 6:00 a.m. for two weeks, they lost about a pound each week compared to no time restriction. Note that “there were no additional instructions or recommendations on the amount or type of food consumed,” and no gadgets, calorie counting, or record-keeping either. The study participants were just told to limit their food intake to the hours of 6:00 a.m. and 7:00 p.m., a simple intervention that’s easy to understand and put into practice. 
     
    The next logical step? Put it to the test for months instead of just weeks. Obese men and women were asked to restrict eating to the eight-hour window between 10:00 a.m. and 6:00 p.m. Twelve weeks later, they had lost nearly seven pounds, as you can see in the graph below and at 2:18 in my video. This deceptively simple intervention may be operating from several different angles. People not only tend to eat more food later in the day, but eat higher fat foods later in the day. By eliminating eating in the late-evening hours, one removes prime-time snacking on the couch, a high-risk time for overeating. And, indeed, during the no-eating-after-7:00-p.m. study, the subjects were inadvertently eating about 250 fewer calories a day. Then, there are also the chronobiological benefits of avoiding late-night eating. 

    I did a whole series of videos about the role our circadian rhythms have in the obesity epidemic, how the timing of meals can be critical, and how we can match meal timing to our body clocks. Just to give you a taste: Did you know that calories eaten at dinner are significantly more fattening than the same number of calories eaten at breakfast? See the table below and at 3:08 in my video

    Calories consumed in the morning cause less weight gain than the same calories eaten in the evening. A diet with a bigger breakfast causes more weight loss than the same exact diet with a bigger dinner, as you can see in the graph below and at 3:21 in my video, and nighttime snacks are more fattening than the same snacks if eaten in the daytime. Thanks to our circadian rhythms, metabolic slowing, hunger, carbohydrate intolerance, triglycerides, and a propensity for weight gain are all things that go bump in the night.  


    What about the fasting component of time-restricted feeding? There’s already the double benefit of getting fewer calories and avoiding night-time eating. Does the fact that you’re fasting for 11 or 16 hours a day play any role, considering the average person may only make it about 9 hours a day without eating? How would you design an experiment to test that? What if you randomized people into two groups and had both groups eat the same number of calories a day and also eat late into the evening, but one group fasted even longer, for 20 hours? That’s exactly what researchers at the USDA and National Institute of Aging did. 
     
    Men and women were randomized to eat three meals a day or fit all of those same calories into a four-hour window between 5:00 p.m. and 9:00 p.m., then fast the rest of the day. If the weight-loss benefits from the other two time-restricted feeding studies were due to the passive calorie restriction or avoidance of late-night eating, then, presumably, both of these groups should end up the same because they’re both eating the same amount and they’re both eating late. That’s not what happened, though. As you can see below and at 4:49 in my video, after eight weeks, the time-restricted feeding group ended up with less body fat, nearly five pounds less. They got about the same number of calories, but they lost more weight. 

    As seen below and at 5:00 in my video, a similar study with an eight-hour eating window resulted in three more pounds of fat loss. So, there does seem to be something to giving your body daily breaks from eating around the clock.


    Because that four-hour eating window in the study was at night, though, the participants suffered the chronobiological consequences—significant elevations in blood pressure and cholesterol levels—despite the weight loss, as you can see below and at 5:13 in my video. The best of both worlds was demonstrated in 2018: early time-restricted feeding, eating with a narrow window earlier in the day, which I covered in my video The Benefits of Early Time-Restricted Eating


    Isn’t that mind-blowing about the circadian rhythm business? Calories in the morning count less and are healthier than calories in the evening. So, if you’re going to skip a meal to widen your daily fasting window, skip dinner instead of breakfast. 

    If you missed any of the other videos in this fasting series, check out the related videos below. 

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • A Look at the 5:2 Diet and the Fasting-Mimicking Diet  | NutritionFacts.org

    A Look at the 5:2 Diet and the Fasting-Mimicking Diet  | NutritionFacts.org

    [ad_1]

    What are the effects of eating only five days a week or following a fasting-mimicking diet five days a month? 
     
    Instead of eating every other day, what if you ate five days a week and fasted for the other two? As I discuss in my video The 5:2 Diet and the Fasting-Mimicking Diet Put to the Test, the available data are similar to that of alternate-day fasting: About a dozen pounds of weight loss was reported in overweight men and also reported in overweight women over six months, with no difference found between participants on the 5:2 intermittent fasting regimen and those on a continuous 500-calories-a-day restriction. The largest trial to date found an 18-pound weight loss within six months in the 5:2 group, which isn’t significantly different from the 20 pounds lost in the continuous calorie restriction group. Weight maintenance over the subsequent six months was also found to be no different.
     
    Though feelings of hunger may be more pronounced on the 5:2 pattern than on an equivalent level of daily calorie cutting, it does not seem to lead to overeating on non-fasting days. One might expect going two days without food may negatively impact mood, but no such adverse impact was noted for those fully fasting on zero calories or sticking to just two packets of oatmeal on each of the “fasting” days. (The oatmeal provides about 500 calories a day.) Like alternate-day fasting, the 5:2 fasting pattern appeared to have inconsistent effects on cognition and on preserving lean mass, and it also failed to live up to the “popular notion” that intermittent fasting would be “easier” to adhere to than daily calorie restriction. 
     
    Compared to those in the continuous-restriction control group, fewer subjects in the 5:2 pattern group expressed interest in continuing their diet after the study was over. This was attributed to quality-of-life issues, with 5:2 fasting participants citing headaches, lack of energy, and difficulty fitting the fasting days into their weekly routine. However, as you can see below and at 1:53 in my video, there has yet to be a single 5:2 diet study showing elevated LDL cholesterol compared with continuous calorie restriction at six months. Nor has it been shown for a year. This offers a potential advantage over alternate-day regimens. 

    Instead of 5:2, what about 25:5, spending five consecutive days a month on a “fasting-mimicking diet” (FMD)? Longevity researcher Valter Longo designed a five-day meal plan to try to simulate the metabolic effects of fasting by being low in protein, sugars, and calories with zero animal protein and zero animal fat. By making the diet plant-based, he hoped to lower the level of the cancer-promoting growth hormone IGF-1. He indeed accomplished this goal, along with a drop in markers of inflammation, after three cycles of his five-days-a-month program, as you can see below and at 2:33 in my video

    One hundred men and women were randomized to consume his fasting-mimicking diet for five consecutive days per month or maintain their regular diet the whole time. As you can see in the graph below and at 2:47 in my video, after three months, the FMD group was down about six pounds compared to the control group, with significant drops in body fat and waist circumference, accompanied by a drop in blood pressure. 

    Those who were the worst off accrued the most dramatic benefits, as seen in the graph below and at 3:04 in my video. What’s even wilder is that three further months after completion, some of the benefits appeared to persist, suggesting the effects “may last for several months.” It’s unclear, though, if those randomized to the FMD group used it as an opportunity to make positive lifestyle changes that helped maintain some of the weight loss. 


    Dr. Longo created a company to market his meal plan commercially, but, to his credit, says “he does not receive a salary or a consulting fee from the company…and will donate 100% of his shares to charity.” The whole diet appears to be mostly dehydrated soup mixes, herbal teas like hibiscus and chamomile, kale chips, nut-based energy bars, an algae-based DHA supplement, and a multivitamin dusted with vegetable powder. Why spend 50 dollars a day on a few processed snacks when you could instead eat a few hundred calories a day of real vegetables? 
     
    How interesting was that? All-you-can-eat above-ground vegetables for five days would have the same low amount of protein, sugars, and calories with zero animal protein or animal fat. But we’ll probably never know if it works as well, better, or worse because it’s hard to imagine such a study ever getting done without the financial incentive. 

    To learn more about IGF-1, see my video Flashback Friday: Animal Protein Compared to Cigarette Smoking.
     
    In this series on fasting, I’ve covered several topics, including the basics of calories and weight loss, water-only fasting, and the types of alternate-day fasting, see them all in the related videos below. 
     
    I close out the series with videos on time-restricted eating: Time-Restricted Eating Put to the Test and The Benefits of Early Time-Restricted Eating
     
    If you want all of the videos in one place, I’ve done three webinars on fasting—Intermittent Fasting, Fasting for Disease Reversal, and Fasting and Cancer—and they’re all available for download now. 

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Is Our Life Expectancy Extended by Intermittent Fasting?  | NutritionFacts.org

    Is Our Life Expectancy Extended by Intermittent Fasting?  | NutritionFacts.org

    [ad_1]

    Alternate-day modified fasting is put to the test for lifespan extension. 

    Is it true that alternate-day calorie restriction prolongs life? Doctors have anecdotally attributed improvements in a variety of disease states to alternate-day fasting, including asthma; seasonal allergies; autoimmune diseases, such as rheumatoid arthritis and osteoarthritis; infectious diseases, such as toenail fungus, periodontal disease, and viral upper respiratory tract infections; neurological conditions, such as Tourette’s syndrome and Meniere’s disease; atrial fibrillation; and menopause-related hot flashes. The actual effect on chronic disease, however, remains unclear, as I discuss in my video Does Intermittent Fasting Increase Human Life Expectancy?
     
    Alternate-day fasting has been put to the test for asthma in overweight adults, and researchers found that asthma-related symptoms and control significantly improved, as did the patients’ quality of life, including objective measurements of lung function and inflammation. As you can see in the graphs below and at 0:56 in my video, there were significant improvements in peak airflow, mood, and energy. Their weight also improved—about a 19-pound drop in eight weeks—so it’s hard to tease out the effects specific to the fasting beyond the benefits we might expect from weight loss by any means. 

    For the most remarkable study on alternate-day fasting, you have to go back more than a half-century. Though the 2017 cholesterol findings were the most concerning data I could find on alternate-day fasting, the most enticing was published in Spain in 1956. The title of the study translates as “The Hunger Diet on Alternate Days in the Nutrition of the Aged.” Inspired by the data being published on life extension with caloric restriction on lab rats, researchers split 120 residents of a nursing home in Madrid into two groups. Sixty residents continued to eat their regular diet, and the other half were put on an alternate-day modified fast. On the odd days of the month, they ate a regular 2,300-calorie diet; on the even days, they were given only a pound of fresh fruits and a liter of milk, estimated to add up to about 900 calories. This continued for three years. So, what happened? 
     
    As you can see below and at 2:16 in my video, throughout the study, 13 participants died in the control group, compared to only 6 in the intermittent fasting group, but those numbers were too small to be statistically significant. 

    What was highly significant, though, was the number of days spent hospitalized: Residents in the control group spent a total of 219 days in the infirmary, whereas the alternate-day fasting group only chalked up 123 days, as you can see below and at 2:38 in my video


    This is held up as solid evidence that alternate-day fasting may improve one’s healthspan and potentially even one’s lifespan, but a few caveats must be considered. It’s not clear how the residents were allocated to their respective groups. If, instead of being randomized, healthier individuals were inadvertently placed in the intermittent fasting group, that could skew the results in their favor. As well, it appears the director of the study was also in charge of medical decisions at the nursing home. In that role, he could have unconsciously been biased toward hospitalizing more folks in the control group. Given the progress that has been made in regulating human experimentation, it’s hard to imagine such a trial being run today, so we may never know if such impressive findings can be replicated. 

    Well, that was interesting! I had never even heard of that study until I started digging into the topic.  

    Check out my fasting series and popular videos on the subject here.  

    For more on longevity, see related videos below.



    [ad_2]

    Michael Greger M.D. FACLM

    Source link