ReportWire

Tag: processed foods

  • What Is the Role of Our Genes in the Obesity Epidemic?  | NutritionFacts.org

    What Is the Role of Our Genes in the Obesity Epidemic?  | NutritionFacts.org

    [ad_1]

    The “fat gene” accounts for less than 1 percent of the differences in size between people.

    To date, about a hundred genetic markers have been linked to obesity, but when you put them all together, overall, they account for less than 3 percent of the difference in body mass index (BMI) between people. You may have heard about the “fat gene,” called FTO, short for FaT mass and Obesity-associated). It’s the gene most strongly linked to obesity, but it explains less than 1 percent of the difference in BMI between people, a mere 0.34 percent. 

    As I discuss in my video The Role of Genes in the Obesity Epidemic, FTO codes for a brain protein that appears to affect our appetite. Are you one of the billion people who carry the FTO susceptibility genes? It doesn’t matter because it only appears to result in a difference in intake of a few hundred extra calories a year. The energy imbalance that led to the obesity epidemic is on the order of hundreds of calories a day, and that’s the gene known so far to have the most effect. The chances of accurately predicting obesity risk based on FTO status is “only slightly better than tossing a coin.” In other words, no, those genes don’t make you look fat.

    When it comes to obesity, the power of our genes is nothing compared to the power of our fork. Even the small influence the FTO gene does have appears to be weaker among those who are physically active and may be abolished completely in those eating healthier diets. FTO only appears to affect those eating diets higher in saturated fat, which is predominantly found in meat, dairy, and junk food. Those eating more healthfully appear to be at no greater risk of weight gain, even if they inherited the “fat gene” from both of their parents.

    Physiologically, FTO gene status does not appear to affect our ability to lose weight. Psychologically, knowing we’re at increased genetic risk for obesity may motivate some people to eat and live more healthfully, but it may cause others to fatalistically throw their hands up in the air and resign themselves to thinking that it just runs in their family, as you can see in the graph below and at 2:11 in my video. Obesity does tend to run in families, but so do lousy diets. 

    Comparing the weight of biological versus adopted children can help tease out the contributions of lifestyles versus genetics. Children growing up with two overweight biological parents were found to be 27 percent more likely to be overweight themselves, whereas adopted children placed in a home with two overweight parents were 21 percent more likely to be overweight. So, genetics do play a role, but this suggests that it’s more the children’s environment than their DNA.

    One of the most dramatic examples of the power of diet over DNA comes from the Pima Indians of Arizona. As you can see in the graph below and at 3:05 in my video, they not only have among the highest rates of obesity, but they also have the highest rates of diabetes in the world. This has been ascribed to their relatively fuel-efficient genetic makeup. Their propensity to store calories may have served them well in times of scarcity when they were living off of corn, beans, and squash, but when the area became “settled,” their source of water, the Gila River, was diverted upstream. Those who survived the ensuing famine had to abandon their traditional diet to live off of government food programs and chronic disease rates skyrocketed. Same genes, but different diet, different result. 

    In fact, a natural experiment was set up. The Pima living over the border in Mexico come from the same genetic pool but were able to maintain more of their traditional lifestyle, sticking with their main staples of beans, wheat flour tortillas, and potatoes. Same genes, but seven times less obesity and about four times less diabetes. You can see those graphs below and at 3:58 and 4:02 in my video. Genes may load the gun, but diet pulls the trigger.

    Of course, it’s not our genes! Our genes didn’t suddenly change 40 years ago. At the same time, though, in a certain sense, it could be thought of as all in our genes. That’s the topic of my next video The Thrifty Gene Theory: Survival of the Fattest.

    This is the second in an 11-video series on the obesity epidemic. If you missed the first one, check out The Role of Diet vs. Exercise in the Obesity Epidemic

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • The Roles Diet and Exercise Play in the Obesity Epidemic  | NutritionFacts.org

    The Roles Diet and Exercise Play in the Obesity Epidemic  | NutritionFacts.org

    [ad_1]

    The common explanations for the cause of the obesity epidemic put forward by the food industry and policymakers, such as inactivity or a lack of willpower, are not only wrong, but actively harmful fallacies.

    Obesity isn’t new, but the obesity epidemic is. We went from a few corpulent kings and queens, like Henry VIII or Louis VI (known as Louis le Gros, or “Louis the Fat”), to a pandemic of obesity, now considered to be “arguably the gravest and most poorly controlled public health threat of our time.” As you can see below and at 0:34 in my video The Role of Diet vs. Exercise in the Obesity Epidemic, about 37 percent of American men are obese and 41 percent of American women, with no end in sight. Earlier reports had suggested that the rise in obesity was at least slowing down, but even that doesn’t appear to be the case. Similarly, we had thought we were turning the corner on childhood obesity “[a]fter 35 years of unremittingly bad news,” but the bad news continues. Childhood and adolescent obesity rates have continued to rise, now into the fourth decade. 

    Over the last century, obesity appears to have jumped ten-fold, from about 1 in 30 to now 1 in 3, but it wasn’t a steady rise. As you can see in the graph below and at 1:15 in my video, something seems to have happened around the late 1970s—and not just in the United States, but around the globe. The obesity pandemic took off at about the same time in the 1970s and 1980s in most high-income countries. The fact that the rapid rise “seemed to begin almost concurrently” across the industrialized world suggests a common cause. What might that trigger have been? 

    Any potential driver would have to be global and “coincide with the upswing of the epidemic.” So, the change would have had to have started about 40 years ago and would have had to have been able to spread rapidly around the globe. Let’s see how all the various theories stack up. For example, as you can see below and at 1:55 in my video, some have blamed changes in our built environment and shifts in city planning that have made our communities less conducive to walking, biking, and grocery shopping. That doesn’t meet our criteria for a credible cause, though, because there was no universal, simultaneous change in our neighborhoods within that time frame.

    When researchers surveyed hundreds of policymakers, most blamed the obesity epidemic on a “lack of personal motivation.” Do you see how little sense that makes? In the United States, for example, obesity shot up across the entire population in the late 1970s, as you can see at 2:26 in my video. I concur with the researchers who “believe it is implausible that each age, sex, and ethnic group, with massive differences in life experience and attitudes, had a simultaneous decline in willpower related to healthy nutrition or exercise.” More plausible than a global change like our characters would be some global change like our lives. 

    The food industry blames inactivity. “If all consumers exercised,” said the CEO of PepsiCo, “obesity wouldn’t exist.” Coca-Cola went a step further, spending $1.5 million to create the Global Energy Balance Network to downplay the role of diet. Leaked emails show the company planned on using the front to “serve as a ‘weapon’ to ‘change the conversation’ about obesity in its ‘war’ with public health.

    This tactic is so common among food and beverage companies that it even has a name: “leanwashing.” You’ve heard of greenwashing, where companies deceptively pretend to be environmentally friendly. Leanwashing is the term used to describe companies that try to position themselves as helping to solve the obesity crisis when they’re instead directly contributing to it. For example, the largest food company in the world, Nestlé, has “rebranded itself as the ‘world’s leading nutrition, health and wellness company. Yes, that Nestlé, makers of Nesquik, Cookie Crisp, and historically more than a hundred different brands of candy, including Butterfinger, Kit Kat, Goobers, Gobstoppers, Runts, and Nerds. Another one of its slogans is “Good Food, Good Life.” Its Raisinets may have some fruit, but Nestlé seems to me more Willy Wonka than wellness. 

    The constant corporate drumbeat of overemphasis on physical inactivity appears to be working. In response to the Harris poll question, “Which of these do you think are the major reasons why obesity has increased?,” a “huge majority of 83% chose lack of exercise, while only 34% chose excessive calorie consumption.” “Confusion about the effect of exercise on the energy balance” has been identified as one of the most common misconceptions about obesity. The scientific community has “come to a fairly decisive conclusion” that the factors governing calorie intake more powerfully affect overall calorie balance. It’s our fast food more than our slow motion. 

    “There is considerable debate in the literature today about whether physical activity has any role whatsoever in the epidemic of obesity that has swept the globe since the 1980s.” The increase in caloric intake per person is more than enough to explain the obesity epidemic in the United States and also explain it globally. If anything, the level of physical activity over the last few decades has gone up slightly in both Europe and North America. Ironically, this may be a result of the extra energy it takes to move around our heavier bodies, making it a consequence of the obesity problem rather than the cause.

    “Formal exercise plays a very small role in the total daily physical activity energy expenditure.” Think how much more physical work people used to do in the workplace, on the farm, or even in the home. It’s not just the shift in collar color from blue to white. Increasing automation, computerization, mechanization, motorization, and urbanization have all contributed to increasingly more sedentary lifestyles over the last century—and that’s the problem with the theory. The occupational shifts and advent of labor-saving devices “have been gradual and largely predated the dramatic increase in weight gain across the developed world in the past few decades.” Washing machines, vacuum cleaners, and the Model T were all invented before 1910. Indeed, when put to the test using state-of-the-art methods to measure energy in and energy out, it was caloric intake, not physical activity, that predicted weight gain over time. 

    The common misconception that obesity is mostly due to lack of exercise may not just be a benign fallacy. Personal theories of causation appear to impact people’s weight. Those who blame insufficient exercise are significantly more likely to be overweight than those who implicate a poor diet. Put those who believe lack of exercise causes obesity in a room with chocolate, and they can covertly be observed consuming more candy. Those holding that view may be different in other ways, though. You can’t prove cause and effect until you put it to the test. And, indeed, as you can see in the graph below, and at 7:22 in my video, people randomized to read an article implicating inactivity went on to eat significantly more sweets than those reading about research that indicated diet. A similar study found that those presented with research blaming genetics subsequently ate significantly more cookies. The title of that paper? “An Unintended Way in Which the Fat Gene Might Make You Fat.” 

    When I sat down to write How Not to Diet, I knew this “what triggered the obesity epidemic” was going to be a big question I had to face. Was it inactivity (just kids sitting around playing video games or scrolling on their phones)? Was it genetic? Was it epigenetic (something turning on our fat genes)? Or was it just the food? Were we eating more fat all of a sudden? More carbs? More processed foods? Or were we just eating more period, because of bigger serving sizes or more snacking? Inquiring minds wanted to know. 

    This is the first in an 11-video series to answer this question, which I originally released in a two-hour webinar in 2020. Check out the webinar digital download here. Or, check them out in the related posts below.  

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Headache and Migraine Relief from Foods  | NutritionFacts.org

    Headache and Migraine Relief from Foods  | NutritionFacts.org

    [ad_1]

    Plant-based diets are put to the test for treating migraine headaches.

    Headaches are one of the top five reasons people end up in emergency rooms and one of the leading reasons people see their doctors in general. One way to try to prevent them is to identify their triggers and avoid them. Common triggers for migraines include stress, smoking, hunger, sleep issues, certain foods (like chocolate, cheese, and alcohol), your menstrual cycle, or certain weather patterns (like high humidity).

    In terms of dietary treatments, the so-called Father of Modern Medicine, William Osler suggested trying a “strict vegetable diet.” After all, the nerve inflammation associated with migraines “may be reduced by a vegan diet as many plant foods are high in anti-inflammatory compounds and antioxidants, and likewise, meat products have been reported to have inflammatory properties.” It wasn’t put to the test, though, for another 117 years.

    As I discuss in my video Friday Favorites: Foods That Help Headache and Migraine Relief, among study participants given a placebo supplement, half said they got better, while the other half said they didn’t. But, when put on a strictly plant-based diet, they did much better, experiencing a significant drop in the severity of their pain, as you can see in the graph below and at 1:08 in my video

    Now, “it is possible that the pain-reducing effects of the vegan diet may be, at least in part, due to weight reduction.” The study participants lost about nine more pounds when they were on the plant-based diet for a month, as shown below, and at 1:22. 

    Even just lowering the fat content of the diet may help. Those placed on a month of consuming less than 30 daily grams of fat (for instance, less than two tablespoons of oil all day), experienced “statistically significant decreases in headache frequency, intensity, duration, and medication intake”—a six-fold decrease in the frequency and intensity, as you can see below and at 1:44 in my video. They went from three migraine attacks every two weeks down to just one a month. And, by “low fat,” the researchers didn’t mean SnackWell’s; they meant more fruits, vegetables, and beans. Before the food industry co-opted and corrupted the term, eating “low fat” meant eating an apple, for example, not Kellogg’s Apple Jacks.  

    Now, they were on a low-fat diet—about 10 percent fat for someone eating 2,500 calories a day. What about just less than 20 percent fat compared to a more normal diet that’s still relatively lower fat than average? As you can see below and at 2:22 in my video, the researchers saw the same significant drops in headache frequency and severity, including a five-fold drop in attacks of severe pain. Since the intervention involved at least a halving of intake of saturated fat, which is mostly found in meat, dairy, and junk, the researchers concluded that reduced consumption of saturated fat may help control migraine attacks—but it isn’t necessarily something they’re getting less of. There are compounds “present in Live green real veggies” that might bind to a migraine-triggering peptide known as calcitonin gene-related peptide, CGRP. 

    Drug companies have been trying to come up with something that binds to CGRP, but the drugs have failed to be effective. They’re also toxic, which is a problem we don’t have with cabbage, as you can see below and at 3:01 in my video

    Green vegetables also have magnesium. Found throughout the food supply but most concentrated in green leafy vegetables, beans, nuts, seeds, and whole grains, magnesium is the central atom to chlorophyll, as shown below and at 3:15. So, you can see how much magnesium foods have in the produce aisle by the intensity of their green color. Although magnesium supplements do not appear to decrease migraine severity, they may reduce the number of attacks you get in the first place. You can ask your doctor about starting 600 mg of magnesium dicitrate every day, but note that magnesium supplements can cause adverse effects, such as diarrhea, so I recommend getting it the way nature intended—in the form of real food, not supplements.  

    Any foods that may be particularly helpful? You may recall that I’ve talked about ground ginger. What about caffeine? Indeed, combining caffeine with over-the-counter painkillers, like Tylenol, aspirin, or ibuprofen, may boost their efficacy, at doses of about 130 mg for tension-type headaches and 100 mg for migraines. That’s about what you might expect to get in three cups of tea, as you can see below, and at 4:00 in my video. (I believe it is just a coincidence that the principal investigator of this study was named Lipton.) 

    Please note that you can overdo it. If you take kids and teens with headaches who were drinking 1.5 liters of cola a day and cut the soda, you can cure 90 percent of them. However, this may be a cola effect rather than a caffeine effect. 

    And, finally, one plant food that may not be the best idea is the Carolina Reaper, the hottest chili pepper in the world. It’s so mind-numbingly hot it can clamp off the arteries in your brain, as seen below and at 4:41 in my video, and you can end up with a “thunderclap headache,” like the 34-year-old man who ate the world’s hottest pepper and ended up in the emergency room. Why am I not surprised it was a man? 

    I’ve previously covered ginger and topical lavender for migraines. Saffron may help relieve PMS symptoms, including headaches. A more exotic way a plant-based diet can prevent headaches is by helping to keep tapeworms out of your brain.

    Though hot peppers can indeed trigger headaches, they may also be used to treat them. Check out my video on relieving cluster headaches with hot sauce

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Children’s Cereals: Candy for Breakfast?  | NutritionFacts.org

    Children’s Cereals: Candy for Breakfast?  | NutritionFacts.org

    [ad_1]

    Plastering front-of-package nutrient claims on cereal boxes is an attempt to distract us from the incongruity of feeding our children multicolored marshmallows for breakfast.

    The American Medical Association started warning people about excess sugar consumption more than 75 years ago, based in part on our understanding that “sugar supplies nothing in nutrition but calories, and the vitamins provided by other foods are sapped by sugar to liberate these calories.” So, added sugars aren’t just empty calories, but negative nutrition. “Thus, the more added sugars one consumes, the more nutritionally depleted one may become.”

    Given the “totality of publicly available scientific evidence,” the Food and Drug Administration (FDA) decided to make processed food manufacturers declare “added sugars” on their nutrition labels. The National Yogurt Association was livid and said it “continues to oppose the ‘added sugars’ declaration,” since it needed “‘added sugars’ to increase palatability” of its products. The junk food association questioned the science, whereas the ice cream folks seemed to imply that consumers are too stupid to “understand or know how to use the added sugar declaration,” so it’s better just to leave it off. The world’s biggest cereal company, Kellogg’s, took a similar tact, opposing it so as not “to confuse consumers.” Should the FDA proceed with such labeling against Kellogg’s objections, the cereal giant pressed that “an added sugars declaration…should be communicated as a footnote.” It claimed that its “goal is to provide consumers with useful information so they can make informed choices.” This is from a company that describes its Froot Loops as “packed with delicious fruity taste, fruity aroma, and bright colors.” Keep in mind that Froot Loops has more sugar than a Krispy Kreme doughnut, as you can see in the graph below and at 1:46 in my video Friday Favorites: Kids’ Breakfast Cereals as Nutritional Façade

    Froot Loops is more than 40 percent sugar by weight! You can see the cereal box’s Nutrition Facts label below and at 1:50 in my video

    The tobacco industry used similar terms, such as “light,” “low,” and “mild” to make its products appear healthier—before it was barred from doing so. “Now sugar interests are fighting similar battles over whether their terminology, including ‘healthy,’ ‘natural,’ ‘naturally sweetened,’ and even ‘lightly sweetened,’ is deceptive to consumers.”

    But if you look at the side of a cereal box, as shown below and at 2:13 in my video, you can see all those vitamins and minerals that have been added. That was one of the ways the cereal companies responded to calls for banning sugary cereals. General Mills defended the likes of Franken Berry, Trix, and Lucky Charms for being fortified with essential vitamins. 

    Sir Grapefellow, I learned, was a “grape-flavored oat cereal” complete with “sweet grape star bits”—that is, marshmallows. Don’t worry. It was “vitamin charged!” You can see that cereal box below and at 2:31 in my video

    Sugary breakfast cereals, said Dr. Jean Mayer from Harvard, “are not a complete food even if fortified with eight or 10 vitamins.” Senator McGovern replied, “I think your point is well taken that these products may be mislabeled or more correctly called candy vitamins than cereals.” 

    Plastering nutrient claims on cereal boxes can create “a ‘nutritional façade’ around a product, acting to distract attention away” from unsavory qualities, such as excess sugar content. Researchers found that the “majority of parents misinterpreted the meaning of claims commonly used on children’s cereals,” raising significant public health concerns. Ironically, cereal boxes bearing low-calorie claims were found to have more calories on average than those without such a claim. The cereal doth protest too much. 

    Even candy bar companies are getting in on the action, bragging about protein content because of some peanuts. Like the Baby Ruth, a candy bar that has 50 grams of sugar. Froot Loops could be considered breakfast candy, as the same serving would have 40 sugar grams, as you can see below and at 3:45 in my video

    Given that “research suggests that consumers believe front-of-package claims, perceive them to be government-endorsed, and use them to ignore the Nutrition Facts Panel,” there’s been a call from nutrition professionals to consider “an outright ban on all front-of-package claims.” The industry’s short-lived “Smart Choices” label, as you can see below and at 4:13 in my video, was met with disbelief when it was found adorning qualifying cereals like Froot Loops and Cookie Crisp. The processed food industry spent more than a billion dollars lobbying against the adoption of more informative labeling (a traffic-light approach), “opposing most aggressively the use of a red light suggesting that any food was too high in anything.” 

    I was invited to testify as an expert witness in a case against sugary cereal companies. (I donated my fee, of course.) Check out the related posts below for a video series and blogs that are a result of some of the research I did. 

    You may also be interested in videos and blogs on the food industry; see related posts below.

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Is All Vegan Food Healthy?  | NutritionFacts.org

    Is All Vegan Food Healthy?  | NutritionFacts.org

    [ad_1]

    How do healthier plant-based diets compare to unhealthy plant foods and animal foods when it comes to diabetes risk? 

    In my video on flexitarians, I discuss how the benefits of eating a plant-based diet are not all-or-nothing. “Simple advice to increase the consumption of plant-derived foods with compensatory [parallel] reductions in the consumption of foods from animal sources confers a survival advantage”— a live-longer advantage. The researchers call it a “pro-vegetarian” eating pattern, one that’s moving in the direction of vegetarianism, “a more gradual and gentle approach.” 

    If you’re dealing with a serious disease, though, like diabetes, completely “avoiding some problem foods is easier than attempting to moderate their intake. Clinicians would never tell an alcoholic to try to simply cut down on alcohol. Avoiding alcohol entirely is more effective and, in fact, easier for a problem drinker…Paradoxically, asking patients to make a large change may be more effective than making a slow transition. Diet studies show that recommending more significant changes increases the chances that patients can accomplish [them]. It may help to replace the common advice, ‘all things in moderation’ with ‘big changes beget big results.’ Success breeds success. After a few days or weeks of major dietary changes, patients are likely to see improvements in weight and blood glucose [sugar] levels—improvements that reinforce the dietary changes that elicited them. Furthermore, they may enjoy other health benefits of a plant-based diet” that may give them further motivation. 

    As you can see below and at 1:43 in my video Friday Favorites: Is Vegan Food Always Healthy?, those who choose to eat plant-based for their health say it’s mostly for “general wellness or general disease prevention” or to improve their energy levels or immune function, for example. 

    They felt it gives them a sense of control over their health, helps them feel better emotionally, improves their overall health, makes them feel better, and more, as shown below and at 1:48. Most felt it was very important for maintaining their health and well-being. 

    For the minority who used it for a specific health problem, mostly high cholesterol or weight loss, followed by high blood pressure and diabetes, most reported they felt it helped a great deal, as you can see below and at 2:14. 

    Some choose plant-based diets for other reasons, such as animal welfare or global warming, and it looks like “ethical vegans” are more likely to eat sugary and fatty foods, like vegan donuts, compared to those eating plant-based because of religious or health concerns, as you can see below and at 2:26 in my video

    The veganest vegan could make an egg- and dairy-free cake, covered with frosting, marshmallow fluff, and chocolate syrup, topped with Oreos, and served with a side of Doritos. Or, they may want fruit for dessert, but in the form of Pop-Tarts and Krispy Kreme pies. Vegan, yes. Healthy, no. 

    “Plant-based diets have been recommended to reduce the risk of type 2 diabetes (T2D). However, not all plant foods are necessarily beneficial.” In the pro-vegetarian scoring system I mentioned above, you get points for eating potato chips and French fries because they are technically plant-based, as you can see below and at 3:07 in my video, but Harvard researchers wanted to examine the association of not only an overall plant-based diet, but healthy and unhealthy versions. So, they created the same kind of pro-vegetarian scoring system, but it was weighted towards any sort of plant-based foods and against animal foods; then, they created a healthful plant-based diet index, where at least some whole plant foods took precedence and Coca-Cola and other sweetened beverages were no longer considered plants. Lastly, they created an unhealthful plant-based diet index by assigning positive scores to processed plant-based junk and negative scores for healthier plant foods and animal foods. 

    Their findings? As you can see below and at 3:51 in my video, a more plant-based diet, in general, was good for reducing diabetes risk, but eating especially healthy plant-based foods did better, nearly cutting risk in half, while those eating more unhealthy plant foods did worse, as shown in the graph below and at 4:03.

    Now, is that because they were also eating more animal foods? People often eat burgers with their fries, so the researchers separated the effects of healthy plant foods, less healthy plant foods, and animal foods on diabetes risk. And, they found that healthy plant foods were protectively associated, animal foods were detrimentally associated, and less healthy plant foods were more neutral when it came to diabetes risk. Below and at 4:32 in my video, you can see the graph that shows higher diabetes risk with more and more animal foods, no protection whatsoever with junky plant foods, and lower and lower diabetes risk associated with more and more healthy whole plant foods in the diet. So, they concluded that, yes, “plant-based diets…are associated with substantially lower risk of developing T2D.” However, it may not be enough to just lower the intake of animal foods; consumption of less healthy plant foods may need to decrease, too. 

    As a physician, labels like vegetarian and vegan just tell me what you don’t eat, but there are a lot of unhealthy vegetarian fare like French fries, potato chips, and soda pop. That’s why I prefer the term whole food and plant-based nutrition. That tells me what you do eat—a diet centered around the healthiest foods out there. 

    The video I mentioned is Do Flexitarians Live Longer?

    You may also be interested in some of my past popular videos and blogs on plant-based diets. Check related posts below. 

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Irregular Meals, Night Shifts, and Metabolic Harms  | NutritionFacts.org

    Irregular Meals, Night Shifts, and Metabolic Harms  | NutritionFacts.org

    [ad_1]

    What can shift workers do to moderate the adverse effects of circadian rhythm disruption?

    Shift workers may have higher rates of death from heart disease, stroke, diabetes, dementia, and cardiovascular disease, as well as higher rates of death from cancer. Graveyard shift, indeed! But, is it just because they’re eating out of vending machines or not getting enough sleep? Highly controlled studies have recently attempted to tease out these other factors by putting people on the same diets with the same sleep—but at the wrong time of day. Redistributing eating to the nighttime resulted in elevated cholesterol and increases in blood pressure and inflammation. No wonder shift workers are at higher risk. Shifting meals to the night in a simulated night-shift protocol effectively turned about one-third of the subjects prediabetic in just ten days. Our bodies just weren’t designed to handle food at night, as I discuss in my video The Metabolic Harms of Night Shifts and Irregular Meals.

    Just as avoiding bright light at night can prevent circadian misalignment, so can avoiding night eating. We may have no control over the lighting at our workplace, but we can try to minimize overnight food intake, which has been shown to help limit the negative metabolic consequences of shift work. When we finally do get home in the morning, though, we may disproportionately crave unhealthy foods. In one experiment, 81 percent of participants in a night-shift scenario chose high-fat foods, such as croissants, out of a breakfast buffet, compared to just 43 percent of the same subjects during a control period on a normal schedule.

    Shiftwork may also leave people too fatigued to exercise. But, even at the same physical activity levels, chronodisruption can affect energy expenditure. Researchers found that we burn 12 to 16 percent fewer calories while sleeping during the daytime compared to nighttime. Just a single improperly-timed snack can affect how much fat we burn every day. Study subjects eating a specified snack at 10:00 am burned about 6 more grams of fat from their body than on the days they ate the same snack at 11:00 pm. That’s only about a pat and a half of butter’s worth of fat, but it was the identical snack, just given at a different time. The late snack group also suffered about a 9 percent bump in their LDL cholesterol within just two weeks.

    Even just sleeping in on the weekends may mess up our metabolism. “Social jetlag is a measure of the discrepancy in sleep timing between our work days and free days.” From a circadian rhythm standpoint, if we go to bed late and sleep in on the weekends, it’s as if we flew a few time zones west on Friday evening, then flew back Monday morning. Travel-induced jet lag goes away in a few days, but what might the consequences be of constantly shifting our sleep schedule every week over our entire working career? Interventional studies have yet put it to the test, but population studies suggest that those who have at least an hour of social jet lag a week (which may describe more than two-thirds of people) have twice the odds of being overweight. 

    If sleep regularity is important, what about meal regularity? “The importance of eating regularly was highlighted early by Hippocrates (460–377 BC) and later by Florence Nightingale,” but it wasn’t put to the test until the 21st century. A few population studies had suggested that those eating meals irregularly were at a metabolic disadvantage, but the first interventional studies weren’t published until 2004. Subjects were randomized to eat their regular diets divided into six regular eating occasions a day or three to nine daily occasions in an irregular manner. Researchers found that an irregular eating pattern can cause a drop in insulin sensitivity and a rise in cholesterol levels, as well as reduce the calorie burn immediately after meals in both lean and obese individuals. The study participants ended up eating more, though, on the irregular meals, so it’s difficult to disentangle the circadian effects. The fact that overweight individuals may overeat on an irregular pattern may be telling in and of itself, but it would be nice to see such a study repeated using identical diets to see if irregularity itself has metabolic effects.

    Just such a study was published in 2016: During two periods, people were randomized to eat identical foods in a regular or irregular meal pattern. As you can see in the graph below and at 4:47 in my video, during the irregular period, people had impaired glucose tolerance, meaning higher blood sugar responses to the same food.

    They also had lower diet-induced thermogenesis, meaning the burning of fewer calories to process each meal, as seen in the graph below and at 4:55 in my video.

    The difference in thermogenesis only came out to be about ten calories per meal, though, and there was no difference in weight changes over the two-week periods. However, diet-induced thermogenesis can act as “a satiety signal.” The extra work put into processing a meal can help slake one’s appetite. And, indeed, “lower hunger and higher fullness ratings” during the regular meal period could potentially translate into better weight control over the long term. 

    The series on chronobiology is winding down with just two videos left in this series: Shedding Light on Shedding Weight and Friday Favorites: Why People Gain Weight in the Fall.

    If you missed any of the other videos, see the related posts below. 
     

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Morning Calories vs. Evening Calories  | NutritionFacts.org

    Morning Calories vs. Evening Calories  | NutritionFacts.org

    [ad_1]

    Why are calories eaten in the morning less fattening than calories eaten in the evening? 

    One reason calories consumed in the morning are less fattening than those eaten in the evening is that more calories are burned off in the morning due to diet-induced thermogenesis. That’s the amount of energy the body takes to digest and process a meal, given off in part as waste heat. If people are given the same meal in the morning, afternoon, or night, their body uses up about 25 percent more calories to process it in the afternoon than at night and about 50 percent more calories to digest it in the morning, as you can see below and at 0:36 in my video Eat More Calories in the Morning Than the Evening. That leaves fewer net calories in the morning to be stored as fat.

    Let’s put some actual numbers to it. A group of Italian researchers randomized 20 people to eat the same standardized meal at either 8:00 am or 8:00 pm and had them return a week later to do the opposite. So, each person had a chance to eat the same meal for breakfast and dinner. After every meal, the study participants were placed in a “calorimeter” contraption to precisely measure how many calories they were burning over the next three hours. As you can see below and at 1:18 in my video, the researchers calculated that the meal given in the morning took about 300 calories to digest, whereas the same meal given at night only used up about 200 calories to process. The meal was about 1,200 calories, but, when eaten in the morning, it ended up only providing about 900 calories compared to more like 1,000 calories at night. Same meal, same food, same amount of food, but effectively 100 fewer calories when consumed in the morning rather than at night. So, a calorie is not just a calorie. It depends on when we eat it. 

    But why do we burn more calories when eating a morning meal? Is it behavioral or biological? If you started working the graveyard shift, sleeping during the day and working all night, which meal would net you fewer calories? Would it be the “breakfast” you had at night before you went to work or the “dinner” you had in the morning before you went to bed? In other words, is it something about eating before you go to sleep that causes your body to hold onto more calories, or is it built into our circadian rhythm, where we store more calories at night regardless of what we’re doing? You don’t know until you put it to the test.

    Harvard researchers randomized people to identical meals at 8:00 am versus 8:00 pm while under simulated night shifts or day shifts. Regardless of activity level or sleeping cycle, the number of calories that were burned processing the morning meals was 50 percent higher than in the evening, as you can see in the graph below and at 2:45 in my video. So, the difference is explained by chronobiology: It’s just part of our circadian rhythm to burn more meal calories in the morning. But, why? What exactly is going on? 

    How does it make sense for our body to waste calories in the morning when we have the whole day ahead of us? 

    Our body isn’t so much wasting calories as investing them. When we eat in the morning, our body bulks up our muscles with glycogen, which is the primary energy reserve our body uses to fuel our muscles, but this takes energy. In the evening, our body expects to be sleeping for much of the next 12 hours, so rather than storing blood sugar as extra glycogen in our muscles, it preferentially uses it as an energy source, which may end up meaning we burn less of our backup fuel (body fat). In the morning, however, our body expects to be running around all day, so instead of just burning off breakfast, our body continues to dip into its fat stores while we use breakfast calories to stuff our muscles full of the energy reserves we need to move around over the day. That’s where the “inefficiency” may come from. The reason it costs more calories to process a morning meal is that, instead of just burning glucose (blood sugar) directly, our body uses up energy to string glucose molecules together into chains of glycogen in our muscles, which are then just going to be broken back down into glucose later in the day. That extra assembly/disassembly step takes energy—energy that our body takes out from the meal, leaving us with fewer calories.

    So, in the morning, our muscles are especially sensitive to insulin, rapidly pulling blood sugar out of our bloodstream to build up glycogen reserves. At night, though, our muscles become relatively insulin-resistant and resist the signal to take in extra blood sugar. So, does that mean you get a higher blood sugar and insulin spike in the evening compared to eating the same meal in the morning? Yes. As you can see in the graph below and at 5:02 in my video, in that 100-calorie-difference study, for example, blood sugars rose twice as high after the 8:00 pm meal compared to the same meal eaten in the morning.

    So, shifting the bulk of our caloric intake towards the morning would appear to have a dual benefit—more weight loss, and better blood sugar control, as shown in the graph below and at 5:12 in my video

    If you thought dual benefits sounded good, stay tuned for triple benefits! I dive deeper into circadian rhythms. See related posts below.

    My last few videos (see below) focus on why science points to loading your calories towards the beginning of the day.

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Lose Weight by Eating More in the Morning  | NutritionFacts.org

    Lose Weight by Eating More in the Morning  | NutritionFacts.org

    [ad_1]

    A calorie is not a calorie. It isn’t only what you eat, but when you eat.

    Mice are nocturnal creatures. They eat during the night and sleep during the day. However, if you only feed mice during the day, they gain more weight than if they were fed a similar amount of calories at night. Same food and about the same amount of food, but different weight outcomes, as you can see in the graph below and at 0:18 in my video Eat More Calories in the Morning to Lose Weight, suggesting that eating at the “wrong” time may lead to disproportionate weight gain. In humans, the wrong time would presumably mean eating at night. 

    Recommendations for weight management often include advice to limit nighttime food consumption, but this was largely anecdotal until it was first studied experimentally in 2013. Researchers instructed a group of young men not to eat after 7:00 pm for two weeks. Compared to a control period during which they continued their regular habits, they ended up about two pounds lighter after the night-eating restriction. This is not surprising, given that dietary records show the study participants inadvertently ate fewer calories during that time. To see if timing has metabolic effects beyond just foreclosing eating opportunities, you’d have to force people to eat the same amount of the same food, but at different times of the day. The U.S. Army stepped forward to carry out just such an investigation.

    In their first set of experiments, Army researchers had people eat a single meal a day either as breakfast or dinner. The results clearly showed the breakfast group lost more weight, as you can see in the graph below and at 1:35 in my video. When study participants ate only once a day at dinner, their weight didn’t change much, but when they ate once a day at breakfast, they lost about two pounds a week. 

    Similar to the night-eating restriction study, this is to be expected, given that people tend to be hungrier in the evening. Think about it. If you went nine hours without eating during the day, you’d be famished, but people go nine hours without eating overnight all the time and don’t wake up ravenous. There is a natural circadian rhythm to hunger that peaks around 8:00 pm and drops to its lowest level around 8:00 am, as you can see in the graph below and at 2:09 in my video. That may be why breakfast is typically the smallest meal of the day. 

    The circadian rhythm of our appetite isn’t just behavioral, but biological, too. It’s not just that we’re hungrier in the evening because we’ve been running around all day. If you stayed up all night and slept all day, you’d still be hungriest when you woke up that evening. To untangle the factors, scientists used what’s called a “forced desynchrony” protocol. Study participants stayed in a room without windows in constant, unchanging, dim light and slept in staggered 20-hour cycles to totally scramble them up. This went on for more than a week, so the subjects ended up eating and sleeping at different times throughout all phases of the day. Then, the researchers could see if cyclical phenomena are truly based on internal clocks or just a consequence of what you happen to be doing at the time.  

    For instance, there is a daily swing in our core body temperature, blood pressure, hormone production, digestion, immune activity, and almost everything else, but let’s use temperature as an example. As you can see in the graph below and at 3:21 in my video, our body temperature usually bottoms out around 4:00 am, dropping from 98.6°F (37°C) down to more like 97.6°F (36.4°C). Is this just because our body cools down as we sleep? No. By keeping people awake and busy for 24 hours straight, it can be shown experimentally that it happens at about the same time no matter what. It’s part of our circadian rhythm, just like our appetite. It makes sense, then, if you are only eating one meal per day and want to lose weight, you’d want to eat in the morning when your hunger hormones are at their lowest level. 

    Sounds reasonable, but it starts to get weird.

    The Army scientists repeated the experiment, but this time, they had the participants eat exactly 2,000 calories either as breakfast or as dinner, taking appetite out of the picture. The subjects weren’t allowed to exercise either. Same number of calories, so the same change in weight, right? No. As you can see in the graph below and at 4:18 in my video, the breakfast-only group still lost about two pounds a week compared to the dinner-only group. Two pounds of weight loss eating the same number of calories. That’s why this concept of chronobiology, meal timing—when to eat—is so important. 

    Isn’t that wild? Two pounds of weight loss a week eating the same number of calories! That was a pretty extreme study, though. What about just shifting a greater percentage of calories to earlier in the day? That’s the subject of my next video: Breakfast Like a King, Lunch Like a Prince, Dinner Like a Pauper. First, let’s take a break from chronobiology to look at the Benefits of Garlic for Fighting Cancer and the Common Cold. Then, we’ll resume checking other videos in the related posts below.

    If you missed the first three videos in this extended series, also check out related posts below. 

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Are Branched-Chain Amino Acids Good for Us?  | NutritionFacts.org

    Are Branched-Chain Amino Acids Good for Us?  | NutritionFacts.org

    [ad_1]

    I discuss why we may not want to exceed the recommended intake of protein.

    Diabetes isn’t just about the amount of body fat, but also the distribution of body fat. At 0:26 in my video Are BCAA (Branched-Chain Amino Acids) Healthy?, you can view cross-sections of thighs from two different patients using MRI. In the images, the fat shows up as white and the thigh muscle is black. At first glance, you might think the bottom cross-section has more fat since it’s ringed with more white. That is the subcutaneous fat, the fat under the skin. But, if you look at the top cross-section, you’ll see how the middle of the thigh muscle is more marbled with fat, like those really fatty Japanese beef steaks. That is the fat infiltrating into the muscle. In the graph below and at 0:48 in my video, the two cross sections are colored so you can see the different types of fat: the fat infiltrating the muscle in red, the fat between the muscles in green, and subcutaneous fat outside of the muscles and under the skin in yellow. If you add up all three types of fat, both of those thighs actually have the same amount of fat—just distributed differently.

    This seems to be the critical factor in terms of determining insulin resistance, the cause of type 2 diabetes. Researchers found that the subcutaneous adipose tissue, the fat right under the skin, was not associated with insulin resistance. Going back to the two cross sections, as seen below and at 1:20 in my video, it is healthier to have the bottom thigh with the thicker ring of subcutaneous fat but less fat infiltrating muscle than the top thigh with more fat present in the muscle.

    Is it possible a more plant-based diet also affects a more healthful distribution of fat?

    We now know the effect of a vegetarian diet versus a conventional diabetic diet on thigh fat distribution in patients with type 2 diabetes. Researchers took 74 people with diabetes and randomly assigned them to follow either a vegetarian diet or a conventional diabetic diet. Both diets were calorie-restricted by the same number of calories. The vegetarian diet was also egg-free, and dairy was limited to a maximum of one serving of low-fat yogurt a day. What did the researchers find? The reduction in the more benign subcutaneous fat was comparable; it was about the same in both groups. However, the more dangerous fat—the fat lodged inside the muscle itself—“was reduced only in response to a vegetarian diet.” So, even getting the same number of calories, there can be a healthier weight loss on a more plant-based diet.

    Those eating strictly plant-based also had lower levels of fat stuck inside the individual muscle fibers themselves, which may help explain why vegans in particular are often found to have the lowest odds of diabetes. It is not just because vegans are generally slimmer either. Even if you match subjects pound for pound, there is significantly less fat inside the muscle cells of vegans compared to omnivores. This is a good thing, since storing fat in muscle cells “may be one of the primary causes of insulin resistance,” which is what’s behind both prediabetes and type 2 diabetes. On the other hand, if you put someone on a high-fat diet, the fat in their muscle cells shoots up by 54 percent in just a single week.

    What about a high-protein diet? That may undermine one of the principal benefits of weight loss: eliminating the weight-loss-induced improvement in insulin resistance. Researchers put obese individuals on a calorie-restricted diet of less than 1,400 calories a day until they lost 10 percent of their body weight. Half of the participants were getting more of a regular protein intake (73 grams a day), and the other half were on a higher-protein diet (about 105 daily grams). Normally, if you lose 10 percent of your body weight, your insulin resistance improves. That’s why it is so critical for obese individuals with type 2 diabetes to lose weight. However, the beneficial effect of a 10 percent weight loss was eliminated by the high protein intake. Those extra 32 grams of protein a day abolished the weight-loss benefit. “The failure to improve…insulin sensitivity in the WL-HP [weight-loss high-protein] group is clinically important because it reflects a failure to improve a major pathophysiological [cause-and-effect] mechanism involved in the development of T2D,” type 2 diabetes. In summary, the researchers concluded that they demonstrated “the protein content of a weight loss diet can have profound effects on metabolic function.” 

    Is this true of any protein? As you can see below and at 4:19 in my video, if you split it between animal protein versus plant protein, following people over time, intake of animal protein is associated with an increased risk of diabetes in most studies.

    Intake of plant protein, however, appears to have either a neutral or even protective association with diabetes, as shown below and at 4:25 in my video

    Those were just observational studies, though. People who eat a lot of animal protein might have many unhealthy behaviors. However, you see the same thing in randomized, controlled, interventional trials, where you can improve blood sugar control just by replacing sources of animal protein with plant protein.

    We think it may be the branched-chain amino acids concentrated in animal protein. Higher levels in the bloodstream are associated with obesity and the development of insulin resistance. As you can see below and at 5:00 in my video, we may be able to drop our levels by sticking to plant proteins, but you don’t know if that has metabolic effects until you put it to the test. 

    Ruining the suspense, researchers titled their study: “Decreased Consumption of Branched-Chain Amino Acids Improves Metabolic Health.” They demonstrated that “a moderate reduction in total dietary protein or selected amino acids can rapidly improve metabolic health,” and this included improving blood sugar control, while also decreasing body mass index (BMI) and body fat. As you can see at 5:27 in my video, the protein-restricted group was eating hundreds more calories per day, significantly more calories than the control group, so they should have gained weight. But, no. They lost weight! After about a month and a half, they were eating more calories but lost more weight—about five more pounds than participants in the control group who were eating fewer calories, as you can see at 5:38 in my video. What’s more, this “protein restriction” had people eat the recommended amount of protein per day, about 56 daily grams. They should have been called the normal protein group or the recommended protein group instead, and the group eating more typically American protein levels and suffering because of it should have been called the excess protein group. Just sticking to the recommended protein intake doubled the levels of a pro-longevity hormone called FGF21, too, but we’ll save that for another discussion.

    To better understand the negative impact of omnivores getting too much protein relative to vegetarians, see my video Flashback Friday: Do Vegetarians Get Enough Protein?.

    I have several additional videos and blogs that may help explain some of the benefits of plant-based proteins. Check in the related posts below.

    Of course, the best way to treat type 2 diabetes is to get rid of it by treating the underlying cause, as described in my video How Not to Die from Diabetes

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Pre-Cut Vegetables and Endotoxins  | NutritionFacts.org

    Pre-Cut Vegetables and Endotoxins  | NutritionFacts.org

    [ad_1]

    Endotoxins can build up on pre-cut vegetables and undermine some of their benefits.

    You may remember when I introduced the endotoxin theory literature in my video The Exogenous Endotoxin Theory, which sought to explain how a single Sausage and Egg McMuffin meal could cripple artery function within hours of consumption. Maybe it’s because such a meal causes inflammation within hours of consumption by inducing low-grade endotoxemia, endotoxins in the bloodstream, as I previously discussed in my video Dead Meat Bacteria Endotoxemia. Endotoxins are structural components of gram-negative bacteria like E. coli, as you can see below and at 0:35 in my video Are Pre-Cut Vegetables Just as Healthy?. Certain foods, like ground meat, have high bacterial loads, so the thought was that the endotoxins in the food were triggering the inflammation.

    Critics of the theory argued that because we already have so many bacteria living in our colon, so many endotoxins just sitting down in our large intestine, a few more endotoxins in our food wouldn’t matter much in terms of causing systemic inflammation. After all, we have about two pounds of pure bacteria down there where the sun don’t shine, so there could be about a whole ounce of endotoxin. The lethal dose of intravenously injected endotoxin can be just a few millionths of a gram, so we could have a million lethal doses down there. However, the apparent paradox is explained by compartmentalization. It’s location, location, location.

    Poop is harmless when it’s in your colon, but it shouldn’t be injected into your bloodstream or eaten for that matter, particularly with fat, as that can promote the absorption of endotoxins in the small intestine. That goes for well-cooked poop, too.

    As you can see in the graph below and at 1:44 in my video, you can boil endotoxins for two hours straight with no detriment in their ability to induce inflammation. You could easily kill off any bacteria if you boiled your poop soup long enough, but you can’t kill off the endotoxins they make, just like you can’t cook the crap out of the meat. The consumption of meat contaminated with feces doesn’t just cause food poisoning. It can spill out onto the animal’s skin during the evisceration process when the digestive tract is ruptured. 

    Even when slaughterhouse workers trim off “visible fecal contamination,” the trimming itself can, ironically, sometimes lead to an increase in certain fecal bacteria, thought to be caused by “cross-contamination resulting from the handling to removal fecal contamination” from one carcass to the next. Then, even when properly stored in the fridge, endotoxins start accumulating along with the bacterial growth, as you can see in the graph below and at 2:30 in my video

    What about other foods? The highest levels of endotoxins were found in meat and dairy, and the lowest levels in fresh fruits and vegetables. That was testing whole fruits and vegetables, though. “Most spoilage organisms cannot penetrate the plant’s surface barrier and spoil the inner tissues.” That’s why fruits and veggies can sit out in the fields all day in the sun. But, once you cut them open, bacteria can gain access to the inner tissues, and, within a matter of days, your veggies can start to spoil. So, what does that mean for all those convenient pre-cut veggies these days?

    While endotoxins were not detectable in the majority of unprocessed vegetables, once you damage the protective outer layers of vegetables, you diminish their resistance to microbial growth. So, while freshly cut carrots and onions start with undetectable levels, day after day after they’ve been chopped, you start to get the growth of bacteria and, along with them, endotoxin buildup—even if they’ve been kept chilled in the fridge. Not as much as meat, but not insignificant either, as you can see in the graph below and at 3:27 in my video. Enough to make a difference, though? You don’t know until you put it to the test.

    What would happen if you switched people between foods expected to have a lower endotoxin load to foods containing more endotoxins? For instance, going from intact meat, such as a steak, and whole fruits and vegetables, to more like ground beef, pre-cut veggies, and more ready-made meals, as shown below and at 3:39 in my video. After just one week on the lower-endotoxin diet, people’s white blood cell count, which is an indicator of total body inflammation, dropped by 12 percent, then bumped back up by 14 percent after just four days on the higher-endotoxin diet. They also lost a pound and a half on the lower-endotoxin diet and slimmed their waists a bit. 

    They weren’t eating otherwise identical diets, though. It looks like they were eating more meat and cheese on the higher-endotoxin diet and perhaps getting more food additives in the ready-made meals. So, how do we know endotoxins had anything to do with it? That’s where the onion study comes in. Another study was designed based on two meals that differed in their content of bacterial products but were otherwise nutritionally identical. So, researchers compared freshly chopped onion to prechopped onion that had been refrigerated for a few days. The pre-chopped onion wasn’t spoiled; it was still before the “best before” date. So, would it make any difference?

    Within three hours of consumption, the fresh onion meal caused significant reductions in several markers of inflammation. That’s what fruits and vegetables do—they reduce inflammation—but these effects were not observed after eating the pre-chopped onions. For example, three hours after eating freshly chopped onions, researchers saw a significant drop in inflammatory status, but there was no significant change three hours after eating the same amount of pre-chopped onions, as you can see in the graph below and at 5:06 in my video. So, it’s not like the pre-chopped onions caused more inflammation, like in the meat, eggs, and dairy studies, but it did appear that some of the onion’s anti-inflammatory effects were extinguished. “In conclusion, the modern trend towards eating minimally processed vegetables”—pre-cut vegetables—“rather than whole [intact] foods is likely to be associated with increased oral endotoxin exposure.” It’s still better to eat pre-cut veggies than no veggies, but cutting your own might be the healthiest.

     For some other practical veggie videos and blogs check out the related posts below. 

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Are Fortified Children’s Breakfast Cereals Just Candy?  | NutritionFacts.org

    Are Fortified Children’s Breakfast Cereals Just Candy?  | NutritionFacts.org

    [ad_1]

    The industry responds to the charge that breakfast cereals are too sugary.

    In 1941, the American Medical Association’s Council on Foods and Nutrition was presented with a new product, Vi-Chocolin, a vitamin-fortified chocolate bar, “offered ostensibly as a specialty product of high nutritive value and of some use in medicine, but in reality intended for promotion to the public as a general purpose confection, a vitaminized candy.” Surely, something like that couldn’t happen today, right? Unfortunately, that’s the sugary cereal industry’s business model.

    As I discuss in my video Are Fortified Kids’ Breakfast Cereals Healthy or Just Candy?, nutrients are added to breakfast cereals “as a marketing gimmick to “create an aura of healthfulness…If those nutrients were added to soft drinks or candy, would we encourage kids to consume them more often?” Would we feed our kids Coke and Snickers for breakfast? We might as well spray cotton candy with vitamins, too. As one medical journal editorial read, “Adding vitamins and minerals to sugary cereals…is worse than useless. The subtle message accompanying such products is that it is safe to eat more.”

    General Mills’ “Grow up strong with Big G kids’ cereals” ad campaign featured products like Lucky Charms, Trix, and Cocoa Puffs. That’s like the dairy industry promoting ice cream as a way to get your calcium. Kids who eat presweetened breakfast cereals may get more than 20 percent of their daily calories from added sugar, as you can see below and at 1:28 in my video

    Most sugar in the American diet comes from beverages like soda, but breakfast cereals represent the third largest food source of added sugars in the diets of children and adolescents, wedged between candy and ice cream. On a per-serving basis, there is more added sugar in a cereal like Frosted Flakes than there is in frosted chocolate cake, a brownie, or even a frosted donut, as you can see below and at 1:48 in my video

    Kellogg’s and General Mills argue that breakfast cereals only contribute a “relatively small amount” of sugar to the diets of children, less than soda, for example. “This is a perfect example of the social psychology phenomenon of ‘diffusion of responsibility.’ This behavior is analogous to each restaurant in the country arguing that it should not be required to ban smoking because it alone contributes only a tiny fraction to Americans’ exposure to secondhand smoke.” In fact, “each source of added sugar…should be reduced.”

    The industry argues that most of their cereals have less than 10 grams of sugar per serving, but when Consumer Reports measured how much cereal youngsters actually poured for themselves, they were found to serve themselves about 50 percent more than the suggested serving size for most of the tested cereals. The average portion of Frosted Flakes they poured for themselves contained 18 grams of sugar, which is 4½ teaspoons or 6 sugar packets’ worth. It’s been estimated that a “child eating one serving per day of a children’s cereal containing the average amount of sugar would consume nearly 1,000 teaspoons of sugar in a year.”

    General Mills offers the “Mary Poppins defense,” arguing that those spoonsful of sugar can “help the medicine go down” and explaining that “if sugar is removed from bran cereal, it would have the consistency of sawdust.” As you can see below and at 3:17 in my video, a General Mills representative wrote that the company is presented “with an untenable choice between making our healthful foods unpalatable or refraining from advertising them.” If it can’t add sugar to its cereals, they would be unpalatable? If one has to add sugar to a product to make it edible, that should tell us something. That’s a characteristic of so-called ultra-processed foods, where you have to pack them full of things like sugar, salt, and flavorings “to give flavor to foods that have had their [natural] intrinsic flavors processed out of them and to mask any unpleasant flavors in the final product.” 

    The president of the Cereal Institute argued that without sugary cereals, kids might not eat breakfast at all. (This is similar to dairy industry arguments that removing chocolate milk from school cafeterias may lead to students “no longer purchasing school lunch.”) He also stressed we must consider the alternatives. As Kellogg’s director of nutrition once put it: “I would suggest that Fruit [sic] Loops as a snack are much better than potato chips or a sweet roll.” You know there’s a problem when the only way to make your product look good is to compare it to Pringles and Cinnabon.

    Want a healthier option? Check out my video Which Is a Better Breakfast: Cereal or Oatmeal?.

    For more on the effects of sugar on the body and if you like these more politically charged videos see the related posts below.

    Finally, for some additional videos on cereal, see Kids’ Breakfast Cereals as Nutritional Façade and Ochratoxin in Breakfast Cereals.

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Restricting Calories for Longevity?  | NutritionFacts.org

    Restricting Calories for Longevity?  | NutritionFacts.org

    [ad_1]

    Though a bane for dieters, a slower metabolism may actually be a good thing.

    We’ve known for more than a century that calorie restriction can increase the lifespan of animals, and metabolic slowdown may be the mechanism. That could be why the tortoise lives ten times longer than the hare. Rabbits can live for 10 to 20 years, whereas “Harriet,” a tortoise “allegedly collected from the Galapagos Islands by Charles Darwin, was estimated to be about 176 years old when she died in 2006.” Slow and steady may win the race. 
     
    As I discuss in my video The Benefits of Calorie Restriction for Longevity, one of the ways our body lowers our resting metabolic rate is by creating cleaner-burning, more efficient mitochondria, the power plants that fuel our cells. It’s like our body passes its own fuel-efficiency standards. These new mitochondria create the same energy with less oxygen and produce less free radical “exhaust.” After all, when our body is afraid famine is afoot, it tries to conserve as much energy as it can. 
     
    Indeed, the largest caloric restriction trial to date found metabolic slowing and a reduction in free radical-induced oxidative stress, both of which may slow the rate of aging. The flame that burns twice as bright burns half as long. But, whether this results in greater human longevity is an unanswered question. Caloric restriction is often said “to extend lifespan in every species studied,” but that isn’t even true of all strains within a single species. Two authors of one article, for instance, don’t even share the same view: One doesn’t think calorie restriction will improve human longevity at all, while the other suggests that a 20 percent calorie restriction starting at age 25 and sustained for 52 years could add five years onto your life. Either way, the reduced oxidative stress would be expected to improve our healthspan. 
     
    Members of the Calorie Restriction Society, self-styled CRONies (for Calorie-Restricted Optimal Nutrition), appear to be in excellent health, but they’re a rather unique, self-selected group of individuals. You don’t really know until you put it to the test. Enter the CALERIE study, the Comprehensive Assessment of Long-Term Effects of Reducing Intake of Energy, the first clinical trial to test the effects of caloric restriction. 
     
    Hundreds of non-obese men and women were randomized to two years of 25 percent calorie restriction. They only ended up achieving half that, yet they still lost about 18 pounds and three inches off their waists, wiping out more than half of their visceral abdominal fat, as you can see in the graph below and at 2:47 in my video

    That translated into significant improvements in cholesterol levels, triglycerides, insulin sensitivity, and blood pressure, which you can see in the graph below and at 2:52 in my video. Eighty percent of those who were overweight when they started were normal-weight by the end of the trial, “compared with a 27% increase in those who became overweight in the control group.” 

    In the famous Minnesota Starvation Study that used conscientious objectors as guinea pigs during World War II, the study subjects suffered both physically and psychologically, experiencing depression, irritability, and loss of libido, among other symptoms. The participants started out lean, though, and had their calorie intake cut in half. The CALERIE study ended up being four times less restrictive, only about 12 percent below baseline calorie intake, and enrolled normal-weight individuals, which in the United States these days means overweight on average. As such, the CALERIE trial subjects experienced nothing but positive quality-of-life benefits, with significant improvements in mood, general health, sex drive, and sleep. They only ended up eating about 300 fewer calories a day than they had eaten at baseline. So, they got all of these benefits—the physiological benefits and the psychological benefits—just from cutting about a small bag of chips’ worth of calories from their daily diets. 
     
    What happened at the end of the trial, though? As researchers saw in the Minnesota Starvation Study and in calorie deprivation experiments done on Army Rangers, as soon as the subjects were released from restriction, they tended to rapidly regain the weight and sometimes even more, as you can see below and at 4:18 in my video

    The leaner they started out, the more their bodies seemed to drive them to overeat to pack back on the extra body fat, as seen in the graph below and at 4:27 in my video. In contrast, after the completion of the CALERIE study, even though their metabolism was slowed, the participants retained about 50 percent of the weight loss two years later. They must have acquired new eating attitudes and behaviors that allowed them to keep their weight down. After extended calorie restriction, for example, cravings for sugary, fatty, and junky foods may actually go down. 
    This is part of my series on calorie restriction, intermittent fasting, and time-restricted eating. See related videos below.

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Sugar and Gaining Weight  | NutritionFacts.org

    Sugar and Gaining Weight  | NutritionFacts.org

    [ad_1]

    The sugar industry responds to evidence implicating sweeteners in the obesity epidemic. 
     
    In terms of excess body fat, the “well-documented obesity epidemic may merely be the tip of the overfat iceberg.” It’s been estimated that 91 percent of adults—nine out of ten of us—and 69 percent of children in the United States are overfat, a condition defined as having “excess body fat sufficient to impair health.” This can occur even in individuals who are “normal-weight and non-obese, often due to excess abdominal fat.” The way to tell if you’re overfat is if your waist circumference is more than half your height. What’s causing this epidemic? As I discuss in my video Does Sugar Lead to Weight Gain?, one primary cause may be all the added sugars we’re eating
     
    A century ago, sugar was heralded as one of the cheapest forms of calories in the diet. Just ten cents’ worth of sugar could furnish thousands of calories. Dr. Fredrick Stare, “Harvard’s sugar-pushing nutritionist,” bristled at the term “empty calories,” writing that the calories in sugar were “not empty but full of energy”—in other words, full of calories, which we are now getting too much of. The excess bodyweight of the U.S. population corresponds to about a daily 350- to 500-calorie excess on average. So, “to revert the obesity epidemic,” that’s how many calories we have to reduce, but which calories should we cut? As you can see below and at 1:33 in my video, the majority of Americans who fail to meet the Dietary Guidelines’ sugar limit get about that many calories in added sugars every day: Twenty-five teaspoons’ worth of added sugars is about 400 calories. 

    There are die-hard sugar defenders. James Rippe, for example, was reportedly paid $40,000 a month by the high fructose corn syrup industry—and that was on top of the $10 million it paid for his research. Even Dr. Rippe considers it “undisputable that sugars…contribute to obesity. It is also undisputable that sugar reduction…should be part of any weight loss program.” And, of all sources of calories to limit, since sugar is just empty calories and contains no essential nutrients, “reducing sugar consumption is obviously the place to start.” And, again, this is what the researchers funded by the likes of Dr. Pepper and Coca-Cola are saying. The primary author of “Dietary Sugar and Body Weight: Have We Reached a Crisis in the Epidemic of Obesity and Diabetes?…,” Richard Kahn, is infamous for his defense of the American Beverage Association—the soda industry—and he was the chief science officer at the American Diabetes Association when it signed a million-dollar sponsorship deal with the world’s largest candy company. “Maybe the American Diabetes Association should rename itself the American Junk Food Association,” said the director of a consumer advocacy group. What do you expect from an organization that was started with drug industry funding? 
     
    The bottom line is that “randomised controlled trials show that increasing sugars intake increases energy [calorie] intake” and “increasing sugar intake leads to body weight gain in adults, and…sugar reduction leads to body weight loss in children.” For example, when researchers randomized individuals to either increase or decrease their intake of table sugar, the added sugar group gained about three and a half pounds over ten weeks, whereas the reduced sugar group lost about two and a half pounds. A systematic review and meta-analysis of all such ad libitum diet studies—real-life studies where sugar levels were changed but people could otherwise eat whatever they wanted—found that reduced intake of dietary sugars resulted in a decrease in body weight, whereas “increased sugars intake was associated with a comparable weight increase.” The researchers found that, “considering the rapid weight gain that occurs after an increased intake of sugars, it seems reasonable to conclude that advice relating to sugars intake is a relevant component of a strategy to reduce the high risk of overweight and obesity in most countries.” That is, it’s reasonable to advise people to cut down on their sugar consumption. 
     
    Findings from observational studies have been “more ambiguous,” though, with an association found between obesity and intake of sweetened beverages, but failing to show consistent correlations with consumption of sugary foods. Most such studies rely on self-reported data, however, and “it is likely that this has introduced bias, especially as underreporting of diet has been found to be more prevalent among obese people and it is sugar-rich foods that are most commonly underreported.” However, one can measure trace sucrose levels in the urine, which gives an objective measure of actual sugar intake and also excludes contributions from other sweeteners such as high fructose corn syrup. When researchers did this, they discovered that, indeed, sugar intake is not only associated with greater odds of obesity and greater waist circumference on a snapshot-in-time cross-sectional basis, but that was also seen in a prospective cohort study over time. “Using urinary sucrose as the measure of sucrose intake,” researchers found that “participants in the highest v. the lowest quintile [fifth] for sucrose intake had 54% greater risk of being overweight or obese.” 
     
    Denying evidence that sugars are harmful to health has always been at the heart of the sugar industry’s defense.” But when the evidence is undeniable, like the link between sugar and cavities, it switches from denial to deflection, like trying to pull attention away from restricting intake to coming up with some kind of “vaccine against tooth decay.” We seem to have reached a similar point with obesity, with the likes of the Sugar Bureau switching from denial to deflection by commissioning research suggesting that obese individuals would not benefit from losing weight, a stance contradicted by hundreds of studies across four continents involving more than ten million participants. 
     
    For more on Big Sugar’s influence, check out Sugar Industry Attempts to Manipulate the Science
     
    You may also be interested in some of my other popular videos on sugar. See related videos below.

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Study Finds Highly Processed Foods Are as Addictive as Heroin, Cocaine | High Times

    Study Finds Highly Processed Foods Are as Addictive as Heroin, Cocaine | High Times

    [ad_1]

    A new study shows that highly processed foods can be as addictive as heroin, cocaine and nicotine, leading some health experts to call for warning labels on popularly consumed snacks such as cookies and chips. The new research, which analyzed the findings of nearly 300 previous nutritional studies, was published recently by the peer-reviewed British Medical Journal.

    The study was headed by University of Michigan professor Ashley Gearhardt, who previously created the Yale Food Addiction Scale (YFAS) by applying the same criteria that experts use to diagnose substance addiction, including uncontrollable and excessive consumption, cravings and continued intake despite potential negative health effects.

    Although addiction to certain foods is not included in common diagnostic frameworks to assess mental health such as the Diagnostic and Statistical Manual of Mental Disorders (DSM-5), research on this topic has grown rapidly in the past 20 years. Much of this research uses the YFAS, which was developed to measure food addiction by assessing DSM-5 criteria for substance use disorder in the context of food intake. 

    Study Finds 14% of Adults Are Addicted to UPFs

    To complete the new study, researchers reviewed 281 previous studies conducted in 36 countries, which found that 14% of adults are addicted to UPFs (ultra-processed foods). The team of researchers was alarmed by the findings because of the amount of UPFs– foods such as cookies, ice cream, sausage, and sugary soft drinks and breakfast cereals– found in modern diets. 

    “The combination of refined carbohydrates and fats often found in UPFs seems to have a supra-additive effect on brain reward systems, above either macronutrient alone, which may increase the addictive potential of these foods,” Gearhardt and the authors of the study wrote in their new findings.

    As UPFs have become more common, previous studies have shown them to be associated with serious medical conditions including cancer, early death, cognitive decline and mental health issues.

    “Many UPFs for many people are addictive,” author Chris van Tulleken told The Guardian about the new study. “And when people experience food addiction, it is almost always to UPF products.”

    Exactly why UPFs cause food addictions is not yet understood. Some experts believe that rather than one particular substance being the root cause of food addictions, a combination of UPFs taken together may be the cause.

    While they are “not likely addictive on their own,” food additives could be “reinforcers” of the caloric effects, the researchers wrote.

    Food Addiction Similar to Drugs and Alcohol

    Natural, unprocessed foods normally have more carbohydrates or more fat, but not both. However, UPFs often have disproportionately higher levels of both fats and carbohydrates. Eating UPFs triggers a spike in dopamine that is followed by a steep decline in the neurotransmitter. The result is a cycle of craving, satisfaction and crash similar to drugs and alcohol, although not everyone is susceptible.

    “Addictive products are not addictive for everyone,” said van Tulleken. “Almost 90% of people can try alcohol and not develop a problematic relationship; many can try cigarettes, or even cocaine.”

    Past research has also found that sugary or fatty foods make healthier alternatives less appealing, a change that could have negative consequences on health, such as over-indulging and weight gain. However, avoiding UPFs has become difficult for many people because processed foods are so ubiquitous in the modern diet. As a result, the addictive properties of UPFs have led some health-conscious researchers to recommend that many foods should come with a warning similar to those for cigarettes and other tobacco products. 

    “Trying to quit UPFs now is like trying to quit smoking in the 1960s,” said van Tulleken.

    Luckily, most of the substances are safe when used in moderation, leading online medical resource Healthline to recommend that processed foods make up no more than 10% to 20% of the calories in a person’s diet. To help reach that goal, van Tulleken suggests choosing foods thoughtfully.

    “Ask yourself: is this really food? You can quickly move from addiction to disgust,” he said.

    [ad_2]

    A.J. Herrington

    Source link