ReportWire

Tag: junk food

  • Do Taxpayer Subsidies Play a Role in the Obesity Epidemic?  | NutritionFacts.org

    Do Taxpayer Subsidies Play a Role in the Obesity Epidemic?  | NutritionFacts.org

    [ad_1]

    Why are U.S. taxpayers giving billions of dollars to support the likes of the sugar and meat industries?

    The rise in calorie surplus sufficient to explain the obesity epidemic was less a change in food quantity than in food quality. Access to cheap, high-calorie, low-quality convenience foods exploded, and the federal government very much played a role in making this happen. U.S. taxpayers give billions of dollars in subsidies to prop up the likes of the sugar industry, the corn industry and its high-fructose syrup, and the production of soybeans, about half of which is processed into vegetable oil and the other half is used as cheap feed to help make dollar-menu meat. You can see a table of subsidy recipients below and at 0:49 in my video The Role of Taxpayer Subsidies in the Obesity Epidemic. Why do taxpayers give nearly a quarter of a billion dollars a year to the sorghum industry? When was the last time you sat down to some sorghum? It’s almost all fed to cattle and other livestock. “We have created a food price structure that favors relatively animal source foods, sweets, and fats”—animal products, sugars, and oils.

    The Farm Bill started out as an emergency measure during the Great Depression of the 1930s to protect small farmers but was weaponized by Big Ag into a cash cow with pork barrel politics—including said producers of beef and pork. From 1970 to 1994, global beef prices dropped by more than 60 percent. And, if it weren’t for taxpayers “sweetening the pot” with billions of dollars a year, high-fructose corn syrup would cost the soda industry about 12 percent more. Then we hand Big Soda billions more through the Supplemental Nutrition Assistance Program (SNAP), formerly known as the Food Stamps Program, to give sugary drinks to low-income individuals. Why is chicken so cheap? After one Farm Bill, corn and soy were subsidized below the cost of production for cheap animal fodder. We effectively handed the poultry and pork industries about $10 billion each. That’s not chicken feed—or rather, it is! 

    This is changing what we eat. 

    As you can see below and at 2:03 in my video, thanks in part to subsidies, dairy, meats, sweets, eggs, oils, and soda were all getting relatively cheaper compared to the overall consumer food price index as the obesity epidemic took off, whereas the relative cost of fresh fruits and vegetables doubled. This may help explain why, during about the same period, the percentage of Americans getting five servings of fruits and vegetables a day dropped from 42 percent to 26 percent. Why not just subsidize produce instead? Because that’s not where the money is. 

    “To understand what is shaping our foodscape today, it is important to understand the significance of differential profit.” Whole foods or minimally processed foods, such as canned beans or tomato paste, are what the food business refers to as “commodities.” They have such slim profit margins that “some are typically sold at or below cost, as ‘loss leaders,’ to attract customers to the store” in the hopes that they’ll also buy the “value-added” products. Some of the most profitable products for producers and vendors alike are the ultra-processed, fatty, sugary, and salty concoctions of artificially flavored, artificially colored, and artificially cheap ingredients—thanks to taxpayer subsidies. 

    Different foods reap different returns. Measured in “profit per square foot of selling space” in the supermarket, confectionaries like candy bars consistently rank among the most lucrative. The markups are the only healthy thing about them. Fried snacks like potato chips and corn chips are also highly profitable. PepsiCo’s subsidiary Frito-Lay brags that while its products represented only about 1 percent of total supermarket sales, they may account for more than 10 percent of operating profits for supermarkets and 40 percent of profit growth. 

    It’s no surprise, then, that the entire system is geared towards garbage. The rise in the calorie supply wasn’t just more food but a different kind of food. There’s a dumb dichotomy about the drivers of the obesity epidemic: Is it the sugar or the fat? They’re both highly subsidized, and they both took off. As you can see below and at 4:29 and 4:35 in my video, along with a significant rise in refined grain products that is difficult to quantify, the rise in obesity was accompanied by about a 20 percent increase in per capita pounds of added sugars and a 38 percent increase in added fats. 

     

    More than half of all calories consumed by most adults in the United States were found to originate from these subsidized foods, and they appear to be worse off for it. Those eating the most had significantly higher levels of chronic disease risk factors, including elevated cholesterol, inflammation, and body weight. 

    If it really were a government of, by, and for the people, we’d be subsidizing healthy foods, if anything, to make fruits and vegetables cheap or even free. Instead, our tax dollars are shoveled to the likes of the sugar industry or to livestock feed to make cheap, fast-food meat. 

    Speaking of sorghum, I had never had it before and it’s delicious! In fact, I wish I had discovered it before How Not to Diet was published. I now add sorghum and finger millet to my BROL bowl which used to just include purple barley groats, rye groats, oat groats, and black lentils, so the acronym has become an unpronounceable BROLMS. Anyway, sorghum is a great rice substitute for those who saw my rice and arsenic video series and were as convinced as I am that we need to diversify our grains. 

    We now turn to marketing. After all of the taxpayer-subsidized glut of calories in the market, the food industry had to find a way to get it into people’s mouths. So, next: The Role of Marketing in the Obesity Epidemic

    We’re about halfway through this series on the obesity epidemic. If you missed any so far, check out the related videos below.

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Processed Foods and Obesity  | NutritionFacts.org

    Processed Foods and Obesity  | NutritionFacts.org

    [ad_1]

    The rise in the U.S. calorie supply responsible for the obesity epidemic wasn’t just about more food, but a different kind of food.

    The rise in the number of calories provided by the food supply since the 1970s “is more than sufficient to explain the US epidemic of obesity.” Similar spikes in calorie surplus were noted in developed countries around the world in parallel with and presumed to be primarily responsible for, the expanding waistlines of their populations. After taking exports into account, by the year 2000, the United States was producing 3,900 calories for every man, woman, and child—nearly twice as much as many people need. 

    It wasn’t always this way. The number of calories in the food supply actually declined over the first half of the twentieth century and only started its upward climb to unprecedented heights in the 1970s. The drop in the first half of the century was attributed to the reduction in hard manual labor. The population had decreased energy needs, so they ate decreased energy diets. They didn’t need all the extra calories. But then the “energy balance flipping point” occurred, when the “move less, stay lean phase” that existed throughout most of the century turned into the “eat more, gain weight phase” that plagues us to this day. So, what changed?

    As I discuss in my video The Role of Processed Foods in the Obesity Epidemic, what happened in the 1970s was a revolution in the food industry. In the 1960s, most food was prepared and cooked in the home. The typical “married female, not working” spent hours a day cooking and cleaning up after meals. (The “married male, non-working spouse” averaged nine minutes, as you can see below and at 1:34 in my video.) But then a mixed-blessing transformation took place. Technological advances in food preservation and packaging enabled manufacturers to mass prepare and distribute food for ready consumption. The metamorphosis has been compared to what happened a century before with the mass production and supply of manufactured goods during the Industrial Revolution. But this time, they were just mass-producing food. Using new preservatives, artificial flavors, and techniques, such as deep freezing and vacuum packaging, food corporations could take advantage of economies of scale to mass produce “very durable, palatable, and ready-to-consume” edibles that offer “an enormous commercial advantage over fresh and perishable whole or minimally processed foods.” 

    Think ye of the Twinkie. With enough time and effort, “ambitious cooks” could create a cream-filled cake, but now they are available around every corner for less than a dollar. If every time someone wanted a Twinkie, they had to bake it themselves, they’d probably eat a lot fewer Twinkies. The packaged food sector is now a multitrillion-dollar industry.

    Consider the humble potato. We’ve long been a nation of potato eaters, but we usually baked or boiled them. Anyone who’s made fries from scratch knows what a pain it is, with all the peeling, cutting, and splattering of oil. But with sophisticated machinations of mechanization, production became centralized and fries could be shipped at -40°F to any fast-food deep-fat fryer or frozen food section in the country to become “America’s favorite vegetable.” Nearly all the increase in potato consumption in recent decades has been in the form of french fries and potato chips. 

    Cigarette production offers a compelling parallel. Up until automated rolling machines were invented, cigarettes had to be rolled by hand. It took 50 workers to produce the same number of cigarettes a machine could make in a minute. The price plunged and production leapt into the billions. Cigarette smoking went from being “relatively uncommon” to being almost everywhere. In the 20th century, the average per capita cigarette consumption rose from 54 cigarettes a year to 4,345 cigarettes “just before the first landmark Surgeon General’s Report” in 1964. The average American went from smoking about one cigarette a week to half a pack a day.

    Tobacco itself was just as addictive before and after mass marketing. What changed was cheap, easy access. French fries have always been tasty, but they went from being rare, even in restaurants, to being accessible around each and every corner (likely next to the gas station where you can get your Twinkies and cigarettes).

    The first Twinkie dates back to 1930, though, and Ore-Ida started selling frozen french fries in the 1950s. There has to be more to the story than just technological innovation, and we’ll explore that next.

    This explosion of processed junk was aided and abetted by Big Government at the behest of Big Food, which I explore in my video The Role of Taxpayer Subsidies in the Obesity Epidemic.

    This is the fifth video in an 11-part series. Here are the first four: 

    Videos still to come are listed in the related videos below.

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Cutting the Calorie-Rich-And-Processed Foods  | NutritionFacts.org

    Cutting the Calorie-Rich-And-Processed Foods  | NutritionFacts.org

    [ad_1]

    We have an uncanny ability to pick out the subtle distinctions in calorie density of foods, but only within the natural range.

    The traditional medical view on obesity, as summed up nearly a century ago: “All obese persons are, alike in one fundamental respect,—they literally overeat.” While this may be true in a technical sense, it is in reference to overeating calories, not food. Our primitive urge to overindulge is selective. People don’t tend to lust for lettuce. We have a natural inborn preference for sweet, starchy, or fatty foods because that’s where the calories are concentrated.

    Think about hunting and gathering efficiency. We used to have to work hard for our food. Prehistorically, it didn’t make sense to spend all day collecting types of food that on average don’t provide at least a day’s worth of calories. You would have been better off staying back at the cave. So, we evolved to crave foods with the biggest caloric bang for their buck.

    If you were able to steadily forage a pound of food an hour and it had 250 calories per pound, it might take you ten hours just to break even on your calories for the day. But if you were gathering something with 500 calories a pound, you could be done in five hours and spend the next five working on your cave paintings. So, the greater the energy density—that is, the more calories per pound—the more efficient the foraging. We developed an acute ability to discriminate foods based on calorie density and to instinctively desire the densest.

    If you study the fruit and vegetable preferences of four-year-old children, what they like correlates with calorie density. As you can see in the graph below and at 1:52 in my video Friday Favorites: Cut the Calorie-Rich-And-Processed Foods, they prefer bananas over berries and carrots over cucumbers. Isn’t that just a preference for sweetness? No, they also prefer potatoes over peaches and green beans over melon, just like monkeys prefer avocados over bananas. We appear to have an inborn drive to maximize calories per mouthful. 

    All the foods the researchers tested in the study with four-year-old kids naturally had less than 500 calories per pound. (Bananas topped the chart at about 400.) Something funny happens when you start going above that: We lose our ability to differentiate. Over the natural range of calorie densities, we have an uncanny aptitude to pick out the subtle distinctions. However, once you start heading towards bacon, cheese, and chocolate territory, which can reach thousands of calories per pound, our perceptions become relatively numb to the differences. It’s no wonder since these foods were unknown to our prehistoric brains. It’s like the dodo bird failing to evolve a fear response because they had no natural predators—and we all know how that turned out—or sea turtle hatchlings crawling in the wrong direction towards artificial light rather than the moon. It is aberrant behavior explained by an “evolutionary mismatch.”

    The food industry exploits our innate biological vulnerabilities by stripping crops down into almost pure calories—straight sugar, oil (which is pretty much pure fat), and white flour (which is mostly refined starch). It also removes the fiber, because that effectively has zero calories. Run brown rice through a mill to make white rice, and you lose about two-thirds of the fiber. Turn whole-wheat flour into white flour, and lose 75 percent. Or you can run crops through animals (to make meat, dairy, and eggs) and remove 100 percent of the fiber. What you’re left with is CRAP—an acronym used by one of my favorite dieticians, Jeff Novick, for Calorie-Rich And Processed food.

    Calories are condensed in the same way plants are turned into addictive drugs like opiates and cocaine: “distillation, crystallization, concentration, and extraction.” They even appear to activate the same reward pathways in the brain. Put people with “food addiction” in an MRI scanner and show them a picture of a chocolate milkshake, and the areas that light up in their brains (as you can see below and at 4:15 in my video) are the same as when cocaine addicts are shown a video of crack smoking. (See those images below and at 4:18 in my video.) 

    “Food addiction” is a misnomer. People don’t suffer out-of-control eating behaviors to food in general. We don’t tend to compulsively crave carrots. Milkshakes are packed with sugar and fat, two of the signals to our brain of calorie density. When people are asked to rate different foods in terms of cravings and loss of control, most incriminated was a load of CRAP—highly processed foods like donuts, along with cheese and meat. Those least related to problematic eating behaviors? Fruits and vegetables. Calorie density may be the reason people don’t get up in the middle of the night and binge on broccoli.

    Animals don’t tend to get fat when they are eating the foods they were designed to eat. There is a confirmed report of free-living primates becoming obese, but that was a troop of baboons who stumbled across the garbage dump at a tourist lodge. The garbage-feeding animals weighed 50 percent more than their wild-feeding counterparts. Sadly, we can suffer the same mismatched fate and become obese by eating garbage, too. For millions of years, before we learned how to hunt, our biology evolved largely on “leaves, roots, fruits, and nuts.” Maybe it would help if we went back to our roots and cut out the CRAP. 

    A key insight I want to emphasize here is the concept of animal products as the ultimate processed food. Basically, all nutrition grows from the ground: seeds, sunlight, and soil. That’s where all our vitamins come from, all our minerals, all the protein, all the essential amino acids. The only reason there are essential amino acids in a steak is because the cow ate them all from plants. Those amino acids are essential—no animals can make them, including us. We have to eat plants to get them. But we can cut out the middlemoo and get nutrition directly from the Earth, and, in doing so, get all the phytonutrients and fiber that are lost when plants are processed through animals. Even ultraprocessed junk foods may have a tiny bit of fiber remaining, but all is lost when plants are ultra-ultraprocessed through animals.

    Having said that, there was also a big jump in what one would traditionally think of as processed foods, and that’s the video we turn to next: The Role of Processed Foods in the Obesity Epidemic.

    We’re making our way through a series on the cause of the obesity epidemic. So far, we’ve looked at exercise (The Role of Diet vs. Exercise in the Obesity Epidemic) and genes (The Role of Genes in the Obesity Epidemic and The Thrifty Gene Theory: Survival of the Fattest), but, really, it’s the food.

    If you’re familiar with my work, you know that I recommend eating a variety of whole plant foods, as close as possible to the way nature intended. I capture this in my Daily Dozen, which you can download for free here or get the free app (iTunes and Android). On the app, you’ll see that there’s also an option for those looking to lose weight: my 21 Tweaks. But before you go checking them off, be sure to read about the science behind the checklist in my book How Not to Diet. Get it for free at your local public library. If you choose to buy a copy, note that all proceeds from all of my books go to charity. 

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • What Is the Role of Our Genes in the Obesity Epidemic?  | NutritionFacts.org

    What Is the Role of Our Genes in the Obesity Epidemic?  | NutritionFacts.org

    [ad_1]

    The “fat gene” accounts for less than 1 percent of the differences in size between people.

    To date, about a hundred genetic markers have been linked to obesity, but when you put them all together, overall, they account for less than 3 percent of the difference in body mass index (BMI) between people. You may have heard about the “fat gene,” called FTO, short for FaT mass and Obesity-associated). It’s the gene most strongly linked to obesity, but it explains less than 1 percent of the difference in BMI between people, a mere 0.34 percent. 

    As I discuss in my video The Role of Genes in the Obesity Epidemic, FTO codes for a brain protein that appears to affect our appetite. Are you one of the billion people who carry the FTO susceptibility genes? It doesn’t matter because it only appears to result in a difference in intake of a few hundred extra calories a year. The energy imbalance that led to the obesity epidemic is on the order of hundreds of calories a day, and that’s the gene known so far to have the most effect. The chances of accurately predicting obesity risk based on FTO status is “only slightly better than tossing a coin.” In other words, no, those genes don’t make you look fat.

    When it comes to obesity, the power of our genes is nothing compared to the power of our fork. Even the small influence the FTO gene does have appears to be weaker among those who are physically active and may be abolished completely in those eating healthier diets. FTO only appears to affect those eating diets higher in saturated fat, which is predominantly found in meat, dairy, and junk food. Those eating more healthfully appear to be at no greater risk of weight gain, even if they inherited the “fat gene” from both of their parents.

    Physiologically, FTO gene status does not appear to affect our ability to lose weight. Psychologically, knowing we’re at increased genetic risk for obesity may motivate some people to eat and live more healthfully, but it may cause others to fatalistically throw their hands up in the air and resign themselves to thinking that it just runs in their family, as you can see in the graph below and at 2:11 in my video. Obesity does tend to run in families, but so do lousy diets. 

    Comparing the weight of biological versus adopted children can help tease out the contributions of lifestyles versus genetics. Children growing up with two overweight biological parents were found to be 27 percent more likely to be overweight themselves, whereas adopted children placed in a home with two overweight parents were 21 percent more likely to be overweight. So, genetics do play a role, but this suggests that it’s more the children’s environment than their DNA.

    One of the most dramatic examples of the power of diet over DNA comes from the Pima Indians of Arizona. As you can see in the graph below and at 3:05 in my video, they not only have among the highest rates of obesity, but they also have the highest rates of diabetes in the world. This has been ascribed to their relatively fuel-efficient genetic makeup. Their propensity to store calories may have served them well in times of scarcity when they were living off of corn, beans, and squash, but when the area became “settled,” their source of water, the Gila River, was diverted upstream. Those who survived the ensuing famine had to abandon their traditional diet to live off of government food programs and chronic disease rates skyrocketed. Same genes, but different diet, different result. 

    In fact, a natural experiment was set up. The Pima living over the border in Mexico come from the same genetic pool but were able to maintain more of their traditional lifestyle, sticking with their main staples of beans, wheat flour tortillas, and potatoes. Same genes, but seven times less obesity and about four times less diabetes. You can see those graphs below and at 3:58 and 4:02 in my video. Genes may load the gun, but diet pulls the trigger.

    Of course, it’s not our genes! Our genes didn’t suddenly change 40 years ago. At the same time, though, in a certain sense, it could be thought of as all in our genes. That’s the topic of my next video The Thrifty Gene Theory: Survival of the Fattest.

    This is the second in an 11-video series on the obesity epidemic. If you missed the first one, check out The Role of Diet vs. Exercise in the Obesity Epidemic

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • The Roles Diet and Exercise Play in the Obesity Epidemic  | NutritionFacts.org

    The Roles Diet and Exercise Play in the Obesity Epidemic  | NutritionFacts.org

    [ad_1]

    The common explanations for the cause of the obesity epidemic put forward by the food industry and policymakers, such as inactivity or a lack of willpower, are not only wrong, but actively harmful fallacies.

    Obesity isn’t new, but the obesity epidemic is. We went from a few corpulent kings and queens, like Henry VIII or Louis VI (known as Louis le Gros, or “Louis the Fat”), to a pandemic of obesity, now considered to be “arguably the gravest and most poorly controlled public health threat of our time.” As you can see below and at 0:34 in my video The Role of Diet vs. Exercise in the Obesity Epidemic, about 37 percent of American men are obese and 41 percent of American women, with no end in sight. Earlier reports had suggested that the rise in obesity was at least slowing down, but even that doesn’t appear to be the case. Similarly, we had thought we were turning the corner on childhood obesity “[a]fter 35 years of unremittingly bad news,” but the bad news continues. Childhood and adolescent obesity rates have continued to rise, now into the fourth decade. 

    Over the last century, obesity appears to have jumped ten-fold, from about 1 in 30 to now 1 in 3, but it wasn’t a steady rise. As you can see in the graph below and at 1:15 in my video, something seems to have happened around the late 1970s—and not just in the United States, but around the globe. The obesity pandemic took off at about the same time in the 1970s and 1980s in most high-income countries. The fact that the rapid rise “seemed to begin almost concurrently” across the industrialized world suggests a common cause. What might that trigger have been? 

    Any potential driver would have to be global and “coincide with the upswing of the epidemic.” So, the change would have had to have started about 40 years ago and would have had to have been able to spread rapidly around the globe. Let’s see how all the various theories stack up. For example, as you can see below and at 1:55 in my video, some have blamed changes in our built environment and shifts in city planning that have made our communities less conducive to walking, biking, and grocery shopping. That doesn’t meet our criteria for a credible cause, though, because there was no universal, simultaneous change in our neighborhoods within that time frame.

    When researchers surveyed hundreds of policymakers, most blamed the obesity epidemic on a “lack of personal motivation.” Do you see how little sense that makes? In the United States, for example, obesity shot up across the entire population in the late 1970s, as you can see at 2:26 in my video. I concur with the researchers who “believe it is implausible that each age, sex, and ethnic group, with massive differences in life experience and attitudes, had a simultaneous decline in willpower related to healthy nutrition or exercise.” More plausible than a global change like our characters would be some global change like our lives. 

    The food industry blames inactivity. “If all consumers exercised,” said the CEO of PepsiCo, “obesity wouldn’t exist.” Coca-Cola went a step further, spending $1.5 million to create the Global Energy Balance Network to downplay the role of diet. Leaked emails show the company planned on using the front to “serve as a ‘weapon’ to ‘change the conversation’ about obesity in its ‘war’ with public health.

    This tactic is so common among food and beverage companies that it even has a name: “leanwashing.” You’ve heard of greenwashing, where companies deceptively pretend to be environmentally friendly. Leanwashing is the term used to describe companies that try to position themselves as helping to solve the obesity crisis when they’re instead directly contributing to it. For example, the largest food company in the world, Nestlé, has “rebranded itself as the ‘world’s leading nutrition, health and wellness company. Yes, that Nestlé, makers of Nesquik, Cookie Crisp, and historically more than a hundred different brands of candy, including Butterfinger, Kit Kat, Goobers, Gobstoppers, Runts, and Nerds. Another one of its slogans is “Good Food, Good Life.” Its Raisinets may have some fruit, but Nestlé seems to me more Willy Wonka than wellness. 

    The constant corporate drumbeat of overemphasis on physical inactivity appears to be working. In response to the Harris poll question, “Which of these do you think are the major reasons why obesity has increased?,” a “huge majority of 83% chose lack of exercise, while only 34% chose excessive calorie consumption.” “Confusion about the effect of exercise on the energy balance” has been identified as one of the most common misconceptions about obesity. The scientific community has “come to a fairly decisive conclusion” that the factors governing calorie intake more powerfully affect overall calorie balance. It’s our fast food more than our slow motion. 

    “There is considerable debate in the literature today about whether physical activity has any role whatsoever in the epidemic of obesity that has swept the globe since the 1980s.” The increase in caloric intake per person is more than enough to explain the obesity epidemic in the United States and also explain it globally. If anything, the level of physical activity over the last few decades has gone up slightly in both Europe and North America. Ironically, this may be a result of the extra energy it takes to move around our heavier bodies, making it a consequence of the obesity problem rather than the cause.

    “Formal exercise plays a very small role in the total daily physical activity energy expenditure.” Think how much more physical work people used to do in the workplace, on the farm, or even in the home. It’s not just the shift in collar color from blue to white. Increasing automation, computerization, mechanization, motorization, and urbanization have all contributed to increasingly more sedentary lifestyles over the last century—and that’s the problem with the theory. The occupational shifts and advent of labor-saving devices “have been gradual and largely predated the dramatic increase in weight gain across the developed world in the past few decades.” Washing machines, vacuum cleaners, and the Model T were all invented before 1910. Indeed, when put to the test using state-of-the-art methods to measure energy in and energy out, it was caloric intake, not physical activity, that predicted weight gain over time. 

    The common misconception that obesity is mostly due to lack of exercise may not just be a benign fallacy. Personal theories of causation appear to impact people’s weight. Those who blame insufficient exercise are significantly more likely to be overweight than those who implicate a poor diet. Put those who believe lack of exercise causes obesity in a room with chocolate, and they can covertly be observed consuming more candy. Those holding that view may be different in other ways, though. You can’t prove cause and effect until you put it to the test. And, indeed, as you can see in the graph below, and at 7:22 in my video, people randomized to read an article implicating inactivity went on to eat significantly more sweets than those reading about research that indicated diet. A similar study found that those presented with research blaming genetics subsequently ate significantly more cookies. The title of that paper? “An Unintended Way in Which the Fat Gene Might Make You Fat.” 

    When I sat down to write How Not to Diet, I knew this “what triggered the obesity epidemic” was going to be a big question I had to face. Was it inactivity (just kids sitting around playing video games or scrolling on their phones)? Was it genetic? Was it epigenetic (something turning on our fat genes)? Or was it just the food? Were we eating more fat all of a sudden? More carbs? More processed foods? Or were we just eating more period, because of bigger serving sizes or more snacking? Inquiring minds wanted to know. 

    This is the first in an 11-video series to answer this question, which I originally released in a two-hour webinar in 2020. Check out the webinar digital download here. Or, check them out in the related posts below.  

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Headache and Migraine Relief from Foods  | NutritionFacts.org

    Headache and Migraine Relief from Foods  | NutritionFacts.org

    [ad_1]

    Plant-based diets are put to the test for treating migraine headaches.

    Headaches are one of the top five reasons people end up in emergency rooms and one of the leading reasons people see their doctors in general. One way to try to prevent them is to identify their triggers and avoid them. Common triggers for migraines include stress, smoking, hunger, sleep issues, certain foods (like chocolate, cheese, and alcohol), your menstrual cycle, or certain weather patterns (like high humidity).

    In terms of dietary treatments, the so-called Father of Modern Medicine, William Osler suggested trying a “strict vegetable diet.” After all, the nerve inflammation associated with migraines “may be reduced by a vegan diet as many plant foods are high in anti-inflammatory compounds and antioxidants, and likewise, meat products have been reported to have inflammatory properties.” It wasn’t put to the test, though, for another 117 years.

    As I discuss in my video Friday Favorites: Foods That Help Headache and Migraine Relief, among study participants given a placebo supplement, half said they got better, while the other half said they didn’t. But, when put on a strictly plant-based diet, they did much better, experiencing a significant drop in the severity of their pain, as you can see in the graph below and at 1:08 in my video

    Now, “it is possible that the pain-reducing effects of the vegan diet may be, at least in part, due to weight reduction.” The study participants lost about nine more pounds when they were on the plant-based diet for a month, as shown below, and at 1:22. 

    Even just lowering the fat content of the diet may help. Those placed on a month of consuming less than 30 daily grams of fat (for instance, less than two tablespoons of oil all day), experienced “statistically significant decreases in headache frequency, intensity, duration, and medication intake”—a six-fold decrease in the frequency and intensity, as you can see below and at 1:44 in my video. They went from three migraine attacks every two weeks down to just one a month. And, by “low fat,” the researchers didn’t mean SnackWell’s; they meant more fruits, vegetables, and beans. Before the food industry co-opted and corrupted the term, eating “low fat” meant eating an apple, for example, not Kellogg’s Apple Jacks.  

    Now, they were on a low-fat diet—about 10 percent fat for someone eating 2,500 calories a day. What about just less than 20 percent fat compared to a more normal diet that’s still relatively lower fat than average? As you can see below and at 2:22 in my video, the researchers saw the same significant drops in headache frequency and severity, including a five-fold drop in attacks of severe pain. Since the intervention involved at least a halving of intake of saturated fat, which is mostly found in meat, dairy, and junk, the researchers concluded that reduced consumption of saturated fat may help control migraine attacks—but it isn’t necessarily something they’re getting less of. There are compounds “present in Live green real veggies” that might bind to a migraine-triggering peptide known as calcitonin gene-related peptide, CGRP. 

    Drug companies have been trying to come up with something that binds to CGRP, but the drugs have failed to be effective. They’re also toxic, which is a problem we don’t have with cabbage, as you can see below and at 3:01 in my video

    Green vegetables also have magnesium. Found throughout the food supply but most concentrated in green leafy vegetables, beans, nuts, seeds, and whole grains, magnesium is the central atom to chlorophyll, as shown below and at 3:15. So, you can see how much magnesium foods have in the produce aisle by the intensity of their green color. Although magnesium supplements do not appear to decrease migraine severity, they may reduce the number of attacks you get in the first place. You can ask your doctor about starting 600 mg of magnesium dicitrate every day, but note that magnesium supplements can cause adverse effects, such as diarrhea, so I recommend getting it the way nature intended—in the form of real food, not supplements.  

    Any foods that may be particularly helpful? You may recall that I’ve talked about ground ginger. What about caffeine? Indeed, combining caffeine with over-the-counter painkillers, like Tylenol, aspirin, or ibuprofen, may boost their efficacy, at doses of about 130 mg for tension-type headaches and 100 mg for migraines. That’s about what you might expect to get in three cups of tea, as you can see below, and at 4:00 in my video. (I believe it is just a coincidence that the principal investigator of this study was named Lipton.) 

    Please note that you can overdo it. If you take kids and teens with headaches who were drinking 1.5 liters of cola a day and cut the soda, you can cure 90 percent of them. However, this may be a cola effect rather than a caffeine effect. 

    And, finally, one plant food that may not be the best idea is the Carolina Reaper, the hottest chili pepper in the world. It’s so mind-numbingly hot it can clamp off the arteries in your brain, as seen below and at 4:41 in my video, and you can end up with a “thunderclap headache,” like the 34-year-old man who ate the world’s hottest pepper and ended up in the emergency room. Why am I not surprised it was a man? 

    I’ve previously covered ginger and topical lavender for migraines. Saffron may help relieve PMS symptoms, including headaches. A more exotic way a plant-based diet can prevent headaches is by helping to keep tapeworms out of your brain.

    Though hot peppers can indeed trigger headaches, they may also be used to treat them. Check out my video on relieving cluster headaches with hot sauce

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Children’s Cereals: Candy for Breakfast?  | NutritionFacts.org

    Children’s Cereals: Candy for Breakfast?  | NutritionFacts.org

    [ad_1]

    Plastering front-of-package nutrient claims on cereal boxes is an attempt to distract us from the incongruity of feeding our children multicolored marshmallows for breakfast.

    The American Medical Association started warning people about excess sugar consumption more than 75 years ago, based in part on our understanding that “sugar supplies nothing in nutrition but calories, and the vitamins provided by other foods are sapped by sugar to liberate these calories.” So, added sugars aren’t just empty calories, but negative nutrition. “Thus, the more added sugars one consumes, the more nutritionally depleted one may become.”

    Given the “totality of publicly available scientific evidence,” the Food and Drug Administration (FDA) decided to make processed food manufacturers declare “added sugars” on their nutrition labels. The National Yogurt Association was livid and said it “continues to oppose the ‘added sugars’ declaration,” since it needed “‘added sugars’ to increase palatability” of its products. The junk food association questioned the science, whereas the ice cream folks seemed to imply that consumers are too stupid to “understand or know how to use the added sugar declaration,” so it’s better just to leave it off. The world’s biggest cereal company, Kellogg’s, took a similar tact, opposing it so as not “to confuse consumers.” Should the FDA proceed with such labeling against Kellogg’s objections, the cereal giant pressed that “an added sugars declaration…should be communicated as a footnote.” It claimed that its “goal is to provide consumers with useful information so they can make informed choices.” This is from a company that describes its Froot Loops as “packed with delicious fruity taste, fruity aroma, and bright colors.” Keep in mind that Froot Loops has more sugar than a Krispy Kreme doughnut, as you can see in the graph below and at 1:46 in my video Friday Favorites: Kids’ Breakfast Cereals as Nutritional Façade

    Froot Loops is more than 40 percent sugar by weight! You can see the cereal box’s Nutrition Facts label below and at 1:50 in my video

    The tobacco industry used similar terms, such as “light,” “low,” and “mild” to make its products appear healthier—before it was barred from doing so. “Now sugar interests are fighting similar battles over whether their terminology, including ‘healthy,’ ‘natural,’ ‘naturally sweetened,’ and even ‘lightly sweetened,’ is deceptive to consumers.”

    But if you look at the side of a cereal box, as shown below and at 2:13 in my video, you can see all those vitamins and minerals that have been added. That was one of the ways the cereal companies responded to calls for banning sugary cereals. General Mills defended the likes of Franken Berry, Trix, and Lucky Charms for being fortified with essential vitamins. 

    Sir Grapefellow, I learned, was a “grape-flavored oat cereal” complete with “sweet grape star bits”—that is, marshmallows. Don’t worry. It was “vitamin charged!” You can see that cereal box below and at 2:31 in my video

    Sugary breakfast cereals, said Dr. Jean Mayer from Harvard, “are not a complete food even if fortified with eight or 10 vitamins.” Senator McGovern replied, “I think your point is well taken that these products may be mislabeled or more correctly called candy vitamins than cereals.” 

    Plastering nutrient claims on cereal boxes can create “a ‘nutritional façade’ around a product, acting to distract attention away” from unsavory qualities, such as excess sugar content. Researchers found that the “majority of parents misinterpreted the meaning of claims commonly used on children’s cereals,” raising significant public health concerns. Ironically, cereal boxes bearing low-calorie claims were found to have more calories on average than those without such a claim. The cereal doth protest too much. 

    Even candy bar companies are getting in on the action, bragging about protein content because of some peanuts. Like the Baby Ruth, a candy bar that has 50 grams of sugar. Froot Loops could be considered breakfast candy, as the same serving would have 40 sugar grams, as you can see below and at 3:45 in my video

    Given that “research suggests that consumers believe front-of-package claims, perceive them to be government-endorsed, and use them to ignore the Nutrition Facts Panel,” there’s been a call from nutrition professionals to consider “an outright ban on all front-of-package claims.” The industry’s short-lived “Smart Choices” label, as you can see below and at 4:13 in my video, was met with disbelief when it was found adorning qualifying cereals like Froot Loops and Cookie Crisp. The processed food industry spent more than a billion dollars lobbying against the adoption of more informative labeling (a traffic-light approach), “opposing most aggressively the use of a red light suggesting that any food was too high in anything.” 

    I was invited to testify as an expert witness in a case against sugary cereal companies. (I donated my fee, of course.) Check out the related posts below for a video series and blogs that are a result of some of the research I did. 

    You may also be interested in videos and blogs on the food industry; see related posts below.

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Is All Vegan Food Healthy?  | NutritionFacts.org

    Is All Vegan Food Healthy?  | NutritionFacts.org

    [ad_1]

    How do healthier plant-based diets compare to unhealthy plant foods and animal foods when it comes to diabetes risk? 

    In my video on flexitarians, I discuss how the benefits of eating a plant-based diet are not all-or-nothing. “Simple advice to increase the consumption of plant-derived foods with compensatory [parallel] reductions in the consumption of foods from animal sources confers a survival advantage”— a live-longer advantage. The researchers call it a “pro-vegetarian” eating pattern, one that’s moving in the direction of vegetarianism, “a more gradual and gentle approach.” 

    If you’re dealing with a serious disease, though, like diabetes, completely “avoiding some problem foods is easier than attempting to moderate their intake. Clinicians would never tell an alcoholic to try to simply cut down on alcohol. Avoiding alcohol entirely is more effective and, in fact, easier for a problem drinker…Paradoxically, asking patients to make a large change may be more effective than making a slow transition. Diet studies show that recommending more significant changes increases the chances that patients can accomplish [them]. It may help to replace the common advice, ‘all things in moderation’ with ‘big changes beget big results.’ Success breeds success. After a few days or weeks of major dietary changes, patients are likely to see improvements in weight and blood glucose [sugar] levels—improvements that reinforce the dietary changes that elicited them. Furthermore, they may enjoy other health benefits of a plant-based diet” that may give them further motivation. 

    As you can see below and at 1:43 in my video Friday Favorites: Is Vegan Food Always Healthy?, those who choose to eat plant-based for their health say it’s mostly for “general wellness or general disease prevention” or to improve their energy levels or immune function, for example. 

    They felt it gives them a sense of control over their health, helps them feel better emotionally, improves their overall health, makes them feel better, and more, as shown below and at 1:48. Most felt it was very important for maintaining their health and well-being. 

    For the minority who used it for a specific health problem, mostly high cholesterol or weight loss, followed by high blood pressure and diabetes, most reported they felt it helped a great deal, as you can see below and at 2:14. 

    Some choose plant-based diets for other reasons, such as animal welfare or global warming, and it looks like “ethical vegans” are more likely to eat sugary and fatty foods, like vegan donuts, compared to those eating plant-based because of religious or health concerns, as you can see below and at 2:26 in my video

    The veganest vegan could make an egg- and dairy-free cake, covered with frosting, marshmallow fluff, and chocolate syrup, topped with Oreos, and served with a side of Doritos. Or, they may want fruit for dessert, but in the form of Pop-Tarts and Krispy Kreme pies. Vegan, yes. Healthy, no. 

    “Plant-based diets have been recommended to reduce the risk of type 2 diabetes (T2D). However, not all plant foods are necessarily beneficial.” In the pro-vegetarian scoring system I mentioned above, you get points for eating potato chips and French fries because they are technically plant-based, as you can see below and at 3:07 in my video, but Harvard researchers wanted to examine the association of not only an overall plant-based diet, but healthy and unhealthy versions. So, they created the same kind of pro-vegetarian scoring system, but it was weighted towards any sort of plant-based foods and against animal foods; then, they created a healthful plant-based diet index, where at least some whole plant foods took precedence and Coca-Cola and other sweetened beverages were no longer considered plants. Lastly, they created an unhealthful plant-based diet index by assigning positive scores to processed plant-based junk and negative scores for healthier plant foods and animal foods. 

    Their findings? As you can see below and at 3:51 in my video, a more plant-based diet, in general, was good for reducing diabetes risk, but eating especially healthy plant-based foods did better, nearly cutting risk in half, while those eating more unhealthy plant foods did worse, as shown in the graph below and at 4:03.

    Now, is that because they were also eating more animal foods? People often eat burgers with their fries, so the researchers separated the effects of healthy plant foods, less healthy plant foods, and animal foods on diabetes risk. And, they found that healthy plant foods were protectively associated, animal foods were detrimentally associated, and less healthy plant foods were more neutral when it came to diabetes risk. Below and at 4:32 in my video, you can see the graph that shows higher diabetes risk with more and more animal foods, no protection whatsoever with junky plant foods, and lower and lower diabetes risk associated with more and more healthy whole plant foods in the diet. So, they concluded that, yes, “plant-based diets…are associated with substantially lower risk of developing T2D.” However, it may not be enough to just lower the intake of animal foods; consumption of less healthy plant foods may need to decrease, too. 

    As a physician, labels like vegetarian and vegan just tell me what you don’t eat, but there are a lot of unhealthy vegetarian fare like French fries, potato chips, and soda pop. That’s why I prefer the term whole food and plant-based nutrition. That tells me what you do eat—a diet centered around the healthiest foods out there. 

    The video I mentioned is Do Flexitarians Live Longer?

    You may also be interested in some of my past popular videos and blogs on plant-based diets. Check related posts below. 

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Irregular Meals, Night Shifts, and Metabolic Harms  | NutritionFacts.org

    Irregular Meals, Night Shifts, and Metabolic Harms  | NutritionFacts.org

    [ad_1]

    What can shift workers do to moderate the adverse effects of circadian rhythm disruption?

    Shift workers may have higher rates of death from heart disease, stroke, diabetes, dementia, and cardiovascular disease, as well as higher rates of death from cancer. Graveyard shift, indeed! But, is it just because they’re eating out of vending machines or not getting enough sleep? Highly controlled studies have recently attempted to tease out these other factors by putting people on the same diets with the same sleep—but at the wrong time of day. Redistributing eating to the nighttime resulted in elevated cholesterol and increases in blood pressure and inflammation. No wonder shift workers are at higher risk. Shifting meals to the night in a simulated night-shift protocol effectively turned about one-third of the subjects prediabetic in just ten days. Our bodies just weren’t designed to handle food at night, as I discuss in my video The Metabolic Harms of Night Shifts and Irregular Meals.

    Just as avoiding bright light at night can prevent circadian misalignment, so can avoiding night eating. We may have no control over the lighting at our workplace, but we can try to minimize overnight food intake, which has been shown to help limit the negative metabolic consequences of shift work. When we finally do get home in the morning, though, we may disproportionately crave unhealthy foods. In one experiment, 81 percent of participants in a night-shift scenario chose high-fat foods, such as croissants, out of a breakfast buffet, compared to just 43 percent of the same subjects during a control period on a normal schedule.

    Shiftwork may also leave people too fatigued to exercise. But, even at the same physical activity levels, chronodisruption can affect energy expenditure. Researchers found that we burn 12 to 16 percent fewer calories while sleeping during the daytime compared to nighttime. Just a single improperly-timed snack can affect how much fat we burn every day. Study subjects eating a specified snack at 10:00 am burned about 6 more grams of fat from their body than on the days they ate the same snack at 11:00 pm. That’s only about a pat and a half of butter’s worth of fat, but it was the identical snack, just given at a different time. The late snack group also suffered about a 9 percent bump in their LDL cholesterol within just two weeks.

    Even just sleeping in on the weekends may mess up our metabolism. “Social jetlag is a measure of the discrepancy in sleep timing between our work days and free days.” From a circadian rhythm standpoint, if we go to bed late and sleep in on the weekends, it’s as if we flew a few time zones west on Friday evening, then flew back Monday morning. Travel-induced jet lag goes away in a few days, but what might the consequences be of constantly shifting our sleep schedule every week over our entire working career? Interventional studies have yet put it to the test, but population studies suggest that those who have at least an hour of social jet lag a week (which may describe more than two-thirds of people) have twice the odds of being overweight. 

    If sleep regularity is important, what about meal regularity? “The importance of eating regularly was highlighted early by Hippocrates (460–377 BC) and later by Florence Nightingale,” but it wasn’t put to the test until the 21st century. A few population studies had suggested that those eating meals irregularly were at a metabolic disadvantage, but the first interventional studies weren’t published until 2004. Subjects were randomized to eat their regular diets divided into six regular eating occasions a day or three to nine daily occasions in an irregular manner. Researchers found that an irregular eating pattern can cause a drop in insulin sensitivity and a rise in cholesterol levels, as well as reduce the calorie burn immediately after meals in both lean and obese individuals. The study participants ended up eating more, though, on the irregular meals, so it’s difficult to disentangle the circadian effects. The fact that overweight individuals may overeat on an irregular pattern may be telling in and of itself, but it would be nice to see such a study repeated using identical diets to see if irregularity itself has metabolic effects.

    Just such a study was published in 2016: During two periods, people were randomized to eat identical foods in a regular or irregular meal pattern. As you can see in the graph below and at 4:47 in my video, during the irregular period, people had impaired glucose tolerance, meaning higher blood sugar responses to the same food.

    They also had lower diet-induced thermogenesis, meaning the burning of fewer calories to process each meal, as seen in the graph below and at 4:55 in my video.

    The difference in thermogenesis only came out to be about ten calories per meal, though, and there was no difference in weight changes over the two-week periods. However, diet-induced thermogenesis can act as “a satiety signal.” The extra work put into processing a meal can help slake one’s appetite. And, indeed, “lower hunger and higher fullness ratings” during the regular meal period could potentially translate into better weight control over the long term. 

    The series on chronobiology is winding down with just two videos left in this series: Shedding Light on Shedding Weight and Friday Favorites: Why People Gain Weight in the Fall.

    If you missed any of the other videos, see the related posts below. 
     

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Morning Calories vs. Evening Calories  | NutritionFacts.org

    Morning Calories vs. Evening Calories  | NutritionFacts.org

    [ad_1]

    Why are calories eaten in the morning less fattening than calories eaten in the evening? 

    One reason calories consumed in the morning are less fattening than those eaten in the evening is that more calories are burned off in the morning due to diet-induced thermogenesis. That’s the amount of energy the body takes to digest and process a meal, given off in part as waste heat. If people are given the same meal in the morning, afternoon, or night, their body uses up about 25 percent more calories to process it in the afternoon than at night and about 50 percent more calories to digest it in the morning, as you can see below and at 0:36 in my video Eat More Calories in the Morning Than the Evening. That leaves fewer net calories in the morning to be stored as fat.

    Let’s put some actual numbers to it. A group of Italian researchers randomized 20 people to eat the same standardized meal at either 8:00 am or 8:00 pm and had them return a week later to do the opposite. So, each person had a chance to eat the same meal for breakfast and dinner. After every meal, the study participants were placed in a “calorimeter” contraption to precisely measure how many calories they were burning over the next three hours. As you can see below and at 1:18 in my video, the researchers calculated that the meal given in the morning took about 300 calories to digest, whereas the same meal given at night only used up about 200 calories to process. The meal was about 1,200 calories, but, when eaten in the morning, it ended up only providing about 900 calories compared to more like 1,000 calories at night. Same meal, same food, same amount of food, but effectively 100 fewer calories when consumed in the morning rather than at night. So, a calorie is not just a calorie. It depends on when we eat it. 

    But why do we burn more calories when eating a morning meal? Is it behavioral or biological? If you started working the graveyard shift, sleeping during the day and working all night, which meal would net you fewer calories? Would it be the “breakfast” you had at night before you went to work or the “dinner” you had in the morning before you went to bed? In other words, is it something about eating before you go to sleep that causes your body to hold onto more calories, or is it built into our circadian rhythm, where we store more calories at night regardless of what we’re doing? You don’t know until you put it to the test.

    Harvard researchers randomized people to identical meals at 8:00 am versus 8:00 pm while under simulated night shifts or day shifts. Regardless of activity level or sleeping cycle, the number of calories that were burned processing the morning meals was 50 percent higher than in the evening, as you can see in the graph below and at 2:45 in my video. So, the difference is explained by chronobiology: It’s just part of our circadian rhythm to burn more meal calories in the morning. But, why? What exactly is going on? 

    How does it make sense for our body to waste calories in the morning when we have the whole day ahead of us? 

    Our body isn’t so much wasting calories as investing them. When we eat in the morning, our body bulks up our muscles with glycogen, which is the primary energy reserve our body uses to fuel our muscles, but this takes energy. In the evening, our body expects to be sleeping for much of the next 12 hours, so rather than storing blood sugar as extra glycogen in our muscles, it preferentially uses it as an energy source, which may end up meaning we burn less of our backup fuel (body fat). In the morning, however, our body expects to be running around all day, so instead of just burning off breakfast, our body continues to dip into its fat stores while we use breakfast calories to stuff our muscles full of the energy reserves we need to move around over the day. That’s where the “inefficiency” may come from. The reason it costs more calories to process a morning meal is that, instead of just burning glucose (blood sugar) directly, our body uses up energy to string glucose molecules together into chains of glycogen in our muscles, which are then just going to be broken back down into glucose later in the day. That extra assembly/disassembly step takes energy—energy that our body takes out from the meal, leaving us with fewer calories.

    So, in the morning, our muscles are especially sensitive to insulin, rapidly pulling blood sugar out of our bloodstream to build up glycogen reserves. At night, though, our muscles become relatively insulin-resistant and resist the signal to take in extra blood sugar. So, does that mean you get a higher blood sugar and insulin spike in the evening compared to eating the same meal in the morning? Yes. As you can see in the graph below and at 5:02 in my video, in that 100-calorie-difference study, for example, blood sugars rose twice as high after the 8:00 pm meal compared to the same meal eaten in the morning.

    So, shifting the bulk of our caloric intake towards the morning would appear to have a dual benefit—more weight loss, and better blood sugar control, as shown in the graph below and at 5:12 in my video

    If you thought dual benefits sounded good, stay tuned for triple benefits! I dive deeper into circadian rhythms. See related posts below.

    My last few videos (see below) focus on why science points to loading your calories towards the beginning of the day.

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Lose Weight by Eating More in the Morning  | NutritionFacts.org

    Lose Weight by Eating More in the Morning  | NutritionFacts.org

    [ad_1]

    A calorie is not a calorie. It isn’t only what you eat, but when you eat.

    Mice are nocturnal creatures. They eat during the night and sleep during the day. However, if you only feed mice during the day, they gain more weight than if they were fed a similar amount of calories at night. Same food and about the same amount of food, but different weight outcomes, as you can see in the graph below and at 0:18 in my video Eat More Calories in the Morning to Lose Weight, suggesting that eating at the “wrong” time may lead to disproportionate weight gain. In humans, the wrong time would presumably mean eating at night. 

    Recommendations for weight management often include advice to limit nighttime food consumption, but this was largely anecdotal until it was first studied experimentally in 2013. Researchers instructed a group of young men not to eat after 7:00 pm for two weeks. Compared to a control period during which they continued their regular habits, they ended up about two pounds lighter after the night-eating restriction. This is not surprising, given that dietary records show the study participants inadvertently ate fewer calories during that time. To see if timing has metabolic effects beyond just foreclosing eating opportunities, you’d have to force people to eat the same amount of the same food, but at different times of the day. The U.S. Army stepped forward to carry out just such an investigation.

    In their first set of experiments, Army researchers had people eat a single meal a day either as breakfast or dinner. The results clearly showed the breakfast group lost more weight, as you can see in the graph below and at 1:35 in my video. When study participants ate only once a day at dinner, their weight didn’t change much, but when they ate once a day at breakfast, they lost about two pounds a week. 

    Similar to the night-eating restriction study, this is to be expected, given that people tend to be hungrier in the evening. Think about it. If you went nine hours without eating during the day, you’d be famished, but people go nine hours without eating overnight all the time and don’t wake up ravenous. There is a natural circadian rhythm to hunger that peaks around 8:00 pm and drops to its lowest level around 8:00 am, as you can see in the graph below and at 2:09 in my video. That may be why breakfast is typically the smallest meal of the day. 

    The circadian rhythm of our appetite isn’t just behavioral, but biological, too. It’s not just that we’re hungrier in the evening because we’ve been running around all day. If you stayed up all night and slept all day, you’d still be hungriest when you woke up that evening. To untangle the factors, scientists used what’s called a “forced desynchrony” protocol. Study participants stayed in a room without windows in constant, unchanging, dim light and slept in staggered 20-hour cycles to totally scramble them up. This went on for more than a week, so the subjects ended up eating and sleeping at different times throughout all phases of the day. Then, the researchers could see if cyclical phenomena are truly based on internal clocks or just a consequence of what you happen to be doing at the time.  

    For instance, there is a daily swing in our core body temperature, blood pressure, hormone production, digestion, immune activity, and almost everything else, but let’s use temperature as an example. As you can see in the graph below and at 3:21 in my video, our body temperature usually bottoms out around 4:00 am, dropping from 98.6°F (37°C) down to more like 97.6°F (36.4°C). Is this just because our body cools down as we sleep? No. By keeping people awake and busy for 24 hours straight, it can be shown experimentally that it happens at about the same time no matter what. It’s part of our circadian rhythm, just like our appetite. It makes sense, then, if you are only eating one meal per day and want to lose weight, you’d want to eat in the morning when your hunger hormones are at their lowest level. 

    Sounds reasonable, but it starts to get weird.

    The Army scientists repeated the experiment, but this time, they had the participants eat exactly 2,000 calories either as breakfast or as dinner, taking appetite out of the picture. The subjects weren’t allowed to exercise either. Same number of calories, so the same change in weight, right? No. As you can see in the graph below and at 4:18 in my video, the breakfast-only group still lost about two pounds a week compared to the dinner-only group. Two pounds of weight loss eating the same number of calories. That’s why this concept of chronobiology, meal timing—when to eat—is so important. 

    Isn’t that wild? Two pounds of weight loss a week eating the same number of calories! That was a pretty extreme study, though. What about just shifting a greater percentage of calories to earlier in the day? That’s the subject of my next video: Breakfast Like a King, Lunch Like a Prince, Dinner Like a Pauper. First, let’s take a break from chronobiology to look at the Benefits of Garlic for Fighting Cancer and the Common Cold. Then, we’ll resume checking other videos in the related posts below.

    If you missed the first three videos in this extended series, also check out related posts below. 

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Are Branched-Chain Amino Acids Good for Us?  | NutritionFacts.org

    Are Branched-Chain Amino Acids Good for Us?  | NutritionFacts.org

    [ad_1]

    I discuss why we may not want to exceed the recommended intake of protein.

    Diabetes isn’t just about the amount of body fat, but also the distribution of body fat. At 0:26 in my video Are BCAA (Branched-Chain Amino Acids) Healthy?, you can view cross-sections of thighs from two different patients using MRI. In the images, the fat shows up as white and the thigh muscle is black. At first glance, you might think the bottom cross-section has more fat since it’s ringed with more white. That is the subcutaneous fat, the fat under the skin. But, if you look at the top cross-section, you’ll see how the middle of the thigh muscle is more marbled with fat, like those really fatty Japanese beef steaks. That is the fat infiltrating into the muscle. In the graph below and at 0:48 in my video, the two cross sections are colored so you can see the different types of fat: the fat infiltrating the muscle in red, the fat between the muscles in green, and subcutaneous fat outside of the muscles and under the skin in yellow. If you add up all three types of fat, both of those thighs actually have the same amount of fat—just distributed differently.

    This seems to be the critical factor in terms of determining insulin resistance, the cause of type 2 diabetes. Researchers found that the subcutaneous adipose tissue, the fat right under the skin, was not associated with insulin resistance. Going back to the two cross sections, as seen below and at 1:20 in my video, it is healthier to have the bottom thigh with the thicker ring of subcutaneous fat but less fat infiltrating muscle than the top thigh with more fat present in the muscle.

    Is it possible a more plant-based diet also affects a more healthful distribution of fat?

    We now know the effect of a vegetarian diet versus a conventional diabetic diet on thigh fat distribution in patients with type 2 diabetes. Researchers took 74 people with diabetes and randomly assigned them to follow either a vegetarian diet or a conventional diabetic diet. Both diets were calorie-restricted by the same number of calories. The vegetarian diet was also egg-free, and dairy was limited to a maximum of one serving of low-fat yogurt a day. What did the researchers find? The reduction in the more benign subcutaneous fat was comparable; it was about the same in both groups. However, the more dangerous fat—the fat lodged inside the muscle itself—“was reduced only in response to a vegetarian diet.” So, even getting the same number of calories, there can be a healthier weight loss on a more plant-based diet.

    Those eating strictly plant-based also had lower levels of fat stuck inside the individual muscle fibers themselves, which may help explain why vegans in particular are often found to have the lowest odds of diabetes. It is not just because vegans are generally slimmer either. Even if you match subjects pound for pound, there is significantly less fat inside the muscle cells of vegans compared to omnivores. This is a good thing, since storing fat in muscle cells “may be one of the primary causes of insulin resistance,” which is what’s behind both prediabetes and type 2 diabetes. On the other hand, if you put someone on a high-fat diet, the fat in their muscle cells shoots up by 54 percent in just a single week.

    What about a high-protein diet? That may undermine one of the principal benefits of weight loss: eliminating the weight-loss-induced improvement in insulin resistance. Researchers put obese individuals on a calorie-restricted diet of less than 1,400 calories a day until they lost 10 percent of their body weight. Half of the participants were getting more of a regular protein intake (73 grams a day), and the other half were on a higher-protein diet (about 105 daily grams). Normally, if you lose 10 percent of your body weight, your insulin resistance improves. That’s why it is so critical for obese individuals with type 2 diabetes to lose weight. However, the beneficial effect of a 10 percent weight loss was eliminated by the high protein intake. Those extra 32 grams of protein a day abolished the weight-loss benefit. “The failure to improve…insulin sensitivity in the WL-HP [weight-loss high-protein] group is clinically important because it reflects a failure to improve a major pathophysiological [cause-and-effect] mechanism involved in the development of T2D,” type 2 diabetes. In summary, the researchers concluded that they demonstrated “the protein content of a weight loss diet can have profound effects on metabolic function.” 

    Is this true of any protein? As you can see below and at 4:19 in my video, if you split it between animal protein versus plant protein, following people over time, intake of animal protein is associated with an increased risk of diabetes in most studies.

    Intake of plant protein, however, appears to have either a neutral or even protective association with diabetes, as shown below and at 4:25 in my video

    Those were just observational studies, though. People who eat a lot of animal protein might have many unhealthy behaviors. However, you see the same thing in randomized, controlled, interventional trials, where you can improve blood sugar control just by replacing sources of animal protein with plant protein.

    We think it may be the branched-chain amino acids concentrated in animal protein. Higher levels in the bloodstream are associated with obesity and the development of insulin resistance. As you can see below and at 5:00 in my video, we may be able to drop our levels by sticking to plant proteins, but you don’t know if that has metabolic effects until you put it to the test. 

    Ruining the suspense, researchers titled their study: “Decreased Consumption of Branched-Chain Amino Acids Improves Metabolic Health.” They demonstrated that “a moderate reduction in total dietary protein or selected amino acids can rapidly improve metabolic health,” and this included improving blood sugar control, while also decreasing body mass index (BMI) and body fat. As you can see at 5:27 in my video, the protein-restricted group was eating hundreds more calories per day, significantly more calories than the control group, so they should have gained weight. But, no. They lost weight! After about a month and a half, they were eating more calories but lost more weight—about five more pounds than participants in the control group who were eating fewer calories, as you can see at 5:38 in my video. What’s more, this “protein restriction” had people eat the recommended amount of protein per day, about 56 daily grams. They should have been called the normal protein group or the recommended protein group instead, and the group eating more typically American protein levels and suffering because of it should have been called the excess protein group. Just sticking to the recommended protein intake doubled the levels of a pro-longevity hormone called FGF21, too, but we’ll save that for another discussion.

    To better understand the negative impact of omnivores getting too much protein relative to vegetarians, see my video Flashback Friday: Do Vegetarians Get Enough Protein?.

    I have several additional videos and blogs that may help explain some of the benefits of plant-based proteins. Check in the related posts below.

    Of course, the best way to treat type 2 diabetes is to get rid of it by treating the underlying cause, as described in my video How Not to Die from Diabetes

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Pre-Cut Vegetables and Endotoxins  | NutritionFacts.org

    Pre-Cut Vegetables and Endotoxins  | NutritionFacts.org

    [ad_1]

    Endotoxins can build up on pre-cut vegetables and undermine some of their benefits.

    You may remember when I introduced the endotoxin theory literature in my video The Exogenous Endotoxin Theory, which sought to explain how a single Sausage and Egg McMuffin meal could cripple artery function within hours of consumption. Maybe it’s because such a meal causes inflammation within hours of consumption by inducing low-grade endotoxemia, endotoxins in the bloodstream, as I previously discussed in my video Dead Meat Bacteria Endotoxemia. Endotoxins are structural components of gram-negative bacteria like E. coli, as you can see below and at 0:35 in my video Are Pre-Cut Vegetables Just as Healthy?. Certain foods, like ground meat, have high bacterial loads, so the thought was that the endotoxins in the food were triggering the inflammation.

    Critics of the theory argued that because we already have so many bacteria living in our colon, so many endotoxins just sitting down in our large intestine, a few more endotoxins in our food wouldn’t matter much in terms of causing systemic inflammation. After all, we have about two pounds of pure bacteria down there where the sun don’t shine, so there could be about a whole ounce of endotoxin. The lethal dose of intravenously injected endotoxin can be just a few millionths of a gram, so we could have a million lethal doses down there. However, the apparent paradox is explained by compartmentalization. It’s location, location, location.

    Poop is harmless when it’s in your colon, but it shouldn’t be injected into your bloodstream or eaten for that matter, particularly with fat, as that can promote the absorption of endotoxins in the small intestine. That goes for well-cooked poop, too.

    As you can see in the graph below and at 1:44 in my video, you can boil endotoxins for two hours straight with no detriment in their ability to induce inflammation. You could easily kill off any bacteria if you boiled your poop soup long enough, but you can’t kill off the endotoxins they make, just like you can’t cook the crap out of the meat. The consumption of meat contaminated with feces doesn’t just cause food poisoning. It can spill out onto the animal’s skin during the evisceration process when the digestive tract is ruptured. 

    Even when slaughterhouse workers trim off “visible fecal contamination,” the trimming itself can, ironically, sometimes lead to an increase in certain fecal bacteria, thought to be caused by “cross-contamination resulting from the handling to removal fecal contamination” from one carcass to the next. Then, even when properly stored in the fridge, endotoxins start accumulating along with the bacterial growth, as you can see in the graph below and at 2:30 in my video

    What about other foods? The highest levels of endotoxins were found in meat and dairy, and the lowest levels in fresh fruits and vegetables. That was testing whole fruits and vegetables, though. “Most spoilage organisms cannot penetrate the plant’s surface barrier and spoil the inner tissues.” That’s why fruits and veggies can sit out in the fields all day in the sun. But, once you cut them open, bacteria can gain access to the inner tissues, and, within a matter of days, your veggies can start to spoil. So, what does that mean for all those convenient pre-cut veggies these days?

    While endotoxins were not detectable in the majority of unprocessed vegetables, once you damage the protective outer layers of vegetables, you diminish their resistance to microbial growth. So, while freshly cut carrots and onions start with undetectable levels, day after day after they’ve been chopped, you start to get the growth of bacteria and, along with them, endotoxin buildup—even if they’ve been kept chilled in the fridge. Not as much as meat, but not insignificant either, as you can see in the graph below and at 3:27 in my video. Enough to make a difference, though? You don’t know until you put it to the test.

    What would happen if you switched people between foods expected to have a lower endotoxin load to foods containing more endotoxins? For instance, going from intact meat, such as a steak, and whole fruits and vegetables, to more like ground beef, pre-cut veggies, and more ready-made meals, as shown below and at 3:39 in my video. After just one week on the lower-endotoxin diet, people’s white blood cell count, which is an indicator of total body inflammation, dropped by 12 percent, then bumped back up by 14 percent after just four days on the higher-endotoxin diet. They also lost a pound and a half on the lower-endotoxin diet and slimmed their waists a bit. 

    They weren’t eating otherwise identical diets, though. It looks like they were eating more meat and cheese on the higher-endotoxin diet and perhaps getting more food additives in the ready-made meals. So, how do we know endotoxins had anything to do with it? That’s where the onion study comes in. Another study was designed based on two meals that differed in their content of bacterial products but were otherwise nutritionally identical. So, researchers compared freshly chopped onion to prechopped onion that had been refrigerated for a few days. The pre-chopped onion wasn’t spoiled; it was still before the “best before” date. So, would it make any difference?

    Within three hours of consumption, the fresh onion meal caused significant reductions in several markers of inflammation. That’s what fruits and vegetables do—they reduce inflammation—but these effects were not observed after eating the pre-chopped onions. For example, three hours after eating freshly chopped onions, researchers saw a significant drop in inflammatory status, but there was no significant change three hours after eating the same amount of pre-chopped onions, as you can see in the graph below and at 5:06 in my video. So, it’s not like the pre-chopped onions caused more inflammation, like in the meat, eggs, and dairy studies, but it did appear that some of the onion’s anti-inflammatory effects were extinguished. “In conclusion, the modern trend towards eating minimally processed vegetables”—pre-cut vegetables—“rather than whole [intact] foods is likely to be associated with increased oral endotoxin exposure.” It’s still better to eat pre-cut veggies than no veggies, but cutting your own might be the healthiest.

     For some other practical veggie videos and blogs check out the related posts below. 

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Are Fortified Children’s Breakfast Cereals Just Candy?  | NutritionFacts.org

    Are Fortified Children’s Breakfast Cereals Just Candy?  | NutritionFacts.org

    [ad_1]

    The industry responds to the charge that breakfast cereals are too sugary.

    In 1941, the American Medical Association’s Council on Foods and Nutrition was presented with a new product, Vi-Chocolin, a vitamin-fortified chocolate bar, “offered ostensibly as a specialty product of high nutritive value and of some use in medicine, but in reality intended for promotion to the public as a general purpose confection, a vitaminized candy.” Surely, something like that couldn’t happen today, right? Unfortunately, that’s the sugary cereal industry’s business model.

    As I discuss in my video Are Fortified Kids’ Breakfast Cereals Healthy or Just Candy?, nutrients are added to breakfast cereals “as a marketing gimmick to “create an aura of healthfulness…If those nutrients were added to soft drinks or candy, would we encourage kids to consume them more often?” Would we feed our kids Coke and Snickers for breakfast? We might as well spray cotton candy with vitamins, too. As one medical journal editorial read, “Adding vitamins and minerals to sugary cereals…is worse than useless. The subtle message accompanying such products is that it is safe to eat more.”

    General Mills’ “Grow up strong with Big G kids’ cereals” ad campaign featured products like Lucky Charms, Trix, and Cocoa Puffs. That’s like the dairy industry promoting ice cream as a way to get your calcium. Kids who eat presweetened breakfast cereals may get more than 20 percent of their daily calories from added sugar, as you can see below and at 1:28 in my video

    Most sugar in the American diet comes from beverages like soda, but breakfast cereals represent the third largest food source of added sugars in the diets of children and adolescents, wedged between candy and ice cream. On a per-serving basis, there is more added sugar in a cereal like Frosted Flakes than there is in frosted chocolate cake, a brownie, or even a frosted donut, as you can see below and at 1:48 in my video

    Kellogg’s and General Mills argue that breakfast cereals only contribute a “relatively small amount” of sugar to the diets of children, less than soda, for example. “This is a perfect example of the social psychology phenomenon of ‘diffusion of responsibility.’ This behavior is analogous to each restaurant in the country arguing that it should not be required to ban smoking because it alone contributes only a tiny fraction to Americans’ exposure to secondhand smoke.” In fact, “each source of added sugar…should be reduced.”

    The industry argues that most of their cereals have less than 10 grams of sugar per serving, but when Consumer Reports measured how much cereal youngsters actually poured for themselves, they were found to serve themselves about 50 percent more than the suggested serving size for most of the tested cereals. The average portion of Frosted Flakes they poured for themselves contained 18 grams of sugar, which is 4½ teaspoons or 6 sugar packets’ worth. It’s been estimated that a “child eating one serving per day of a children’s cereal containing the average amount of sugar would consume nearly 1,000 teaspoons of sugar in a year.”

    General Mills offers the “Mary Poppins defense,” arguing that those spoonsful of sugar can “help the medicine go down” and explaining that “if sugar is removed from bran cereal, it would have the consistency of sawdust.” As you can see below and at 3:17 in my video, a General Mills representative wrote that the company is presented “with an untenable choice between making our healthful foods unpalatable or refraining from advertising them.” If it can’t add sugar to its cereals, they would be unpalatable? If one has to add sugar to a product to make it edible, that should tell us something. That’s a characteristic of so-called ultra-processed foods, where you have to pack them full of things like sugar, salt, and flavorings “to give flavor to foods that have had their [natural] intrinsic flavors processed out of them and to mask any unpleasant flavors in the final product.” 

    The president of the Cereal Institute argued that without sugary cereals, kids might not eat breakfast at all. (This is similar to dairy industry arguments that removing chocolate milk from school cafeterias may lead to students “no longer purchasing school lunch.”) He also stressed we must consider the alternatives. As Kellogg’s director of nutrition once put it: “I would suggest that Fruit [sic] Loops as a snack are much better than potato chips or a sweet roll.” You know there’s a problem when the only way to make your product look good is to compare it to Pringles and Cinnabon.

    Want a healthier option? Check out my video Which Is a Better Breakfast: Cereal or Oatmeal?.

    For more on the effects of sugar on the body and if you like these more politically charged videos see the related posts below.

    Finally, for some additional videos on cereal, see Kids’ Breakfast Cereals as Nutritional Façade and Ochratoxin in Breakfast Cereals.

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Like Marijuana, Sleep Deprivation Can Trigger The Munchies

    Like Marijuana, Sleep Deprivation Can Trigger The Munchies

    [ad_1]

    Up to 75 million of adults in Canada and the US suffer sleep issues with almost 40% unexpectedly dozing off at some point during the day at least once a month. Sleep deficiency can disrupt our mental process making work, school, driving, and social functioning more challenging. And you could wind up gaining way, study show like marijuana, sleep deprivation can trigger the munchies.

    Previous research has linked not getting enough sleep with nighttime snacking and junk food cravings. But a study published in eLife journal examined the neural pathways connected munchie symptoms and sleep deprivation. When individuals received only four hours of sleep instead of the recommended eight hours, it increased certain compounds in the body’s endocannabinoid system that want high-calorie foods.

    RELATED: Lower Doses Of Marijuana Might Improve Your Sex Drive

    The endocannabinoid system, or ECS for short, regulates various biological processes to like sleep, appetite, internal temperature, and more. All mammals have an ECS, not just humans. These functions receive adjustments through endocannabinoids, which your body creates naturally and similar to the phytocannabinoids in marijuana. When you smoke marijuana, it activates these receptors, making your body crave high-calorie foods. According to this study, sleep deprivation works much in the same way.

    To begin the study, researchers asked 25 subjects to receive seven to nine hours of sleep each night for a week. The following week researchers randomly assigned half of the participants to sleep four hours certain nights and kept the other half on standard sleep schedules. In a cruel temptation, researchers then had everyone from the study eat from a buffet, where their food choices were monitored, included what foods they ate and how much of it.

    Photo by Flickr user Jenn Durfey

    Sleep deprivation didn’t necessarily increase the amount of food participants ate, the study found. But it did affect the kinds of foods they chose, often opting for fattier and higher-calorie foods.

    “Importantly, effects of sleep deprivation on dietary behavior persisted into the next day (after a night of unrestricted recovery sleep), with a higher percentage of calories consumed,” researchers wrote.

    RELATED: Cannabis Flavonoid Could Provide Breakthrough Treatment Against Pancreatic Cancer

    Scientists also conducted fMRI scans regularly throughout the study to track the brain’s olfactory system. Participants were exposed to a variety of odors and were tracked for their reactions. Those who were sleep deprived had far stronger reactions to food odor than all other odors.

    “Taken together, our findings show that sleep-dependent changes in food choices are associated with changes in an olfactory pathway that is related to the ECS,” researchers wrote. “This pathway is likely not restricted to sleep-dependent changes in food intake but may also account for dietary decisions more generally. In this regard, our current findings may help to guide the identification of novel targets for treatments of obesity.”

    [ad_2]

    Brendan Bures

    Source link

  • Restricting Calories for Longevity?  | NutritionFacts.org

    Restricting Calories for Longevity?  | NutritionFacts.org

    [ad_1]

    Though a bane for dieters, a slower metabolism may actually be a good thing.

    We’ve known for more than a century that calorie restriction can increase the lifespan of animals, and metabolic slowdown may be the mechanism. That could be why the tortoise lives ten times longer than the hare. Rabbits can live for 10 to 20 years, whereas “Harriet,” a tortoise “allegedly collected from the Galapagos Islands by Charles Darwin, was estimated to be about 176 years old when she died in 2006.” Slow and steady may win the race. 
     
    As I discuss in my video The Benefits of Calorie Restriction for Longevity, one of the ways our body lowers our resting metabolic rate is by creating cleaner-burning, more efficient mitochondria, the power plants that fuel our cells. It’s like our body passes its own fuel-efficiency standards. These new mitochondria create the same energy with less oxygen and produce less free radical “exhaust.” After all, when our body is afraid famine is afoot, it tries to conserve as much energy as it can. 
     
    Indeed, the largest caloric restriction trial to date found metabolic slowing and a reduction in free radical-induced oxidative stress, both of which may slow the rate of aging. The flame that burns twice as bright burns half as long. But, whether this results in greater human longevity is an unanswered question. Caloric restriction is often said “to extend lifespan in every species studied,” but that isn’t even true of all strains within a single species. Two authors of one article, for instance, don’t even share the same view: One doesn’t think calorie restriction will improve human longevity at all, while the other suggests that a 20 percent calorie restriction starting at age 25 and sustained for 52 years could add five years onto your life. Either way, the reduced oxidative stress would be expected to improve our healthspan. 
     
    Members of the Calorie Restriction Society, self-styled CRONies (for Calorie-Restricted Optimal Nutrition), appear to be in excellent health, but they’re a rather unique, self-selected group of individuals. You don’t really know until you put it to the test. Enter the CALERIE study, the Comprehensive Assessment of Long-Term Effects of Reducing Intake of Energy, the first clinical trial to test the effects of caloric restriction. 
     
    Hundreds of non-obese men and women were randomized to two years of 25 percent calorie restriction. They only ended up achieving half that, yet they still lost about 18 pounds and three inches off their waists, wiping out more than half of their visceral abdominal fat, as you can see in the graph below and at 2:47 in my video

    That translated into significant improvements in cholesterol levels, triglycerides, insulin sensitivity, and blood pressure, which you can see in the graph below and at 2:52 in my video. Eighty percent of those who were overweight when they started were normal-weight by the end of the trial, “compared with a 27% increase in those who became overweight in the control group.” 

    In the famous Minnesota Starvation Study that used conscientious objectors as guinea pigs during World War II, the study subjects suffered both physically and psychologically, experiencing depression, irritability, and loss of libido, among other symptoms. The participants started out lean, though, and had their calorie intake cut in half. The CALERIE study ended up being four times less restrictive, only about 12 percent below baseline calorie intake, and enrolled normal-weight individuals, which in the United States these days means overweight on average. As such, the CALERIE trial subjects experienced nothing but positive quality-of-life benefits, with significant improvements in mood, general health, sex drive, and sleep. They only ended up eating about 300 fewer calories a day than they had eaten at baseline. So, they got all of these benefits—the physiological benefits and the psychological benefits—just from cutting about a small bag of chips’ worth of calories from their daily diets. 
     
    What happened at the end of the trial, though? As researchers saw in the Minnesota Starvation Study and in calorie deprivation experiments done on Army Rangers, as soon as the subjects were released from restriction, they tended to rapidly regain the weight and sometimes even more, as you can see below and at 4:18 in my video

    The leaner they started out, the more their bodies seemed to drive them to overeat to pack back on the extra body fat, as seen in the graph below and at 4:27 in my video. In contrast, after the completion of the CALERIE study, even though their metabolism was slowed, the participants retained about 50 percent of the weight loss two years later. They must have acquired new eating attitudes and behaviors that allowed them to keep their weight down. After extended calorie restriction, for example, cravings for sugary, fatty, and junky foods may actually go down. 
    This is part of my series on calorie restriction, intermittent fasting, and time-restricted eating. See related videos below.

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Sugar and Gaining Weight  | NutritionFacts.org

    Sugar and Gaining Weight  | NutritionFacts.org

    [ad_1]

    The sugar industry responds to evidence implicating sweeteners in the obesity epidemic. 
     
    In terms of excess body fat, the “well-documented obesity epidemic may merely be the tip of the overfat iceberg.” It’s been estimated that 91 percent of adults—nine out of ten of us—and 69 percent of children in the United States are overfat, a condition defined as having “excess body fat sufficient to impair health.” This can occur even in individuals who are “normal-weight and non-obese, often due to excess abdominal fat.” The way to tell if you’re overfat is if your waist circumference is more than half your height. What’s causing this epidemic? As I discuss in my video Does Sugar Lead to Weight Gain?, one primary cause may be all the added sugars we’re eating
     
    A century ago, sugar was heralded as one of the cheapest forms of calories in the diet. Just ten cents’ worth of sugar could furnish thousands of calories. Dr. Fredrick Stare, “Harvard’s sugar-pushing nutritionist,” bristled at the term “empty calories,” writing that the calories in sugar were “not empty but full of energy”—in other words, full of calories, which we are now getting too much of. The excess bodyweight of the U.S. population corresponds to about a daily 350- to 500-calorie excess on average. So, “to revert the obesity epidemic,” that’s how many calories we have to reduce, but which calories should we cut? As you can see below and at 1:33 in my video, the majority of Americans who fail to meet the Dietary Guidelines’ sugar limit get about that many calories in added sugars every day: Twenty-five teaspoons’ worth of added sugars is about 400 calories. 

    There are die-hard sugar defenders. James Rippe, for example, was reportedly paid $40,000 a month by the high fructose corn syrup industry—and that was on top of the $10 million it paid for his research. Even Dr. Rippe considers it “undisputable that sugars…contribute to obesity. It is also undisputable that sugar reduction…should be part of any weight loss program.” And, of all sources of calories to limit, since sugar is just empty calories and contains no essential nutrients, “reducing sugar consumption is obviously the place to start.” And, again, this is what the researchers funded by the likes of Dr. Pepper and Coca-Cola are saying. The primary author of “Dietary Sugar and Body Weight: Have We Reached a Crisis in the Epidemic of Obesity and Diabetes?…,” Richard Kahn, is infamous for his defense of the American Beverage Association—the soda industry—and he was the chief science officer at the American Diabetes Association when it signed a million-dollar sponsorship deal with the world’s largest candy company. “Maybe the American Diabetes Association should rename itself the American Junk Food Association,” said the director of a consumer advocacy group. What do you expect from an organization that was started with drug industry funding? 
     
    The bottom line is that “randomised controlled trials show that increasing sugars intake increases energy [calorie] intake” and “increasing sugar intake leads to body weight gain in adults, and…sugar reduction leads to body weight loss in children.” For example, when researchers randomized individuals to either increase or decrease their intake of table sugar, the added sugar group gained about three and a half pounds over ten weeks, whereas the reduced sugar group lost about two and a half pounds. A systematic review and meta-analysis of all such ad libitum diet studies—real-life studies where sugar levels were changed but people could otherwise eat whatever they wanted—found that reduced intake of dietary sugars resulted in a decrease in body weight, whereas “increased sugars intake was associated with a comparable weight increase.” The researchers found that, “considering the rapid weight gain that occurs after an increased intake of sugars, it seems reasonable to conclude that advice relating to sugars intake is a relevant component of a strategy to reduce the high risk of overweight and obesity in most countries.” That is, it’s reasonable to advise people to cut down on their sugar consumption. 
     
    Findings from observational studies have been “more ambiguous,” though, with an association found between obesity and intake of sweetened beverages, but failing to show consistent correlations with consumption of sugary foods. Most such studies rely on self-reported data, however, and “it is likely that this has introduced bias, especially as underreporting of diet has been found to be more prevalent among obese people and it is sugar-rich foods that are most commonly underreported.” However, one can measure trace sucrose levels in the urine, which gives an objective measure of actual sugar intake and also excludes contributions from other sweeteners such as high fructose corn syrup. When researchers did this, they discovered that, indeed, sugar intake is not only associated with greater odds of obesity and greater waist circumference on a snapshot-in-time cross-sectional basis, but that was also seen in a prospective cohort study over time. “Using urinary sucrose as the measure of sucrose intake,” researchers found that “participants in the highest v. the lowest quintile [fifth] for sucrose intake had 54% greater risk of being overweight or obese.” 
     
    Denying evidence that sugars are harmful to health has always been at the heart of the sugar industry’s defense.” But when the evidence is undeniable, like the link between sugar and cavities, it switches from denial to deflection, like trying to pull attention away from restricting intake to coming up with some kind of “vaccine against tooth decay.” We seem to have reached a similar point with obesity, with the likes of the Sugar Bureau switching from denial to deflection by commissioning research suggesting that obese individuals would not benefit from losing weight, a stance contradicted by hundreds of studies across four continents involving more than ten million participants. 
     
    For more on Big Sugar’s influence, check out Sugar Industry Attempts to Manipulate the Science
     
    You may also be interested in some of my other popular videos on sugar. See related videos below.

    [ad_2]

    Michael Greger M.D. FACLM

    Source link

  • Eating Fast Is Bad for You—Right?

    Eating Fast Is Bad for You—Right?

    [ad_1]

    For as long as I have been feeding myself—which, for the record, is several decades now—I have been feeding myself fast. I bite big, in rapid succession; my chews are hasty and few. In the time it takes others to get through a third of their meal, mine is already gone. You could reasonably call my approach to eating pneumatic, reminiscent of a suction-feeding fish or a Roomba run amok.

    Where my vacuuming mouth goes, advice to constrain it follows. Internet writers have declared slowness akin to slimness; self-described “foodies” lament that there’s “nothing worse” than watching a guest inhale a painstakingly prepared meal. There are even children’s songs that warn against the perils of eating too fast. My family and friends—most of whom have long since learned to avoid “splitting” entrees with me—often comment on my speed. “Slow down,” one of my aunts fretted at a recent meal. “Don’t you know that eating fast is bad for you?”

    I do, or at least I have heard. Over the decades, a multitude of studies have found that people who eat faster are more likely to consume more calories and carry more weight; they’re also more likely to have high blood pressure and diabetes. “The data are very robust,” says Kathleen Melanson of the University of Rhode Island; the evidence holds up when researchers look across geographies, genders, and age. The findings have even prompted researchers to conduct eating-speed interventions, and design devices—vibrating forks and wearable tech—that they hope will slow diners down.

    But the widespread mantra of go slower probably isn’t as definitive or universal as it at first seems. Fast eaters like me aren’t necessarily doomed to metabolic misfortune; many of us can probably safely and happily keep hoovering our meals. Most studies examining eating speed rely on population-level observations taken at single points in time, rather than extended clinical trials that track people assigned to eat fast or slow; they can speak to associations between pace and certain aspects of health, but not to cause and effect. And not all of them actually agree on whether protracted eating boosts satisfaction or leads people to eat less. Even among experts, “there is no consensus about the benefits of eating slow,” says Tany E. Garcidueñas-Fimbres, a nutrition researcher at Universitat Rovira i Virgili, in Spain, who has studied eating rates.

    The idea that eating too fast could raise certain health risks absolutely does make sense. The key, experts told me, is the potential mismatch between the rate at which we consume nutrients and the rate at which we perceive and process them. Our brain doesn’t register fullness until it’s received a series of cues from the digestive tract: chewing in the mouth, swallowing down the throat; distension in the stomach, transit into the small intestine. Flood the gastrointestinal tract with a ton of food at once, and those signals might struggle to keep pace—making it easier to wolf down more food than the gut is asking for. Fast eating may also inundate the blood with sugar, risking insulin resistance—a common precursor to diabetes, says Michio Shimabukuro, a metabolism researcher at Fukushima Medical University, in Japan.

    The big asterisk here is that a lot of these ideas are still theoretical, says Janine Higgins, a pediatric endocrinologist at the University of Colorado Anschutz Medical Campus, who’s studied eating pace. Research that merely demonstrates an association between fast eating and higher food intake cannot prove which observation led to the other, if there’s a causal link at all. Some other factor—stress, an underlying medical condition, even diet composition—could be driving both. “The good science is just completely lacking,” says Susan Roberts, a nutrition researcher at Tufts University.

    Scientists don’t even have universal definitions of what “slow” or “fast” eating is, or how to measure it. Studies over the years have used total meal time, chew speed, and other metrics—but all have their drawbacks. Articles sometimes point to a cutoff of 20 minutes per meal, claiming that’s how long the body takes to feel full. But Matthew Hayes, a nutritional neuroscientist at the University of Pennsylvania, criticized that as an oversimplification: Satisfaction signals start trickling into the brain almost immediately when we eat, and fullness thresholds vary among people and circumstances. Studies that ask volunteers to rate their own speeds have issues too: People often compare themselves with friends and family, who won’t represent the population at large. Eating rate can also fluctuate over a lifetime or even a day, depending on hunger, stress, time constraints, the pace of present company, even the tempo of background music.

    In an evolutionary sense, all of us humans eat absurdly fast. We eat “orders of magnitude quicker” than our primate relatives, just over one hour a day compared with their almost 12, says Adam van Casteren, a feeding ecologist at the University of Manchester, in England. That’s thanks largely to how we treat our food: Fire, tools such as knives, and, more recently, chemical processing have softened nature’s raw ingredients, liberating us from “the prison of mastication,” as van Casteren puts it. Modern Western diets have taken that pattern to an extreme. They’re chock-full of ultra-processed foods, so soft and sugar- and fat-laden that they can be gulped down with nary a chew—which could be one of the factors that drive faster eating and chronic metabolic ills.

    In plenty of circumstances, slowing down will come with perks, not least because it could curb the risk of choking or excess gas. It could also temper blood-sugar spikes in people with diets heavy in processed foods—which whiz through the digestive tract, Roberts told me, though the healthier move would probably be eating fewer of those foods to begin with. And some studies focused on people with high BMI, including Melanson’s, have shown that eating slower can aid weight loss. But, she cautioned, those results won’t necessarily apply to everyone.

    The main impact of leisurely eating may not even be about chewing rates or bite size per se, but about helping people eat more mindfully. “A lot of us are distracted when we eat,” says Fatima Cody Stanford, an obesity-medicine physician at Massachusetts General Hospital. “And so we are missing our hunger and satiety cues.” In countries such as the United States, people also have to wrestle with the immense pressure “to be done with lunch really fast,” Herman Pontzer, an anthropologist at Duke University, told me. Couple that with the fast foods we tend to reach for, and maybe it’s no shock that people don’t feel satisfied as they scarf down their meals.

    The point here isn’t to demonize slow eating; in the grand scheme of things, it seems a pretty healthful thing to do. At the same time, that doesn’t mean that “eat slow” should be a blanket command. For people already eating a lot of high-fiber foods—which the body naturally processes ploddingly—Roberts doesn’t think sluggish chewing has much to add. The extolling of slow eating is, at best, “a half truth,” Hayes told me, that’s become easy to exploit.

    I do feel self-conscious when I’m the first person at the table to finish by a mile, and I don’t enjoy the stares and the comments about my “big appetite.” Certain super-slow eaters might get teased for making others wait, but they’re generally not getting chastised for ruining their health. When I asked experts if it was harmful to eat too slowly, several of them told me they’d never even considered it—and that the answer was probably no.

    Still, for the most part, I’m happy to be the Usain Bolt of chewing. My hot foods stay hot, and my cold foods stay cold. I’ve intermittently tried slow eating over the years, deploying some of the usual tricks: smaller utensils, tinier bites, crunchier foods. I even, once, tried to count my chews. The biggest difference I felt, though, wasn’t fullness or more satisfaction; I just kind of hated the way that my mushy food lingered in my mouth.

    Maybe if I’d stuck with slow eating, I would have lost some gassiness, choking risk, or weight—but also, I think, some joy. There’s something to speed-eating that can be plain old fun, akin to the rush of zooming down an empty highway in a red sports car. If I have just an hour-ish (or, knowing me, less) of eating each day, I’d prefer to relish every brisk, indecorous bite.

    [ad_2]

    Katherine J. Wu

    Source link

  • Moms Eating Ultra-Processed Food Raises Kids’ Obesity Risk

    Moms Eating Ultra-Processed Food Raises Kids’ Obesity Risk

    [ad_1]

    Oct. 7, 2022 Moms who consume ultra-processed food during pregnancy may contribute to their children being obese or overweight in childhood and adolescence, a new study suggests. 

    Among the 19,958 mother-child pairs studied, 12.4% of children developed obesity or overweight in the full study group, and the children of those mothers who ate the most ultra-processed foods (12.1 servings/day) had a 26% higher risk of obesity/overweight, compared with those with the lowest consumption (3.4 servings/day), reports Andrew T. Chan, MD, a professor of medicine at Harvard Medical School, and colleagues. 

    The results were published online in the journal BMJ

    The study shows the potential benefits of limiting ultra-processed food during reproductive years to decrease the risk of childhood obesity, the study authors note. Ultra-processed foods, such as packaged baked goods and snacks, fizzy drinks, and sugary cereals, which are linked to an increase in adult weight, are frequently included in modern Western diets.  

    But the relationship between parents eating highly processed meals and their children’s weight is unclear across generations, the researchers note. 

    “Overall awareness of the importance of diet in one’s personal health, as well as in the health of their families, is something that we hope will be a source of change, and certainly does start with promoting and educating people about the importance of diet during those critical periods,” Chan said in an interview.

    He said it is important not to blame mothers for their kids’ health, as there are other things at play beyond just education. “It requires a concerted effort to ensure that we break down the social and economic barriers to access to healthy foods so that it becomes actually feasible for many women to be able to have access to a diet that will promote health for both themselves and their kids.”

    Does Eating Ultra-Processed Food During Pregnancy Make Kids Obese?

    In this study, investigators looked at whether eating ultra-processed food throughout pregnancy and while raising kids increased the likelihood of children and teens being overweight or obese.

    The study team evaluated 14,553 mothers and their 19,958 children using data collected from two large studies. Males comprised 45% of the children in the cohort. The children spanned from 7 to 17 years of age.

    Childhood obesity or overweight has been linked to maternal consumption of highly processed meals during child-rearing. 

    “We know that lifestyle during pregnancy is important for not only the health of the baby, but also the health of the mother. So, it does represent an opportunity for people to think critically about what they can do to really optimize their health, and it becomes a period of time where people are maybe thinking a little bit more about their health and are more open to new dietary counseling and also more motivated to effect change,” Chan says.

    It’s important for women to consider their diet, Chan says. Women need to take into account “what kinds of foods they are eating and, if possible, try to avoid ultra-processed foods that have very refined ingredients and a lot of additives and preservatives, because they tend to really have a higher content of those dietary factors that we think lead to overweight and obesity,” he says.

    Physical activity is also important during the reproductive years and pregnancy, and people should aim to sustain physical activity during pregnancy and beyond, Chan notes. 

    The findings may be limited, as they were based on self-reported questionnaires and some mother-children pairs stopped taking part in the study during follow-up. Most of the mothers were from similar personal and family educational backgrounds, had comparable social and economic backgrounds, and were primarily white, which limits how this study can apply to other groups of people, the researchers noted. 

    Staying healthy isn’t something that you should really start doing in middle age or late adulthood, it is really something that should be promoted at a young age, and certainly during young adulthood, because of the influence that it has on your long-term health, but also the potential influence it might have on your family’s,” Chan says.

    [ad_2]

    Source link