ReportWire

Tag: Regular people

  • California’s exodus isn’t just billionaires — it’s regular people renting U-Hauls, too

    [ad_1]

    It isn’t just billionaires leaving California.

    Anecdotal data suggest there is also an exodus of regular people who load their belongings into rental trucks and lug them to another state.

    U-Haul’s survey of the more than 2.5 million one-way trips using its vehicles in the U.S. last year showed that the gap between the number of people leaving and the number arriving was higher in California than in any other state.

    While the Golden State also attracts a large number of newcomers, it has had the biggest net outflow for six years in a row.

    Generally, the defectors don’t go far. The top five destinations for the diaspora using U-Haul’s trucks, trailers and boxes last year were Arizona, Nevada, Oregon, Washington and Texas.

    California experienced a net outflow of U-Haul users with an in-migration of 49.4%, and those leaving of 50.6%. Massachusetts, New York, New Jersey and Illinois also rank among the bottom five on the index.

    U-Haul didn’t speculate on the reasons California continues to top the ranking.

    “We continue to find that life circumstances — marriage, children, a death in the family, college, jobs and other events — dictate the need for most moves,” John Taylor, U-Haul International president, said in a press statement.

    While California’s exodus was greater than any other state, the silver lining was that the state lost fewer residents to out-of-state migration in 2025 than in 2024.

    U-Haul said that broadly the hotly debated issue of blue-to-red state migration, which became more pronounced after the pandemic of 2020, continues to be a discernible trend.

    Though U-Haul did not specify the reasons for the exodus, California demographers tracking the trend point to the cost of living and housing affordability as the top reasons for leaving.

    “Over the last dozen years or so, on a net basis, the flow out of the state because of housing [affordability] far exceeds other reasons people cite [including] jobs or family,” said Hans Johnson, senior fellow at the Public Policy Institute of California.

    “This net out migration from California is a more than two-decade-long trend. And again, we’re a big state, so the net out numbers are big,” he said.

    U-Haul data showed that there was a pretty even split between arrivals and departures. While the company declined to share absolute numbers, it said that 50.6% of its one-way customers in California were leaving, while 49.4% were arriving.

    U-Haul’s network of 24,000 rental locations across the U.S. provides a near-real-time view of domestic migration dynamics, while official data on population movements often lags.

    California’s population grew by a marginal 0.05% in the year ending July 2025, reaching 39.5 million people, according to the California Department of Finance.

    After two consecutive years of population decline following the 2020 pandemic, California recorded its third year of population growth in 2025. While international migration has rebounded, the number of California residents moving out increased to 216,000, consistent with levels in 2018 and 2019.

    Eric McGhee, senior fellow at the Public Policy Institute of California, who researches the challenges facing California, said there’s growing evidence of political leanings shaping the state’s migration patterns, with those moving out of state more likely to be Republican and those moving in likely to be Democratic.

    “Partisanship probably is not the most significant of these considerations, but it may be just the last straw that broke the camel’s back, on top of the other things that are more traditional drivers of migration … cost of living and family and friends and jobs,” McGhee said.

    Living in California costs 12.6% more than the national average, according to the U.S. Bureau of Economic Analysis. One of the biggest pain points in the state is housing, which is 57.8% more expensive than what the average American pays.

    The U-Haul study across all 50 states found that 7 of the top 10 growth states where people moved to have Republican governors. Nine of the states with the biggest net outflows had Democrat governors.

    Texas, Florida and North Carolina were the top three growth states for U-Haul customers, with Dallas, Houston and Austin bagging the top spots for growth in metro regions.

    A notable exception in California was San Diego and San Francisco, which were the only California cities in the top 25 metros with a net inflow of one-way U-Haul customers.

    [ad_2]

    Nilesh Christopher

    Source link

  • BMI Won’t Die

    BMI Won’t Die

    [ad_1]

    If anything defines America’s current obesity-drug boom, it’s this: Many more people want these injections than can actually get them. The roadblocks include exorbitant costs that can stretch beyond $1,000 a month, limited insurance coverage, and constant supply shortages. But before all of those issues come into play, anyone attempting to get a prescription will inevitably confront the same obstacle: their body mass index, or BMI.

    So much depends on the simple calculation of dividing one’s weight by the square of their height. According to the FDA, people qualify for prescriptions of Wegovy and Zepbound—the obesity-drug versions of the diabetes medications Ozempic and Mounjaro—only if their BMI is 3o or higher, or 27 or higher with a weight-related health issue such as hypertension. Many who do get on the medication use BMI to track their progress. That BMI is the single biggest factor determining who gets prescribed these drugs, and who doesn’t, is the result of how deeply entrenched this metric has become in how both doctors and regular people approach health: Low BMI is good and high BMI is bad, or so most of us have come to think.

    This roughly 200-year-old metric has never been more relevant—or maligned—than it is in the obesity-drug era. BMI has become like the decrepit car you keep driving because it still sort of works and is too much of a hassle to replace. Its numerous shortcomings have been called out for many years now: For starters, it accounts for only height and weight, not other, more pertinent measures such as body-fat percentage. In June, the American Medical Association formally recognized that BMI should not be used alone as a health measure. Last year, some doctors called for BMI to be retired altogether, echoing previous assertions.

    The thing is, BMI can be an insightful health metric, but only when used judiciously with other factors. The problem is that it often hasn’t been. Just as obesity drugs are taking off, however, professional views are changing. People are so accustomed to seeing BMI as the “be-all, end-all” of health indicators, Kate Bauer, a nutritional-sciences professor at the University of Michigan, told me. “But that’s increasingly not the way it’s being used in clinical practice.” A shift in the medical field is a good start, but the bigger challenge will be getting everyone else to catch up.

    BMI got its start in the 1830s, when a Belgian astronomer named Adolphe Quetelet attempted to determine the properties of the “average” man. Using data on primarily white people, he observed that weight tended to vary as the square of height—a calculation that came to be known as Quetelet’s index.

    Over the next 150 years, what began as a descriptive tool transformed into a prescriptive one. Quetelet’s index (and other metrics like it) informed height-weight tables used by life-insurance companies to estimate risk. These sorts of tables formed “recommendations for the general population going from ‘average’ to ‘ideal’ weights,” the epidemiologist Katherine Flegal wrote in her history of BMI; eventually, nonideal weights were classified as “overweight” and “obese.” In 1972, the American physiologist Ancel Keys proposed using Quetelet’s index—which he renamed BMI—to roughly measure obesity. We’ve been stuck with BMI ever since. The metric became embedded not only in research and doctor’s visits but also in the very definitions of obesity. According to the World Health Organization, a BMI starting at 25 and less than 30 is considered overweight; anything above that range is obese.

    But using BMI to categorize a person’s health was controversial from the start. Even Keys called it “scientifically indefensible” to use BMI to judge someone as overweight. BMI doesn’t account for where fat is distributed on the body; fat that builds up around organs and tissues, called visceral fat, is linked to serious medical issues, while fat under the skin—the kind you can pinch—is usually less of a problem. Muscularity is also overlooked: LeBron James, for example, would be considered overweight. Both fat distribution and muscularity can vary widely across sex, age, and ethnicity. People with high BMIs can be perfectly healthy, and “there are people with normal BMIs that are actually sick because they have too much body fat,” Angela Fitch, an assistant professor at Harvard Medical School and the president of the Obesity Medicine Association, told me.

    For all its flaws, BMI is actually useful at the population level, Fitch said, and doctors can measure it quickly and cheaply. But BMI becomes troubling when it is all that doctors see. In some cases, the moment when a patient’s BMI is calculated by their doctor may shape the rest of the appointment and relationship going forward. “The default is to hyper-focus on the weight number, and I just don’t think that that’s helpful,” Tracy Richmond, a pediatrics professor at Harvard Medical School, told me. Anti-obesity bias is well documented among physicians—even some obesity specialists—and can lead them to dismiss the legitimate medical needs of people with a high BMI. In one tragic example, a patient died from cancer that went undiagnosed because her doctors attributed her health issues to her high BMI.

    But after many decades, the medical community has begun to use BMI in a different way. “More and more clinicians are realizing that there are people who can be quite healthy with a high BMI,” Kate Bauer said. The shift has been gradual, though it was given a boost by the AMA policy update earlier this year: “Hopefully that will help clinicians make a change to supplement BMI with other measures,” Aayush Visaria, an internal-medicine resident at Rutgers Robert Wood Johnson Medical School who researches BMI’s shortcomings, told me.

    Physicians I spoke with acknowledged BMI’s flaws but didn’t seem too concerned about its continued use in medicine—even as obesity drugs make this metric even more consequential. BMI isn’t a problem, they said, as long as physicians consider other factors when diagnosing obesity or prescribing drugs to treat it. If you go to a doctor with the intention of getting on an obesity drug, you should be subject to a comprehensive evaluation including metrics such as blood sugar, cholesterol levels, and body composition that go “way beyond BMI,” Katherine Saunders, a clinical-medicine professor at Weill Cornell Medicine, said. Because Wegovy and other drugs come with side effects, she told me, doctors must be absolutely sure that a patient actually needs them, she added.

    But BMI isn’t like most other health metrics. Because of its simplicity, it has seeped out of doctor’s offices and into the mainstream, where this more nuanced view still isn’t common. Whether we realize it or not, BMI is central to our basic idea of health, affecting nearly every aspect of daily life. Insurance companies are notorious for charging higher rates to people with high BMI and lowering premiums for people who commit to long-term weight loss. Fertility treatments and orthopedic and gender-affirming surgery can be withheld from patients until they hit BMI targets. Workplace wellness programs based on BMI are designed to help employees manage their weight. BMI has even been used to prevent prospective parents from adopting a child.

    The rise of obesity drugs may make these kinds of usages of BMI even harder to shake. Determining drug eligibility by high BMI supports the notion that a number is synonymous with illness. Certainly many people using obesity drugs take a holistic view of their health, as doctors are learning to do. But focusing on BMI is still common. Some members of the r/Ozempic Subreddit, for example, share their BMI to show their progress on the drug. Again, high BMI can be used to predict who has obesity, but it isn’t itself an obesity diagnosis. The problem with BMI’s continued dominance is that it makes it even harder to move away from simply associating a number on a scale with overall health, with all the downstream consequences that come along with a weight-obsessed culture. As obesity drugs are becoming mainstream, “there needs to be public education explaining that BMI by itself may not be a good indicator of health,” Visaria said.

    In another 200 years, surely BMI will finally be supplanted by something else. If not much sooner: A large effort to establish hard biological criteria for obesity is under way; the goal is to eliminate BMI-based definitions once and for all. Caroline Apovian, a professor at Harvard Medical School, gives it “at least 10 years” before a comparably cheap or convenient replacement arises—though any changes would take longer to filter into public consciousness.” Until that happens, we’re stuck with BMI, and the mess it has wrought.

    [ad_2]

    Yasmin Tayag

    Source link

  • No One Has to Pretend Water Is Exciting

    No One Has to Pretend Water Is Exciting

    [ad_1]

    Over the past few decades, what Americans want out of their beverages has swung wildly between two extremes. In the 1990s, sweet drinks were all the rage. Soda sales were on what seemed like a limitless upward trajectory. Quaker bought the then-ascendant Snapple brand for $1.7 billion in cash, a sum that made me actually snort when I read it in the harsh light of 2023. Gimmicky drinks such as Surge, Orbitz, and SoBe “elixirs” crowded grocery-store shelves. As a middle schooler in the late ’90s, my consumption patterns were practically a case study in the era’s marketing magic. I’m not sure a single drop of plain water ever touched my lips outside of soccer practice.

    Toward the end of that decade, the first evidence of the coming reversal was already visible. Skepticism (most of it warranted, though some of it not) toward sugar and artificial sweeteners steadily grew. The soda giants, reading the room, began marketing their own bottled-water brands to compete with the more fashionable likes of Evian and Perrier. Dasani and Aquafina came right on time: As soda sales faltered, bottled-water sales took off. In 2016, the Beverage Marketing Corporation estimated that, for the first time, Americans consumed more bottled water—almost 40 gallons per capita on average—than carbonated beverages. Tap water, too, has found a new home in an ever-increasing number of reusable water bottles. New Stanley cup, anyone?

    Americans, in short, got sold on hydration. As my colleague Katherine J. Wu recently wrote, how much water any particular person needs to drink to maintain a healthy baseline is still the subject of significant disagreement among experts. But in the absence of clear guidance—and with plenty of encouragement from the health-and-wellness industry—many people seem to have simply decided that more is more, and they shoot for as much as a gallon a day.

    That isn’t to say that everyone likes drinking all this water. As the nation has disciplined itself to refill its glasses, Americans have been forced to confront the inconvenient reality that drinking plain water day in and day out can be kind of a chore. To choke it all down, they’ve returned to powders and concentrated syrups designed to make water more palatable, more healthful, or both. Sweet drinks are back again, albeit in a different form. Enter the water enhancer.

    Today, products meant to gussy up your water are everywhere at grocery and convenience stores. They come in little brightly colored squeeze bottles or single-serving packets. The sales pitch is pretty simple: Throw one in your purse or laptop bag, and instead of buying a packaged beverage, you squirt a couple of drops of syrup or mix a tablespoon or two of powder into regular water. Voilà. Now water is better. This is not exactly a new type of product: Powdered mixes from Crystal Light and Gatorade were around in the 1980s. But unlike the water enhancers of yore, today’s mixes are mostly portioned for single servings instead of big batches.

    According to Phil Lempert, a grocery-industry expert and the founder of the website Supermarket Guru, water enhancers split into roughly two categories: low-calorie flavorings, such as Kraft Heinz’s highly concentrated MiO drops, and hydrating sports (or hangover) drinks, such as powdered electrolyte packets from Liquid I.V. Both MiO and Liquid I.V. debuted in the early 2010s. Within a few years, competitors including LMNT, Cure, and Buoy entered the market, along with the new entrants from old brands, Crystal Light and Gatorade among them. Most of these brands boast about their products’ low sugar content; even some of the enhancers flavored to taste like Skittles, Starburst, or other candy rely on artificial or alternative sweeteners and have few calories. Other ingredients have been incorporated into new products: Companies such as Cure now make caffeinated concentrates. Liquid I.V. has a powder that includes melatonin for sleep. Lots of other products now contain additional vitamins, minerals, or electrolytes. Many water enhancers have become, in essence, drinkable supplements.

    Water enhancers’ rise can easily be charted in sales numbers. Darren Seifer, the food-and-beverage-industry analyst at the consumer-data firm Circana, told me that although the products are still a small part of the overall beverage market, they’ve seen consistent growth. In 2022, sales volume of sports-drinks mixes—the category in which the firm places most water enhancers—was up 15 percent over the previous year. According to Seifer, the growth has been much larger for some brands. A spokesperson for Liquid I.V., which was bought by Unilever in 2020, told me that the brand’s sales have nearly doubled in each of the past four years.

    Like so many cultural phenomena, water enhancers also have become the subject of a viral trend. WaterTok, a subset of TikTok where users mix and match different powders and syrups into recipes inside giant insulated water bottles, flooded the internet earlier this year with tips on how to make tap water taste like, among other things, birthday cake. (Like most TikTok trends, it’s a little extreme, and it doesn’t seem to be especially indicative of how regular people end up using the products. TikTok Franken-water sounds sort of terrifying, and some health experts have expressed concern over its potential misuse as a weight-loss aid.)

    The whole concept of water enhancement can be pretty easy to mock: Why, exactly, can some people not find it within themselves to drink regular water? Why do they need it to taste like Skittles? Why do some people think a random wellness company might actually be able to improve on water, of all things? Once you’ve got the water in your glass, just stop there! Drink that! And yes, drinking Jolly Rancher aspartame water does strike me as more ludicrous than just having a Diet Coke. But if you let go of your immediate revulsion at the occasional licensed candy branding and consider water enhancers as a concept on its merits, you’ll find that even the worst of the bunch isn’t functionally much different than a sugar-free sports drink or low-calorie lemonade. In most cases, they’re arguably better if your goal is to stay hydrated, have a little treat, and have some say in how much sugar or sweetener you consume in the process.

    There’s little reason to believe that the people who use water enhancers are doing so at the expense of the plain water that they’d be drinking otherwise. Americans’ consumption of plain water remains, by all indications, robust. It’s mostly sales of soda and juice that are generally sluggish, which at least hints that, for a lot of the people who like those types of drinks, the trade-off that’s actually being made is between water enhancers and some kind of heavily sweetened beverage. In a lot of cases, that trade-off seems positive, on balance, especially because the enhancers allow people to control how much sweetness actually goes into a drink. This does not guarantee that people consume lower concentrations of flavorings, but it at least allows them to do so if they want.

    To fully understand why people are suddenly so enthusiastic about water enhancers, you also have to look outside of the beverage market and to the kinds of vessels that are so often used to consume them: reusable water bottles and high-capacity insulated cups. According to Circana’s data, the Hydro Flasks and Yetis and Stanleys of the world are still selling like hotcakes, and they present a significant shift in the physical reality of how a lot of Americans get their daily fluids—and, potentially, how much of those fluids they intend to be drinking. If you’ve already got 30 or 40 ounces of water on your desk at work, buying a Gatorade or coconut water or other premixed beverage to lug around with it makes less sense than it otherwise would, and having a couple of packets of sweetened electrolyte powder in your laptop bag is comparatively easy.

    At the core of all of this is a fundamental anxiety. Americans want to do what they can for their health, but for so many people, the most meaningful changes—easier, more affordable access to nutritious foods; taking time for exercise; less stress—are difficult to achieve or outside of their control. Swapping out sugary drinks for plausibly healthier options might not be life-altering, but at least it feels like something. “It’s a low-hanging fruit, in terms of healthy behaviors,” Caleb Bryant, a food-and-beverage analyst at the consumer-data firm Mintel, told me. The same anxiety exists for people who buy bottled water regularly, which Circana’s Seifer points out is still a huge group whose numbers have not yet shown any decline. If you’re selling water enhancers, you don’t need to convert bottled-water drinkers away from a product they already like, as you would with a bottled drink—you just have to convince them that they might occasionally like adding something to it.

    The enhancers have their limits. The freedom they confer can easily mislead consumers about how much better self-mixed drinks actually are: The experts I spoke with all agreed that at least some people seem to assume that no matter how much or what kind of water enhancer they use, their beverage will end up inherently healthier than something prepackaged, just because they get to see the water first before they add anything to it. In that way, the brands behind water enhancers are still very much profiting off of the confusing hydration hype that’s been separating people from their money in dubiously healthful ways for years.

    On balance, though, water enhancers do seem to offer something desirable to people who want their water to be a little bit more palatable and the companies who want to sell to them. They are, on some level, a rare win-win: Water enhancers’ smaller, lighter proportions have significant upsides for the companies marketing them, according to Supermarket Guru’s Lempert. The beverage business as a whole is already a more profitable, less cost-intensive category in which to operate than many other sectors of the grocery industry, he told me, which likely helps account for all the upstarts flocking to the water-enhancer category—they’re inexpensive to produce and don’t spoil quickly. When you take away the necessity of buying plastic bottles and packing, shipping, and stocking heavy liquids, the beverage math gets even better. Consumers find some advantages in those differences too: They create less plastic waste (as long as you’re not always buying bottled water to use with them), take up less room in the pantry, and are sometimes less expensive per serving than a bottled alternative.

    Ultimately, the biggest driver behind water enhancers’ popularity is probably just the nature of water itself. It’s great, but drinking a ton of it every day can become drudgery. These additive products play to a tendency to tinker with water in pursuit of health, stimulation, or pleasure that humans have had for thousands of years. Teas, coffee, beer, wine, and sweetened, fruity drinks such as aguas frescas were all developed because, on some level, water—humble and utilitarian as it is—just wasn’t satisfying all of the needs and desires that our forebears had. Now that lots of people believe they need to be downing liters of water every day for their health, they’ve rediscovered an age-old problem. Yes, water is great. But maybe it could be better, or at least more fun?

    You do need to drink water; any downsides of erring on the side of overhydration don’t really kick in until the volume gets extreme. But forgoing a little fun or flavor in pursuit of perfect physical health is something that humans have never been particularly good at doing. One medieval religious text even cited drinking nothing but plain water as a just punishment for swearing against God. With that in mind, it might have been foolish to expect that in the 21st century, with so many alternatives available, copious amounts of plain water would be the widespread drink of choice for long.

    [ad_2]

    Amanda Mull

    Source link

  • Being Alive Is Bad for Your Health

    Being Alive Is Bad for Your Health

    [ad_1]

    In 2016, I gave up Diet Coke. This was no small adjustment. I was born and raised in suburban Atlanta, home to the Coca-Cola Company’s global headquarters, and I had never lived in a home without Diet Coke stocked in the refrigerator at all times. Every morning in high school, I’d slam one with breakfast, and then I’d make sure to shove some quarters (a simpler time) in my back pocket to use in the school’s vending machines. When I moved into my freshman college dorm, the first thing I did was stock my mini fridge with cans. A few years later, my then-boyfriend swathed two 12-packs in wrapping paper and put them under his Christmas tree. It was a joke, but it wasn’t.

    You’d think quitting would have been agonizing. To my surprise, it was easy. For years, I’d heard anecdotes about people who forsook diet drinks and felt their health improve seemingly overnight—better sleep, better skin, better energy. I’d also heard whispers about the larger suspected dangers of fake sweeteners. Yet I’d loved my DCs too much to be swayed. Then I tried my first can of unsweetened seltzer at a friend’s apartment. After years of turning my nose up at the thought of LaCroix, I realized that much of what I enjoyed about Diet Coke was its frigidity and fizz. That was enough. I switched to seltzer on the spot, prepared to join the smug converted and receive whatever health benefits were sure to accrue to me for my good behavior.

    Except they never came. Seven years later, I feel no better than I ever did drinking four or five cans of the stuff a day. I still stick to seltzer anyway—because, you know, who knows?—and I’ve mostly forgotten that Diet Coke exists. But the diet sodas had not, as it turns out, been preventing me from getting great sleep or calming my rosacea or feeling, I don’t know, zesty. Besides the caffeine, they appeared to make no difference in how good or bad I felt at all.

    Yesterday, Reuters reported that the WHO’s International Agency for Research on Cancer will soon declare aspartame, the sweetener used in Diet Coke and many other no-calorie sodas, as “possibly carcinogenic to humans.” I probably should have felt vindicated. I may not feel better now, but many years down the road (knock on wood), I’ll be better off. I’d bet on the right horse! Instead, I felt nothing so much as irritation. Over the past few decades, a growing number of foods and behaviors have become the regular subject of vague, ever-changing health warnings—fake sweeteners, real sugar, wine, butter, milk (dairy and non), carbohydrates, coffee, fat, chocolate, eggs, meat, veganism, vegetarianism, weightlifting, drinking a lot of water, and scores of others. The more warnings there are, the less actionable any particular one of them feels. What, exactly, is anyone supposed to do with any of this information, except feel bad about the things they enjoy?

    It’s worth reviewing what is actually known or suspected about diet sodas and health. The lion’s share of research on this topic happens in what are known as observational studies—scientists track consumption and record health outcomes, looking for commonalities and trends linking behavior and effects. These studies can’t tell you if the behavior caused the outcome, but they can establish an association that’s worth investigating further. Regular, sustained diet-soda consumption has been linked to weight gain, Type 2 diabetes, and increased risk of stroke, among other things—understandably troublesome correlations for people worried about their health. But there’s a huge complicating factor in understanding what that means: For decades, advertisements recommended that people who were already worried about—or already had—some of those same health concerns substitute diet drinks for those with real sugar, and many such people still make those substitutions in order to adhere to low-carb diets or even out their blood sugar. As a result, little evidence suggests that diet soda is solely responsible for any of those issues—health is a highly complicated, multifactorial phenomenon in almost every aspect—but many experts still recommend limiting your consumption of diet soda as a reasonable precaution.

    A representative for the IARC would neither confirm nor deny the nature of the WHO’s pending announcement on aspartame, which will be released on July 14. For the sake of argument, let’s assume that Reuters’s reporting is correct: In two weeks, the organization will update the sweetener’s designation to indicate that it’s “possibly carcinogenic.” To regular people, those words—especially in the context of a health organization’s public bulletins—would seem to imply significant suspicion of real danger. The evidence may not yet all be in place, but surely there’s enough reason to believe that the threat is real, that there’s cause to spook the general public.

    Except, as my colleague Ed Yong wrote in 2015, when the IARC made a similar announcement about the carcinogenic potential of meat, that’s not what the classification means at all. The IARC chops risk up into four categories: carcinogenic (Group 1), probably carcinogenic (Group 2A), possibly carcinogenic (Group 2B), and unclassified (Group 3). Those categories do one very specific thing: They describe how definitive the agency believes the evidence is for any level of increased risk, even a very tiny one. The category in which aspartame may soon find itself, 2B, makes no grand claims about carcinogenicity. “In practice, 2B becomes a giant dumping ground for all the risk factors that IARC has considered, and could neither confirm nor fully discount as carcinogens. Which is to say: most things,” Yong wrote. “It’s a bloated category, essentially one big epidemiological shruggie.”

    The categories are not at all intended to communicate the degree of the risk involved—just how sure or unsure the organization is that there’s a risk associated with a thing or substance at all. And association can mean a lot of things. Hypothetically, regular consumption of food that may quadruple your risk of a highly deadly cancer would fall in the same category as something that may increase your risk of a cancer with a 95 percent survival rate by just a few percentage points, as long as the IARC felt similarly confident in the evidence for both of those effects.

    These designations about carcinogenicity are just one example of how health information can arrive to the general public in ways that are functionally useless, even if well intentioned. Earlier this year, the WHO advised against all use of artificial sweeteners. At first, that might sound dire. But the actual substance of the warning was about the limited evidence that those sweeteners aid in weight loss, not any new evidence about their unique ability to harm your health in some way. (The warning did nod to the links between long-term use of artificial sweeteners and increased risks of cardiovascular disease, Type 2 diabetes, and premature death, but as the WHO noted at the time, these are understood as murky correlations, not part of an alarming breakthrough discovery.)

    The same release quotes the WHO’s director for nutrition and food safety advising that, for long-term weight control, people need to find ways beyond artificial sweeteners to reduce their consumption of real sugar—in essence, it’s not a health alert about any particular chemical, but about dessert as a concept. How much of any sweetener would you need to cut out of your diet in order to limit any risks it may pose? The release, on its own, doesn’t specify. Consider a birthday crudités platter instead of a cake, just to be sure. (Is that celery non-GMO? Organic? Just checking.)

    The media, surely, deserve our fair share of blame for how quickly and how far these oversimplified ideas spread. Many people are very worried about the food they eat—perhaps because they have received so many conflicting indicators over the years about how that food affects their bodies—and flock to news that something has been deemed beneficial or dangerous. At best, the research that many such stories cite is rarely definitive, and at worst, it’s so poorly designed or otherwise flawed that it’s flatly incapable of producing useful information.

    Taken in aggregate, this morass of poor communication and confusing information has the very real potential to exhaust people’s ability to identify and respond to actual risk, or to confuse them into nihilism. The solution-free finger-wagging, so often about the exact things that many people experience as the little joys in everyday life, doesn’t help. When everything is an ambiguously urgent health risk, it very quickly begins to feel like nothing is. I still drink a few Diet Cokes a year, and I maintain that there’s no better beverage to pair with pizza. We’re all going to die someday.

    [ad_2]

    Amanda Mull

    Source link

  • A Slice of ‘Bacon’ Made Me Believe in Fake Meat

    A Slice of ‘Bacon’ Made Me Believe in Fake Meat

    [ad_1]

    Last month, at a dining table in a sunny New York City hotel suite, I found myself thrown completely off guard by a strip of fake bacon. I was there to taste a new kind of plant-based meat, which, like most Americans, I’ve tried before but never truly craved in the way that I’ve craved real meat. But even before I tried the bacon, or even saw it, I could tell it was different. The aroma of salt, smoke, and sizzling fat rising from the nearby kitchen seemed unmistakably real. The crispy bacon strips looked the part too—tiger-striped with golden fat and presented on a miniature BLT. Then crunch gave way to satisfying chew, followed by a burst of hickory and the incomparable juiciness of animal fat.

    I knew it wasn’t real bacon, but for a moment, it fooled me. The bacon was indeed made from plants, just like the burger patties you can buy from companies such as Impossible Foods and Beyond Meat. But it had been mixed with real pork fat. Well, kind of. What marbled the meat had not come from a butchered pig but a living hog whose fat cells had been sampled and grown in a vat.

    This lab-grown fat, or “cultivated fat,” was made by Mission Barns, a San Francisco start-up, with one purpose: to win people over to plant-based meat. And a lot of people need to be won over, it seems. The plant-based-meat industry, which a few years ago seemed destined for mainstream success, is now struggling. Once the novelty of seeing plant protein “bleed” wore off, the high price, middling nutrition, and just-okay flavor of plant-based meat has become harder for consumers to overlook, food analysts told me. In 2021 and 2022, many of the fast-food chains that had once given plant-based meat a national platform—Burger King, Dunkin’, McDonald’s—lost interest in selling it. In the past four months, the two most visible plant-based-meat companies, Beyond Meat and Impossible Foods, have each announced layoffs.

    Meanwhile, the future of meat alternatives—lab-grown meat that is molecularly identical to the real deal—is at least several years away, lodged between science fiction and reality. But we can’t wait until then to eat less meat; it’s one of the single best things that regular people can do for the climate, and also helps address concerns about animal suffering and health. Lab-grown fat might be the bridge. It is created using the same approach as lab-grown meat, but it’s far simpler to make and can be mixed into existing plant-based foods, Elysabeth Alfano, the CEO of the investment firm VegTech Invest, told me. As such, it’s likely to become commercially available far sooner—maybe even within the next few years. Maybe all it will take to save fake meat is a little animal fat.


    Animal fat is culinary magic. It creates the juiciness of a burger, and leaves a buttery coat on the tongue. Its absence is the reason that chicken breasts taste so bland. Fat, the chef Samin Nosrat wrote in Salt, Fat, Acid, Heat, is “a source of both rich flavor and of a particular desired texture.” The fake meat on the market now is definitely lacking in the flavor and texture departments. Most products approximate meatiness using a concoction of plant oils, flavorings, binders, and salt, which is certainly meatier than the bean burgers that came before it, but is far from perfect: The food blog Serious Eats, for instance, has pointed out off-putting flavor notes, at least prior to cooking, including coconut and cat food. On a molecular level, plant fat is ill-equipped to mimic its animal counterpart. Coconut oil, common in plant-based meat, is solid at room temperature but melts under relatively low heat, so it spills out into the pan while cooking. As a result, the mouthfeel of plant-based meat tends to be more greasy than sumptuous.

    Replacing those plant oils with cultivated animal fat, which keeps its structure when heated, would maintain the flavor and juiciness people expect of real meat, Audrey Gyr, the start-up innovation specialist at the Good Food Institute, a nonprofit that advocates for plant-based substitutes, told me. In a sense, the technique of using animal fat to flavor plants is hardly new. Chicken schmaltz has long lent rich nuttiness to potato latkes; rendered guanciale is what gives a classic amatriciana its succulence. Plant-based bacon enhanced with pork fat follows from the same culinary tradition, but it’s very high-tech. Fat cells sampled from a live animal are grown in huge bioreactors and fed with plant-derived sugars, proteins, and other growth components. In time, they multiply to form a mass of fat cells: a soft, pale solid with robust flavor, the same white substance you might see encircling a pork chop or marbling a steak.

    Out of the bioreactor, the fat “looks a little bit like margarine,” Ed Steele, a co-founder of the London-based cultivated-fat company Hoxton Farms, told me. It is a complicated process, but far easier than engineering cultivated meat, which involves many cell types that must be coaxed into rigid muscle fibers. Fat involves one type of cell and is most useful as a formless blob. Just as in the human body, all it takes is time, space, and a steady drip of sugars, oils, and other fats, Eitan Fischer, CEO of Mission Barns, told me. The bacon I’d tried at the tasting had been constructed by layering cultivated fat with plant-based protein, curing and smoking the loaf, then slicing it into bacon-like strips. Mixing just 10 percent cultivated fat with plant-based protein by mass, Steele said, can make a product taste and feel like the real thing.

    Already, cultivated-fat products are within sight. Mission Barns plans to incorporate its cultivated fat into its own plant-based products; Hoxton Farms hopes to sell its fat directly to existing plant-based-meat manufacturers. Other companies, such as the Belgian start-up Peace of Meat, the Berlin-based Cultimate Foods, and Singapore’s fish-focused ImpacFat, are also working on their own versions of cultivated fat. In theory, the fat can be mixed into virtually any type of plant-based meat—nuggets, sausages, paté. In the U.S., a path to market is already being cleared. Last November, cultivated chicken from the California start-up Upside Foods received FDA clearance; now it’s waiting on additional clearance from the Department of Agriculture. Pending its own regulatory approvals, Mission Barns says it is ready to launch its products in a few supermarkets and restaurants, which also include a convincingly porky plant-based meatball I also tried at the tasting. (Due to the pending approval, I had to sign a liability waiver before digging in.)


    I left the tasting with animal fat on my lips and a new conviction in my mind: At the right price, I’d buy this bacon over the regular stuff. Because cultivated fat can be made without harming animals—the fat cells in the bacon I tasted came from a happily free-ranging pig named Dawn, a PR rep for Mission Barns told me—it may appeal to flexitarians like myself who just want to eat less meat.

    Although there’s no guarantee it would taste as good at home as it did when prepared by Mission Barns’s private chef, with its realistic texture and flavor, cultivated fat could solve the main issue plaguing plant-based meat: It just doesn’t taste that good. Cultivated fat is “the next step in making environmentally friendly foods more palatable to the average consumer,” Jennifer Bartashus, a packaged-food analyst at Bloomberg Intelligence, told me.

    But cultivated fat still faces some of the same problems that have turned America off plant-based meat. The current products for sale are not particularly healthy, and cultivated fat would not change that fact. Building consumer trust and familiarity may also be an issue. Some people are leery of plant-based products because they’re confused about what they’re made of. The more complex notion of cultivated fat may be just as unappetizing, if not more so. “We still don’t know exactly how consumers are going to feel about cultivated fat,” Gyr said. Certainly, finding a catchy name for these products would help, but I have struggled to find a term less clunky than “plant-based meat flavored with cultivated animal fat” to describe what I ate. Unless cultivated-fat companies really nail their marketing, they could go the way of “blended meat”—mixtures of plant-based protein and real meat introduced by three meat companies in 2019, which was “a bit of a marketing failure,” Gyr said.

    Above all, though, is the price relative to that of traditional meat. Plant-based meat’s higher cost has partly been blamed for the industry’s slump, and products containing cultivated fat, in all likelihood, will not be cheaper in the near future. Neither founder I spoke with shared specific numbers; Fischer, of Mission Barns, said only that the company’s small production scale makes it “fairly expensive” compared with traditional meat products, while Steele said his hope is that companies using Hoxton Farms’ cultivated fat in their plant-based-meat recipes won’t have to spend more than they do now.

    Despite these obstacles, cultivated fat is promising for the flagging plant-based-meat industry because of the fact that it is absolutely delicious. Cultivated fat could “lead to a new round of innovation that will pull consumers back in,” Bartashus said. After all, plant-based and real meat could reach cost parity around 2026, at which point even more companies might want to get in on meat alternatives. Cultivated fat might warm us up to the future of fully cultivated meat. With enough time, lab-grown chicken breasts could become as boring as regular chicken breasts.

    Enthusiasm about cultivated fat, and fake meat in general, has a distinctly techno-optimist flavor, as if persuading all meat eaters to embrace plants gussied up in bacon grease will be easy. “Eventually our goal is to outcompete current conventional meat prices, whether it’s meatballs or bacon,” Fischer said. But even as the problems with eating meat have only become clearer, meat consumption in the U.S. has continued to rise. Globally, meat consumption in countries such as India and China is expected to skyrocket in the coming years. At the very least, cultivated fat provides consumers with another option at a time when eating a steak for one meal and then opting for plant-based meat the next can count as a win.

    Since the tasting, I’ve often thought about why eating the bacon left me feeling so perplexed. When I gnawed on the crispy golden edge of one of the strips, I knew I was eating real bacon fat, but my brain still wrestled with the idea that it had not come directly from a piece of pork. I’ve only ever known a world where animal fat comes from slaughtered animals. That is changing. If cultivated fat can tide the plant-based-meat industry over until lab-grown meat becomes a reality, these new products will have done their part. In the meantime, we may come to find that they’re already good enough.

    [ad_2]

    Yasmin Tayag

    Source link

  • Foreign Candy Puts American Candy to Shame

    Foreign Candy Puts American Candy to Shame

    [ad_1]

    This article was featured in One Story to Read Today, a newsletter in which our editors recommend a single must-read from The Atlantic, Monday through Friday. Sign up for it here.      

    At Sunrise Mart, a small Japanese grocery with a branch in Brooklyn’s Sunset Park, you can’t miss the mountain of KitKats. The shop sells all kinds of fresh foods and imported snacks, but as soon as you step inside, you’re toe-to-toe with an enormous heap of colorful bags of the chocolate bars, rising up from the floor in the store’s most prominent real estate. The bags offer flavors such as lychee, chocolate orange, and cheesecake. At $10 each, they’re a little expensive. That doesn’t seem to matter. When I visited the store this spring in search of soup ingredients, multiple shoppers buzzed around me on an otherwise slow weekday afternoon, snapping up bag after bag.

    I’d never had anything but a standard American KitKat before, but I’d heard so many people rave about the Japanese versions that stumbling on the opportunity to try them myself seemed like money I couldn’t afford not to spend. I stuffed two bags of the pistachio and matcha flavors in my tote and headed for the subway, feeling like I’d just unearthed some kind of treasure. When I got home, I pulled out both, plus a few other packages of impulse-purchased Asian candy that I’d scooped up (you know, while I was there), and staged my own little taste test on my kitchen counter. Their flavors and textures differed from the candy I’d been eating for my entire life. They were all great. Matcha won.

    Without realizing it, I’d repeated a ritual that’s become pretty common, both online and in real life. YouTube and TikTok videos of Americans taste-testing candies from Europe, Asia, and Latin America rack up millions of views. At Economy Candy, a Manhattan confectioner that stocks a huge variety of sweets, new customers come in every day, brandishing their phones, fiending to try candies from far-flung locales that they heard about on the internet or that their roommate tried on vacation. Skye Greenfield Cohen, who runs the store with her husband, told me that as recently as five years ago, Economy Candy had only a few racks of imports. “That meant halvah from the Middle East, Turkish delight, those kinds of grandmalike candy that were more nostalgic for a homeland, rather than fun,” she said. Now imports from around the world make up about a third of the store’s inventory.

    On one level, it’s not difficult to understand why any type of candy, foreign or domestic, becomes popular. Candy is engineered to entice and delight, and it’s mostly pretty cheap. But American shoppers don’t exactly lack domestic candy options; any average grocery store’s checkout line is bursting with Snickers, Twizzlers, M&Ms, and Skittles. The hunger for foreign treats can’t be entirely explained by the vagaries of social-media virality, either. According to one estimate, since 2009, the annual value of America’s non-chocolate candy imports has grown by hundreds of millions of dollars; in 2019, it crossed the $2 billion threshold for the first time. Some logistical and cultural factors help explain the United States’ imported-candy boom. But first and foremost, Americans seem to love foreign sweets because they’re having the same revelation I had in my kitchen with my green KitKats: The international stuff puts most domestic candy to shame.


    In the early 2010s, executives at the American division of the Japanese confectioner Morinaga & Company noticed something strange happening in Utah. The company’s Hi-Chew brand of fruit-flavored candies, which was then difficult to find in much of the United States, was selling extraordinarily well in Salt Lake City. The success was welcome—Morinaga wanted to expand its market in the U.S.—but it didn’t immediately make sense. At the time, the majority of the company’s American sales came from West Coast cities with large Asian populations, where the candies were stocked by grocers who catered to people who already knew and liked them. Salt Lake City, which is almost three-quarters white, was anomalously enamored of the intensely chewy little fruit nuggets.

    The company eventually figured out what was going on: According to Teruhiro Kawabe, Morinaga America’s president, missionaries from the Church of Latter-Day Saints were coming home from stints in Japan, where Hi-Chew has been omnipresent for decades, and buying up as much of the candy as they could find. “They got to know the candy in the Japanese grocery stores, and they got addicted,” Kawabe told me. Soon their friends and families were, too. The Salt Lake City scenario wasn’t exactly replicable, but Kawabe said that it served as proof of concept: Americans would love the candy, if the company could get it in front of them.

    Getting a particular product in front of shoppers, though, is much easier said than done, especially when it comes to things that are largely unknown or thought to have a niche audience. Candy purchases tend to be spur-of-the-moment decisions made at checkout counters, and that real estate is limited and has long been spoken for by conglomerates such as Hershey and Mars, which make much of the candy that Americans have been eating for their entire life. To take a shot at mainstream American success, Hi-Chew’s makers did the usual stuff that consumer-products businesses do: They hired retail consultants, switched distributors, that kind of thing. But they also set their sights on a very important group: Major League Baseball players, the only people who routinely spend time chewing snacks in extreme close-up on TV. Morinaga supplied Japanese players in the league with Hi-Chew, Kawabe told me, focusing first on teams in markets where major retailers were headquartered. The gambit worked; ESPN reported on just how obsessed the 2015 Yankees squad was with the little fruit candies. Walgreens and CVS picked up the brand after it became popular with the Chicago Cubs and Boston Red Sox. Regular people tried the newly plentiful and suddenly trendy candy, and then insisted that their brother or spouse or co-workers try it. Hi-Chew’s U.S. sales grew from $8 million in 2012 to more than $100 million in 2021, according to Kawabe.

    This success story might feel a little bit too convenient, but baseball players’ mid-2010s Hi-Chew mania was well documented—and, apparently, ongoing. Moreover, explosive American growth in the past decade has been common among foreign candy brands. Sales of gummy candies from the German confectioner Haribo more than doubled from 2011 to 2017. Ferrero, the Italian parent company of Kinder chocolates, says that the line’s U.S. sales are growing by double digits annually. The European chocolate brands Milka and Cadbury are now owned by the American Oreo-maker Mondelez—an advantage over other confectioners when navigating import and retail red tape.

    None of these companies pulled off the same tactic with baseball players, but their rise seems to have followed similar patterns. Greenfield Cohen, from Economy Candy, said sales growth largely happens by word of mouth. This is helped along by the increasing popularity of international travel and the internet’s ability to serve niche products to a potentially large pool of previously untapped buyers. European candy in particular benefits from these dynamics—millions of American tourists visit the continent every year, and destination-specific candies are a common gift for returning travelers to bring home to loved ones. (That’s how I first tried Hi-Chew way back in 2002, although my high-school best friend had gone on a family trip to the exotic land of Tampa, not Japan.) Now the barrier between trying one piece of interesting candy—or even just hearing someone rave about it—and keeping a stockpile in your pantry or desk drawer is as low as it’s ever been.


    Of course, candy also needs to taste good for people to like it. All the word of mouth in the world won’t permanently increase sales of a bad product. Once people try candy from other parts of the world, they return to it because it is, in some very real ways, better than its domestic competitors.

    Have you ever had a matcha KitKat? Its physical form is identical to that of a regular KitKat, except instead of chocolate, it’s blanketed in bright green. Where many Americans would expect the familiar, slightly bland flavor of milk chocolate, there’s an earthy, creamy sweetness—perfect for people who, like me, get a little queasy after a few pieces of sickly sweet Halloween candy. With Hi-Chews, each wrapped in tiny squares of plain-white waxed paper, the flavors are important—and far more varied than in popular American fruit candies—but the primary feature is the texture. Chewing one feels like you’ve encountered a Starburst that fights back. It’s delicious.

    The reasons for foreign candy’s superiority are varied—and more surprising than you might expect. In some cases, yes, a candy is better because it is fundamentally different, on a chemical level, than what’s available in America. Europe’s strict regulations on chocolate quality mean that it offers something that’s not really comparable to a Hershey bar (and that Europeans are generally enthusiastic to tell you how much American chocolate sucks). The European Union also bans certain food additives that the FDA allows, which can yield slightly different results in all kinds of finished products, including candy.

    These cases seem to be the exception, not the rule, however. Ali Bouzari, a culinary scientist and co-founder of the product-development firm Pilot R&D, doesn’t buy the idea that inherently superior quality is the reason that so many people are charmed by imported sweets. “The basic tools of commercial candy manufacture are pretty universal, and the ingredients that people work with are fully globalized,” Bouzari told me. German brands, Japanese brands, and American brands likely all source their grape flavorings, for example, from the same vendors. What’s different—and what makes foreign candies so enticing—instead mostly seems to be in the implementation. Imported candies tend to embrace flavors and textures that American candies don’t. “I will always first go for the melon stuff” when shopping in an East Asian grocery store, Bouzari said. “This is candy inspired by a culture that thinks about melons more than we do.” Every part of the world has some kind of confection that it does particularly well: Scandinavians produce more flavors and textures of licorice than most Americans could dream of. Mexican candy frequently includes savory or spicy flavors. Candies from a number of East and South Asian countries tend to feature a far wider array of fruit flavors than are available in the West.

    The flavorings and ingredients that go into these candies are likely available to American manufacturers from the vendors they’re already using, according to Bouzari. Foreign producers develop products primarily for their domestic markets, so they make different choices and end up with results that can feel idiosyncratic—sometimes thrillingly so—to the American palate. As food culture has globalized, those palates have become more adventurous, especially in larger metropolitan areas, where more types of food have become more widely available in restaurants and grocery stores than ever before. Meanwhile, Bouzari told me, major U.S. manufacturers haven’t really kept up. They depend on appealing to as broad a swath of the country’s atypically diverse population as possible—not just across racial and ethnic lines, but across the country’s many local and regional food cultures. The results are candies that tend to be highly sweet and pretty bland, forgoing flavors and textures that brands believe might alienate white Americans in particular.

    All that being said, American tastes have a way of bending the world to their will. Once a foreign confectioner achieves a certain level of American success, it usually ends up adjusting its products for the American market, even if only a little. Kawabe, Morinaga America’s president, told me that some of the Hi-Chew flavors sold in mainstream U.S. retailers vary slightly from what’s available in Japan. When Americans buy grape candy, for example, their flavor expectations are just different from when the Japanese buy the same thing. Candy companies that want huge U.S. sales growth, for better or worse, need to meet people where they are.

    The most salient difference between foreign and domestic candy might not be chemical or methodological, but rather philosophical. New American products could theoretically embrace the lessons of imported candy and snatch up some of its growing domestic market share. But in Bouzari’s experience, much of the candy being developed domestically, such as low-carb candy from brands like Smart Sweets and Highkey, isn’t trying to delight consumers, but to placate their health fears by engineering it into diet food. “Candy is meant to be edible, ephemeral entertainment,” he said. “If you try to turn it into food, you get caught in a weird no-man’s-land where it’s neither the complete entertainment that it should be, and it’s not as nourishing as it should be.”

    For Americans who want something fun and novel and sweet, overseas might just be the most logical place to look right now. “In most other places I’ve been in the world, there is a more well-adjusted relationship to hedonism in food than we have here,” Bouzari said. “Other people spend less time trying to figure out how to eat gummy bears with no sugar.”

    [ad_2]

    Amanda Mull

    Source link