ReportWire

Tag: food scientist

  • Go Ahead, Try to Explain Milk

    Go Ahead, Try to Explain Milk

    [ad_1]

    If an alien life form landed on Earth tomorrow and called up some of the planet’s foremost experts on lactation, it would have a heck of time figuring out what, exactly, humans and other mammals are feeding their kids.

    The trouble is, no one can really describe what milk is—least of all the people who think most often about it. They can describe, mostly, who makes it: mammals (though arguably also some other animals that feed their young secretions from their throat or their skin). They can describe, mostly, where it comes from: mammary glands via, usually, nipples (though please note the existence of monotremes, which ooze milk into abdominal grooves). They can even describe, mostly, what milk does: nourish, protect, and exchange chemical signals with infants to support development and growth.

    But few of these answers get at what milk, materially, compositionally, is actually like. Bridget Young, an infant-nutrition researcher at the University of Rochester, told me milk was an “ecological system”; Alan S. Ryan, a clinical-research consultant, called it a “nutritional instrument.” Bruce German, a food scientist at UC Davis, told me milk was “the result of the evolutionary selective pressure on a unique feeding strategy,” adding, by way of clarification, that it was “a biological process.” A few researchers defaulted to using milk to explain something else. “It’s the defining feature of mammals,” says Melanie Martin, an anthropologist at the University of Washington. None of these characterizations were bad. But had I been that alien, I would have no idea what these people were talking about.

    What these experts were trying to avoid was categorizing milk as a “food”—the way that most people on Earth might, especially in industrialized countries where dairy products command entire supermarket aisles. “Overwhelmingly, when we think about milk, when we talk about milk, we think of nutrition,” says Katie Hinde, an evolutionary biologist at Arizona State University. That’s not the wrong way to think about it. But it’s also not entirely right.

    The milk that mammals make is undoubtedly full of the carbs, fat, protein, vitamins, and minerals newborn mammals need to survive. And, across species, much of it does resemble the creamy, tart-tangy, lactose-rich whitish liquid that billions of people regularly buy. But to consider only milk’s nutrient constituents—to imply that it has a single recipe—is to do it “a disservice,” German told me. Mammalian milk is a manifestation of hundreds of millions of years of evolutionary tinkering that have turned it into a diet, and a developmental stimulus, and a conduit for maternal-infant communication, and a passive vaccine. It builds organs, fine-tunes metabolism, and calibrates immunity; it paints some of an infant’s first portraits of its mother, and telegraphs chemical signals to the microbes that live inside the gut. Milk can sustain echidnas that hatch from eggs, and wildebeest that can gallop within hours of birth; it can support newborn honey possums that weigh just three milligrams at birth, and blue-whale calves clocking in at up to 20 tons. Among some primates, it influences infants’ playfulness, and may shape their sleep habits and bias them toward certain foods. Some of its ingredients are found nowhere else in nature; others are indigestible, still others are alive.

    Milk is also dynamic in a way that no other fluid is. It remodels in the hours, days, weeks, and months after birth; it changes from the beginning of a single stint of feeding to the end. In humans, scientists have identified “morning” milk that’s high in cortisol, and “night” milk that’s heavy in melatonin; certain primates have “boy milk and girl milk,” German told me, which support subtly different developmental needs. Tammar wallabies, which can nurse two joeys of different ages at once, even produce milks tailored to each offspring’s developmental stage; Kevin Nicholas, a biologist at Monash University, has found that when the joeys swap teats, the younger sibling’s growth accelerates. And when mothers and their offspring change, milk changes in lockstep. It reflects the mother’s stress level and physical health, taking on new flavors as her diet shifts; its fat content fluctuates, depending on how far apart bouts of nursing are spaced. Scientists are just beginning to understand how made-to-order milk might be: Some evidence suggests that maternal tissues may register, via the breast, when infants catch infections—and modify milk in real time to furnish babies with the exact immune cells or molecules they need.

    “It’s a triad: mother, milk, and infant,” says Moran Yassour, a computational biologist at Hebrew University of Jerusalem. “Each one of them is playing a role, and the milk is active.” That dynamism makes milk both a miracle, and an enduring mystery—as unique and unreplicable as any individual parent or child, and just as difficult to define.


    In its earliest forms, milk probably didn’t have much nutritional value at all. Scientists think the substance’s origins date back about 300 million years, before the rise of mammals, in a lineage of creatures that hatched their young from very delicate eggs. The structures that would later develop into mammary glands started out similar to the ones we use to sweat; the substance that would become proper milk pooled on the surface of skin and was slathered onto shells. The earliest milks probably had few calories and almost none of its hallmark lactose. But they were deeply hydrating, and teeming with immunity.

    As our ancestors jettisoned egg laying for live birth, they began to extrude milk not just as a defensive shield for their offspring, but as a source of calories, vitamins, and minerals. The more that milk offered to infants, the more that it demanded of those that produced it: Mothers “dissolve themselves to make it,” German told me, liquefying their own fat stores to keep their babies fed, “which is impressive and scary at the same time.” In its many modern manifestations, milk is, in every mammal that produces it, a one-stop shop for newborn needs—“the only real time in life where we have hydration, nutrients, and bioactive factors that are all a single source,” says Liz Johnson, an infant-nutrition researcher at Cornell.

    Each time mammals have splintered into new lineages, taking on new traits, so too has their milk. While most primates and other species that can afford to spend months doting on their young produce dilute, sugary milks that can be given on demand, other mammals have evolved milk that encourages more independence and is calorific enough to nourish in short, ultra-efficient bursts. Hooded seals, which have to wean their pups within four days of birth, churn out goopy milk that’s nearly sugar-free, but clocks in at about 60 percent fat—helping their offspring nearly double in weight by the time they swim away. Marsupial milk, meanwhile, is ultra-sweet, with double or triple the sugar content of what cows produce, and cottontail rabbits pump out a particularly protein-rich brew. (One thing milk can’t do? Be high in both sugar and fat, says Mike Power, a biological anthropologist at the Smithsonian Conservation Biology Institute, where he maintains a large repository of mammalian milk: “Nature has never been able to produce ice cream.”) Each species’ milk even has its own microbiome—a community of helpful bacteria that goes on to seed the newborn infant’s gut. Mammal milks are now so specialized to their species that they can’t substitute for one another, even between species that otherwise live similar lives.

    Human milk—like other primate milk—is on the watery, sugary side. But its concentrations of immunity-promoting ingredients have no comparator. It bustles with defensive cells; it shuttles a stream of antibodies from mother to young, at levels that in some cases outstrip those of other great apes’ milk by a factor of at least 10. Its third-most-common solid ingredient is a group of carbohydrates known as human milk oligosaccharides, or HMOs, which aren’t digestible by our own cells but feed beneficial bacteria in the colon while keeping pathogens out. Roughly 200 types of oligosaccharides have been found in human milk—an inventory with more diversity, complexity, and nuance than that of any other mammalian species described to date, says Concepcion Remoroza, a chemist who’s cataloging the HMOs of different mammalian milks at the National Institute of Standards and Technology.

    The sheer defensive firepower in our species’ milk is probably a glimpse into the challenges in our past, as humans crowded together to plant, fertilize, and harvest mass quantities of food, and invited domesticated creatures into our jam-packed homes. “We were basically concentrating our pathogens and our parasites,” Power told me, in ways that put infants at risk. Perhaps the millennia modified our milk in response, making those unsanitary conditions possible to survive.


    Mammals would not exist without their milk. And yet, “we don’t actually know that much about milk,” down to the list of its core ingredients in our own species, says E. A. Quinn, an anthropologist at Washington University in St. Louis. Even for the breast-milk components that scientists can confidently identify, Quinn told me, “we don’t really have a good handle on what normal human values are.” Many studies examining the contents of breast milk have focused on Western countries, where the population skews wealthier, well nourished, and white. But so much varies from person to person, from moment to moment, that it’s tough to get a read on what’s universally good; likely, no such standard exists, at least not one that can apply across so many situations, demographics, and phases of lactation, much less to each infant’s of-the-moment needs.

    Milk’s enduring enigmas don’t just pose an academic puzzle. They also present a frustrating target—simultaneously hazy and mobile—for infant formulas that billions of people rely on as a supplement or substitute. Originally conceived of and still regulated as a food, formula fulfills only part of milk’s tripartite raison d’etre. Thanks to the strict standards on carb, fat, protein, vitamin, and mineral content set by the FDA and other government agencies, modern formulas—most of which are based on skim cow’s milk—do “the nourish part really well,” helping babies meet all their growth milestones, Bridget Young, the University of Rochester infant nutrition researcher, told me. “The protect and communicate part is where we start to fall short.” Differences in health outcomes for breastfed and formula-fed infants, though they’ve shrunk, do still exist: Milk-raised babies have, on average, fewer digestive troubles and infections; later in life, they might be less likely to develop certain metabolic issues.

    To close a few of those gaps, some formula companies have set their sights on some of milk’s more mysterious ingredients. For nearly a decade, Abbott, one of the largest manufacturers of formula in the United States, has been introducing a small number of HMOs into its products; elsewhere, scientists are tinkering with the healthful punch via live bacterial cultures, à la yogurt. A few are even trying a more animal-centric route. The company ByHeart uses whole cow’s milk as its base, instead of the more-standard skim. And Nicholas, the Monash University biologist, is taking inspiration from wallaby milk—complex, nutritious, and stimulating enough to grow organs of multiple species almost from scratch—which he thinks could guide the development of formulas for premature human infants not yet ready to subsist solely on mature milk.

    All of these approaches, though, have their limits. Of the 200 or so HMOs known to be in human milk, companies have managed to painstakingly synthesize and include just a handful in their products; the rest are more complex, and even less well understood. Getting the full roster into formula will “never happen,” Sharon Donovan, a nutritional scientist at the University of Illinois at Urbana-Champaign, told me. Other protein- and fat-based components of milk, specially packaged by mammary glands, are, in theory, more straightforward to mix in. But those ingredients might not always behave as expected when worked onto a template of cow’s milk, which just “cannot be compared” to the intricacies of human milk, Remoroza told me. (In terms of carbs, fats, and protein, zebra milk is, technically, a better match for us.)

    A company called Biomilq is trying a radical way to circumvent cows altogether: It’s in the early stages of growing donated human-mammary-gland cells in bioreactors, in hopes of producing a more recognizable analogue for breast milk, ready-made with our own species-specific mix of lactose, fats, and proteins, and maybe even a few HMOs, Leila Strickland, one of Biomilq’s co-founders, told me. But even Strickland is careful to say that her company’s product will never be breast milk. Too many of breast milk’s immunological, hormonal, and microbial components come from elsewhere in the mother’s body; they represent her experience in the world as an entire person, not a stand-alone gland. And like every other milk alternative, Biomilq’s product won’t be able to adjust itself in real time to suit a baby’s individual needs. If true milk represents a live discourse between mother and infant, the best Biomilq can manage will be a sophisticated, pretaped monologue.

    For all the ground that formula has gained, “no human recipe can replicate what has evolved” over hundreds of millions of years, Martin, of the University of Washington, told me. That may be especially true as long as formula continues to be officially regarded as a food—requiring it to be, above all else, safe, and every batch the same. Uniformity and relative sterility are part and parcel of mass production, yet almost antithetical to the variation and malleability of milk, Cornell’s Johnson told me. And in regulatory terms, foods aren’t designed to treat or cure, which can create headaches for companies that try to introduce microbes and molecules that carry even a twinge of additional health risk. Float the notion of a very biologically active addition like a growth factor or a metabolic hormone, and that can quickly “start to scare people a bit,” Donovan, of the University of Illinois at Urbana-Champaign, told me.

    As companies have vied to make their formulas more milk-esque and complex, some experts have discussed treating them more like drugs, a designation reserved for products with proven health impact. But that classification, too, seems a poor fit. “We’re not developing a cure for infancy,” Strickland, of Biomilq, told me. Formula’s main calling is, for now, still to “promote optimal growth and development,” Ryan, the research consultant, told me. Formula may not even need to aspire to meet milk’s bar. For babies that are born full-term, who remain up-to-date on their vaccinations and have access to consistent medical care, who are rich in socioeconomic support, who are held and doted on and loved—infants whose caregivers offer them immunity, resources, and guidance in many other ways—the effect of swapping formula for milk “is teeny,” Katie Hinde, of Arizona State University, told me. Other differences noted in the past between formula- and breastfed infants have also potentially been exaggerated or misleading; so many demographic differences exist between people who are able to breastfeed their kids and those who formula-feed that tracing any single shred of a person’s adult medical history back to their experiences in infancy is tough.

    The biggest hurdles in infant feeding nowadays, after all, are more about access than tech. Many people—some of them already at higher risk of poorer health outcomes later in life—end up halting breastfeeding earlier than they intend or want to, because it’s financially, socially, or institutionally unsustainable. Those disparities are especially apparent in places such as the U.S., where health care is privatized and paid parental leave and affordable lactation consultants are scarce, and where breastfeeding rates splinter unequally along the lines of race, education, and socioeconomic status. “Where milk matters the most, breastfeeding tends to be supported the least,” Hinde told me. If milk is a singular triumph of evolution, a catalyst for and a product of how all mammals came to be, it shouldn’t be relegated to a societal luxury.

    [ad_2]

    Katherine J. Wu

    Source link

  • American Food Will Never Look Natural Again

    American Food Will Never Look Natural Again

    [ad_1]

    In 1856, an amateur chemist named William Henry Perkin mixed a batch of chemicals that he hoped, in vain, would yield the malaria drug quinine. When Perkin’s failed experiment turned purple, a hue so vivid that it could stain silks without fading, he realized he’d stumbled upon a different marvel of modernity: a commercially viable synthetic dye, the first of a new generation of chemicals that would revolutionize the way humans colored their clothes and, soon after, their food.

    The edible versions of the chemicals, in particular, were a revelation, offering food manufacturers ”cheap and convenient” alternatives to pigments squeezed painstakingly from natural sources such as plants, says Ai Hisano, a historian and the author of Visualizing Taste: How Business Changed the Look of What You Eat. Dyes could keep peas verdant after canning and sausages pink after cooking; they could turn too-green oranges more orange and light up corner-shop candy displays. By the Second World War, synthetic dyes had become, as one grocer put it, “one of the greatest forces in the world” in the sale of foods. And the more foods the chemicals were introduced to, the more the chemicals came to define how those foods should look: the yellow of butter, the crimson of strawberry Jell-O.

    But after hitting a mid-20th-century peak, the roster of synthetic dyes used in Western foods began to shrink. In recent years, European countries have appended warning labels onto the products that contain them; the United States has whittled down its once-long list of approved artificial food dyes to just nine. The FDA is now reviewing a petition to delist Red No. 3, which colors candy corn, conversation hearts, and certain chewing gums and cake icings; California and New York are mulling legislation that could ban the additive, along with several others, by 2025.

    The concern is that the dyes add not just colors but a substantial health risk. Several of the compounds have been linked to patterns of hyperactivity and restlessness in kids. Red No. 3 has also been known since the 1980s to cause cancer in rats. The precise explanation for the harm is unclear; research into the issue has been spotty, and “there is no comprehensive set of data that says, ‘This is the mechanism,’” according to Elad Tako, a food scientist at Cornell University. Several respected researchers have even dismissed the evidence as overhyped. More than a century into the dyes’ tenure, “there is not even consensus on the fact that they are dangerous,” or what happens when our bodies snarf them down, says Monica Giusti, a food scientist at Ohio State University.

    Even so, the argument against artificial food dyes seems as though it should be simple: They have no known nutritional benefits and potentially carry several health risks. “We’re talking about something that’s cosmetic versus something that is hurting kids,” says Lisa Lefferts, an environmental-health consultant who has petitioned the FDA to ban Red No. 3. And yet, the dyes endure—precisely because they offer our foods and our eyes shades that nature never could.


    When synthetic food dyes were newer, their shortcomings were hard to miss. One of the colorants’ main ingredients was derived from the by-products of the process that turned coal into fuel—and in the absence of careful scrutiny, some early batches of the dyes ended up contaminated with arsenic, mercury, and lead. Companies also used the dyes to conceal defects or spoilage that then sickened many people. By the 1930s, Congress required, among other safety measures, that government scientists vet the chemicals’ safety and restricted companies to sourcing exclusively from an approved list.

    But dangerous chemicals seemed to keep slipping through. In the 1950s, after a batch of Halloween candy sickened several children, FDA scientists found that the culprit was the synthetic dye that had turned the treats orange—a dye so toxic that it caused organ damage and even premature death in animals in labs. The agency hastily banned it and, by the late 70s, axed nearly a dozen other synthetic dyes linked to cancers and organ damage in animals. Today, Americans regularly see just seven artificial dyes in their foods; two others are used very sparingly.

    Still, roughly 19 million pounds of the seven prevalent synthetic dyes were certified by the FDA to flood the U.S. food supply in fiscal year 2022—and no one agrees on which colorants pose the biggest threat. In the European Union and the United Kingdom, foods containing any of six synthetic food dyes—including the three most common ones in the U.S.: Red No. 40, Yellow No. 5, and Yellow No. 6—must warn customers that the colorants “may have an adverse effect on activity and attention in children.” The FDA, however, has yet to adopt any such posture—even though it’s long since delisted Red No. 2, which is still allowed in Europe. Even Red No. 3—which has been linked to both cancer in animals and behavioral issues in kids, and may be one of the most concerning additives remaining in the American food supply, according to Peter Lurie, the president and executive director of the Center for Science in the Public Interest—carries a mixed rap. The FDA banned it from cosmetics and externally applied drugs decades ago but still allows it in food; countries in Europe have restricted its use but don’t mind adding it to certain canned cherries to maintain their hue.

    On the whole, the International Association of Color Manufacturers, which represents the color-additives industry, told me that the claims around food dyes and health risks aren’t sound, pointing out that many of the studies on synthetic colors have yielded conflicting results. The FDA, too, maintains that color additives “are very safe when used properly.” The links, to be fair, are tough to study: With behavior-focused outcomes in kids, for instance, “you’re looking at more subtle kinds of changes that you find on a population basis,” and some children seem more sensitive than others, further muddying the stats, says Linda Birnbaum, the former director of the National Institute of Environmental Health Sciences and the National Toxicology Program. And some laboratory studies on the chemicals have delivered them into rodents in high doses or via tubes down their throat, making the data’s relevance to us a bit shakier. But although some argue that there’s not enough evidence to conclude that the dyes definitely pose a peril, others rightly note that there’s also insufficient data to conclude that they don’t. For all of the pounds of the chemicals we’ve gulped down, “there are still more questions than answers about artificial colorants,” says Diego Luna-Vital, a food scientist at the Monterrey Institute of Technology and Higher Education, in Mexico.

    Lefferts, the environmental-health consultant, is one of several researchers who’d rather err on the side of caution and expunge the entire current roster of artificial food dyes. The potential losses seem negligible, she told me, and the possible benefits immense. Scientists may not even yet know the extent of the dyes’ issues: Just last year, a group led by Waliul Khan of McMaster University published evidence that Red No. 40 may raise the risk of colitis in mice. But without an outright push from the FDA, manufacturers have little incentive to change their practices. And there’s not exactly a clear-cut path toward developing new synthetic colorants with a less dubious safety profile: Without identifying why current dyes might be dangerous, scientists can’t purposefully avoid the root problem in future ones, says Thomas Galligan, CSPI’s principal scientist for food additives and supplements.


    In the background of the fight over artificial dyes, the colorants’ natural counterparts are making a slow and steady comeback. In the EU and the U.K., consumers can find Starburst and M&M’s tinted mostly with plant extracts. And in the U.S., Kraft has re-created the artificial-orange hue of its mac and cheese with a blend of annatto, turmeric, and paprika. Recent surveys have shown that a growing contingent of the global population is eager to eat cleaner ingredients—not, as Jim Murphy, the former president of General Mills, once put it, “colors with numbers in their foods.”

    But in late 2017, Murphy would go on to eat his words, after his company’s all-natural version of Trix debuted, then rapidly tanked. Trix traditionalists were horrified at the revamped recipe’s muted melange of purple-y reds and orangey yellows, devoid of the greens and blues that General Mills had struggled to naturally replicate; they called it “disgusting,” and “basically a salad now.” Just two years after pledging to purge its products of artificial additives, General Mills reinstated “classic Trix”—complete with its synthetics-laden ingredient list. A similar story played out with Necco, which removed the artificial dyes from its wafers only to quickly return them; Mars, too, publicly promised to remove synthetics from its American products then let its self-imposed deadline pass without making good.

    Natural dyes, it turns out, are still a chore to work with, for the same reasons they were once so easily replaced. They’re expensive to extract and process; their colors are inconsistent, and tend to fade quite fast, especially in the presence of light and heat, Luna-Vital told me. Humans are also limited to what nature has available, and the fickleness of those compounds: They often “change on us,” Giusti told me, when researchers mix them into recipes. Sometimes the colors even impart unwanted flavors or funk.

    Several companies, including Sensient and Kalsec, told me that they are now trying to introduce modifications or tweaks that enhance natural pigments’ stability and vibrancy to help them compete. But the more tinkering happens, the more these new dyes could start to resemble the ones that researchers want them to oust. Nowadays, even natural colorants “are artificially created, on some level,” Hisano, the historian, told me. And although the FDA’s regulatory standards assume that plant-, animal-, and mineral-derived dyes will be a safer alternative to synthetics, going as far as to exempt them from certain tests, relying on the simple reassurance that a source is natural is, admittedly, “not the strongest scientific argument,” Michael Jacobson, the former executive director of CSPI, told me. Nature-made, after all, has never been synonymous with safe: It wasn’t so long ago that bakers were bleaching their breads with chalk and dairy manufacturers were tingeing their milks yellow with lead chromate. (“The FDA’s regulations require evidence that a color additive is safe at its intended level of use before it may be added to foods,” a spokesperson told me.)

    There is, technically, another option—abstaining from adding colors to foods at all. But that would fundamentally transform how we experience our meals. Added dyes and pigments—both artificial and natural—are mainstays not just of sports drinks and packaged sweets but also salad dressing, yogurt, pickles, peanut butter, and dried and smoked meats; they’re what makes farmed-salmon flesh pink. Vision is key to taste: “There’s probably no other sensory cue that gives us as much information about what we’re about to eat,” says Charles Spence, an experimental psychologist at the University of Oxford. In what might be an echo of the preferences that helped our ancestors find ripe fruits, Spence told me, our modern brain still tends to link pinks and reds to sugar and yellows and greens to all things tart. Colors can play tricks too: When researchers artificially darken the tint of drinks or yogurt, study subjects insist that it tastes sweeter; when consumers see a rainbow of flavors in their snacks, the sheer appeal of variety may persuade some of them to eat more.

    Some of artificial dyes’ biggest dangers, then, may not even be entirely inherent to the chemicals themselves. Foods that need a color boost tend to be the ones that experts already want us to avoid: candies, sodas, and packaged, processed snacks, especially those marketed to children, points out Lindsay Moyer, a CSPI nutritionist. Colors so exaggerated, so surprising, so unnatural inevitably tempt kids “to reach out of the grocery cart,” Moyer told me. Dyes, once cooked up by us to mimic and juxtapose with the natural world, have long since altered us—manipulating our base instincts, warping our appetites—and transformed into a luxury that the world now seems entirely unable to quit.


    ​When you buy a book using a link on this page, we receive a commission. Thank you for supporting The Atlantic.

    [ad_2]

    Katherine J. Wu

    Source link