ReportWire

Tag: food production

  • Milk Has Lost All Meaning

    Milk Has Lost All Meaning

    [ad_1]

    You overhear a lot of strange things in coffee shops, but an order for an “almond-based dairy-alternative cappuccino” is not one of them. Ditto a “soy-beverage macchiato” or an “oat-drink latte.” Vocalizing such a request elicited a confidence-hollowing glare from my barista when I recently attempted this stunt in a New York City café. To most people, plant-based milk is plant-based milk.

    But though the American public has embraced this naming convention, the dairy industry has not. For more than a decade, companies have sought to convince the FDA that plant-based products shouldn’t be able to use the M-word. An early skirmish played out in 2008 over the name “soy milk,” which, the FDA acknowledged at the time, wasn’t exactly milk; a decade later, then-FDA Commissioner Scott Gottlieb pointed out that nut milk shouldn’t be called “milk” because “an almond doesn’t lactate.” To be safe, some fake-milk products have stuck to vaguer labels such as “drink,” “beverage,” and “dairy alternative.”

    But a few weeks ago, the FDA signaled an end to the debate by proposing long-awaited naming recommendations: Plant-based milk, the agency said, could be called “milk” if its plant origin was clearly identified (for example, “pistachio milk”). In addition, labels could clearly state how the product differs nutritionally from regular milk. A package labeled “rice milk” would be acceptable, but it should note when the product has less calcium or vitamin D than milk.

    Rather than prompt a détente, these recommendations are sucking milk into an existential crisis. Differentiating plant-based milk and milk requires defining what milk actually is, but doing so is at odds with the acknowledgement that plant-based milk is milk. It is impossible to compare plant-based and cow’s milk if there isn’t a standard nutrient content for cow’s milk, which comes in a range of formulations. This awkward moment is the culmination of a decades-long shift in the way the FDA—and consumers—have come to think about and define food in general. At this point, it’s unclear what milk is anymore.

    Technically, milk has an official definition, together with more than 250 other foods, including ketchup and peanut butter. In 1973, the FDA came up with this: “The lacteal secretion, practically free from colostrum, obtained by the complete milking of one or more healthy cows.” (Yum.) The recent guidance doesn’t override this definition but doesn’t uphold it either, so milk’s status remains vague. The agency doesn’t seem to mind; consumers understand that plant-based milk isn’t dairy milk, a spokesperson told me. But the FDA has long allowed for loose interpretations of this standard, which is why the lacteal secretions of sheep and goats can be called “milk.” As time goes on, what can be called “milk” seems to matter less and less.

    At one point, names mattered. In the late 1800s, people began to worry that their food was no longer “normal and natural and pure,” Xaq Frohlich, a food historian at Auburn University who is writing a book on the history of the FDA’s food standards, told me. As food production scaled up in the late 19th century, so did attempts to cut corners with cheap products parading as the real thing, such as margarine made with beef tallow. In 1939, the FDA began establishing so-called standards of identity based on traditional ideas of food.

    But the agency’s food definitions were malleable even before oat milk. The agency hasn’t been very strict about standards of identity, because consumers haven’t either. Around the 1960s, as people became aware of the ills of animal fat and cholesterol—and purchased the low-fat and diet foods that proliferated in response—the agency moved away from defining the identity of food toward a policy of “informative labeling” that provided nutritional information directly on the package so consumers knew exactly what they were eating. It became accepted that food was something that could be “tinkered with,” Frohlich said, and what mattered more than whether something was natural was whether it was healthy. In the midst of this change, milk was assigned its official identity, which came with caveats for added vitamins. Loosely interpreted, “milk” soon came to encompass that of other ruminants, as well as chocolate, strawberry, skim, lactose-free, and calcium-fortified stuff.

    In this context, the FDA’s recent expansion of this standard to accommodate plant-based milk is to be expected; Frohlich doesn’t think the plant-based or dairy industries “are particularly surprised by this proposal.” Very little will change if the new guidance becomes policy. (The decision has to go through a public-comment period before the FDA issues the final word.) If anything, there may be more plant-based products labeled “milk” at the supermarket, and perhaps the new labels will stave off any potential confusion that occurs. Pointing out nutritional differences between plant-based and dairy milk on packaging, the FDA spokesperson said, is meant to address the “potential public-health concern” that people will mistakenly expect these products to be nutritional substitutes for each other. But the nutritional value of dairy milk varies depending on the type, and in some cases, the nutrients are added in. Milk is just confusing, and perhaps that’s okay. For most consumers, milk will continue to be milk—a white-ish fluid, sourced from a variety of plants and animals, and ever-evolving.

    Milk aside, for most modern consumers, what to call a food matters less than other factors, such as what it consists of, where it comes from, how it’s made, and its impact on the planet. “Public understandings of food have really changed since the early 21st century,” Charlotte Biltekoff, a professor of food science and technology at UC Davis, told me. In some cases, people don’t define food by what it is so much as what it does. Many plant-based milks, Biltekoff said, don’t look or taste much like dairy milk but are accepted as milk because they’re used in the same way: splashed in coffee, poured into cereal, or as an ingredient in baked goods. In short, trying to define food with a standard identity can’t capture “the full scope of how most people interact with food and health right now,” she said. A name—or, indeed, a label pointing out nutritional differences between dairy and plant-based milk—can encompass only a fraction of what people want to know about milk, all of which is beyond what the FDA can regulate, Biltekoff added. No wonder its name doesn’t seem to matter much anymore.

    That’s not to say that all food names will eventually become diffuse to the point of meaninglessness. It’s hard to imagine peanut referring to anything but the legume, but then again, a debate over what counted as “peanut butter” lasted for a decade in the ’60s and ’70s. Naming clashes, in all likelihood, will occur over staple foods that already attract a lot of scrutiny and are produced by powerful industries, such as eggs or meat. For example, Americans use the term meat flexibly: In addition to animal flesh, it can also refer to products made from plants, fungi, or even mammal cells grown in a lab. Just as the dairy and plant-based industries fueled the naming debate over milk, there will undoubtedly be pushback from those holding on to and breaking meat conventions: “You will see the meat industry make similar arguments” about what constitutes a hamburger or what lab-grown chicken can be named, Frohlich said.

    So long as technology keeps pushing the boundaries of what food can be, food names will continue to shift, and the results won’t always be neat. Someone can value natural foods plucked from farmers’ markets and served to them at farm-to-table restaurants but at the same time champion technological advances that make different versions of our foods possible. Such a person might exclusively eat free-range organic bacon but demand highly processed oat milk for their cortado. These inner conflicts are inevitable as we undergo what Biltekoff calls “a kind of evolution in our understanding of what good food is.” Milk, for now, remains fluid—simultaneously many things and nothing at all.

    [ad_2]

    Yasmin Tayag

    Source link

  • Expiration Dates Are Meaningless

    Expiration Dates Are Meaningless

    [ad_1]

    For refrigerators across America, the passing of Thanksgiving promises a major purge. The good stuff is the first to go: the mashed potatoes, the buttery remains of stuffing, breakfast-worthy cold pie. But what’s that in the distance, huddled gloomily behind the leftovers? There lie the marginalized relics of pre-Thanksgiving grocery runs. Heavy cream, a few days past its sell-by date. A desolate bag of spinach whose label says it went bad on Sunday. Bread so hard you wonder if it’s from last Thanksgiving.

    The alimentarily unthinking, myself included, tend to move right past expiration dates. Last week, I considered the contents of a petite container in the bowels of my fridge that had transcended its best-by date by six weeks. Did I dare eat a peach yogurt? I sure did, and it was great. In most households, old items don’t stand a chance. It makes sense for people to be wary of expired food, which can occasionally be vile and incite a frenzied dash to the toilet, but food scientists have been telling us for years—if not decades—that expiration dates are mostly useless when it comes to food safety. Indeed, an enormous portion of what we deem trash is perfectly fine to eat: The food-waste nonprofit ReFED estimated that 305 million pounds of food would be needlessly discarded this Thanksgiving.

    Expiration dates, it seems, are hard to quit. But if there were ever a moment to wean ourselves off the habit of throwing out “expired” but perfectly fine items because of excessive caution, it is now. Food waste has long been a huge climate issue—rotting food’s annual emissions in the U.S. approximate that of 42 coal-fired power plants—and with inflation’s brutal toll on grocery bills, it’s also a problem for your wallet. People throw away roughly $1,300 a year in wasted food, Zach Conrad, an assistant professor of food systems at William and Mary, told me. In this economy? The only things we should be tossing are expiration dates themselves.

    Expiration dates, part of a sprawling family of labels that includes the easily confused siblings “best before,” “sell by,” and “best if used by,” have long muddled our conception of what is edible. They do so by insinuating that food has a definitive point of no return, past which it is dead, kaput, expired—and you might be, too, if you dare eat it. If only food were as simple as that.

    The problem is that most expiration dates convey only information about an item’s quality. With the exception of infant formula, where they really do refer to expiration, dates generally represent a manufacturer’s best estimate of how long food is optimally fresh and tasty, though what this actually means varies widely, not least because there is no federal oversight over labeling. Milk in Idaho, for example, can be “sold by” grocery stores more than 10 days later than in neighboring Montana, though the interim makes no difference in terms of quality. Some states, such as New York and Tennessee, don’t require labels at all.

    Date labels have been this haphazard since they arose in the 1970s. At the time, most Americans had begun to rely on grocery stores to get their food—and on manufacturers to know about its freshness. Now “the large majority of consumers think that these [labels] are related to safety,” Emily Broad Leib, a Harvard Law Professor and the founding director of its Food Law and Policy Clinic, told me. A study she co-authored in 2019 found that 84 percent of Americans at least occasionally throw out food close to the date listed on the package. But quality and safety are two very different things. Plenty of products can be edible, if not tasty, long past their expiration date. Safety, to food experts, refers to an item’s ability to cause the kind of food poisoning that sends people to the hospital. It’s “no joke,” Roni Neff, a food-waste expert at Johns Hopkins University, told me.

    Consider milk, which is among the most-wasted foods in the world. Milk that has already soured or curdled can—get this—still be perfectly safe to consume. (In fact, it makes for fluffy pancakes and biscuits and … skin-softening face masks.) “If you take a sip of that milk, you’re not going to end up with a foodborne illness,” Broad Leib said, adding that milk is one of the safest foods on the market because pasteurization kills all of the germs. Her rule of thumb for other refrigerated items is that anything destined for the stove or oven is safe past its expiration date, so long as it doesn’t smell or look odd. In industry speak, cooking is a “kill step”—one that destroys harmful interlopers—if done correctly. And then there is the pantry, an Eden of forever-stable food. Generally, dry goods never become unsafe, even if their flavor dulls. “You’re not taking your life into your hands if you’re eating a stale cracker or cereal,” said Broad Leib.

    Of course it would just be easier if labels were geared toward safety, but for the majority of food, the factors are too complex to sum up in a single date. Food is considered unsafe if it carries pathogens such as listeria, E. coli, or salmonella that can cause foodborne illness. These sneak into food through contamination, like when E. coli–tainted water is used to grow romaine lettuce. Proper storage, which means temperatures colder than 40 degrees Fahrenheit or hotter than 140 degrees Fahrenheit, inhibits their growth (except for listeria, which is particularly scary because it can thrive during refrigeration). It would be extremely difficult for a label to reflect all of this information, especially given that unsafe storage and contamination tend to occur after purchase, in hot car trunks and on unsanitized countertops. But as long as food doesn’t carry these germs to begin with, pathogens won’t suddenly appear the moment the clock strikes midnight on the expiration date. “They’re not spontaneous. Your crackers aren’t, like, contracting salmonella from the shelf,” said Broad Leib.

    There is, however, one category of food that should be labeled. Sometimes referred to as “foods pregnant women should avoid,” it includes certain ready-to-eat products such as deli meats, raw fish, sprouted vegetables, and unpasteurized milk and cheese, Brian Roe, a professor at Ohio State University’s Food Innovation Center, told me. These require extra caution because they can carry listeria, which is invisible to the senses, and are usually served cold—that is, they don’t go through a kill step before serving. Experts I spoke with agreed that high-risk foods should be identified as such, because there’s no way to tell if they’ve become unsafe. As things stand, the date label is the only information available, and it is “not helping people protect themselves from that handful of foods,” said Broad Leib. To overcome this setback, efforts are under way in the Senate and the House to replace all date labels with two phrases: best if used by to denote quality and use by for safety.

    But it’s one thing to know expiration dates are bogus and another to live accordingly. In America, dates have become a tradition we can’t escape, Neff said, adding that the stickler of each household usually gets to set the rules. And even for more adventurous eaters, date labels serve a purpose: They’re a tool for calibrating judgment, or merely for providing the comfort of a reference point. “There’s something about seeing a number there that we think tells us something that gives us a sense of security,” Neff said. Manufacturers, meanwhile, maintain date labels because they don’t want to risk consumers buying products past their prime, even if they are safe and still (mostly) tasty.

    Although there’s no perfect way to know whether food is safe or not, there are better ways than expiration dates to tell. The adage “When in doubt, throw it out” doesn’t cut it anymore, said Neff; if you’re not sure, just look it up. Good tools are available online: She recommends FoodKeeper, an app developed by the U.S. Department of Agriculture, which lets users look up roughly how long food lasts. The Waste-Free Kitchen Handbook, by the food-waste pioneer Dana Gunders, gives detailed practical advice, such as scraping a half-inch below blue-green mold on hard cheese to safely recover the rest. Leftovers require slightly more caution, noted Broad Leib, because reheating, transferring between containers, and frequent touching with utensils (which, admit it, have been in your mouth) introduces more risk for contamination; her recommendation is to eat them within three to five days, and reheat them well—to a pathogen-killing internal temperature of 165 degrees Fahrenheit. And if doing so proves tedious, consider Roe’s take on the old saying: “When in doubt, cover it with panko, fry it up, and give it to your kids.”

    Yet for most foods, one tactic reigns supreme: the smell test. Your senses can give you most of the information you need. “If something smells off, you know,” said Broad Leib. Humans evolved disgust because it taught us to avoid the stench of pathogen-tainted food. But because most people are out of practice, they struggle to tell good from bad or don’t trust their senses. To be fair, it can be hard to discern whether weird smells are coming from the milk or the carton. To restore the food knowledge that has been lost since Americans shifted away from agriculture, all of the experts I spoke with supported the revival of home-economics classes—albeit with different branding and less sexism. Teaching students how to handle perishable food means teaching them what perished looks and smells like. Adults can learn this at home, of course, by opening that milk carton and daring to sniff deeply. It may be the first sniff of the rest of your life.

    It’s unlikely that we’ll ever return en masse to the pre-1970s idyll of purchasing food directly from farmers or growing it ourselves. Americans are “several generations removed now from agriculture and food production, so we don’t know our food as well as they once did,” Jackie Suggitt, the director of capital, innovation, and engagement at ReFED, told me. A smell rebellion, if you will, can’t restore our severed relationship with food, but hey, it’s a start. The lonely items lingering in one’s post-Thanksgiving fridge may be one inhale away from renewed relevance. If I deigned to sniff that “expired” heavy cream, I might be delighted to encounter a future garnish for pumpkin pie. And what is wilted spinach anyway but a can of artichokes away from dip?

    [ad_2]

    Yasmin Tayag

    Source link