ReportWire

Tag: high heat

  • Just How Sweaty Can Humans Get?

    Just How Sweaty Can Humans Get?

    [ad_1]

    This summer, I, like so many other Americans, have forgotten what it means to be dry. The heat has grown so punishing, and the humidity so intense, that every movement sends my body into revolt. When I stand, I sweat. When I sit, I sweat. When I slice into a particularly dense head of cabbage, I sweat.

    The way things are going, infinite moistness may be something many of us will have to get used to. This past July was the world’s hottest month in recorded history; off the coast of Florida, ocean temperatures hit triple digits, while in Arizona, the asphalt caused third-degree burns. As human-driven climate change continues to remodel the globe, heat waves are hitting harder, longer, and more frequently. The consequences of this crisis will, on a macroscopic scale, upend where and how humans can survive. It will also, in an everyday sense, make our lives very, very sweaty.

    For most Americans, that’s probably unwelcome news. Our culture doesn’t exactly love sweat. Heavy perspirers are shunned on subways; BO is a hallmark of pubescent shame. History is splattered with examples of people trying to cloak sweat in perfumes, wash it away by bathing, or soak it up with wads of cotton or rubber crammed into their shirts, dresses, and hats. People without medical reason to do so have opted to paralyze their sweat-triggering nerves with Botox. Even Bruce Lee had the sweat glands in his armpits surgically removed, reportedly to avoid on-screen stains, several months before his death, in 1973.

    But our scorn of sweat is entirely undeserved. Perspiration is vital to life. It cools our bodies and hydrates our skin; it manages our microbiome and emits chemical cues. Sweat is also a fundamental part of what makes people people. Without it, we wouldn’t be able to run long distances in high heat; we wouldn’t be able to power our big brains and bodies; we wouldn’t have colonized so much of the Earth. We may even have sweat to thank (or blame) for our skin’s nakedness, says Yana Kamberov, a sweat researcher at the University of Pennsylvania. Her team’s recent data, not yet published, suggest that as human skin evolved to produce more and more sweat glands, fur-making hair follicles disappeared to make room. Sweat is one of the “key milestones” in human evolution, argues Andrew Best, a biological anthropologist at the Massachusetts College of Liberal Arts—on par with big brains, walking upright, and the expression of culture through language and art.

    Humans aren’t the only animals that sweat. Many mammals—among them, dogs, cats, and rats—perspire through the footpads on their paws; chimpanzees, macaques, and other primates are covered in sweat glands. Even horses and camels slick their skin in the heat. But only our bodies are studded with this many millions of teeny, tubular sweat glands—about 10 times the number found on other primates’ skin—that funnel water from our blood to pores that can squeeze out upwards of three, four, even five liters of sweat an hour when we need them to.

    Our dampness isn’t cost free. Sweat is siphoned from the liquid components of blood—lose too much, and the risks of heat stroke and death shoot way up. Our lack of fur also makes us more vulnerable to bites and burns. That humans sweat anyway, then, Best told me, is a testament to perspiration’s cooling punch—it’s so much more efficient than merely panting or hiding from the heat. “If your objective is to be able to sustain a high metabolic rate in warm conditions, sweating is absolutely the best,” he said.

    And yet, in modern times, many of us just can’t seem to accept the realities of sweat. Americans are, for whatever reason, particularly preoccupied with quashing perspiration; in many other countries, “body odor is just normal,” says Angela Lamb, a dermatologist at Mount Sinai’s Icahn School of Medicine. But the bemoaning of BO has cultural roots that long predate the United States. “I’ve read discussions well back into antiquity where there are discussions about people whose armpits stink,” says Cari Casteel, a historian at the University of Buffalo. By the start of the 20th century, Americans had been primed by the recent popularization of germ theory to fear dirtiness—the perfect moment for marketers to “put the fear in women, and then men, that sweat was going to kibosh your plans for romance or a job,” says Sarah Everts, the author of The Joy of Sweat. These days, deodorants command an $8 billion market in the United States.

    Our aversion to sweat doesn’t make much evolutionary sense. Unlike other excretions that elicit near-universal disgust, sweat doesn’t routinely transmit disease or pose other harm. But it does evoke physical labor and emotional stress—neither of which polite society is typically keen to see. And for some, maybe it signifies “losing control of your body in a particular way,” says Tina Lasisi, a biological anthropologist at the University of Michigan. Unlike urine or tears, sweat is the product of a body function that we can’t train ourselves to suppress or delay.

    We also hate sweat because we think it smells bad. But it doesn’t, really. Nearly all of the sweat glands on human bodies are of the so-called eccrine variety, and produce slightly salty water with virtually no scent. A few spots, such as the armpits and groin, are freckled with apocrine glands that produce a waxy, fatty substance laced with pheromones—but even that has no inherent odor. The bacteria on our skin eat it, and their waste generates a stench, leaving sweat as the scapegoat. Our species’ approach to perspiration may even make us “less stinky than we could be,” Best told me. The expansion of eccrine glands across the body might not have only made our skin barer; it’s also thought to have evicted a whole legion of BO-producing apocrine glands.

    As global temperatures climb, for many people—especially in parts of the world that lack access to air-conditioning—sweat will be an inevitability. “I suspect everyone is going to be quite drippy,” Kamberov told me. Exactly how slick each of us will be, though, is anyone’s guess. Experts have evidence that men sweat more than women, and that perspiration potential declines with age. But by and large, they can’t say with certainty why some people are inherently sweatier than others, and how much of it is inborn. Decades ago, a Japanese researcher hypothesized that perspiration potential might be calibrated in the first two or three years of life: Kids born into tropical climates, his analyses suggested, might activate more of their sweat glands than children in temperate regions. But Best’s recent attempts to replicate those findings have so far come up empty.

    Perspiration does seem to be malleable within a lifetime. A couple of weeks into a new, intense exercise regimen, for instance, people will start to sweat more and earlier. Over longer periods of time, the body can also learn to tolerate high temperatures, and sweat less copiously but more efficiently. We sense these changes subtly as the seasons shift, says Laure Rittié, a physiologist at Glaxo-Smith Kline, who has studied sweat. It’s part of the reason a 75-degree day might feel toastier—and perhaps sweatier—in the spring than in the fall.

    But we can’t simply sweat our way out of our climatic bind. There’s a ceiling to the temperatures we can tolerate; the body can leach only so much liquid out at once. Sweat’s cooling power also tends to falter in humid conditions, when liquid can’t evaporate as easily off of skin. Nor can researchers predict whether future generations might evolve to perspire much more than we do now. We no longer live under the intense conditions that pressured our ancestors to sprout more sweat glands—changes that also took place over many millions of years. It’s even possible that we’re fast approaching the maximal moistness a primate body can produce. “We don’t have a great idea about the outer limits of that plasticity,” Jason Kamilar, a biological anthropologist at the University of Massachusetts at Amherst, told me.

    For now, people who are already on the sweatier side may find themselves better equipped to deal with a warming world, Rittié told me. At long last: Blessed are the moist, for they shall inherit the Earth.

    [ad_2]

    Katherine J. Wu

    Source link

  • So Are Nonstick Pans Safe or What?

    So Are Nonstick Pans Safe or What?

    [ad_1]

    I grew up in a nonstick-pan home. No matter what was on the menu, my dad would reach for the Teflon-coated pan first: nonstick for stir-fried vegetables, for reheating takeout, for the sunny-side-up eggs, garlic fried rice, and crisped Spam slices that constituted breakfast. Nowadays, I’m a much fussier cook: A stainless-steel pan is my kitchen workhorse. Still, when I’m looking to make something delicate, such as a golden pancake or a classic omelet, I can’t help but turn back to that time-tested fave.

    And what a dream it is to use. Nonstick surfaces are so frictionless that fragile crepes and scallops practically lift themselves off the pan; cleaning up sticky foods, such as oozing grilled-cheese sandwiches, becomes no more strenuous than rinsing a plate. No wonder 70 percent of skillets sold in the U.S. are nonstick. Who can afford to mangle a dainty snapper fillet or spend time scrubbing away crisped rice?

    All of this convenience, however, comes with a cost: the unsettling feeling that cooking with a nonstick pan is somehow bad for you. My dad had a rule that we could only use a soft, silicon-edged spatula with the pan, born of his hazy intuition that any scratches on the coating would cause it to leach into our food and make us sick. Many home cooks have lived with these fears since at least the early 2000s, when we first began to hear about problems with Teflon, the substance that makes pans nonstick. Teflon is produced from chemicals that are part of an enormous family of chemicals known as perfluoroalkyl and polyfluoroakyl substances, or PFAS, and research has linked exposure to them to many health conditions, including certain cancers, reproductive issues, and high cholesterol. And that is about all we know: In kitchens over the past two decades, the same questions around safety have lingered unanswered amid the aromas of sizzling foods and, perhaps, invisible clouds of Teflon fumes.

    It is objectively ridiculous that the safety of one of the most common household items in America remains such a mystery. But the reality is that it is nearly impossible to measure the risks of PFAS from nonstick cookware—and more important, it’s probably pointless to try. That’s because PFAS have for many decades imparted a valuable stain- and water-resistance to many types of surfaces, including carpets, car seats, and raincoats.

    At this point, the chemicals are also ubiquitous in the environment, particularly in the water supply. Last June, the Environmental Protection Agency established new safety guidelines for the level of certain PFAS in drinking water; a study published around the same time showed that millions of deaths are correlated with PFAS exposure. By the Environmental Working Group’s latest count, PFAS have contaminated more than 2,850 sites in 50 states and two territories—an “alarming” level of pervasiveness, researchers wrote in a National Academies of Sciences, Engineering, and Medicine report last year. But something about nonstick pans has generated the biggest freak-out. This is not surprising, given their exposure to food and open flames. After all, people do not heat up and consume raincoats (as far as I know).

    Since research into their health effects began, certain types of PFAS have been flagged as more dangerous than others. Two of them, PFOA and PFOS, were voluntarily phased out by manufacturers for several reasons, including the fact that they were deemed dangerous to the immune system; now many nonstick pans specify that their coatings are PFOA free. (If you’re confused by all the acronyms, you aren’t the only one.) But other types of PFAS are still used in these coatings, and their risks to humans aren’t clear. Teflon claims that any flakes of nonstick coating you might ingest are inert, but public studies backing up that claim are difficult to find.

    In the absence of relevant data, everyone seems to have a different take on nonstick pans. The FDA, for example, allows PFAS to be used in nonstick cookware, but the EPA says that exposure to them can lead to adverse health effects, and last year proposed labeling certain members of the group as “hazardous substances.” According to the CDC, the health effects of low exposure to these chemicals are “uncertain.” Food experts are similarly undecided on nonstick pans: A writer for the culinary site Serious Eats said he “wouldn’t assume they’re totally safe,” whereas a Wirecutter review said they “seem to be safe”—if used correctly.

    That’s about the firmest answer you’re going to get regarding the safety of nonstick cookware. “In no study has it been shown that people who use nonstick pans have higher levels” of PFAS, says Jane Hoppin, a North Carolina State University epidemiologist and a member of a National Academies of Sciences, Engineering, and Medicine committee to study PFAS. But she also told me that, with regard to the broader research on PFAS-related health risks, “I haven’t seen anybody say it’s safe to use.”

    Certainly, more research could be done on PFAS, given the lack of relevant studies. There is no research, for example, showing that people who use nonstick pans are more likely to get sick. The one study on exposure from nonstick pans mentioned in the report that Hoppin and others published last year found inconclusive results after measuring gaseous PFAS released from heated nonstick pans, though the researchers tested only a few pans. Another study in which scientists used nonstick pans to cook beef and pork—and an assortment of more glamorous meats including chicken nuggets—and then measured the PFAS levels likewise failed to reach a conclusion, because too few meat samples were used.

    More scientists could probably be convinced to pursue rigorous research in this field if PFAS exposure came only from nonstick pans. Investigating the risks would be tough, perhaps impossible: Designing a rigorous study to test the risks of PFAS exposure would likely involve forcing unwitting test subjects to breathe in PFAS fumes or eat from flaking pans. But given that we are exposed to PFAS in so many other ways—drinking water being chief among them—what would be the point? “They’re in dental floss, and they’re in your Gore-Tex jacket, and they’re in your shoes,” Hoppin said. “The relative contribution of any one of those things is minor.”

    As long as PFAS keep proliferating in the environment, we might never fully know exactly what nonstick pans are doing to us. The best we can do for now is decide what level of risk we’re willing to accept in exchange for a slippery pan, based on the information available. And that information is frustratingly vague: Most nonstick products come with a disclosure of the types of PFAS they contain and the types they do not. Sometimes they also include instructions to avoid high heat, especially above 500 degrees Fahrenheit. Hoppin recommends throwing nonstick pans away once they start flaking; in general, it seems worth it to use the pans only when essential. There is likewise a dearth of guidance on breathing in the fumes from an overheated pan, though breathing in PFAS fumes in industrial settings has been known to cause flulike symptoms. If you’re concerned, Hoppin said, you could use any of the growing number of nonstick alternatives, including ceramic and carbon-steel cookware. (Her preference is well-seasoned cast iron.)

    Still, perhaps it’s time to accept that exposure to PFAS is inevitable, much like exposure to microplastics and other carcinogens. At this point, so many harmful substances are all around us that there doesn’t seem to be any point in trying to limit them in individual products, though such efforts are underway for raincoats and period underwear. “What we really need to do is remove these chemicals from production,” Hoppin said. The hope is that doing so would broadly reduce our exposure to PFAS, and there’s evidence that it would work: After PFOS was phased out in the early 2000s, its levels in human blood declined significantly. But until PFAS are more tightly regulated, we’ll continue our endless slide through nonstick limbo, with our grasp of the cookware’s safety remaining slippery at best.

    I’ve tried to cut down on my nonstick-pan use for sheer peace of mind. Many professional chefs reject nonstick pans as unnecessary if you know the proper technique; French chefs, after all, were flipping omelets long before the first Teflon pan was invented—by a French engineer—in 1954. Fancying myself a purist, I recently attempted to cook an omelet using All-Clad stainless steel, following a set of demanding instructions involving ungodly amounts of butter and a moderate amount of heat. Unlike my resolve to avoid nonstick pans, the eggs stuck.

    [ad_2]

    Yasmin Tayag

    Source link