ReportWire

Tag: parts of the world

  • Just How Sweaty Can Humans Get?

    Just How Sweaty Can Humans Get?

    [ad_1]

    This summer, I, like so many other Americans, have forgotten what it means to be dry. The heat has grown so punishing, and the humidity so intense, that every movement sends my body into revolt. When I stand, I sweat. When I sit, I sweat. When I slice into a particularly dense head of cabbage, I sweat.

    The way things are going, infinite moistness may be something many of us will have to get used to. This past July was the world’s hottest month in recorded history; off the coast of Florida, ocean temperatures hit triple digits, while in Arizona, the asphalt caused third-degree burns. As human-driven climate change continues to remodel the globe, heat waves are hitting harder, longer, and more frequently. The consequences of this crisis will, on a macroscopic scale, upend where and how humans can survive. It will also, in an everyday sense, make our lives very, very sweaty.

    For most Americans, that’s probably unwelcome news. Our culture doesn’t exactly love sweat. Heavy perspirers are shunned on subways; BO is a hallmark of pubescent shame. History is splattered with examples of people trying to cloak sweat in perfumes, wash it away by bathing, or soak it up with wads of cotton or rubber crammed into their shirts, dresses, and hats. People without medical reason to do so have opted to paralyze their sweat-triggering nerves with Botox. Even Bruce Lee had the sweat glands in his armpits surgically removed, reportedly to avoid on-screen stains, several months before his death, in 1973.

    But our scorn of sweat is entirely undeserved. Perspiration is vital to life. It cools our bodies and hydrates our skin; it manages our microbiome and emits chemical cues. Sweat is also a fundamental part of what makes people people. Without it, we wouldn’t be able to run long distances in high heat; we wouldn’t be able to power our big brains and bodies; we wouldn’t have colonized so much of the Earth. We may even have sweat to thank (or blame) for our skin’s nakedness, says Yana Kamberov, a sweat researcher at the University of Pennsylvania. Her team’s recent data, not yet published, suggest that as human skin evolved to produce more and more sweat glands, fur-making hair follicles disappeared to make room. Sweat is one of the “key milestones” in human evolution, argues Andrew Best, a biological anthropologist at the Massachusetts College of Liberal Arts—on par with big brains, walking upright, and the expression of culture through language and art.

    Humans aren’t the only animals that sweat. Many mammals—among them, dogs, cats, and rats—perspire through the footpads on their paws; chimpanzees, macaques, and other primates are covered in sweat glands. Even horses and camels slick their skin in the heat. But only our bodies are studded with this many millions of teeny, tubular sweat glands—about 10 times the number found on other primates’ skin—that funnel water from our blood to pores that can squeeze out upwards of three, four, even five liters of sweat an hour when we need them to.

    Our dampness isn’t cost free. Sweat is siphoned from the liquid components of blood—lose too much, and the risks of heat stroke and death shoot way up. Our lack of fur also makes us more vulnerable to bites and burns. That humans sweat anyway, then, Best told me, is a testament to perspiration’s cooling punch—it’s so much more efficient than merely panting or hiding from the heat. “If your objective is to be able to sustain a high metabolic rate in warm conditions, sweating is absolutely the best,” he said.

    And yet, in modern times, many of us just can’t seem to accept the realities of sweat. Americans are, for whatever reason, particularly preoccupied with quashing perspiration; in many other countries, “body odor is just normal,” says Angela Lamb, a dermatologist at Mount Sinai’s Icahn School of Medicine. But the bemoaning of BO has cultural roots that long predate the United States. “I’ve read discussions well back into antiquity where there are discussions about people whose armpits stink,” says Cari Casteel, a historian at the University of Buffalo. By the start of the 20th century, Americans had been primed by the recent popularization of germ theory to fear dirtiness—the perfect moment for marketers to “put the fear in women, and then men, that sweat was going to kibosh your plans for romance or a job,” says Sarah Everts, the author of The Joy of Sweat. These days, deodorants command an $8 billion market in the United States.

    Our aversion to sweat doesn’t make much evolutionary sense. Unlike other excretions that elicit near-universal disgust, sweat doesn’t routinely transmit disease or pose other harm. But it does evoke physical labor and emotional stress—neither of which polite society is typically keen to see. And for some, maybe it signifies “losing control of your body in a particular way,” says Tina Lasisi, a biological anthropologist at the University of Michigan. Unlike urine or tears, sweat is the product of a body function that we can’t train ourselves to suppress or delay.

    We also hate sweat because we think it smells bad. But it doesn’t, really. Nearly all of the sweat glands on human bodies are of the so-called eccrine variety, and produce slightly salty water with virtually no scent. A few spots, such as the armpits and groin, are freckled with apocrine glands that produce a waxy, fatty substance laced with pheromones—but even that has no inherent odor. The bacteria on our skin eat it, and their waste generates a stench, leaving sweat as the scapegoat. Our species’ approach to perspiration may even make us “less stinky than we could be,” Best told me. The expansion of eccrine glands across the body might not have only made our skin barer; it’s also thought to have evicted a whole legion of BO-producing apocrine glands.

    As global temperatures climb, for many people—especially in parts of the world that lack access to air-conditioning—sweat will be an inevitability. “I suspect everyone is going to be quite drippy,” Kamberov told me. Exactly how slick each of us will be, though, is anyone’s guess. Experts have evidence that men sweat more than women, and that perspiration potential declines with age. But by and large, they can’t say with certainty why some people are inherently sweatier than others, and how much of it is inborn. Decades ago, a Japanese researcher hypothesized that perspiration potential might be calibrated in the first two or three years of life: Kids born into tropical climates, his analyses suggested, might activate more of their sweat glands than children in temperate regions. But Best’s recent attempts to replicate those findings have so far come up empty.

    Perspiration does seem to be malleable within a lifetime. A couple of weeks into a new, intense exercise regimen, for instance, people will start to sweat more and earlier. Over longer periods of time, the body can also learn to tolerate high temperatures, and sweat less copiously but more efficiently. We sense these changes subtly as the seasons shift, says Laure Rittié, a physiologist at Glaxo-Smith Kline, who has studied sweat. It’s part of the reason a 75-degree day might feel toastier—and perhaps sweatier—in the spring than in the fall.

    But we can’t simply sweat our way out of our climatic bind. There’s a ceiling to the temperatures we can tolerate; the body can leach only so much liquid out at once. Sweat’s cooling power also tends to falter in humid conditions, when liquid can’t evaporate as easily off of skin. Nor can researchers predict whether future generations might evolve to perspire much more than we do now. We no longer live under the intense conditions that pressured our ancestors to sprout more sweat glands—changes that also took place over many millions of years. It’s even possible that we’re fast approaching the maximal moistness a primate body can produce. “We don’t have a great idea about the outer limits of that plasticity,” Jason Kamilar, a biological anthropologist at the University of Massachusetts at Amherst, told me.

    For now, people who are already on the sweatier side may find themselves better equipped to deal with a warming world, Rittié told me. At long last: Blessed are the moist, for they shall inherit the Earth.

    [ad_2]

    Katherine J. Wu

    Source link

  • A Major Breed of Flu Has Gone Missing

    A Major Breed of Flu Has Gone Missing

    [ad_1]

    In March 2020, Yamagata’s trail went cold.

    The pathogen, one of the four main groups of flu viruses targeted by seasonal vaccines, had spent the first part of the year flitting across the Northern Hemisphere, as it typically did. As the seasons turned, scientists were preparing, as they typically did, for the virus to make its annual trek across the equator and seed new outbreaks in the globe’s southern half.

    That migration never came to pass. As the new coronavirus spread, pandemic-mitigation measures started to squash flu-transmission rates to record lows. The drop-off was so sharp that several flu lineages may have gone extinct, among them Yamagata, which hasn’t been definitively detected in more than three years despite virologists’ best efforts to root it out.

    Yamagata’s disappearance could still be temporary. “Right now, we’re all just kind of holding our breath,” says Adam Lauring, a virologist at the University of Michigan Medical School. The virus might be biding its time in an isolated population, escaping the notice of tests. But the search has stretched on so fruitlessly that some experts are ready to declare it officially done. “It’s been missing for this long,” says Vijaykrishna Dhanasekaran, a virologist at Hong Kong University. “At this point, I would really think it’s gone.”

    If Yamagata remains AWOL indefinitely, its absence would have at least one relatively straightforward consequence: Researchers might no longer need to account for the lineage in annual vaccines. But its vanishing act could have a more head-spinning implication. Flu viruses, which have been plaguing human populations for centuries, are some of the most well-known and well-studied threats to our health. They have prompted the creation of annual shots, potent antivirals, and internationally funded surveillance programs. And yet, scientists still have some basic questions about why they behave as they do—especially about Yamagata and its closest kin.


    Yamagata, in many ways, has long been an underdog among underdogs. The lineage is one of two in a group called influenza B viruses, and it’s slower to evolve and transmit, and is thus sometimes considered less troublesome, than its close cousin Victoria. As a pair, the B’s are also commonly regarded as the wimpier versions of flu.

    To be fair, the competition is stiff. Flu B’s are constantly being compared with influenza A viruses—the group that contains every flu subtype that has caused a pandemic in our recent past, including the extraordinarily deadly outbreak of 1918. Seasonal flu epidemics, too, tend to be heavily dominated by flu A’s, especially H3N2 and H1N1, two notably tough-to-target strains that feature prominently in each year’s vaccine. Even H5N1, the flavor of avian influenza that’s been devastating North America’s wildlife, is a member of the pathogen’s A team.

    B viruses, meanwhile, don’t have a particularly daunting résumé. “To our knowledge, there has never been a B pandemic,” says John Paget, an infectious-disease epidemiologist at the Netherlands Institute for Health Services Research. Only once every seven seasons or so does a B virus dominate. And although A and B viruses sometimes tag-team the winter, causing twin outbreaks spaced out by a few weeks, these seasons often open with a major flu A banger and then close out with a more muted B coda.

    The reasons underlying these differences are still pretty murky, though scientists do have some hints. Whereas flu A viruses are known as especially speedy shape-shifters, constantly spawning genetic offshoots that vie to outcompete one another, flu B’s evolve at oddly plodding rates. Their sluggish approach makes it easier for our immune system to recognize the viruses when they reappear, resulting in longer-lasting protection, more effective vaccines, and fewer reinfections than are typical with the A’s. Those molecular differences also seem to drive differences in how and when the viruses spread. The A’s tend to trouble people repeatedly from birth to death, and are great at globe-trotting. But B’s, perhaps because immunity against them is easier to come by, more often concentrate among kids, many of whom have never encountered the viruses before—and who are usually more resilient to respiratory viruses and travel less than adults, keeping outbreaks mostly regional. That might also help explain why B epidemics so frequently lag behind A’s: Slower pathogen evolution facing off with more durable host immunity add up to less rapid B spread, while their A colleagues rush ahead. Our bodies also seem to mount rather fiery defenses against A viruses, steeling them against other infections in the weeks that follow and deepening the disadvantage against any B’s trailing behind. All of that means flu B has a hard time catching humans off guard.

    The virus’s host preferences, too, make flu A viruses more dangerous. Those lineages are great at hopscotching among a whole menagerie of species—most infamously, pigs and wild, water-loving birds—sometimes undergoing rapid bursts of evolution as they go. But flu B’s seem to almost exclusively infect humans, igniting only the rare and fast-resolving outbreak in a limited number of other species—a few seals here, a handful of pigs there. Spillovers from wild creatures into humans are the roots of global outbreaks. And so, with its zoonotic bent, “influenza A will always be the main focus” of concern, says Carolien van de Sandt, a virologist at the Peter Doherty Institute for Infection and Immunity, in Melbourne. Even among some scientists, Yamagata and Victoria register as little more than literal B-list blips.

    Plenty of other experts, though, think flu B’s relative obscurity is misguided—perhaps even a bit dangerous. Flu B’s account for roughly a quarter of annual flu cases, many of which lead to hospitalization and death; they seem hardier than their A cousins against certain antiviral drugs. And scientists simply know a lot less about flu B’s: how, precisely, they interact with the immune system; what factors influence their sluggish evolutionary rate; the nuances of their person-to-person spread; their oddball animal-host range. And that lack of intel on what has for decades been a formidable infectious foe creates a risk all on its own.


    Flu lineages have dipped into relative obscurity before only to come roaring back. After the end of the H2N2 pandemic of the late 1950s, H1N1 appeared to flame out—only to reemerge nearly two decades later to greet a population full of young people whose immune systems hadn’t glimpsed it before. And as recently as the 1990s, the B lineage Victoria underwent a years-long ebb in most parts of the world, before ricocheting back to prominence in the early 2000s.

    As far as researchers can tell, Victoria is alive and well; during the globe’s most recent winter seasons, the lineage appears to have ignited late-arriving outbreaks in several countries, including in South Africa, Malaysia, and various parts of Europe. But based on the viral sequences that researchers have isolated from people sick with flu, Yamagata is still nowhere to be found, says Saverio Caini, a virologist at the cancer research center ISPRO, in Italy.

    The lineage was already teetering on a precipice before the pandemic began, van de Sandt told me. Yamagata and Victoria, which splintered apart in the early 1980s, are still closely related enough that they often compete for the same hosts. And just prior to 2020, Victoria, the more diverse and fleet-footed of the two B lineages, had been reliably edging out its cousin, pushing Yamagata’s prevalence down, down, down. That trend, coupled with several years of use of a well-matched Yamagata strain in the seasonal flu vaccine, meant that Yamagata “had already decreased in incidence and circulation,” van de Sandt said. With the odds so steeply stacked, the addition of pandemic mitigations may have been the final factor that snuffed the lineage out.

    Recently, a few countries—including China, Pakistan, and Belize—have tentatively reported possible Yamagata infections. But there’s been no conclusive genetic proof, several experts told me. Several parts of the world, including the United States, regularly use flu vaccines containing active flu viruses that can trip the same viral tests that the wild, disease-causing pathogens do. “So the reports could be contaminations,” van de Sandt said. Scientists would need to scour the virus’s genetic sequences to distinguish infection from injection; those data, however, haven’t emerged.

    Should the Yamagata dry spell continue, researchers may want to start considering snipping the lineage out of vaccines altogether, perhaps as early as the middle or end of this year. Doing so would punt the world back to the early 2010s, when flu shots were trivalent—designed to protect people against two A viruses, H3N2 and H1N1, plus either Victoria or Yamagata, depending on which lineage researchers forecasted would surge more. (They were often wrong.) Or maybe the space once used for Yamagata could feasibly be filled with another flavor of H3N2, the fastest mutator of the bunch.

    But purging Yamagata from the vaccine would be a gamble. If Yamagata is not gone for good, van de Sandt worries that booting it from the vaccine would leave the world vulnerable to a massive and deadly outbreak. Even Dhanasekaran, who is among the researchers who are fairly confident that we’ve seen the last of Yamagata, told me he doesn’t want to rule out the possibility that the virus is cloistering in an immunocompromised person with a chronic infection, and it’s unclear if it could reemerge from such a hiding place. The only thing scientists can do for now is be patient, says Jayna Raghwani, a computational biologist at the University of Oxford. “If we don’t see it in successive seasons for another two to three years, that will be more convincing,” she told me.

    If Yamagata’s death knell has actually rung, though, it will have reverberating effects. There’s no telling, for instance, how other flu lineages might be affected by their colleague’s supposed retirement. Perhaps Victoria, which can swap genetic material with Yamagata, will evolve more slowly without its partner. At the same time, Victoria may have an easier time infecting people now that it no longer needs to compete as often for hosts.

    If Yamagata has gone to pasture, “there won’t be a ceremony declaring the world Yamagata free,” Lauring told me. And it’s easy, he points out, to forget things we don’t see. But even if Yamagata seems gone for now, the effects of its demise will be significant enough that it can’t be forgotten—not just yet.

    [ad_2]

    Katherine J. Wu

    Source link