ReportWire

Tag: fundamental part

  • Just How Sweaty Can Humans Get?

    Just How Sweaty Can Humans Get?

    [ad_1]

    This summer, I, like so many other Americans, have forgotten what it means to be dry. The heat has grown so punishing, and the humidity so intense, that every movement sends my body into revolt. When I stand, I sweat. When I sit, I sweat. When I slice into a particularly dense head of cabbage, I sweat.

    The way things are going, infinite moistness may be something many of us will have to get used to. This past July was the world’s hottest month in recorded history; off the coast of Florida, ocean temperatures hit triple digits, while in Arizona, the asphalt caused third-degree burns. As human-driven climate change continues to remodel the globe, heat waves are hitting harder, longer, and more frequently. The consequences of this crisis will, on a macroscopic scale, upend where and how humans can survive. It will also, in an everyday sense, make our lives very, very sweaty.

    For most Americans, that’s probably unwelcome news. Our culture doesn’t exactly love sweat. Heavy perspirers are shunned on subways; BO is a hallmark of pubescent shame. History is splattered with examples of people trying to cloak sweat in perfumes, wash it away by bathing, or soak it up with wads of cotton or rubber crammed into their shirts, dresses, and hats. People without medical reason to do so have opted to paralyze their sweat-triggering nerves with Botox. Even Bruce Lee had the sweat glands in his armpits surgically removed, reportedly to avoid on-screen stains, several months before his death, in 1973.

    But our scorn of sweat is entirely undeserved. Perspiration is vital to life. It cools our bodies and hydrates our skin; it manages our microbiome and emits chemical cues. Sweat is also a fundamental part of what makes people people. Without it, we wouldn’t be able to run long distances in high heat; we wouldn’t be able to power our big brains and bodies; we wouldn’t have colonized so much of the Earth. We may even have sweat to thank (or blame) for our skin’s nakedness, says Yana Kamberov, a sweat researcher at the University of Pennsylvania. Her team’s recent data, not yet published, suggest that as human skin evolved to produce more and more sweat glands, fur-making hair follicles disappeared to make room. Sweat is one of the “key milestones” in human evolution, argues Andrew Best, a biological anthropologist at the Massachusetts College of Liberal Arts—on par with big brains, walking upright, and the expression of culture through language and art.

    Humans aren’t the only animals that sweat. Many mammals—among them, dogs, cats, and rats—perspire through the footpads on their paws; chimpanzees, macaques, and other primates are covered in sweat glands. Even horses and camels slick their skin in the heat. But only our bodies are studded with this many millions of teeny, tubular sweat glands—about 10 times the number found on other primates’ skin—that funnel water from our blood to pores that can squeeze out upwards of three, four, even five liters of sweat an hour when we need them to.

    Our dampness isn’t cost free. Sweat is siphoned from the liquid components of blood—lose too much, and the risks of heat stroke and death shoot way up. Our lack of fur also makes us more vulnerable to bites and burns. That humans sweat anyway, then, Best told me, is a testament to perspiration’s cooling punch—it’s so much more efficient than merely panting or hiding from the heat. “If your objective is to be able to sustain a high metabolic rate in warm conditions, sweating is absolutely the best,” he said.

    And yet, in modern times, many of us just can’t seem to accept the realities of sweat. Americans are, for whatever reason, particularly preoccupied with quashing perspiration; in many other countries, “body odor is just normal,” says Angela Lamb, a dermatologist at Mount Sinai’s Icahn School of Medicine. But the bemoaning of BO has cultural roots that long predate the United States. “I’ve read discussions well back into antiquity where there are discussions about people whose armpits stink,” says Cari Casteel, a historian at the University of Buffalo. By the start of the 20th century, Americans had been primed by the recent popularization of germ theory to fear dirtiness—the perfect moment for marketers to “put the fear in women, and then men, that sweat was going to kibosh your plans for romance or a job,” says Sarah Everts, the author of The Joy of Sweat. These days, deodorants command an $8 billion market in the United States.

    Our aversion to sweat doesn’t make much evolutionary sense. Unlike other excretions that elicit near-universal disgust, sweat doesn’t routinely transmit disease or pose other harm. But it does evoke physical labor and emotional stress—neither of which polite society is typically keen to see. And for some, maybe it signifies “losing control of your body in a particular way,” says Tina Lasisi, a biological anthropologist at the University of Michigan. Unlike urine or tears, sweat is the product of a body function that we can’t train ourselves to suppress or delay.

    We also hate sweat because we think it smells bad. But it doesn’t, really. Nearly all of the sweat glands on human bodies are of the so-called eccrine variety, and produce slightly salty water with virtually no scent. A few spots, such as the armpits and groin, are freckled with apocrine glands that produce a waxy, fatty substance laced with pheromones—but even that has no inherent odor. The bacteria on our skin eat it, and their waste generates a stench, leaving sweat as the scapegoat. Our species’ approach to perspiration may even make us “less stinky than we could be,” Best told me. The expansion of eccrine glands across the body might not have only made our skin barer; it’s also thought to have evicted a whole legion of BO-producing apocrine glands.

    As global temperatures climb, for many people—especially in parts of the world that lack access to air-conditioning—sweat will be an inevitability. “I suspect everyone is going to be quite drippy,” Kamberov told me. Exactly how slick each of us will be, though, is anyone’s guess. Experts have evidence that men sweat more than women, and that perspiration potential declines with age. But by and large, they can’t say with certainty why some people are inherently sweatier than others, and how much of it is inborn. Decades ago, a Japanese researcher hypothesized that perspiration potential might be calibrated in the first two or three years of life: Kids born into tropical climates, his analyses suggested, might activate more of their sweat glands than children in temperate regions. But Best’s recent attempts to replicate those findings have so far come up empty.

    Perspiration does seem to be malleable within a lifetime. A couple of weeks into a new, intense exercise regimen, for instance, people will start to sweat more and earlier. Over longer periods of time, the body can also learn to tolerate high temperatures, and sweat less copiously but more efficiently. We sense these changes subtly as the seasons shift, says Laure Rittié, a physiologist at Glaxo-Smith Kline, who has studied sweat. It’s part of the reason a 75-degree day might feel toastier—and perhaps sweatier—in the spring than in the fall.

    But we can’t simply sweat our way out of our climatic bind. There’s a ceiling to the temperatures we can tolerate; the body can leach only so much liquid out at once. Sweat’s cooling power also tends to falter in humid conditions, when liquid can’t evaporate as easily off of skin. Nor can researchers predict whether future generations might evolve to perspire much more than we do now. We no longer live under the intense conditions that pressured our ancestors to sprout more sweat glands—changes that also took place over many millions of years. It’s even possible that we’re fast approaching the maximal moistness a primate body can produce. “We don’t have a great idea about the outer limits of that plasticity,” Jason Kamilar, a biological anthropologist at the University of Massachusetts at Amherst, told me.

    For now, people who are already on the sweatier side may find themselves better equipped to deal with a warming world, Rittié told me. At long last: Blessed are the moist, for they shall inherit the Earth.

    [ad_2]

    Katherine J. Wu

    Source link

  • Is COVID Immunity Hung Up on Old Variants?

    Is COVID Immunity Hung Up on Old Variants?

    [ad_1]

    In the two-plus years that COVID vaccines have been available in America, the basic recipe has changed just once. The virus, meanwhile, has belched out five variants concerning enough to earn their own Greek-letter names, followed by a menagerie of weirdly monikered Omicron subvariants, each seeming to spread faster than the last. Vaccines, which take months to reformulate, just can’t keep up with a virus that seems to reinvent itself by the week.

    But SARS-CoV-2’s evolutionary sprint might not be the only reason that immunity can get bogged down in the past. The body seems to fixate on the first version of the virus that it encountered, either through injection or infection—a preoccupation with the past that researchers call “original antigenic sin,” and that may leave us with defenses that are poorly tailored to circulating variants. In recent months, some experts have begun to worry that this “sin” might now be undermining updated vaccines. At an extreme, the thinking goes, people may not get much protection from a COVID shot that is a perfect match for the viral variant du jour.

    Recent data hint at this possibility. Past brushes with the virus or the original vaccine seem to mold, or even muffle, people’s reactions to bivalent shots—“I have no doubt about that,” Jenna Guthmiller, an immunologist at the University of Colorado School of Medicine, told me. The immune system just doesn’t make Omicron-focused antibodies in the quantity or quality it probably would have had it seen the updated jabs first. But there’s also an upside to this stubbornness that we could not live without, says Katelyn Gostic, an immunologist and infectious-disease modeler who has studied the phenomenon with flu. Original antigenic sin is the reason repeat infections, on average, get milder over time, and the oomph that enables vaccines to work as well as they do. “It’s a fundamental part,” Gostic told me, “of being able to create immunological memory.”

    This is not just basic biology. The body’s powerful first impressions of this coronavirus can and should influence how, when, and how often we revaccinate against it, and with what. Better understanding of the degree to which these impressions linger could also help scientists figure out why people are (or are not) fighting off the latest variants—and how their defenses will fare against the virus as it continues to change.


    The worst thing about “original antigenic sin” is its name. The blame for that technically lies with Thomas Francis Jr., the immunologist who coined the phrase more than six decades ago after noticing that the initial flu infections people weathered in childhood could bias how they fared against subsequent strains. “Basically, the flu you get first in life is the one you respond to most avidly for the long term,” says Gabriel Victora, an immunologist at Rockefeller University. That can become somewhat of an issue when a very different-looking strain comes knocking.

    In scenarios like these, original antigenic sin may sound like the molecular equivalent of a lovesick teen pining over an ex, or a student who never graduates out of immunological grade school. But from the immune system’s point of view, never forgetting your first is logically sound. New encounters with a pathogen catch the body off guard—and tend to be the most severe. A deep-rooted defensive reaction, then, is practical: It ups the chances that the next time the same invader shows up, it will be swiftly identified and dispatched. “Having good memory and being able to boost it very quickly is sometimes a very good thing,” Victora told me. It’s the body’s way of ensuring that it won’t get fooled twice.

    These old grudges come with clear advantages even when microbes morph into new forms, as flu viruses and coronaviruses often do. Pathogens don’t remake themselves all at once, so immune cells that home in on familiar snippets of a virus can still in many cases snuff out enough invaders to prevent an infection’s worst effects. That’s why even flu shots that aren’t perfectly matched to the season’s most prominent strains are usually still quite good at keeping people out of hospitals and morgues. “There’s a lot of leniency in how much the virus can change before we really lose protection,” Guthmiller told me. The wiggle room should be even bigger, she said, with SARS-CoV-2, whose subvariants tend to be far more similar to one another than, say, different flu strains are.

    With all the positives that immune memory can offer, many immunologists tend to roll their eyes at the negative and bizarrely moralizing implications of the phrase original antigenic sin. “I really, really hate that term,” says Deepta Bhattacharya, an immunologist at the University of Arizona. Instead, Bhattacharya and others prefer to use more neutral words such as imprinting, evocative of a duckling latching onto the first maternal figure it spots. “This is not some strange immunological phenomenon,” says Rafi Ahmed, an immunologist at Emory University. It’s more a textbook example of what an adaptable, high-functioning immune system does, and one that can have positive or negative effects, depending on context. Recent flu outbreaks have showcased a little bit of each: During the 2009 H1N1 pandemic, many elderly people, normally more susceptible to flu viruses, fared better than expected against the late-aughts strain, because they’d banked exposures to a similar-looking H1N1—a derivative of the culprit behind the 1918 pandemic—in their youth. But in some seasons that followed, H1N1 disproportionately sickened middle-aged adults whose early-life flu indoctrinations may have tilted them away from a protective response.

    The backward-gazing immune systems of those adults may have done more than preferentially amplify defensive responses to a less relevant viral strain. They might have also actively suppressed the formation of a response to the new one. Part of that is sheer kinetics: Veteran immune cells, trained up on past variants and strains, tend to be quicker on the draw than fresh recruits, says Scott Hensley, an immunologist at the Perelman School of Medicine at the University of Pennsylvania. And the greater the number of experienced soldiers, the more likely they are to crowd out rookie fighters—depriving them of battlefield experience they might otherwise accrue. Should the newer viral strain eventually return for a repeat infection, those less experienced immune cells may not be adequately prepared—leaving people more vulnerable, perhaps, than they might otherwise have been.

    Some researchers think that form of imprinting might now be playing out with the bivalent COVID vaccines. Several studies have found that the BA.5-focused shots are, at best, moderately more effective at producing an Omicron-targeted antibody response than the original-recipe jab—not the knockout results that some might have hoped for. Recent work in mice from Victora’s lab backs up that idea: B cells, the manufacturers of antibodies, do seem to have trouble moving past the impressions of SARS-CoV-2’s spike protein that they got from first exposure. But the findings don’t really trouble Victora, who gladly received his own bivalent COVID shot. (He’ll take the next update, too, whenever it’s ready.) A blunted response to a new vaccine, he told me, is not a nonexistent one—and the more foreign a second shot recipe is compared with the first, the more novice fighters should be expected to participate in the fight. “You’re still adding new responses,” he said, that will rev back up when they become relevant. The coronavirus is a fast evolver. But the immune system also adapts. Which means that people who receive the bivalent shot can still expect to be better protected against Omicron variants than those who don’t.

    Historical flu data support this idea. Many of the middle-aged adults slammed by recent H1N1 infections may not have mounted perfect attacks on the unfamiliar virus, but as immune cells continued to tussle with the pathogen, the body “pretty quickly filled in the gaps,” Gostic told me. Although it’s tempting to view imprinting as a form of destiny, “that’s just not how the immune system works,” Guthmiller told me. Preferences can be overwritten; biases can be undone.


    Original antigenic sin might not be a crisis, but its existence does suggest ways to optimize our vaccination strategies with past biases in mind. Sometimes, those preferences might need to be avoided; in other instances, they should be actively embraced.

    For that to happen, though, immunologists would need to fill in some holes in their knowledge of imprinting: how often it occurs, the rules by which it operates, what can entrench or alleviate it. Even among flu viruses, where the pattern has been best-studied, plenty of murkiness remains. It’s not clear whether imprinting is stronger, for instance, when the first exposure comes via infection or vaccination. Scientists can’t yet say whether children, with their fiery yet impressionable immune systems, might be more or less prone to getting stuck on their very first flu strain. Researchers don’t even know for certain whether repetition of a first exposure—say, through multiple doses of the same vaccine, or reinfections with the same variant—will more deeply embed a particular imprint.

    It does seem intuitive that multiple doses of a vaccine could exacerbate an early bias, Ahmed told me. But if that’s the case, then the same principle might also work the other way: Maybe multiple exposures to a new version of the virus could help break an old habit, and nudge the immune system to move on. Recent evidence has hinted that people previously infected with an early Omicron subvariant responded more enthusiastically to a bivalent BA.1-focused vaccine—available in the United Kingdom—than those who’d never encountered the lineage before. Hensley, at the University of Pennsylvania, is now trying to figure out if the same is true for Americans who got the BA.5-based bivalent shot after getting sick with one of the many Omicron subvariants.

    Ahmed thinks that giving people two updated shots—a safer approach, he points out, than adding an infection to the mix—could untether the body from old imprints too. A few years ago, he and his colleagues showed that a second dose of a particular flu vaccine could help shift the ratio of people’s immune responses. A second dose of the fall’s bivalent vaccine might not be practical or palatable for most people, especially now that BA.5 is on its way out. But if next autumn’s recipe overlaps with BA.5 in ways that it doesn’t with the original variant—as it likely will to at least some degree, given the Omicron lineage’s continuing reign—a later, slightly different shot could still be a boon.

    Keeping vaccine doses relatively spaced out—on an annual basis, say, à la flu shots—will likely help too, Bhattacharya said. His recent studies, not yet published, hint that the body might “forget” old variants, as it were, if it’s simply given more time: As antibodies raised against prior infections and injections fall away, vaccine ingredients could linger in the body rather than be destroyed by prior immunity on sight. That slightly extended stay might offer the junior members of the immune system—lesser in number, and slower on the uptake—more of an opportunity to cook up an Omicron-specific response.

    In an ideal world, researchers might someday know enough about imprinting to account for its finickiness whenever they select and roll out new shots. Flu shots, for instance, could be personalized to account for which strains babies were first exposed to, based on birth year; combinations of COVID vaccine doses and infections could dictate the timing and composition of a next jab. But the world is not yet living that reality, Gostic told me. And after three years of an ever-changing coronavirus and a fluctuating approach to public health, it’s clear that there won’t be a single vaccine recipe that’s ideal for everyone at once.

    Even Thomas Francis Jr. did not consider original antigenic sin to be a total negative, Hensley told me. According to Francis, the true issue with the “sin” was that humans were missing out on the chance to imprint on multiple strains at once in childhood, when the immune system is still a blank slate—something that modern researchers could soon accomplish with the development of universal vaccines. Our reliance on first impressions can be a drawback. But the same phenomenon can be an opportunity to acquaint the body with diversity early on—to give it a richer narrative, and memories of many threats to come.

    [ad_2]

    Katherine J. Wu

    Source link