ReportWire

Tag: immune cells

  • Human stem cells age more rapidly in space, study finds

    [ad_1]

    While scientists are still working to understand the effects an extended trip to space can have on the human body, research in recent years has suggested that astronauts may experience some pretty dramatic changes on both the physiological and psychological levels. In the latest study led by a team at University of California San Diego, researchers found signs of accelerated aging in human stem cells that spent roughly a month in space. 

    The research focused on hematopoietic stem and progenitor cells (HSPCs), which are crucial in the formation of blood and immune cells. Stem cells were sent to the International Space Station for stays of 32-45 days using specially developed nanobioreactors to monitor them. Another set remained on Earth at the Kennedy Space Center. The cells that went to the ISS showed a host of changes, including reduced self-renewal abilities, greater susceptibility to DNA damage and inflammation in the mitochondria. However, the damage didn’t appear to be permanent. The team notes that the changes were at least partially reversed when the cells were removed from the space environment. 

    “Space is the ultimate stress test for the human body,” Catriona Jamieson, director of the UC San Diego Sanford Stem Cell Institute, said in a statement. “These findings are critically important because they show that the stressors of space — like microgravity and cosmic galactic radiation — can accelerate the molecular aging of blood stem cells. Understanding these changes not only informs how we protect astronauts during long-duration missions but also helps us model human aging and diseases like cancer here on Earth.” 

    [ad_2]

    Source link

  • What If There’s a Secret Benefit to Getting Asian Glow?

    What If There’s a Secret Benefit to Getting Asian Glow?

    [ad_1]

    At every party, no matter the occasion, my drink of choice is soda water with lime. I have never, not once, been drunk—or even finished a full serving of alcohol. The single time I came close to doing so (thanks to half a serving of mulled wine), my heart rate soared, the room spun, and my face turned stop-sign red … all before I collapsed in front of a college professor at an academic event.

    The blame for my alcohol aversion falls fully on my genetics: Like an estimated 500 million other people, most of them of East Asian descent, I carry a genetic mutation called ALDH2*2 that causes me to produce broken versions of an enzyme called aldehyde dehydrogenase 2, preventing my body from properly breaking down the toxic components of alcohol. And so, whenever I drink, all sorts of poisons known as aldehydes build up in my body—a predicament that my face announces to everyone around me.

    By one line of evolutionary logic, I and the other sufferers of so-called alcohol flush (also known as Asian glow) shouldn’t exist. Alcohol isn’t the only source of aldehydes in the body. Our own cells also naturally produce the compounds, and they can wreak all sorts of havoc on our DNA and proteins if they aren’t promptly cleared. So even at baseline, flushers are toting around extra toxins, leaving them at higher risk for a host of health issues, including esophageal cancer and heart disease. And yet, somehow, our cohort of people, with its intense genetic baggage, has grown to half a billion people in potentially as little as 2,000 years.

    The reason might hew to a different line of evolutionary logic—one driven not by the dangers of aldehydes to us but by the dangers of aldehydes to some of our smallest enemies, according to Heran Darwin, a microbiologist at New York University. As Darwin and her colleagues reported at a conference last week, people with the ALDH2*2 mutation might be especially good at fighting off certain pathogens—among them the bug that causes tuberculosis, or TB, one of the greatest infectious killers in recent history.

    The research, currently under review for publication at the journal Science, hasn’t yet been fully vetted by other scientists. And truly nailing TB, or any other pathogen, as the evolutionary catalyst for the rise of ALDH2*2 will likely be tough. But if infectious disease can even partly explain the staggering size of the flushing cohort—as several experts told me is likely the case—the mystery of one of the most common mutations in the human population will be one step closer to being solved.

    Scientists have long been aware of aldehydes’ nasty effects on DNA and proteins; the compounds are carcinogens that literally “damage the fabric of life,” says Ketan J. Patel, a molecular biologist at the University of Oxford who studies the ALDH2*2 mutation and is reviewing the new research for publication in Science. For years, though, many researchers dismissed the chemicals as the annoying refuse of the body’s daily chores. Our bodies produce them as part of run-of-the-mill metabolism; the compounds also build up during infection or inflammation, as byproducts of some of the noxious chemicals we churn out. But then aldehydes are generally swept away by our molecular cleanup systems like so much microscopic trash.

    Darwin and her colleagues are now convinced that the chemicals deserve more credit. Dosed into laboratory cultures, aldehydes can kill TB within days. In previous research, Darwin’s team also found that aldehydes—including ones produced by the bacteria themselves—can make TB ultra sensitive to nitric oxide, a defensive compound that humans produce during infections, as well as copper, a metal that destroys many microbes on contact. (For what it’s worth, the aldehydes found in our bodies after we consume alcohol don’t seem to much bother TB, Darwin told me. Drinking has actually been linked to worse outcomes with the disease.)

    The team is still tabulating the many ways in which aldehydes are exerting their antimicrobial effects. But Darwin suspects that the bugs that are vulnerable to the chemicals are dying “a death by a thousand cuts,” she told me at the conference. Which makes aldehydes more than worthless waste. Maybe our ancestors’ bodies wised up to the molecules’ universally destructive powers—and began to purposefully deploy them in their defensive arsenal. “It’s the immune system capitalizing on the toxicity,” says Joshua Woodward, a microbiologist at the University of Washington who has been studying the antibacterial effects of aldehydes.

    Specific cells show hints that they’ve caught on to aldehydes’ potency. Sarah Stanley, a microbiologist and an immunologist at UC Berkeley, who has been co-leading the research with Darwin, has found that when immune cells receive certain chemical signals signifying infection, they’ll ramp up some of the metabolic pathways that produce aldehydes. Those same signals, the researchers recently found, can also prompt immune cells to tamp down their levels of aldehyde dehydrogenase 2—the very aldehyde-detoxifying enzyme that the mutant gene in people like me fails to make.

    If holstering that enzyme is a way for cells to up their supply of toxins and brace for inevitable attack, that could be good news for ALDH2*2 carriers, who already struggle to make enough of it. When, in an extreme imitation of human flushers, the researchers purged the ALDH2 gene from a strain of mice, then infected them with TB, they found that the rodents accumulated fewer bacteria in their lungs.

    The buildup of aldehydes in the mutant mice wasn’t enough to, say, render them totally immune to TB. But even a small defensive bump can make for a massive advantage when combating such a deadly disease, Russell Vance, an immunologist at UC Berkeley who’s been collaborating with Darwin and Stanley on the project, told me. Darwin is now curious as to whether TB’s distaste for aldehyde could be leveraged during infections, she told me—by, for instance, supplementing antibiotic regimens with a side of Antabuse, a medication that blocks aldehyde dehydrogenase, mimicking the effects of ALDH2*2.

    Tying those results to the existence of ALDH2*2 in half a billion people is a larger leap, several experts told me. There are clues of a relationship: Darwin and Stanley’s team found, for instance, that in a cohort from Vietnam and Singapore, people carrying the mutation were less likely to have active cases of TB—echoing patterns documented by at least one other study from Korea. But Daniela Brites, an evolutionary geneticist at the Swiss Tropical and Public Health Institute, told me that the connection still feels a little shaky. Other studies that have searched for genetic predispositions to TB, or resistance to it, she pointed out, haven’t hit on ALDH2*2—a sign that any link might be weak.

    The team’s general idea could still pan out. “They are definitely on the right track,” Patel told me. Throughout most of human history, infectious diseases have been among the most dramatic influences over who lives and who dies—a pressure so immense that it’s left obvious scars on the human genome. A mutation that can cause sickle cell anemia has become very common in parts of the African continent because it helps guard people against malaria.

    The story with ALDH2*2 is probably similar, Patel said. He’s confident that some infectious agent—perhaps several of them—has played a major role in keeping the mutation around. TB, with its devastating track record, could be among the candidates, but it wouldn’t have to be. A few years ago, work from Woodward’s lab showed that aldehydes can also do a number on the bacterial pathogens Staphylococcus aureus and Francisella novicida. (Darwin and Stanley’s team have now shown that mice lacking ALDH2 also fare better against the closely related Francisella tularensis.) Che-Hong Chen, a geneticist at Stanford who’s been studying ALDH2*2 for years, suspects that the culprit might not be a bacterium at all. He favors the idea that it’s, once again, malaria, acting on a different part of our genome, in a different region of the world.

    Other tiny perks of ALDH2*2 may have helped the mutation proliferate. As Chen points out, it’s a pretty big disincentive to drink—and people who abstain (which, of course, isn’t all of us) do spare themselves a lot of potential liver problems. Which is another way in which the consequences of my genetic anomaly might not be so bad, even if at first flush it seems more trouble than it’s worth.

    [ad_2]

    Katherine J. Wu

    Source link

  • Doctors Suddenly Got Way Better at Treating Eczema

    Doctors Suddenly Got Way Better at Treating Eczema

    [ad_1]

    Up until a few years ago, Heather Sullivan’s 14-year-old son, Sawyer, had struggled with eczema his entire life. When he was just a baby, most of his body would be covered in intensely itchy rashes that bled and oozed when he couldn’t help but scratch. His family tried steroid creams, wet wraps, bleach baths, and all of the lotions. They tore up their carpet and replaced their sheetrock in hopes of eliminating triggers. At 15 months, he went on cyclosporine, a powerful immunosuppressant usually given to organ-transplant patients. It cleared him up, but the drug comes with potentially dangerous side effects over time. Doctors, Sullivan recalls, were “just appalled that my child would be on this amount of medicine at this age”—but his eczema came roaring back as soon as he went off it.

    When a new eczema drug called Dupixent finally became available to Sawyer a few years ago, his turnaround was fast and dramatic. Within a week, his itchiness and redness started calming down. He felt and looked better. The condition that had dominated their lives began to fade into the background.

    Doctors who treat severe eczema now speak of pre- and post-Dupixent eras: “It changed the landscape of having eczema forever,” says Brett King, a dermatologist at Yale. Today, a half dozen novel treatments are available for the skin condition, all of which work by quieting the same biological pathway in eczema; dozens more are in clinical trials. Unlike older drugs, these new ones are precisely targeted and in many cases startlingly effective.

    Eczema, also known as atopic dermatitis, is characterized by red, itchy, and inflamed skin. It’s a very common condition, estimated to affect 10 percent of Americans. Of those, a large minority suffer from moderate to severe eczema that seeps into everyday life. “Just imagine scratching endlessly,” King says. “You wake up from sleep scratching. Your sheets are bloody in the morning.” The most basic eczema advice is to moisturize, and moisturize often, to protect the barrier of the skin. But scientists now know that eczema’s cause is not in the skin alone. Many patients also have “an over-reactive or overzealous immune system,” says Dawn Davis, a dermatologist at the Mayo Clinic. Their immune cells release chemicals that irritate nerves, causing itch, and even degrade the skin itself.

    Topical steroids, such as over-the-counter hydrocortisone cream, can tamp down the immune reaction that flares in eczema. If these fail, doctors have resorted to more powerful oral steroids, such as prednisone, or other oral immunosuppressants, such as the aforementioned cyclosporine. The drugs can calm eczema, but because they suppress the overall immune system, they also do much more. Prednisone, for example, makes you more prone to infections as well as bone fractures, high blood pressure, and glaucoma when taken in the long term. Of course, for many people, eczema is a chronic condition that requires long-term treatment. “Prednisone is kind of like carpet bombing,” says Peter Lio, a dermatologist at Northwestern University. It blasts eczema away, but at a cost.

    In contrast, the newer drugs, Lio says, are more like shotguns that target specific parts of the immune system—with less collateral damage. They fall into two broad classes. Monoclonal antibodies, such as Dupixent, intercept the immune-signaling molecules that trigger itch and skin inflammation. And then JAK inhibitors, which include pills such as Rinvoq and the topical cream Opzelura, scramble the signal after cells have received it. The development of these drugs came after years of research zeroed in on some of the key immune molecules dysregulated in eczema. But serendipity played a role too: The first such drugs were originally developed for other conditions, such as rheumatoid arthritis—only to be repurposed when researchers realized that they targeted the very pathways involved in eczema. The breakthroughs in eczema treatment, in fact, are part of a broader revolution in treating inflammatory disorders; both classes of new drugs are now used to tune the immune system in a whole host of different conditions.

    The monoclonal antibodies and oral JAK inhibitors may have their own serious side effects, such as blood clots, which, Lio says, give some doctors unfamiliar with the new drugs—especially the latter type—pause. But the traditional drugs are not great either. “I’m frustrated that a lot of clinicians are very cavalier about prednisone and cyclosporine … They’re like, ‘Oh, they’re our old friends,’” he told me. “Then they get nervous about JAK inhibitors.” In his mind, the new drugs are simply the better option in terms of safety and efficacy.

    Jonathan Silverberg, a dermatologist at George Washington University who specializes in eczema, says he now rarely uses the old oral steroids and immunosuppressants. When he does revert to them, it’s not for medical reasons: He ends up prescribing older (that is, generic and therefore cheaper) drugs for uninsured patients who can’t afford the new ones, or for patients who have insurance but are nevertheless denied coverage. “Insurance says, ‘Can it be fixed with a $10 medicine? Or does it really need the $1,000 tube?’” King told me. Getting patients these newer drugs can mean a lot of time fighting with insurance.

    For now, these drugs have most dramatically improved the lives of patients with moderate to severe eczema—at least those patients who can access them. But doctors told me that topical JAK inhibitors, which are safer than the oral version, could one day be first-line treatments for mild eczema as well. “In a perfect world, I would love it if I never had to prescribe a topical steroid again,” Silverberg said, citing the side effects that come with long-term use. Topical steroids can thin the skin, causing stretch marks, fragility, and poor healing. But at the moment, steroids are also cheap and easily available. They’re not going anywhere as long as the new treatments still come with hefty price tags.

    [ad_2]

    Sarah Zhang

    Source link

  • Ticks’ Secret Weapon

    Ticks’ Secret Weapon

    [ad_1]

    In the three-plus decades I’ve been alive, I have never been bitten by a tick. Actually, that may be a lie, and I have no way of knowing for sure. Because even though ticks have harpoonlike mouthparts, even though certain species can latch on for up to two weeks, even though some guzzle enough blood to swell 100 times in weight, their bites are disturbingly discreet. “As a kid, I would have hundreds of ticks on me,” at least several of which would bite, says Adela Oliva Chavez, a tick researcher at Texas A&M University. And yet she would never notice until her aunt would pick them off her skin.

    The secret behind tick stealth is tick saliva—a strange, slippery, and multifaceted fluid with no biological peer. It keeps the pests’ bites bizarrely itch- and pain-free, and allows them to feed unimpeded by their hosts’ immunity. As climate change remodels the world, spit is also what’s helping ticks enter new habitats and hosts—bringing with them the many deadly viruses, bacteria, and parasites they so often import.

    For all their dependency on blood, ticks almost never eat. In their sometimes-multiyear life span, they may feed only once in each stage: larva, nymph, and adult. Which means, as my colleague Sarah Zhang once wrote, each meal must count for an awful lot. Unlike mosquitoes and other bloodsucking bugs that can get away with a dine and dash, ticks must linger on flesh for days or even weeks—an extended feast that requires them to essentially graft onto the host’s body like a temporary limb.

    For the entirety of that process, saliva is key. When a tick first bites, its spit lines the wound with a gluelike substance that cements its mouth in place. Once secure, the tick deploys a fleet of spit-borne compounds that dilate its host’s vessels, while simultaneously battling the bodily compounds that would normally prompt the injury to clot, heal, or tingle with pain or itch. Under most circumstances, such an onslaught of foreign molecules would instantly marshal the body’s immune cavalry. But ticks have workarounds for that too. Their saliva is an anti-inflammatory and an analgesic; it can disable the alarms that cells send to one another, preventing them from coordinating an attack. Spit can even reprogram immune cells so that they never complete their development or receive the cues they need to gather at the scene.

    All of these strategies can also ease the way for bacteria, viruses, and parasites that the tick swallows from one host, then deposits into the next. With tick saliva breaching the skin barrier and keeping the immune system in check, all the pathogens have to do is come along for the ride. “Tick saliva is like a luxury vehicle that delivers them to the site of infection and rolls out the red carpet,” says Seemay Chou, the CEO of the biotech start-up Arcadia Science. Studies have shown that multiple pathogens get an infectious boost when chauffeured by spit, spilling more efficiently into the skin of newly bitten hosts. Borrelia burgdorferi, the bacterium that causes Lyme disease, will even slather parts of tick saliva onto itself like a cloak, essentially rendering itself invisible to bodily defense. Ticks’ infectious cargo may even egg each other on: Saravanan Thangamani, at Upstate Medical University, in New York, has found evidence that ticks simultaneously carrying Borrelia and Powassan virus may end up injecting more of the latter into fresh wounds.

    Already, ticks spread more pathogens to humans and their livestock than any other insect or arachnid. And the risks ticks pose may only be growing, as warming temperatures and human meddling with wildlife allow them to expand their geographic range and infiltrate new hosts. In North America, lone-star ticks and black-legged ticks have been orchestrating a concerted march north into Canada. At the same time, the percentage of ticks carrying infections is also increasing, Thangamani told me, and for decades now, case counts of Lyme disease and tick-borne encephalitis in several parts of the world have been on a steady rise. As cold seasons shrink, the periods of the year when ticks bite—usually, the warmest months—are expanding too. “Many, many places are getting filled up with ticks,” says Jean Tsao, an entomologist at Michigan State University. “And they’re going to get more.”

    It helps that many ticks aren’t picky about whom they carry or bite. Some species, as they push into new places, have picked up new pathogens in the past few years—Bourbon virus, heartland virus—that pose additional threats to us. Many tick species are also relatively indiscriminate about their hosts: Within its lifetime, a single deer tick may “feed very happily on reptiles, avians, and birds,” says Pat Nuttall, a virologist and tick researcher at the University of Oxford. Their spit is intricate enough that it can be tailored to counteract the defenses of each species in turn. Transfer a tick from a rabbit to a human or a dog, Oliva Chavez told me, and it will take notice—and adjust its saliva, quite literally to taste.

    Vaccines to combat Lyme and other tick-borne diseases have long been in development. But many researchers think the more efficient tactic is going after the tick itself—a strategy that could, at best, “stop the transmission of several pathogens at once,” says Girish Neelakanta, a tick biologist at the University of Tennessee at Knoxville. Anti-tick immunity is possible: Studies have documented guinea pigs, cattle, rabbits, goats, and dogs developing sustained defenses against the arachnids after they’ve been bitten over and over again—even reactions that can help the animals detect a bite immediately, and sweep the pest away.

    But spit is a slippery target for bodily defenses to hit. The substance doesn’t just shut down immune responses. It also reformulates itself constantly so that it can keep evading the host’s defenses—as often as every few hours, faster than most of the immune system can keep track. By the time the body has prepped an assault on one salivary ingredient, the tick has almost certainly swapped it out for the next. “It’s a game that the tick is playing, a catch-me-if-you-can kind of thing,” says Sukanya Narasimhan, a tick researcher at Yale. To outcompete the tick’s tricks, Narasimhan thinks it will be key to develop a vaccine that triggers the body to respond to tick bites fast, “as soon as a tick attaches,” she said, ideally by targeting the saliva’s first ingredients.

    As ticks continue their takeover, it’s hard not to develop at least some grudging respect for their pluck. Some scientists even think that studying, or perhaps mimicking, their saliva could lead to other breakthroughs. Copycatting the spit’s immunosuppressive tendencies could be useful for the treatment of asthma, or for drugs that assist in organ transplants; imitating its anticoagulant properties could help keep life-threatening clots at bay. Some tick-saliva ingredients have even prompted investigations into their potential as cancer therapy. Ticks, after all, have been studying mammalian bodies for millions of years, all in hopes of subterfuge; under their tutelage, Chou, the Arcadia Science CEO, hopes to learn more about the molecular pathways that drive the urge to itch.

    Ticks aren’t invincible, though, and some of the same global changes now easing their entry into new habitats could eventually hinder their progress. Already, they are fleeing parts of the planet that have grown too hot, too humid, too flooded, too razed with wildfires for them or their preferred hosts to survive, including certain inhospitable pockets of the American South. A tick decline could be good for us. But it would also be a symptom of a planetary scourge that has grown worse. Ticks, undoubtedly, “will continue to adapt,” Thangamani told me. And yet they, too, have their limits—further, but not that much further, beyond our own.

    [ad_2]

    Katherine J. Wu

    Source link

  • A Gaping Hole in Cancer-Therapy Trials

    A Gaping Hole in Cancer-Therapy Trials

    [ad_1]

    This article was originally published by Undark Magazine.

    In October 2021, 84-year-old Jim Yeldell was diagnosed with Stage 3 lung cancer. The first drug he tried disrupted his balance and coordination, so his doctor halved the dose to minimize these side effects, Yeldell recalls. In addition, his physician recommended a course of treatment that included chemotherapy, radiation, and a drug targeting a specific genetic mutation. This combination can be extremely effective—at least in younger people—but it can also be “incredibly toxic” in older, frail people, says Elizabeth Kvale, a palliative-care specialist at Baylor College of Medicine, and also Yeldell’s daughter-in-law.

    Older patients are often underrepresented in clinical trials of new cancer treatments, including the one offered to Yeldell. As a result, he only learned of the potential for toxicity because his daughter-in-law had witnessed the treatment’s severe side effects in the older adults at her clinic.

    This dearth of age-specific data has profound implications for clinical care, because older adults are more likely than younger people to be diagnosed with cancer. In the U.S., approximately 42 percent of people with cancer are over the age of 70—a number that could grow in the years to come—yet they comprise less than a quarter of the people in clinical trials to test new cancer treatments. Many of those who do participate are the healthiest of the aged, who may not have common age-related conditions like diabetes or poor kidney or heart function, says Mina Sedrak, a medical oncologist and the deputy director of the Center for Cancer and Aging at City of Hope National Medical Center.

    For decades, clinical trials have tended to exclude older participants for reasons that include concerns about preexisting conditions and other medications and participants’ ability to travel to trial locations. As a result, clinicians cannot be as certain that approved cancer drugs will work as predicted in clinical trials for the people most likely to have cancer. This dearth of data means that older cancer patients must decide if they want to pursue a treatment that might yield fewer benefits—and cause more side effects—than it did for younger people in the clinical trial.

    This evidence gap extends across the spectrum of cancer treatments—from chemotherapy and radiation to immune-checkpoint inhibitors—with sometimes-dire results. Many forms of chemotherapy, for example, have proved to be more toxic in older adults, a discovery that came only after the drugs were approved for use in this population. “This is a huge problem,” Sedrak says. In an effort to minimize side effects, doctors will often tweak the dose or duration of medications that are given to older adults, but these physicians are doing this without any real guidance.

    Despite recommendations from funders and regulators, as well as extensive media coverage, not much has changed in the past three decades. “We’re in this space where everyone agrees this is a problem, but there’s very little guidance on how to do better for older adults,” Kvale says. “The consequences in the real world are stark.”


    Post-approval studies of cancer drugs have helped shed light on the disconnect between how these drugs are used in clinical trials and how they are used in clinics around the country.

    For example, when Cary Gross, a physician and cancer researcher at Yale, set out to study the use of a new kind of cancer drug known as an immune-checkpoint inhibitor, he knew that most clinicians were well aware that clinical trials overlooked older patients. Gross’s research team suspected that some doctors might be wary of offering older adults the treatments, which work by preventing immune cells from switching off, thus allowing them to kill cancer cells. “Maybe they’re going to be more careful,” he says, and offer the intervention to younger patients first.

    But in a 2018 analysis of more than 3,000 patients, Gross and his colleagues found that within four months of approval by the FDA, most patients eligible to receive a class of immune-checkpoint inhibitors were being prescribed the drugs. And the patients receiving this treatment in clinics were significantly older than those in the clinical trials. “Oncologists were very ready to give these drugs to the older patients, even though they’re not as well represented,” Gross says.

    In another analysis, published this year, Gross and his colleagues examined how these drugs helped people diagnosed with certain types of lung cancer. The team found that the drugs extended the life of patients under the age of 55 by a median of four and a half months, but only by a month in those over the age of 75.

    The evidence doesn’t suggest that checkpoint inhibitors aren’t helpful for many patients, Gross says. But it’s important to identify which particular populations are helped the most by these drugs. “I thought that we would see a greater survival benefit than we did,” he says. “It really calls into question how we’re doing research, and we really have to double down on doing more research that includes older patients.”

    People over the age of 65 don’t fare well with other types of cancer treatments either. About half of older patients with advanced cancer experience severe and even potentially life-threatening side effects with chemotherapy, which can lead oncologists to lower medication doses, as in Yeldell’s case.

    There’s a strong connection between the lack of evidence from clinical trials and worse outcomes in the clinic, according to Kvale. “There’s a lot of enthusiasm for these medicines that don’t seem so toxic up front,” she says, “but understanding where they do or don’t work well is key—not just because of the efficacy, but because those drugs are almost toxically expensive sometimes.”

    Since the earliest reports of this data gap, regulators and researchers have tried to fix the problem. Changes to clinical trials have, in principle, made it easier for older adults to sign up. For instance, fewer and fewer studies have an upper age limit for participants. Last year, the FDA issued guidance to industry-funded trials recommending the inclusion of older adults and relaxing other criteria, to allow for participants with natural age-related declines. Still, the problem persists.

    When Sedrak and his colleagues set out to understand why the needle had moved so little over the past few decades, their analysis found a number of explanations, beginning with eligibility criteria that may inadvertently disqualify older adults. Physicians may also be concerned about their older patients’ ability to tolerate unknown side effects of new drugs. Patients and caregivers share these concerns. The logistics of participation can also prove problematic.

    “But of all these, the main driving force, the upstream force, is that trials are not designed with older adults in mind,” Sedrak says. Clinical trials tend to focus on survival, and although older adults do care about this, many of them have other motivations—and concerns—when considering treatment.


    Clinical trials are generally geared toward measuring improvements in health: They may track the size of tumors or months of life gained. These issues aren’t always top of mind for older adults, according to Sedrak. He says he’s more likely to hear questions about how side effects may influence the patient’s cognitive function, ability to live independently, and more. “We don’t design trials that capture the end points that older adults want to know,” he says.

    As a group, older adults do experience more side effects, sometimes so severe that the cure rivals the disease. In the absence of evidence from clinical trials, clinicians and patients have tried to find other ways to predict how a patient’s age might influence their response to treatment. In Yeldell’s case, discussions with Kvale and his care team led him to choose a less intensive course of treatment that has kept his cancer stable since October 2022. He continues to live in his own home and exercises with a trainer three times a week.

    For others trying to weigh their choices, researchers are developing tools that can create a more complete picture by accounting for a person’s physiological age. In a 2021 clinical trial, Supriya Mohile, a geriatric oncologist at the University of Rochester, and her colleagues tested the use of one such tool, known as a geriatric assessment, on the side effects and toxicity of cancer treatments. The tool assesses a person’s biological age based on various physiological tests.

    The team recruited more than 700 people with an average age of 77 who were about to embark on a new cancer-treatment regimen with a high risk of toxicity. Roughly half of the participants received guided treatment-management recommendations based on a geriatric assessment, which their oncologists factored into their treatment decisions. Only half of this group of patients experienced serious side effects from chemotherapy, compared with 71 percent of those who didn’t receive specialized treatment recommendations.

    This type of assessment can help avoid both undertreatment of people who might benefit from chemotherapy and overtreatment of those at risk of serious side effects, Mohile says. It doesn’t compensate for the lack of data on older adults. But in the absence of that evidence, tools such as geriatric assessment can help clinicians, patients, and families make better-informed choices. “We’re kind of going backwards around the problem,” Mohile says. Although geriatric oncologists recognize the need for better ways to make decisions, she says, “I think the geriatric assessment needs to be implemented until we have better clinical-trial data.”

    Since 2018, the American Society of Clinical Oncology has recommended the use of geriatric assessment to guide cancer care for older patients. But clinicians have been slow to follow through in their practice, in part because the assessment doesn’t necessarily show any cancer-specific benefits, such as tumors shrinking and people living longer. Instead, the tool’s main purpose is to improve quality of life. “We need more prospective therapeutic trials in older adults, but we also need all of these other mechanisms to be funded,” Mohile says, “So we actually know what to do for older adults who are in the real world.”

    [ad_2]

    Jyoti Madhusoodanan

    Source link

  • The COVID Question That Will Take Decades to Answer

    The COVID Question That Will Take Decades to Answer

    [ad_1]

    To be a newborn in the year 2023—and, almost certainly, every year that follows—means emerging into a world where the coronavirus is ubiquitous. Babies might not meet the virus in the first week or month of life, but soon enough, SARS-CoV-2 will find them. “For anyone born into this world, it’s not going to take a lot of time for them to become infected,” maybe a year, maybe two, says Katia Koelle, a virologist and infectious-disease modeler at Emory University. Beyond a shadow of a doubt, this virus will be one of the very first serious pathogens that today’s infants—and all future infants—meet.

    Three years into the coronavirus pandemic, these babies are on the leading edge of a generational turnover that will define the rest of our relationship with SARS-CoV-2. They and their slightly older peers are slated to be the first humans who may still be alive when COVID-19 truly hits a new turning point: when almost everyone on Earth has acquired a degree of immunity to the virus as a very young child.

    That future crossroads might not sound all that different from where the world is currently. With vaccines now common in most countries and the virus so transmissible, a significant majority of people have some degree of immunity. And in recent months, the world has begun to witness the consequences of that shift. The flux of COVID cases and hospitalizations in most countries seems to be stabilizing into a seasonal-ish sine wave; disease has gotten, on average, less severe, and long COVID seems to be somewhat less likely among those who have recently gotten shots. Even the virus’s evolution seems to be plodding, making minor tweaks to its genetic code, rather than major changes that require another Greek-letter name.

    But today’s status quo may be more of a layover than a final destination in our journey toward COVID’s final form. Against SARS-CoV-2, most little kids have fared reasonably well. And as more babies have been born into a SARS-CoV-2-ridden world, the average age of first exposure to this coronavirus has been steadily dropping—a trend that could continue to massage COVID-19 into a milder disease. Eventually, the expectation is that the illness will reach a stable nadir, at which point it may truly be “another common cold,” says Rustom Antia, an infectious-disease modeler at Emory.

    The full outcome of this living experiment, though, won’t be clear for decades—well after the billions of people who encountered the coronavirus for the first time in adulthood are long gone. The experiences that today’s youngest children have with the virus are only just beginning to shape what it will mean to have COVID throughout a lifetime, when we all coexist with it from birth to death as a matter of course.


    At the beginning of SARS-CoV-2’s global tear, the coronavirus was eager to infect all of us, and we had no immunity to rebuff its attempts. But vulnerability wasn’t just about immune defenses: Age, too, has turned out to be key to resilience. Much of the horror of the disease could be traced to having not only a large population that lacked protection against the virus—but a large adult population that lacked protection against the virus. Had the entire world been made up of grade-schoolers when the pandemic arrived, “I don’t think it would have been nearly as severe,” says Juliet Pulliam, an infectious-disease modeler at Stellenbosch University, in South Africa.

    Across several viral diseases—polio, chicken pox, mumps, SARS, measles, and more—getting sick as an adult is notably more dangerous than as a kid, a trend that’s typically exacerbated when people don’t have any vaccinations or infections to those pathogens in their rearview. The manageable infections that strike toddlers and grade-schoolers may turn serious when they first manifest at older ages, landing people in the hospital with pneumonia, brain swelling, even blindness, and eventually killing some. When scientists plot mortality data by age, many curves bend into “a pretty striking J shape,” says Dylan Morris, an infectious-disease modeler at UCLA.

    The reason for that age differential isn’t always clear. Some of kids’ resilience probably comes from having a young, spry body, far less likely to be burdened with chronic medical conditions that raise severe disease risk. But the quick-wittedness of the young immune system is also likely playing a role. Several studies have found that children are much better at marshaling hordes of interferon—an immune molecule that armors cells against viruses—and may harbor larger, more efficient cavalries of infected-cell-annihilating T cells. That performance peaks sometime around grade school or middle school, says Janet Chou, a pediatrician at Boston Children’s Hospital. After that, our molecular defenses begin a rapid tumble, growing progressively creakier, clumsier, sluggish, and likelier to launch misguided attacks against the tissues that house them. By the time we’re deep into adulthood, our immune systems are no longer sprightly, or terribly well calibrated. When we get sick, our bodies end up rife with inflammation. And our immune cells, weary and depleted, are far less unable to fight off the pathogens they once so easily trounced.

    Whatever the explanations, children are far less likely to experience serious symptoms, or to end up in the hospital or the ICU after being infected with SARS-CoV-2. Long COVID, too, seems to be less prevalent in younger cohorts, says Alexandra Yonts, a pediatrician at Children’s National Hospital. And although some children still develop MIS-C, a rare and dangerous inflammatory condition that can appear weeks after they catch the virus, the condition “seems to have dissipated” as the pandemic has worn on, says Betsy Herold, the chief of pediatric infectious disease at the Children’s Hospital at Montefiore, in the Bronx.

    Should those patterns hold, and as the age of first exposure continues to fall, COVID is likely to become less intense. The relative mildness of childhood encounters with the virus could mean that almost everyone’s first infection—which tends, on average, to be more severe than the ones that immediately follow—could rank low in intensity, setting a sort of ceiling for subsequent bouts. That might make concentrating first encounters “in the younger age group actually a good thing,” says Ruian Ke, an infectious-disease modeler at Los Alamos National Laboratory.

    COVID will likely remain capable of killing, hospitalizing, and chronically debilitating a subset of adults and kids alike. But the hope, experts told me, is that the proportion of individuals who face the worst outcomes will continue to drop. That may be what happened in the aftermath of the 1918 flu pandemic, Antia, of Emory, told me: That strain of the virus stuck around, but never caused the same devastation again. Some researchers suspect that something similar may have even played out with another human coronavirus, OC43: After sparking a devastating pandemic in the 19th century, it’s possible that the virus no longer managed to wreak much more havoc than a common cold in a population that had almost universally encountered it early in life.


    Such a fate for COVID, though, isn’t a guarantee. The virus’s propensity to linger in the body’s nooks and crannies, sometimes causing symptoms that last many months or years, could make it an outlier among its coronaviral kin, says Melody Zeng, an immunologist at Cornell University. And even if the disease is likely to get better than what it is now, that is not a very high bar to clear.

    Some small subset of the population will always be naive to the virus—and it’s not exactly a comfort that in the future, that cohort will almost exclusively be composed of our kids. Pediatric immune systems are robust, UCLA’s Morris told me. But “robust is not the same as infallible.” Since the start of the pandemic, more than 2,000 Americans under the age of 18 have died from COVID—a small fraction of total deaths, but enough to make the disease a leading cause of death for children in the U.S. MIS-C and long COVID may not be common, but their consequences are no less devastating for the children who experience them. Some risks are especially concentrated among our youngest kids, under the age 5, whose immune defenses are still revving up, making them more vulnerable than their slightly older peers. There’s especially little to safeguard newborns just under six months, who aren’t yet eligible for most vaccines—including COVID shots—and who are rapidly losing the antibody-based protection passed down from their mothers while they were in the womb.

    A younger average age of first infection will also probably increase the total number of exposures people have to SARS-CoV-2 in a typical lifetime—each instance carrying some risk of severe or chronic disease. Ke worries the cumulative toll that this repetition could exact: Studies have shown that each subsequent tussle with the virus has the potential to further erode the functioning or structural integrity of organs throughout the body, raising the chances of chronic damage. There’s no telling how many encounters might push an individual past a healthy tipping point.

    Racking up exposures also won’t always bode well for the later chapters of these children’s lives. Decades from now, nearly everyone will have banked plenty of encounters with SARS-CoV-2 by the time they reach advanced age, Chou, from Boston Children’s Hospital, told me. But the virus will also continue to change its appearance, and occasionally escape the immunity that some people built up as kids. Even absent those evasions, as their immune systems wither, many older people may not be able to leverage past experiences with the disease to much benefit. The American experience with influenza is telling. Despite a lifetime of infections and available vaccines, tens of thousands of people typically die annually of the disease in the United States alone, says Ofer Levy, the director of the Precision Vaccines Program at Boston Children’s Hospital. So even with the expected COVID softening, “I don’t think we’re going to reach a point where it’s, Oh well, tra-la-la,” Levy told me. And the protection that immunity offers can have caveats: Decades of research with influenza suggest that immune systems can get a bit hung up on the first versions of a virus that they see, biasing them against mounting strong attacks against other strains; SARS-CoV-2 now seems to be following that pattern. Depending on the coronavirus variants that kids encounter first, their responses and vulnerability to future bouts of illness may vary, says Scott Hensley, an immunologist at the University of Pennsylvania.

    Early vaccinations—that ideally target multiple versions of SARS-CoV-2—could make a big difference in reducing just about every bad outcome the virus threatens. Severe disease, long COVID, and transmission to other children and vulnerable adults all would likely be “reduced, prevented, and avoided,” Chou told me. But that’s only if very young kids are taking those shots, which, right now, isn’t at all the case. Nor are they necessarily getting protection passed down during gestation or early life from their mothers, because many adults are not up to date on COVID shots.

    Some of these issues could, in theory, end up moot. A hundred or so years from now, COVID could simply be another common cold, indistinguishable in practice from any other. But Morris points out that this reality, too, wouldn’t fully spare us. “When we bother to look at the burden of the other human coronaviruses, the ones who have been with us for ages? In the elderly, it’s real,” he told me. One study found that a nursing-home outbreak of OC43—the purported former pandemic coronavirus—carried an 8 percent fatality rate; another, caused by NL63, killed three out of the 20 people who caught it in a long-term-care facility in 2017. These and other “mild” respiratory viruses also continue to pose a threat to people of any age who are immunocompromised.

    SARS-CoV-2 doesn’t need to follow in those footsteps. It’s the only human coronavirus against which we have vaccines—which makes the true best-case scenario one in which it ends up even milder than a common cold, because we proactively protect against it. Disease would not need to be as inevitable; the vaccine, rather than the virus, could be the first bit of intel on the disease that kids receive. Tomorrow’s children probably won’t live in a COVID-free world. But they could at least be spared many of the burdens we’re carrying now.

    [ad_2]

    Katherine J. Wu

    Source link

  • Someday, You Might Be Able to Eat Your Way Out of a Cold

    Someday, You Might Be Able to Eat Your Way Out of a Cold

    [ad_1]

    When it comes to treating disease with food, the quackery stretches back far. Through the centuries, raw garlic has been touted as a home treatment for everything from chlamydia to the common cold; Renaissance remedies for the plague included figs soaked in hyssop oil. During the 1918 flu pandemic, Americans wolfed down onions or chugged “fluid beef” gravy to keep the deadly virus at bay.

    Even in modern times, the internet abounds with dubious culinary cure-alls: apple-cider vinegar for gonorrhea; orange juice for malaria; mint, milk, and pineapple for tuberculosis. It all has a way of making real science sound like garbage. Research on nutrition and immunity “has been ruined a bit by all the writing out there on Eat this to cure cancer,” Lydia Lynch, an immunologist and a cancer biologist at Harvard, told me.

    In recent years, though, plenty of legit studies have confirmed that our diets really can affect our ability to fight off invaders—down to the fine-scale functioning of individual immune cells. Those studies belong to a new subfield of immunology sometimes referred to as immunometabolism. Researchers are still a long way off from being able to confidently recommend specific foods or dietary supplements for colds, flus, STIs, and other infectious illnesses. But someday, knowledge of how nutrients fuel the fight against disease could influence the way that infections are treated in hospitals, in clinics, and maybe at home—not just with antimicrobials and steroids but with dietary supplements, metabolic drugs, or whole foods.

    Although major breakthroughs in immunometabolism are just now arriving, the concepts that underlie them have been around for at least as long as the quackery. People have known for millennia that in the hours after we fall ill, our appetite dwindles; our body feels heavy and sluggish; we lose our thirst drive. In the 1980s, the veterinarian Benjamin Hart argued that those changes were a package deal—just some of many sickness behaviors, as he called them, that are evolutionarily hardwired into all sorts of creatures. The goal, Hart told me recently, is to “help the animal stay in one place and conserve energy”—especially as the body devotes a large proportion of its limited resources to igniting microbe-fighting fevers.

    The notion of illness-induced anorexia (not to be confused with the eating disorder anorexia nervosa) might seem, at first, like “a bit of a paradox,” says Zuri Sullivan, an immunologist at Harvard. Fighting pathogenic microbes is energetically costly—which makes eating less a very counterintuitive choice. But researchers have long posited that cutting down on calories could serve a strategic purpose: to deprive certain pathogens of essential nutrients. (Because viruses do not eat to acquire energy, this notion is limited to cell-based organisms such as bacteria, fungi, and parasites.) A team led by Miguel Soares, an immunologist at the Instituto Gulbenkian de Ciência, in Portugal, recently showed that this exact scenario might be playing out with malaria. As the parasites burst out of the red blood cells where they replicate, the resulting spray of heme (an oxygen-transporting molecule) prompts the liver to stop making glucose. The halt seems to deprive the parasites of nutrition, weakening them and tempering the infection’s worst effects.

    Cutting down on sugar can be a dangerous race to the bottom: Animals that forgo food while they’re sick are trying to starve out an invader before they themselves run out of energy. Let the glucose boycott stretch on too long, and the dieter might develop dangerously low blood sugar —a common complication of severe malaria—which can turn deadly if untreated. At the same time, though, a paucity of glucose might have beneficial effects on individual tissues and cells during certain immune fights. For example, low-carbohydrate, high-fat ketogenic diets seem to enhance the protective powers of certain types of immune cells in mice, making it tougher for particular pathogens to infiltrate airway tissue.

    Those findings are still far from potential human applications. But Andrew Wang, an immunologist and a rheumatologist at Yale, hopes that this sort of research could someday yield better clinical treatments for sepsis, an often fatal condition in which an infection spreads throughout the body, infiltrating the blood. “It’s still not understood exactly what you’re supposed to feed folks with sepsis,” Wang told me. He and his former mentor at Yale, Ruslan Medzhitov, are now running a clinical trial to see whether shifting the balance of carbohydrates and lipids in their diet speeds recovery for people ill with sepsis. If the team is able to suss out clear patterns, doctors might eventually be able to flip the body’s metabolic switches with carefully timed doses of drugs, giving immune cells a bigger edge against their enemies.

    But the rules of these food-illness interactions, to the extent that anyone understands them, are devilishly complex. Sepsis can be caused by a whole slew of different pathogens. And context really, really matters. In 2016, Wang, Medzhitov, and their colleagues discovered that feeding mice glucose during infections created starkly different effects depending on the nature of the pathogen driving disease. When the mice were pumped full of glucose while infected with the bacterium Listeria, all of them died—whereas about half of the rodents that were allowed to give in to their infection-induced anorexia lived. Meanwhile, the same sugary menu increased survival rates for mice with the flu.

    In this case, the difference doesn’t seem to boil down to what the microbe was eating. Instead, the mice’s diet changed the nature of the immune response they were able to marshal—and how much collateral damage that response was able to inflict on the body, as James Hamblin wrote for The Atlantic at the time. The type of inflammation that mice ignited against Listeria, the team found, could imperil fragile brain cells when the rodents were well fed. But when the mice went off sugar, their starved livers started producing an alternate fuel source called ketone bodies—the same compounds people make when on a ketogenic diet—that helped steel their neurons. Even as the mice fought off their bacterial infections, their brain stayed resilient to the inflammatory burn. The opposite played out when the researchers subbed in influenza, a virus that sparks a different type of inflammation: Glucose pushed brain cells into better shielding themselves against the immune system’s fiery response.

    There’s not yet one unifying principle to explain these differences. But they are a reminder of an underappreciated aspect of immunity. Surviving disease, after all, isn’t just about purging a pathogen from the body; our tissues also have to guard themselves from shrapnel as immune cells and microbes wage all-out war. It’s now becoming clear, Soares told me, that “metabolic reprogramming is a big component of that protection.” The tactics that thwart a bacterium like Listeria might not also shield us from a virus, a parasite, or a fungus; they may not be ideal during peacetime. Which means our bodies must constantly toggle between metabolic states.

    In the same way that the types of infections likely matter, so do the specific types of nutrients: animal fats, plant fats, starches, simple sugars, proteins. Like glucose, fats can be boons in some contexts but detrimental in others, as Lynch has found. In people with obesity or other metabolic conditions, immune cells appear to reconfigure themselves to rely more heavily on fats as they perform their day-to-day functions. They can also be more sluggish when they attack. That’s the case for a class of cells called natural killers: “They still recognize cancer or a virally infected cell and go to it as something that needs to be killed,” Lynch told me. “But they lack the energy to actually kill it.” Timing, too, almost certainly has an effect. The immune defenses that help someone expunge a virus in the first few days of an infection might not be the ones that are ideal later on in the course of disease.

    Even starving out bacterial enemies isn’t a surefire strategy. A few years ago, Janelle Ayres, an immunologist at the Salk Institute for Biological Studies, and her colleagues found that when they infected mice with Salmonella and didn’t allow the rodents to eat, the hungry microbes in their guts began to spread outside of the intestines, likely in search of food. The migration ended up killing tons of their tiny mammal hosts. Mice that ate normally, meanwhile, fared far better—though the Salmonella inside of them also had an easier time transmitting to new hosts. The microbes, too, were responding to the metabolic milieu, and trying to adapt. “It would be great if it was as simple as ‘If you have a bacterial infection, reduce glucose,’” Ayres said. “But I think we just don’t know.”

    All of this leaves immunometabolism in a somewhat chaotic state. “We don’t have simple recommendations” on how to eat your way to better immunity, Medzhitov told me. And any that eventually emerge will likely have to be tempered by caveats: Factors such as age, sex, infection and vaccination history, underlying medical conditions, and more can all alter people’s immunometabolic needs. After Medzhitov’s 2016 study on glucose and viral infections was published, he recalls being dismayed by a piece from a foreign outlet circulating online claiming that “a scientist from the USA says that during flu, you should eat candy,” he told me with a sigh. “That was bad.”

    But considering how chaotic, individualistic, and messy nutrition is for humans, it shouldn’t be a surprise that the dietary principles governing our individual cells can get pretty complicated too. For now, Medzhitov said, we may be able to follow our instincts. Our bodies, after all, have been navigating this mess for millennia, and have probably picked up some sense of what they need along the way. It may not be a coincidence that during viral infections, “something sweet like honey and tea can really feel good,” Medzhitov said. There may even be some immunological value in downing the sick-day classic, chicken soup: It’s chock-full of fluid and salts, helpful things to ingest when the body’s electrolyte balance has been thrown out of whack by disease.

    The science around sickness cravings is far from settled. Still, Sullivan, who trained with Medzhitov, jokes that she now feels better about indulging in Talenti mango sorbet when she’s feeling under the weather with something viral, thanks to her colleagues’ 2016 finds. Maybe the sugar helps her body battle the virus without harming itself; then again, maybe not. For now, she figures it can’t hurt to dig in.

    [ad_2]

    Katherine J. Wu

    Source link

  • Is COVID Immunity Hung Up on Old Variants?

    Is COVID Immunity Hung Up on Old Variants?

    [ad_1]

    In the two-plus years that COVID vaccines have been available in America, the basic recipe has changed just once. The virus, meanwhile, has belched out five variants concerning enough to earn their own Greek-letter names, followed by a menagerie of weirdly monikered Omicron subvariants, each seeming to spread faster than the last. Vaccines, which take months to reformulate, just can’t keep up with a virus that seems to reinvent itself by the week.

    But SARS-CoV-2’s evolutionary sprint might not be the only reason that immunity can get bogged down in the past. The body seems to fixate on the first version of the virus that it encountered, either through injection or infection—a preoccupation with the past that researchers call “original antigenic sin,” and that may leave us with defenses that are poorly tailored to circulating variants. In recent months, some experts have begun to worry that this “sin” might now be undermining updated vaccines. At an extreme, the thinking goes, people may not get much protection from a COVID shot that is a perfect match for the viral variant du jour.

    Recent data hint at this possibility. Past brushes with the virus or the original vaccine seem to mold, or even muffle, people’s reactions to bivalent shots—“I have no doubt about that,” Jenna Guthmiller, an immunologist at the University of Colorado School of Medicine, told me. The immune system just doesn’t make Omicron-focused antibodies in the quantity or quality it probably would have had it seen the updated jabs first. But there’s also an upside to this stubbornness that we could not live without, says Katelyn Gostic, an immunologist and infectious-disease modeler who has studied the phenomenon with flu. Original antigenic sin is the reason repeat infections, on average, get milder over time, and the oomph that enables vaccines to work as well as they do. “It’s a fundamental part,” Gostic told me, “of being able to create immunological memory.”

    This is not just basic biology. The body’s powerful first impressions of this coronavirus can and should influence how, when, and how often we revaccinate against it, and with what. Better understanding of the degree to which these impressions linger could also help scientists figure out why people are (or are not) fighting off the latest variants—and how their defenses will fare against the virus as it continues to change.


    The worst thing about “original antigenic sin” is its name. The blame for that technically lies with Thomas Francis Jr., the immunologist who coined the phrase more than six decades ago after noticing that the initial flu infections people weathered in childhood could bias how they fared against subsequent strains. “Basically, the flu you get first in life is the one you respond to most avidly for the long term,” says Gabriel Victora, an immunologist at Rockefeller University. That can become somewhat of an issue when a very different-looking strain comes knocking.

    In scenarios like these, original antigenic sin may sound like the molecular equivalent of a lovesick teen pining over an ex, or a student who never graduates out of immunological grade school. But from the immune system’s point of view, never forgetting your first is logically sound. New encounters with a pathogen catch the body off guard—and tend to be the most severe. A deep-rooted defensive reaction, then, is practical: It ups the chances that the next time the same invader shows up, it will be swiftly identified and dispatched. “Having good memory and being able to boost it very quickly is sometimes a very good thing,” Victora told me. It’s the body’s way of ensuring that it won’t get fooled twice.

    These old grudges come with clear advantages even when microbes morph into new forms, as flu viruses and coronaviruses often do. Pathogens don’t remake themselves all at once, so immune cells that home in on familiar snippets of a virus can still in many cases snuff out enough invaders to prevent an infection’s worst effects. That’s why even flu shots that aren’t perfectly matched to the season’s most prominent strains are usually still quite good at keeping people out of hospitals and morgues. “There’s a lot of leniency in how much the virus can change before we really lose protection,” Guthmiller told me. The wiggle room should be even bigger, she said, with SARS-CoV-2, whose subvariants tend to be far more similar to one another than, say, different flu strains are.

    With all the positives that immune memory can offer, many immunologists tend to roll their eyes at the negative and bizarrely moralizing implications of the phrase original antigenic sin. “I really, really hate that term,” says Deepta Bhattacharya, an immunologist at the University of Arizona. Instead, Bhattacharya and others prefer to use more neutral words such as imprinting, evocative of a duckling latching onto the first maternal figure it spots. “This is not some strange immunological phenomenon,” says Rafi Ahmed, an immunologist at Emory University. It’s more a textbook example of what an adaptable, high-functioning immune system does, and one that can have positive or negative effects, depending on context. Recent flu outbreaks have showcased a little bit of each: During the 2009 H1N1 pandemic, many elderly people, normally more susceptible to flu viruses, fared better than expected against the late-aughts strain, because they’d banked exposures to a similar-looking H1N1—a derivative of the culprit behind the 1918 pandemic—in their youth. But in some seasons that followed, H1N1 disproportionately sickened middle-aged adults whose early-life flu indoctrinations may have tilted them away from a protective response.

    The backward-gazing immune systems of those adults may have done more than preferentially amplify defensive responses to a less relevant viral strain. They might have also actively suppressed the formation of a response to the new one. Part of that is sheer kinetics: Veteran immune cells, trained up on past variants and strains, tend to be quicker on the draw than fresh recruits, says Scott Hensley, an immunologist at the Perelman School of Medicine at the University of Pennsylvania. And the greater the number of experienced soldiers, the more likely they are to crowd out rookie fighters—depriving them of battlefield experience they might otherwise accrue. Should the newer viral strain eventually return for a repeat infection, those less experienced immune cells may not be adequately prepared—leaving people more vulnerable, perhaps, than they might otherwise have been.

    Some researchers think that form of imprinting might now be playing out with the bivalent COVID vaccines. Several studies have found that the BA.5-focused shots are, at best, moderately more effective at producing an Omicron-targeted antibody response than the original-recipe jab—not the knockout results that some might have hoped for. Recent work in mice from Victora’s lab backs up that idea: B cells, the manufacturers of antibodies, do seem to have trouble moving past the impressions of SARS-CoV-2’s spike protein that they got from first exposure. But the findings don’t really trouble Victora, who gladly received his own bivalent COVID shot. (He’ll take the next update, too, whenever it’s ready.) A blunted response to a new vaccine, he told me, is not a nonexistent one—and the more foreign a second shot recipe is compared with the first, the more novice fighters should be expected to participate in the fight. “You’re still adding new responses,” he said, that will rev back up when they become relevant. The coronavirus is a fast evolver. But the immune system also adapts. Which means that people who receive the bivalent shot can still expect to be better protected against Omicron variants than those who don’t.

    Historical flu data support this idea. Many of the middle-aged adults slammed by recent H1N1 infections may not have mounted perfect attacks on the unfamiliar virus, but as immune cells continued to tussle with the pathogen, the body “pretty quickly filled in the gaps,” Gostic told me. Although it’s tempting to view imprinting as a form of destiny, “that’s just not how the immune system works,” Guthmiller told me. Preferences can be overwritten; biases can be undone.


    Original antigenic sin might not be a crisis, but its existence does suggest ways to optimize our vaccination strategies with past biases in mind. Sometimes, those preferences might need to be avoided; in other instances, they should be actively embraced.

    For that to happen, though, immunologists would need to fill in some holes in their knowledge of imprinting: how often it occurs, the rules by which it operates, what can entrench or alleviate it. Even among flu viruses, where the pattern has been best-studied, plenty of murkiness remains. It’s not clear whether imprinting is stronger, for instance, when the first exposure comes via infection or vaccination. Scientists can’t yet say whether children, with their fiery yet impressionable immune systems, might be more or less prone to getting stuck on their very first flu strain. Researchers don’t even know for certain whether repetition of a first exposure—say, through multiple doses of the same vaccine, or reinfections with the same variant—will more deeply embed a particular imprint.

    It does seem intuitive that multiple doses of a vaccine could exacerbate an early bias, Ahmed told me. But if that’s the case, then the same principle might also work the other way: Maybe multiple exposures to a new version of the virus could help break an old habit, and nudge the immune system to move on. Recent evidence has hinted that people previously infected with an early Omicron subvariant responded more enthusiastically to a bivalent BA.1-focused vaccine—available in the United Kingdom—than those who’d never encountered the lineage before. Hensley, at the University of Pennsylvania, is now trying to figure out if the same is true for Americans who got the BA.5-based bivalent shot after getting sick with one of the many Omicron subvariants.

    Ahmed thinks that giving people two updated shots—a safer approach, he points out, than adding an infection to the mix—could untether the body from old imprints too. A few years ago, he and his colleagues showed that a second dose of a particular flu vaccine could help shift the ratio of people’s immune responses. A second dose of the fall’s bivalent vaccine might not be practical or palatable for most people, especially now that BA.5 is on its way out. But if next autumn’s recipe overlaps with BA.5 in ways that it doesn’t with the original variant—as it likely will to at least some degree, given the Omicron lineage’s continuing reign—a later, slightly different shot could still be a boon.

    Keeping vaccine doses relatively spaced out—on an annual basis, say, à la flu shots—will likely help too, Bhattacharya said. His recent studies, not yet published, hint that the body might “forget” old variants, as it were, if it’s simply given more time: As antibodies raised against prior infections and injections fall away, vaccine ingredients could linger in the body rather than be destroyed by prior immunity on sight. That slightly extended stay might offer the junior members of the immune system—lesser in number, and slower on the uptake—more of an opportunity to cook up an Omicron-specific response.

    In an ideal world, researchers might someday know enough about imprinting to account for its finickiness whenever they select and roll out new shots. Flu shots, for instance, could be personalized to account for which strains babies were first exposed to, based on birth year; combinations of COVID vaccine doses and infections could dictate the timing and composition of a next jab. But the world is not yet living that reality, Gostic told me. And after three years of an ever-changing coronavirus and a fluctuating approach to public health, it’s clear that there won’t be a single vaccine recipe that’s ideal for everyone at once.

    Even Thomas Francis Jr. did not consider original antigenic sin to be a total negative, Hensley told me. According to Francis, the true issue with the “sin” was that humans were missing out on the chance to imprint on multiple strains at once in childhood, when the immune system is still a blank slate—something that modern researchers could soon accomplish with the development of universal vaccines. Our reliance on first impressions can be a drawback. But the same phenomenon can be an opportunity to acquaint the body with diversity early on—to give it a richer narrative, and memories of many threats to come.

    [ad_2]

    Katherine J. Wu

    Source link

  • Will Nasal COVID Vaccines Save Us?

    Will Nasal COVID Vaccines Save Us?

    [ad_1]

    Since the early days of the coronavirus pandemic, a niche subset of experimental vaccines has offered the world a tantalizing promise: a sustained slowdown in the spread of disease. Formulated to spritz protection into the body via the nose or the mouth—the same portals of entry most accessible to the virus itself—mucosal vaccines could head SARS-CoV-2 off at the pass, stamping out infection to a degree that their injectable counterparts might never hope to achieve.

    Now, nearly three years into the pandemic, mucosal vaccines are popping up all over the map. In September, India authorized one delivered as drops into the nostrils; around the same time, mainland China green-lit an inhalable immunization, and later on, a nasal-spray vaccine, now both being rolled out amid a massive case wave. Two more mucosal recipes have been quietly bopping around in Russia and Iran for many months. Some of the world’s largest and most populous countries now have access to the technology—and yet it isn’t clear how well that’s working out. “Nothing has been published; no data has been made available,” says Mike Diamond, an immunologist at Washington University in St. Louis, whose own approach to mucosal vaccines has been licensed for use in India via a company called Bharat. If mucosal vaccines are delivering on their promise, we don’t know it yet; we don’t know if they will ever deliver.

    The allure of a mucosal vaccine is all about geography. Injectable shots are great at coaxing out immune defenses in the blood, where they’re able to cut down on the risk of severe disease and death. But they aren’t as good at marshaling a protective response in the upper airway, leaving an opening for the virus to still infect and transmit. When viral invaders throng the nose, blood-borne defenses have to scamper to the site of infection at a bit of a delay—it’s like stationing guards next to a bank’s central vault, only to have them rush to the entrance every time a robber trips an external alarm. Mucosal vaccines, meanwhile, would presumably be working at the door.

    That same logic drives the effectiveness of the powerful oral polio vaccine, which bolsters defenses in its target virus’s preferred environment—the gut. Just one mucosal vaccine exists to combat a pathogen that enters through the nose: a nasal spray made up of weakened flu viruses, a version of which is branded as FluMist. The up-the-nose spritz is reasonably protective in kids, in some cases even outperforming its injected counterparts (though not always). But FluMist is much less potent for adults: The immunity they accumulate from a lifetime of influenza infections can wipe out the vaccine before it has time to lay down new protection. When it comes to cooking up a mucosal vaccine for a respiratory virus, “we don’t have a great template to follow,” says Deepta Bhattacharya, an immunologist at the University of Arizona.

    To circumvent the FluMist problem, some researchers have instead concocted viral-vector-based vaccines—the same group of immunizations to which the Johnson & Johnson and AstraZeneca COVID shots belong. China’s two mucosal vaccines fall into this category; so does India’s nose-drop concoction, as well as a nasal version of Russia’s Sputnik V shot. Other researchers are cooking up vaccines that contain ready-made molecules of the coronavirus’s spike protein, more akin to the shot from Novavax. Among them are Iran’s mucosal COVID vaccine and a newer, still-in-development candidate from the immunologist Akiko Iwasaki and her colleagues at Yale. The Yale group is also testing an mRNA-based nasal recipe. And the company Vaxart has been tinkering with a COVID-vaccine pill that could be swallowed to provoke immune cells in the gut, which would then deploy fighters throughout the body’s mucosal surfaces, up through the nose.

    Early data in animals have spurred some optimism. Trial versions of Diamond’s vaccine guarded mice, hamsters, and monkeys from the virus, in some cases seeming to stave off infection entirely; a miniaturized version of Vaxart’s oral vaccine was able to keep infected hamsters from spreading the coronavirus through the air. Iwasaki is pursuing an approach that deploys mucosal vaccines exclusively as boosters to injected shots, in the hopes that the initial jab can lay down bodywide immunity, a subset of which can then be tugged into a specialized compartment in the nose. Her nasal-protein recipe seems to trim transmission rates among rodents that have first received an in-the-muscle shot.

    But attempts to re-create these results in people yielded mixed results. After an intranasal version of the AstraZeneca vaccine roused great defenses in animals, a team at Oxford moved the immunization into a small human trial—and last month, published results showing that it hardly triggered any immune response, even as a booster to an in-the-arm shot. Adam Ritchie, one of the Oxford immunologists behind the study, told me the results don’t necessarily spell disaster for other mucosal attempts, and that with more finagling, AstraZeneca’s vaccine might someday do better up the nose. Still, the results “definitely put a damper on the excitement around intranasal vaccines,” says Stephanie Langel, an immunologist at Case Western Reserve University, who’s partnering with Vaxart to develop a COVID-vaccine pill.

    The mucosal COVID vaccines in India and China, at least, have reportedly shown a bit more promise in small, early human trials. Bharat’s info sheet on its nasal-drop vaccine—the Indian riff on Diamond’s recipe—says it bested another locally made vaccine, Covaxin, at tickling out antibodies, while provoking fewer side effects. China’s inhaled vaccine, too, seems to do reasonably well on the human-antibody front. But antibodies aren’t the same as true effectiveness: Vaccine makers and local health ministries, experts told me, have yet to release large-scale, real-world data showing that the vaccines substantially cut down on transmission or infection. And although some studies have hinted that nasal protection can stick around in animals for many, many months, there’s no guarantee the same will be true in humans, in whom mucosal antibodies, in particular, “are kind of known to wane pretty quickly,” Langel told me.

    SARS-CoV-2 infections have offered sobering lessons of their own. The nasal immune response to the virus itself is neither impenetrable nor particularly long-lived, says David Martinez, a viral immunologist at the University of North Carolina at Chapel Hill. Even people who have been both vaccinated and infected can still get infected again, he told me, and it would be difficult for a nasal vaccine to do much better. “I don’t think mucosal vaccines are going to be the deus ex machina that some people think they’re going to be.”

    Mucosal vaccines don’t need to provide a perfect blockade against infection to prove valuable. Packaged into sprays, drops, or pills, immunizations tailor-made for the mouth or the nose might make COVID vaccines easier to ship, store, and distribute en masse. “They often don’t require specialized training,” says Gregory Poland, a vaccinologist at the Mayo Clinic—a major advantage for rural or low-resource areas. The immunizing experience could also be easier for kids or anyone else who’d rather not endure a needle. Should something like Vaxart’s encapsulated vaccine work out, Langel told me, COVID vaccines could even one day be shipped via mail, in a form safe and easy enough to swallow with a glass of water at home. Some formulations may also come with far fewer side effects than, say, the mRNA-based shots, which “really kick my ass,” Bhattacharya told me. Even if mucosal vaccines weren’t a transmission-blocking knockout, “if it meant I didn’t have to get the mRNA vaccine, I would consider it.”

    But the longer that countries such as the U.S. have gone without mucosal COVID vaccines, the harder it’s gotten to get one across the finish line. Transmission, in particular, is tough to study, and Langel pointed out that any new immunizations will likely have to prove that they can outperform our current crop of injected shots to secure funding, possibly even FDA approval. “It’s an uphill battle,” she told me.

    Top White House advisers remain resolute that transmission-reducing tech has to be part of the next generation of COVID vaccines. Ideally, those advancements would be paired with ingredients that enhance the life span of immune responses and combat a wider swath of variants; skimp on any of them, and the U.S. might remain in repeat-vaccination purgatory for a while yet. “We need to do better on all three fronts,” Anthony Fauci, the outgoing director of the National Institute of Allergy and Infectious Diseases, told me. But packaging all that together will require another major financial investment. “We need Warp Speed 2.0,” says Shankar Musunuri, the CEO of Ocugen, the American company that has licensed Diamond’s recipe. “And so far, there is no action.” When I asked Fauci about this, he didn’t seem optimistic that this would change. “I think that they’ve reached the point where they feel, ‘We’ve given enough money to it,’” he told me. In the absence of dedicated government funds, some scientists, Iwasaki among them, have decided to spin off companies of their own. But without more public urgency and cash flow, “it could be years to decades to market,” Iwasaki told me. “And that’s if everything goes well.”

    Then there’s the issue of uptake. Musunuri told me that he’s confident that the introduction of mucosal COVID vaccines in the U.S.—however long it takes to happen—will “attract all populations, including kids … people like new things.” But Rupali Limaye, a behavioral scientist at Johns Hopkins University, worries that for some, novelty will drive the exact opposite effect. The “newness” of COVID vaccines, she told me, is exactly what has prompted many to adopt an attitude of “wait and see” or even “that’s not for me.” An even newer one that jets ingredients up into the head might be met with additional reproach.

    Vaccine fatigue has also set in for much of the public. In the United States, hospitalizations are once again rising, and yet less than 15 percent of people eligible for bivalent shots have gotten them. That sort of uptake is at odds with the dream of a mucosal vaccine that can drive down transmission. “It would have to be a lot of people getting vaccinated in order to have that public-health population impact,” says Ben Cowling, an epidemiologist at the University of Hong Kong. And there’s no guarantee that even a widely administered mucosal vaccine would be the population’s final dose. The pace at which we’re doling out shots is driven in part by “the virus changing so quickly,” says Ali Ellebedy, an immunologist at Washington University in St. Louis. Even a sustained encampment of antibodies in the nose could end up being a poor match for the next variant that comes along, necessitating yet another update.

    The experts I spoke with worried that some members of the scientific community—even some members of the public—have begun to pin all their hopes about stopping the spread of SARS-CoV-2 on mucosal vaccines. It’s a recipe for disappointment. “People love the idea of a magic pill,” Langel told me. “But it’s just not reality.” The virus is here to stay; the goal continues to be to make that reality more survivable. “We’re trying to reduce infection and transmission, not eliminate it; that would be almost impossible,” Iwasaki told me. That’s true for any vaccine, no matter how, or where, the body first encounters it.

    [ad_2]

    Katherine J. Wu

    Source link

  • I Was Allergic to Cats. Until Suddenly, I Wasn’t.

    I Was Allergic to Cats. Until Suddenly, I Wasn’t.

    [ad_1]

    Of all the nicknames I have for my cat Calvin—Fluffernutter, Chonk-a-Donk, Fuzzy Lumpkin, Jerky McJerkface—Bumpus Maximus may be the most apt. Every night, when I crawl into bed, Calvin hops onto my pillow, purrs, and bonks his head affectionately against mine. It’s adorable, and a little bit gross. Tiny tufts of fur jet into my nose; flecks of spittle smear onto my cheeks.

    Just shy of a decade ago, cuddling a cat this aggressively would have left me in dire straits. From early childhood through my early 20s, I nursed a serious allergy that made it impossible for me to safely interact with most felines, much less adopt them. Just a few minutes of exposure was enough to make my eyes water and clog my nasal passages with snot. Within an hour, my throat would swell and my chest would erupt in crimson hives.

    Then, sometime in the early 2010s, my misery came to an abrupt and baffling end. With no apparent interventions, my cat allergy disappeared. Stray whiffs of dander, sufficient to send my body into conniptions mere months before, couldn’t even compel my nose to twitch. My body just up and decided that the former bane of its existence was suddenly totally chill.

    What I went through is, technically speaking, “completely weird,” says Kimberly Blumenthal, an allergist and immunologist at Massachusetts General Hospital. Some allergies do naturally fade with time, but short of allergy shots, which don’t always work, “we think of cat allergy as a permanent diagnosis,” Blumenthal told me. One solution that’s often proposed? “Get rid of your cat.”

    My case is an anomaly, but its oddness is not. Although experts have a broad sense of how allergies play out in the body, far less is known about what causes them to come and go—an enigma that’s becoming more worrying as rates of allergy continue to climb. Nailing down how, when, and why these chronic conditions vanish could help researchers engineer those circumstances more often for allergy sufferers—in ways that are actually under our control, and not just by chance.


    All allergies, at their core, are molecular screwups: an immune system mistakenly flagging a harmless substance as dangerous and attacking it. In the classic version, an allergen, be it a fleck of almond or grass or dog, evokes the ire of certain immune cells, prompting them to churn out an antibody called IgE. IgE drags the allergen like a hostage over to other defensive cells and molecules to rile them up too. A blaze of inflammation-promoting signals, including histamine, end up getting released, sparking bouts of itching, redness, and swelling. Blood vessels dilate; mucus floods out in gobs. At their most extreme, these reactions get so gnarly that they can kill.

    Just about every step of this chain reaction is essential to produce a bona fide allergy—which means that intervening at any of several points can shut the cascade down. People whose bodies make less IgE over time can become less sensitive to allergens. The same seems to be true for those who start producing more of another antibody, called IgG4, that can counteract IgE. Some people also dispatch a molecule known as IL-10 that can tell immune cells to cool their heels even in the midst of IgE’s perpetual scream.

    All this and more can eventually persuade a body to lose its phobia of an allergen, a phenomenon known as tolerance. But because there is not a single way in which allergy manifests, it stands to reason that there won’t be a single way in which it disappears. “We don’t fully understand how these things go away,” says Zachary Rubin, a pediatrician at Oak Brook Allergists, in Illinois.

    Tolerance does display a few trends. Sometimes, it unfurls naturally as people get older, especially as they approach their 60s (though allergies can appear in old age as well). Other diagnoses can go poof amid the changes that unfold as children zip through the physiological and hormonal changes brought on by toddlerhood, adolescence, and the teen years. As many as 60 to 80 percent of milk, wheat, and egg allergies can peace out by puberty—a pattern that might also be related to the instability of the allergens involved. Certain snippets of milk and egg proteins, for instance, can unravel in the presence of heat or stomach acid, making the molecules “less allergenic,” and giving the body ample opportunity to reappraise them as benign, says Anna Nowak-Węgrzyn, a pediatric allergist and immunologist at NYU Langone Health. About 80 to 90 percent of penicillin allergies, too, disappear within 10 years of when they’re first detected, more if you count the ones that are improperly diagnosed, as Blumenthal has found.

    Other allergies are more likely to be lifers without dedicated intervention—among them, issues with peanuts, tree nuts, shellfish, pollen, and pets. Part of the reason may be that some of these allergens are super tough to neutralize or purge. The main cat allergen, a protein called “Fel d 1” that’s found in feline saliva, urine, and gland secretions, can linger for six months after a cat vacates the premises. It can get airborne, and glom on to surfaces; it’s been found in schools and churches and buses and hospitals, “even in space,” Blumenthal told me.

    For hangers-on like these, allergists can try to nudge the body toward tolerance through shots or mouth drops that introduce bits of an allergen over months or years, basically the immunological version of exposure therapy. In some cases, it works: Dosing people with Fel d 1 can at least improve a cat allergy, but it’s hardly a sure hit. Researchers haven’t even fully sussed out how allergy shots induce tolerance—just that “they work well for a lot of patients,” Rubin told me. The world of allergy research as a whole is something of a Wild West: Some people are truly, genuinely, hypersensitive to water touching their skin; others have gotten allergies because of organ transplants, apparently inheriting their donor’s sensitivity as amped-up immune cells hitched a ride.

    Part of the trouble is that allergy can involve just about every nook and cranny of the immune system; to study its wax and wane, scientists have to repeatedly look at people’s blood, gut, or airway to figure out what sorts of cells and molecules are lurking about, all while tracking their symptoms and exposures, which doesn’t come easy or cheap. And fully disentangling the nuances of bygone allergies isn’t just about better understanding people who are the rule. It’s about delving into the exceptions to it too.


    How frustratingly little we know about allergies is compounded by the fact that the world is becoming a more allergic place. A lot of the why remains murky, but researchers think that part of the problem can be traced to the perils of modern living: the wider use of antibiotics; the shifts in eating patterns; the squeaky-cleanness of so many contemporary childhoods, focused heavily on time indoors. About 50 million people in the U.S. alone experience allergies each year—some of them little more than a nuisance, others potentially deadly when triggered without immediate treatment. Allergies can diminish quality of life. They can limit the areas where people can safely rent an apartment, or the places where they can safely dine. They can hamper access to lifesaving treatments, leaving doctors scrambling to find alternative therapies that don’t harm more than they help.

    But if allergies can rise this steeply with the times, maybe they can resolve rapidly too. New antibody-based treatments could help silence the body’s alarm sensors and quell IgE’s rampage. Some researchers are even looking into how fecal transplants that port the gut microbiome of tolerant people into allergy sufferers might help certain food sensitivities subside. Anne Liu, an allergist and immunologist at Stanford, is also hopeful that “the incidence of new food allergies will decline over the next 10 years,” as more advances come through. After years of advising parents against introducing their kids to sometimes-allergenic substances such as milk and peanuts too young, experts are now encouraging early exposures, in the hopes of teaching tolerance. And the more researchers learn about how allergies naturally abate, the better they might be able to safely replicate fade-outs.

    One instructive example could come from cases quite opposite to mine: longtime pet owners who develop allergies to their animals after spending some time away from them. That’s what happened to Stefanie Mezigian, of Michigan. After spending her entire childhood with her cat, Thumper, Mezigian was dismayed to find herself sneezing and sniffling when she visited home the summer after her freshman year of college. Years later, Mezigian seems to have built a partial tolerance up again; she now has another cat, Jack, and plans to keep felines in her life for good—both for companionship and to wrangle her immune system’s woes. “If I go without cats, that seems to be when I develop problems,” she told me.

    It’s a reasonable thought to have, Liu told me. People in Mezigian’s situation probably have the reactive IgE bopping around their body their entire life. But maybe during a fur-free stretch, the immune system, trying to be “parsimonious,” stops making molecules that rein in the allergy, she said. The immune system is nothing if not malleable, and a bit diva-esque: Set one thing off kilter, and an entire network of molecules and cells can revamp its approach to the world.

    I may never know why my cat allergy ghosted me. Maybe I got infected by a virus that gently rewired my immune system; maybe my hormone levels went into flux. Maybe it was the stress, or joy, of graduating college and starting grad school; maybe my diet or microbiome changed in just the right way, at just the right time. Perhaps it’s pointless to guess. Allergy, like the rest of the immune system, is a hot, complicated mess—a common fixture of modern living that many of us take for granted, but that remains, in so many cases, a mystery. All I can do is hope my cat allergy stays gone, though there’s no telling if it will. “I have no idea,” Nowak-Węgrzyn told me. “I’m just happy for you. Go enjoy your cats.”

    [ad_2]

    Katherine J. Wu

    Source link

  • When’s the Perfect Time to Get a Flu Shot?

    When’s the Perfect Time to Get a Flu Shot?

    [ad_1]

    For about 60 years, health authorities in the United States have been championing a routine for at least some sector of the public: a yearly flu shot. That recommendation now applies to every American over the age of six months, and for many of us, flu vaccines have become a fixture of fall.

    The logic of that timeline seems solid enough. A shot in the autumn preps the body for each winter’s circulating viral strains. But years into researching flu immunity, experts have yet to reach a consensus on the optimal time to receive the vaccine—or even the number of injections that should be doled out.

    Each year, a new flu shot recipe debuts in the U.S. sometime around July or August, and according to the CDC the best time for most people to show up for an injection is about now: preferably no sooner than September, ideally no later than the end of October. Many health-care systems require their employees to get the shot in this time frame as well. But those who opt to follow the CDC current guidelines, as I recently did, then mention that fact in a forum frequented by a bunch of experts, as I also recently did, might rapidly hear that they’ve made a terrible, terrible choice.

    “There’s no way I would do what you did,” one virologist texted me. “It’s poor advice to get the flu vaccine now.” Florian Krammer, a virologist at Mount Sinai’s Icahn School of Medicine, echoed that sentiment in a tweet: “I think it is too early to get a flu shot.” When I prodded other experts to share their scheduling preferences, I found that some are September shooters, but others won’t juice up till December or later. One vaccinologist I spoke with goes totally avant-garde, and nabs multiple doses a year.

    There is definitely such a thing as getting a flu shot too early, as Helen Branswell has reported for Stat. After people get their vaccine, levels of antibodies rocket up, buoying protection against both infection and disease. But after only weeks, the number of those molecules begins to steadily tick downward, raising people’s risk of developing a symptomatic case of flu by about 6 to 18 percent, various studies have found. On average, people can expect that a good portion of their anti-flu antibodies “are meaningfully gone by about three or so months” after a shot, says Lauren Rodda, an immunologist at the University of Washington.

    That decline is why some researchers, Krammer among them, think that September and even October shots could be premature, especially if flu activity peaks well after winter begins. In about three-quarters of the flu seasons from 1982 to 2020, the virus didn’t hit its apex until January or later. Krammer, for one, told me that he usually waits until at least late November to dose up. Stanley Plotkin, a 90-year-old vaccinologist and vaccine consultant, has a different solution. People in his age group—over 65—don’t respond as well to vaccines in general, and seem to lose protection more rapidly. So for the past several years, Plotkin has doubled up on flu shots, getting one sometime before Halloween and another in January, to ensure he’s chock-full of antibodies throughout the entire risky, wintry stretch. “The higher the titers,” or antibody levels, Plotkin told me, “the better the efficacy, so I’m trying to take advantage of that.” (He made clear to me that he wasn’t “making recommendations for the rest of the world”—just “playing the odds” given his age.)

    Data on doubling up is quite sparse. But Ben Cowling, an epidemiologist and flu researcher at Hong Kong University, has been running a years-long study to figure out whether offering two vaccines a year, separated by roughly six months, could keep vulnerable people safe for longer. His target population is Hong Kongers, who often experience multiple annual flu peaks, one seeded by the Northern Hemisphere’s winter wave and another by the Southern Hemisphere’s. So far, “getting that second dose seems to give you additional protection,” Cowling told me, “and it seems like there’s no harm of getting vaccinated twice a year,” apart from the financial and logistical cost of a double rollout.

    In the U.S., though, flu season is usually synonymous with winter. And the closer together two shots are given, the more blunted the effects of the second injection might be: People who are already bustling with antibodies may obliterate a second shot’s contents before the vaccine has a chance to teach immune cells anything new. That might be why several studies that have looked at double-dosing flu shots within weeks of each other “showed no benefit” in older people and certain immunocompromised groups, Poland told me. (One exception? Organtransplant recipients. Kids getting their very first flu shot are also supposed to get two of them, four weeks apart.)

    Even at the three-ish-month mark past vaccination, the body’s anti-flu defenses don’t reset to zero, Rodda told me. Shots shore up B cells and T cells, which can survive for many months or years in various anatomical nooks and crannies. Those arsenals are especially hefty in people who have banked a lifetime of exposures to flu viruses and vaccines, and they can guard people against severe disease, hospitalization, and death, even after an antibody surge has faded. A recent study found that vaccine protection against flu hospitalizations ebbed by less than 10 percent a month after people got their shot, though the rates among adults older than 65 were a smidge higher. Still other numbers barely noted any changes in post-vaccine safeguards against symptomatic flu cases of a range of severities, at least within the first few months. “I do think the best protection is within three months of vaccination,” Cowling told me. “But there’s still a good amount by six.”

    For some young, healthy adults, a decent number of flu antibodies may actually stick around for more than a year. “You can test my blood right now,” Rodda told me. “I haven’t gotten vaccinated just yet this year, and I have detectable titers.” Ali Ellebedy, an immunologist at Washington University in St. Louis, told me he has found that some people who have regularly received flu vaccines have almost no antibody bump when they get a fresh shot: Their blood is already hopping with the molecules. Preexisting immunity also seems to be a big reason that nasal-spray-based flu vaccines don’t work terribly well in adults, whose airways have hosted far more flu viruses than children’s.

    Getting a second flu shot in a single season is pretty unlikely to hurt. But Ellebedy compares it to taking out a second insurance policy on a car that’s rarely driven: likely of quite marginal benefit for most people. Plus, because it’s not a sanctioned flu-vaccine regimen, pharmacists might be reluctant to acquiesce, Poland pointed out. Double-dosing probably wouldn’t stand much of a chance as an official CDC recommendation, either. “We do a bad enough job,” Poland said, getting Americans to take even one dose a year.

    That’s why the push to vaccinate in late summer and early fall is so essential for the single shot we currently have, says Huong McLean, a vaccine researcher at the Marshfield Clinic Research Institute in Wisconsin. “People get busy, and health systems are making sure that most people can get protected before the season starts,” she told me. Ellebedy, who’s usually a September vaccinator, told me he “doesn’t see the point of delaying vaccination for fear of having a lower antibody level in February.” Flu seasons are unpredictable, with some starting as early as October, and the viruses aren’t usually keen on giving their hosts a heads-up. That makes dillydallying a risk: Put the shot off till November or December, and “you might get infected in between,” Ellebedy said—or simply forget to make an appointment at all, especially as the holidays draw near.

    In the future, improvements to flu-shot tech could help cleave off some of the ambiguity. Higher doses of vaccine, which are given to older people, could rile up the immune system to a greater degree; the same could be true for more provocative vaccines, made with ingredients called adjuvants that trip more of the body’s defensive sensors. Injections such as those seem to “maintain higher antibody titers year-round,” says Sophie Valkenburg, an immunologist at Hong Kong University and the University of Melbourne—a trend that Ellebedy attributes to the body investing more resources in training its fighters against what it perceives to be a larger threat. Such a switch would likely come with a cost, though, McLean said: Higher doses and adjuvants “also mean more adverse events, more reactions to the vaccine.”

    For now, the only obvious choice, Rodda told me, is to “definitely get vaccinated this year.” After the past two flu seasons, one essentially absent and one super light, and with flu-vaccination rates still lackluster, Americans are more likely than not in immunity deficit. Flu-vaccination rates have also ticked downward since the coronavirus pandemic began, which means there may be an argument for erring on the early side this season, if only to ensure that people reinforce their defenses against severe disease, Rodda said. Plus, Australia’s recent flu season, often a bellwether for ours, arrived ahead of schedule.

    Even so, people who vaccinate too early could end up sicker in late winter—in the same way that people who vaccinate too late could end up sicker now. Plotkin told me that staying apprised of the epidemiology helps: “If I heard influenza outbreaks were starting to occur now, I would go and get my first dose.” But timing remains a gamble, subject to the virus’s whims. Flu is ornery and unpredictable, and often unwilling to be forecasted at all.

    [ad_2]

    Katherine J. Wu

    Source link

  • Should Your Flu and COVID Shots Go in Different Arms?

    Should Your Flu and COVID Shots Go in Different Arms?

    [ad_1]

    At a press briefing earlier this month, Ashish Jha, the White House’s COVID czar, laid out some pretty lofty expectations for America’s immunity this fall. “Millions” of Americans, he said, would be flocking to pharmacies for the newest version of the COVID vaccine in September and October, at the same appointment where they’d get their yearly flu shot. “It’s actually a good idea,” he told the press. “I really believe this is why God gave us two arms.”

    That’s how I got immunized last week at my local CVS: COVID shot on the left, flu shot on the right. I spent the next day or so nursing not one but two achy upper arms. Reaching high shelves was hard; putting on deodorant was worse. And it did make me wonder what would have happened if I’d ignored Jha’s teleological advice and gotten both jabs in the same arm. Maybe my annoyance would have been lessened. Or perhaps the same-side shots would have made the soreness in my left arm way worse. When I posed this puzzle to immunologists, vaccinologists, and pharmacists, I got back a lot of hems and haws. For the millions of Americans who will be getting two-shot appointments by fall’s end, they told me, the choice really does come down to personal preference in the absence of clear data: You’ve just gotta pick a side. Or, you know, two.

    On the one hand (sorry), there are the vaccine double-downers. Sallie Permar, a pediatrician at Weill Cornell Medicine, and Stephanie Langel, an immunologist at Duke University, both said they’d probably get both shots in the same shoulder; so would Rishi Goel, an immunologist at the University of Pennsylvania. “Personally, I’d rather have one arm that’s slightly uncomfortable than both,” Goel told me.

    On the other hand, we’ve got Team Divide-and-Conquer. Several experts said they’d follow the White House protocol of splitting shots left and right. Ali Ellebedy, an immunologist at Washington University in St. Louis, told me he’d prefer to have two slightly sore arms to one totally dead one. Jacinda Abdul-Mutakabbir, a pharmacist at Loma Linda University, says she generally recommends that her patients get the vaccines on separate sides “for comfort.” Last year, she opted to get the flu shot and a COVID booster within a few inches of each other, and “I wanted to chop my arm off,” she told me. “Never again.”

    The deciding logic here should be pretty intuitive, Permar told me. Two shots on one side might be expected to double how sore that arm will get, though the experience of each vaccine recipient will depend on a bevy of factors, including the ingredients in the shots and that person’s infection and vaccination history, as well as their immune-system health. Also, for people like my husband—who’s prone to very heavy vaccine side effects—the choice may not matter at all. He was so knocked out by the fever and chills that came with his COVID-flu-shot combo, he couldn’t have cared less which arms got the shots.

    I dug around for studies examining the consequences of the one-versus-two-arm choice and found only one: a Canadian trial from 2003, which vaccinated a few hundred sixth-graders at two dozen middle schools against group C meningitis and hepatitis B at the same time. Roughly half the kids got both shots in the same arm; the others received one on each side. (Some kids in the latter group requested that their shots be administered by a pair of nurses who could plunge both syringes at the same time.) Among students in the same-arm group, 18 percent ended up with tenderness at the injection site that they rated “moderate or severe.” But those kids fared better than the ones in the two-arm group, 28 percent of whom experienced moderate or severe tenderness in at least one arm, and 8 percent of whom had it in both arms at the same time.

    But those results apply only to that group of kids in that setting, with those two specific vaccines; there’s no telling whether the same trends would be seen with flu shots and COVID shots when given to children or adults. Michela Locci, an immunologist at the University of Pennsylvania, told me she suspects that combining flu and COVID inoculations in the same arm could actually drive extra side effects: “The overall inflammation might be higher,” she said.

    Many pediatricians, who often have to administer four or five shots to a baby at once, are habitual splitters. “If there’s more than one vaccine syringe to give to a baby, generally, two legs are used,” Permar told me. (Kids usually upgrade to arm shots sometime in toddlerhood—it’s all about finding a muscle that’s big enough for the needle to hit its mark.) Doctors also have a nerdy reason to split shots between arms or legs. “If there’s a local reaction to the vaccine,” Permar said, “you can identify which vaccine it was if you separate them by space.” (For the record, I had a more painful reaction in my left arm, where I got the COVID shot. Others I’ve spoken with have reported the same disparity.)

    The CDC advocates for separating vaccination shots by at least one inch of space. Per the agency, if a COVID shot is being given at the same time as a vaccine “that might be more likely to cause a local injection site reaction,” the shots should be dosed into “different limbs, if possible.” Two types of flu shots cleared for use in people 65 years and older—the high-dose vaccine and the adjuvanted one—fall into that category. But the different-limb advice doesn’t seem to apply to other flu shots, including those cleared for use in younger adults and kids.

    However someone ends up taking simultaneous flu and COVID shots, the placement is unlikely to affect how much protection the vaccines provide. There could be an argument for letting “each side focus on its own thing,” says Gabriel Victora, an immunologist at Rockefeller University. “But it probably doesn’t make a whole lot of difference.” Children routinely get combo vaccines, such as DTaP and MMR, each of which combines multiple disease-fighting ingredients in a single syringe. The triple-threat formulas work just as well as injecting their individual parts. The immune system is used to multitasking: It spends all day being bombarded by microbes, so there’s good reason to believe that with vaccines, too, our body will see simultaneous shots “as independent events,” Goel told me.

    Which arm gets picked for which shot, though, will affect where the jab’s contents end up. After a vaccine is injected, its immunity-inducing ingredients meander to the nearest lymph node, such as the ones in the armpits. There, hordes of immune cells fight over the vaccine’s bits, and the fittest and fiercest among them are selected to leave the lymph node and fight. Here, again, doubling up on one arm shouldn’t be an issue, Goel said: The immune-cell boot camps in these lymph nodes have “a good amount of real estate.”

    It might even be a good idea to stick the same limb—and thereby, the same lymph node—every time you get another dose of a particular vaccine. After immune cells in a lymph node spot a particular bit of pathogen, some of them march off into battle, but others may hang around like reserve troops, mulling over what they’ve learned. A couple of recent studies, one of them in mice, hint that repeated delivery of the same ingredients to those veteran learners could give the body a slight edge—though the extent of that advantage “might be marginal,” Victora told me. Still, Langel, of Duke, told me jokingly that because she usually gets all of her vaccines in her “non-writing” arm, the lymph node beneath it could now be especially superpowered—a “nice bonus” for her defenses on the whole.

    That said, no one should stress too much about getting a shot in the “wrong” arm. “It’s not like you’re immune on the left side and not on the right side,” Goel told me. Immune cells travel throughout the body; there is no midline DMZ. Permar even points out that getting the newly formulated COVID vaccine, which includes new ingredients tailored to fight Omicron subvariants, on the opposite side from the previous rounds could help its ingredients reach a fresher slate of cells. “I think you could convince yourself either way,” she told me. Which, honestly, leaves me totally at peace with my choice. Apart from arm achiness, I had no other side effects—and in a way, I preferred the symmetry of the one-on-each-side injections.

    With all that said, it’s worth briefly acknowledging a third option: Splitting the flu and COVID vaccines into separate visits. I was, before my most recent COVID shot, some 10 months out from my previous dose. But it felt awfully early for my flu shot, which might be better timed for peak protection if taken later in the season. Still, the allure of getting it all over with was too tantalizing, especially because I happen to have a lot of travel up ahead. In the grand scheme of things, the bigger, more important choice was opting into the shots at all.

    [ad_2]

    Katherine J. Wu

    Source link