ReportWire

Tag: recent studies

  • This Fall’s COVID Vaccines Are for Everyone

    This Fall’s COVID Vaccines Are for Everyone

    [ad_1]

    Paul Offit is not an anti-vaxxer. His résumé alone would tell you that: A pediatrician at Children’s Hospital of Philadelphia, he is the co-inventor of a rotavirus vaccine for infants that has been credited with saving “hundreds of lives every day”; he is the author of roughly a dozen books on immunization that repeatedly debunk anti-vaccine claims. And from the earliest days of COVID-19 vaccines, he’s stressed the importance of getting the shots. At least, up to a certain point.

    Like most of his public-health colleagues, Offit strongly advocates annual COVID shots for those at highest risk. But regularly reimmunizing young and healthy Americans is a waste of resources, he told me, and invites unnecessary exposure to the shots’ rare but nontrivial side effects. If they’ve already received two or three doses of a COVID vaccine, as is the case for most, they can stop—and should be told as much.

    His view cuts directly against the CDC’s new COVID-vaccine guidelines, announced Tuesday following an advisory committee’s 13–1 vote: Every American six months or older should get at least one dose of this autumn’s updated shot. For his less-than-full-throated support for annual vaccination, Offit has become a lightning rod. Peers in medicine and public health have called his opinions “preposterous.” He’s also been made into an unlikely star in anti-vaccine circles. Public figures with prominently shot-skeptical stances have approvingly parroted his quotes. Right-leaning news outlets that have featured vaccine misinformation have called him up for quotes and sound bites—a sign, he told me, that as a public-health expert “you screwed up somehow.”

    Offit stands by his opinion, the core of which is certainly scientifically sound: Some sectors of the population are at much higher risk for COVID than the rest of us. But the crux of the controversy around his view is not about facts alone. At this point in the pandemic, in a country where seasonal vaccine uptake is worryingly low and direly inequitable, where health care is privatized and piecemeal, where anti-vaccine activists will pull at any single loose thread, many experts now argue that policies riddled with ifs, ands, or buts—factually sound though they may be—are not the path toward maximizing uptake. “The nuanced, totally correct way can also be the garbled-message way,” Anthony Fauci, the former director of the National Institute of Allergy and Infectious Diseases, told me.

    For the past two years, the United States’ biggest COVID-vaccine problem hasn’t been that too many young and healthy people are clamoring for shots and crowding out more vulnerable groups. It’s been that no one, really—including those who most need additional doses—is opting for additional injections at all. America’s vaccination pipeline is already so riddled with obstacles that plenty of public-health experts have become deeply hesitant to add more. They’re opting instead for a simple, proactive message—one that is broadly inclusive—in the hope that a concerted push for all will nudge at least some fraction of the public to actually get a shot this year.

    On several key vaccination points, experts do largely agree. The people who bear a disproportionate share of COVID’s risk should receive a disproportionate share of immunization outreach, says Saad Omer, the dean of UT Southwestern’s O’Donnell School of Public Health.

    Choosing which groups to prioritize, however, is tricky. Offit told me he sees four groups as being at highest risk: people who are pregnant, immunocompromised, over the age of 70, or dealing with multiple chronic health conditions. Céline Gounder, an infectious-disease specialist and epidemiologist at NYC Health + Hospitals/Bellevue, who mostly aligns with Offit’s stance, would add other groups based on exposure risk: people living in shelters, jails, or other group settings, for instance, and potentially people who work in health care. (Both Gounder and Offit also emphasize that unvaccinated people, especially infants, should get their shots this year, period.) But there are other vulnerable groups to consider. Risk of severe COVID still stratifies by factors such as socioeconomic status and race, concentrating among groups who are already disproportionately disconnected from health care.

    That’s a potentially lengthy list—and messy messaging has hampered pandemic responses before. As Gretchen Chapman, a vaccine-behavior expert at Carnegie Mellon University, told me last month, a key part of improving uptake is “making it easy, making it convenient, making it the automatic thing.” Fauci agrees. Offit, had he been at the CDC’s helm, would have strongly recommended the vaccine for only his four high-risk groups, and merely allowed everyone else to get it if they wanted to—drawing a stark line between those who should and those who may. Fauci, meanwhile, approves of the CDC’s decision. If it were entirely up to him, “I would recommend it for everyone” for the sheer sake of clarity, he told me.

    The benefit-risk ratio for the young and healthy, Fauci told me, is lower than it is for older or sicker people, but “it’s not zero.” Anyone can end up developing a severe case of COVID. That means that shoring up immunity, especially with a shot that targets a recent coronavirus variant, will still bolster protection against the worst outcomes. Secondarily, the doses will lower the likelihood of infection and transmission for at least several weeks. Amid the current rise in cases, that protection could soften short-term symptoms and reduce people’s chances of developing long COVID; it could minimize absences from workplaces and classrooms; it could curb spread within highly immunized communities. For Fauci, those perks are all enough to tip the scales.

    Offit did tell me that he’s frustrated at the way his views have frequently been framed. Some people, for instance, are inaccurately portraying him as actively dissuading people from signing up for shots. “I’m not opposed to offering the vaccine for anyone who wants it,” he told me. In the case of the young and healthy, “I just don’t think they need another dose.” He often uses himself as an example: At 72 years old, Offit didn’t get the bivalent shot last fall, because he says he’s in good health; he also won’t be getting this year’s XBB.1-targeting brew. Three original-recipe shots, plus a bout of COVID, are protection enough for him. He gave similar advice to his two adult children, he told me, and he’d say the same to a healthy thrice-dosed teen: More vaccine is “low risk, low reward.”

    The vax-for-all guideline isn’t incompatible, exactly, with a more targeted approach. Even with a universal recommendation in place, government resources could be funneled toward promoting higher uptake among essential-to-protect groups. But in a country where people, especially adults, are already disinclined to vaccinate, other experts argue that the slight difference between these two tactics could compound into a chasm between public-health outcomes. A strong recommendation for all, followed by targeted implementation, they argue, is more likely to result in higher vaccination rates all around, including in more vulnerable populations. Narrow recommendations, meanwhile, could inadvertently exclude people who really need the shot, while inviting scrutiny over a vaccine’s downsides—cratering uptake in high- and low-risk groups alike. Among Americans, avoiding a strong recommendation for certain populations could be functionally synonymous with explicitly discouraging those people from getting a shot at all.

    Offit pointed out to me that several other countries, including the United Kingdom, have issued recommendations that target COVID vaccines to high-risk groups, as he’d hoped the U.S. would. “What I’ve said is really nothing that other countries haven’t said,” Offit told me. But the situation in the U.S. is arguably different. Our health care is privatized and far more difficult to access and navigate. People who are unable to, or decide not to, access a shot have a weaker, more porous safety net—especially if they lack insurance. (Plus, in the U.K., cost was reportedly a major policy impetus.) A broad recommendation cuts against these forces, especially because it makes it harder for insurance companies to deny coverage.

    A weaker call for COVID shots would also make that recommendation incongruous with the CDC’s message on flu shots—another universal call for all Americans six months and older to dose up each year. Offit actually does endorse annual shots for the flu: Immunity to flu viruses erodes faster, he argues, and flu vaccines are “safer” than COVID ones.

    It’s true that COVID and the flu aren’t identical—not least because SARS-CoV-2 continues to kill and chronically sicken more people each year. But other experts noted that the cadence of vaccination isn’t just about immunity. Recent studies suggest that, at least for now, the coronavirus is shape-shifting far faster than seasonal flu viruses are—a point in favor of immunizing more regularly, says Vijay Dhanasekaran, a viral-evolution researcher at the University of Hong Kong. The coronavirus is also, for now, simply around for more of the year, which makes infections more likely and frequent—and regular vaccination perhaps more prudent. Besides, scientifically and logistically, “flu is the closest template we have,” Ali Ellebedy, an immunologist at Washington University in St. Louis, told me. Syncing the two shots’ schedules could have its own rewards: The regularity and predictability of flu vaccination, which is typically higher among the elderly, could buoy uptake of COVID shots—especially if manufacturers are able to bundle the immunizations into the same syringe.

    Flu’s touchstone may be especially important this fall. With the newly updated shots arriving late in the season, and COVID deaths still at a relative low, experts are predicting that uptake may be worse than it was last year, when less than 20 percent of people opted in to the bivalent dose. A recommendation from the CDC “is just the beginning” of reversing that trend, Omer, of UT Southwestern, told me. Getting the shots also needs to be straightforward and routine. That could mean actively promoting them in health-care settings, making it easier for providers to check if their patients are up to date, guaranteeing availability for the uninsured, and conducting outreach to the broader community—especially to vulnerable groups.

    Offit hasn’t changed his mind on who most needs these new COVID vaccines. But he is rethinking how he talks about it: “I will stop putting myself in a position where I’m going to be misinterpreted,” he told me. After the past week, he more clearly sees the merits of focusing on who should be signing up rather than who doesn’t need another dose. Better to emphasize the importance of the shot for the people he worries most about and recommend it to them, without reservation, to whatever extent we can.

    [ad_2]

    Katherine J. Wu

    Source link

  • Where End-of-Life Care Falls Short

    Where End-of-Life Care Falls Short

    [ad_1]

    This article originally appeared in Undark Magazine.

    When Kevin E. Taylor became a pastor 22 years ago, he didn’t expect how often he’d have to help families make gut-wrenching decisions for a loved one who was very ill or about to die. The families in his predominantly Black church in New Jersey generally didn’t have any written instructions, or conversations to recall, to help them know if their relative wanted—or didn’t want—certain types of medical treatment.

    So Taylor started encouraging church members to ask their elders questions, such as whether they would want to be kept on life support if they became sick and were unable to make decisions for themselves.

    “Each time you have the conversation, you destigmatize it,” says Taylor, now the senior pastor at Unity Fellowship Church NewArk, a Christian church with about 120 regular members.

    Taylor is part of an initiative led by Compassion & Choices, a nonprofit advocacy group that encourages more Black Americans to consider and document their medical wishes for the end of their life.

    End-of-life planning—also known as advance care planning, or ACP—usually requires a person to fill out legal documents that indicate the care they would want if they were to become unable to speak for themselves because of injury or illness. There are options to specify whether they would want life-sustaining care, even if it were unlikely to cure or improve their condition, or comfort care to manage pain, even if it hastened death. Medical groups have supported ACP, and proposed public-awareness campaigns aim to promote the practice.

    Yet research has found that many Americans—particularly Black Americans—have not bought into the promise of ACP. Advocates say that such plans are especially important for Black Americans, who are more likely to experience racial discrimination and lower-quality care throughout the health-care system. Advance care planning, they say, could help patients understand their options and document their wishes, as well as reduce anxiety for family members.

    However, the practice has also come under scrutiny in recent years: Some research suggests that it might not actually help patients get the kind of care they want at the end of life. It’s unclear whether those results are due to research methods or to a failure of ACP itself; comparing the care that individuals said they want in the future with the care they actually received while dying is exceedingly difficult. And many studies that show the shortcomings of ACP look predominantly at white patients.

    Still, researchers maintain that encouraging discussions about end-of-life care is important, while also acknowledging that ACP needs either improvement or an overhaul. “We should be looking for, okay, what else can we do other than advance care planning?” says Karen Bullock, a social-work professor at Boston College, who researches decision-making and acceptance around ACP in Black communities. “Or can we do something different with advance care planning?”

    Advance care planning was first proposed in the U.S. in 1967, when a lawyer for the now-defunct Euthanasia Society of America advocated for the idea of a living will—a document that would allow a person to indicate whether to withhold or withdraw life-sustaining treatment if they were no longer capable of making health-care decisions. By 1986, most states had adopted living-will laws that established standardized documents for patients, as well as protections for physicians who complied with patients’ wishes.

    Over the past four decades, ACP has expanded to include a range of legal documents, called advance directives, for detailing one’s wishes for end-of-life care. In addition to do-not-resuscitate, or DNR, orders, patients can list treatments they would want and under which scenarios, as well as appoint a surrogate to make health-care decisions for them. Health-care facilities that receive Medicare or Medicaid reimbursement are required to ask whether patients have advance directives, and to provide them with relevant information. And in most states, doctors can record a patient’s end-of-life wishes in a form called a Provider Order for Life-Sustaining Treatment. These documents encourage patients to talk with their physician about their wishes, which are then added to the patient chart, unlike advance directives, which usually consist of the patient filling out forms themselves without discussing them directly with their doctor.

    But as far as who makes those plans, research has shown a racial disparity: A 2016 study of more than 2,000 adults, all of whom were over the age of 50, showed that 44 percent of white participants had completed an advance directive, compared with 24 percent of Black participants. Many people simply aren’t aware of ACP or don’t fully understand it. And for Black individuals, that knowledge may be especially hard to come by—one study found that clinicians tend to avoid discussions with Black and other nonwhite patients about the care they want at the end of life, because they feel uncomfortable broaching these conversations or are unsure of whether patients want to have them.

    Other research has found that Black Americans may be more hesitant to fill out documents in part because of a mistrust in the health-care system, rooted in a long history of racist treatment. “It’s a direct, in my opinion, outcome from segregated health-care systems,” Bullock says. “When we forced integration, integration didn’t mean equitable care.”

    Religion can also be a major barrier to ACP. A large proportion of Black Americans are religious, and some say they are hesitant to engage in ACP because of the belief that God, rather than clinicians, should decide their fate. That’s one reason programs such as Compassion & Choices have looked to churches to make ACP more accessible. Several studies support the effectiveness of sharing health messages, including about smoking cessation and heart health, in church communities. “Black people tend to trust their faith leaders, and so if the church is saying this is a good thing to do, then we will be willing to try it,” Bullock says.

    But in 2021, an article by palliative-care doctors laid bare the growing evidence that ACP may be failing to get patients the end-of-life care they want, also known as goal-concordant care. The paper summarized the findings of numerous studies investigating the effectiveness of the practice, and concluded that “despite the intrinsic logic of ACP, the evidence suggests it does not have the desired effect.”

    For example, although some studies identified benefits such as increased likelihood of a patient dying in the place they desired or avoiding unwanted resuscitation, others found the opposite. One study found that seriously ill patients who prioritized comfort care in their advance directive spent practically just as many days in the hospital as did patients who prioritized life-extending experiences. The authors of the 2021 summary paper suggested several reasons that goal-concordant care might not occur: Patients may request treatments that are not available; clinicians may not have access to the documentation; surrogates may override patients’ requests.

    A pair of older studies suggested that these issues might be especially pronounced for Black patients; they found that Black patients with cancer who had signed DNR orders were more likely to be resuscitated, for example. These studies have been held up as evidence that Black Americans receive less goal-concordant care. But Holly Prigerson, a researcher at Cornell University who oversaw the studies, notes that her team investigated the care of Black participants who were resuscitated against their wishes, and in those cases, clinicians did not have access to their records because the patients had been transferred from another hospital.

    One issue facing research on advance care planning is that so many studies focus on white patients, giving little insight into whether ACP helps Black patients. For example, in two recent studies on the subject, more than 90 percent of patients were white.

    Many experts, including Prigerson, agree that it’s important to devise new approaches to assess goal-concordant care, which generally relies on what patients indicated in advance directives or what they told family members months or years before dying. But patients change their mind, and relatives may not understand or accept their wishes.

    “It’s a very problematic thing to assess,” Prigerson says. “It’s not impossible, but there are so many issues with it.”

    As for whether ACP can manage to improve end-of-life care specifically in areas where Black patients receive worse care, such as pain management, experts such as Bullock note that studies have not really explored that question. But addressing other racial disparities—including correcting physicians’ false beliefs about Black patients being less sensitive to pain, improving how physicians communicate with Black patients, and strengthening social supports for patients who want to enroll in hospice—is likely more crucial than expanding ACP.

    ACP “may be part of the solution, but it is not going to be sufficient,” says Robert M. Arnold, a University of Pittsburgh professor of palliative care and medical ethics, and one of the authors of the 2021 article that questioned the benefits of ACP.

    Many of the shortcomings of ACP, including the low engagement rate and the unclear benefits, have prompted researchers and clinicians to think about how to overhaul the practice.

    Efforts to make ACP more accessible have spanned creating easy-to-read versions absent any legalese, and short, simple videos. A 2023 study found that one program that incorporated these elements, called PREPARE for Your Care, helped both white and Black adults with chronic medical conditions get goal-concordant care. The study stood out because it asked patients who were still able to communicate if they were getting the medical care they wanted, rather than waiting until after they died to evaluate goal-concordant care.

    “That, to me, is incredibly important,” says Rebecca Sudore, a geriatrician and researcher at UC San Francisco, who was the senior author of the study and helped develop PREPARE for Your Care. Sudore and her colleagues have proposed “real-time assessment from patients and their caregivers” to more accurately measure goal-concordant care.

    In the past few years, clinicians have become more aware that ACP should involve ongoing conversations and shared decision-making among patients, clinicians, and surrogates, rather than just legal documents, says Ramona Rhodes, a geriatrician affiliated with the University of Arkansas for Medical Sciences.

    Rhodes and her colleagues are leading a study to address whether certain types of ACP can promote engagement and improve care for Black patients. A group of older patients—half are Black, and half are white—with serious illnesses at clinics across the South are receiving materials either for Respecting Choices, an ACP guide that focuses on conversations with patients and families, or Five Wishes, a short patient questionnaire and the most widely used advance directive in the United States. The team hypothesizes that Respecting Choices will lead to greater participation among Black patients and possibly more goal-concordant care, if it prepares patients and families to talk with clinicians about their wishes, Rhodes says.

    Taylor, the pastor, notes that when he talks with church members about planning for end-of-life care, they often see the importance of it for the first time. And it usually persuades them to take action. “Sometimes it’s awkward,” he says. “But it’s now awkward and informed.”

    [ad_2]

    Carina Storrs

    Source link

  • MSG Is Finally Getting Its Revenge

    MSG Is Finally Getting Its Revenge

    [ad_1]

    Updated at 1:45 p.m. ET on May 17, 2023

    In March, the World Health Organization issued a dire warning that was also completely obvious: Nearly everyone on the planet consumes too much salt. And not just a sprinkle too much; on average, people consume more than double what is advisable every single day, raising the risk of common diseases such as heart attack and stroke. If governments intervene in such profligate salt intake, the WHO urged, they could save the lives of 7 million people by 2030.

    Such warnings about salt are so ubiquitous that they are easy to tune out. In the United States, salt intake has been a public-health issue for more than half a century; since then, the initiatives launched to combat it have been deemed by health officials as “too numerous to describe,” but little has changed in terms of policy or appetite. The main reason salt has remained a problem is that it’s a major part of all processed food—and, well, it makes everything delicious. Persuading Americans to reduce their consumption would require a convincing dupe—something that would cut down on unhealthy sodium without making food any less tasty.

    No perfect dupe exists. But the next best thing could be … MSG. Seriously. Last month, the FDA proposed reducing sodium in certain foods using salt substitutes. One candidate that has research behind it is monosodium glutamate, the white crystalline powder that has long been maligned in the West as an unhealthy food additive. A common seasoning in some Asian cuisines, MSG was linked in the late 1960s to ailments—headaches, numbness, dizziness, heart palpitations—that became known as Chinese Restaurant Syndrome. The health concerns around MSG have since been debunked, and the FDA considers it safe to eat. But it still has a bad rap: Many products are still proudly advertised as MSG free. Now the chemical may soon get its revenge. Given the chance to replace salt in some of our food, it could eventually come to represent something wholesome—perhaps even something close to healthy.

    The concerns with MSG originated in 1968, when a Chinese American physician, writing in The New England Journal of Medicine, described feeling generally ill after eating Chinese food, which he suggested could be because of MSG. Other researchers quickly produced studies that seemed to substantiate this claim, and MSG became a public-health villain. In the ’70s, the Chicago Tribune ran the headline “Chinese Food Make You Crazy? MSG Is No. 1 Suspect.” All the attention “renewed medical legitimacy [for] a number of long-held assumptions about the strangely ‘exotic’, ‘bizarre’ and ‘excessive’ practices associated with Chinese culture,” the historian Ian Mosby wrote in 2009. That’s not to say that all symptoms associated with MSG are bunk; people can be sensitive to MSG—like any food—and may experience broad symptoms such as headaches after eating it, Amanda Li, a dietary nutritionist at the University of Washington, told me. But “research has shown no clear evidence linking MSG consumption to any serious potential adverse reactions,” she said.

    On the whole, MSG does seem better than salt itself, considering that excessive salt consumption poses so many chronic health risks. A relatively small amount of MSG could be used to rescue flavor in reduced-salt products without endangering health. This is possible partly because of MSG’s molecular makeup. It satisfies the need for salt to a certain extent because it contains sodium (it’s right there in the name, after all)—but just a third of the amount, by weight, that salt does. The rest of the molecule is made of the amino acid L-glutamate, which registers as the savory, “brothy” flavor known as umami.

    MSG isn’t a one-to-one replacement for salt, but that’s what makes it such a promising alternative. It is a general flavor enhancer, meaning that it can amplify the perception of salt and other flavors that are already in a dish, as well as add an umami element, Soo-Yeun Lee, a sensory scientist and the director of Washington State University’s School of Food Science, told me. One secret to this effect is that unlike salt, which imparts a blast of flavor and then quickly dissipates, MSG stays on the tongue long after food is swallowed, producing a lasting savory sensation, Lee said.  It may amplify saltiness by increasing salivation, letting sodium molecules wash over the tongue more freely, Aubrey Dunteman, a food scientist at the University of Illinois at Urbana-Champaign, told me.

    All of this gives MSG the potential to play into a salt-reduction strategy. A 2019 study in the journal Nutrients found that substituting MSG (or other similar but more obscure chemicals) for some of the salt in certain foods could have major impacts: Adults who eat cured meats could cut 40 percent of their intake; cheese eaters, 45 percent. Another study from researchers in Japan found that incorporating MSG and other umami substances into common Japanese condiments, such as soy sauce, seasoning salt, and miso paste, could cut salt intake by up to 22.3 percent. Doing the same in curry-chicken and chili-chicken soups, Malaysian scientists found, could be used to reduce the recipes’ salt content by 32.5 percent.

    Take those findings with a grain of, uh, MSG. Recent studies have uniformly found that MSG is a safe, promising salt replacement, but many, including both the Nutrients study and the Japanese one, were funded at least in part by Ajinomoto Co.—the company that introduced the first commercial form of the substance—or the International Glutamate Technical Committee, a trade group. Lee and Dunteman have also received funding from Ajinomoto for some of their MSG work, including a study showing that the substance could improve the flavor of reduced-sodium bread. Lee said she aimed to show that MSG substitution for salt is “feasible, so if any food companies want to take that up and try it on their own systems,” they have a basis for doing so. Her goal, she added, “is not to sell bread with MSG.” (The paper, along with the two others mentioned that received industry funding, were independently peer-reviewed.)

    Clearly, more independent research is needed, but food companies have plenty of incentive to help find a better alternative to salt. More than 70 percent of Americans’ salt consumption comes from processed and manufactured food, and if the FDA decides to crack down on salt intake, its policies will largely target the food industry, Lee said. Already, some manufacturers of canned soup and fish are experimenting with salt substitutes.

    Deploying MSG in a sweeping sodium-reduction campaign would not be straightforward. MSG is more expensive than salt, Dunteman noted. More crucially, in many foods, salt provides more than flavor; it can also act as a preservative and regulate texture by, say, adding juiciness to lean meat or stabilizing leavened dough. In their study on bread, Lee and Dunteman found that removing too much salt reduced chewiness and firmness, even when MSG made up for taste. Among common processed foods, bread is a prime target for future MSG research, because it is the biggest contributor to U.S. sodium intake—not only because of its salt content but also because of the sheer amount of it that Americans consume. When MSG is used instead of salt to enhance flavor, “foods can taste just as delicious but without affecting hypertension,” Katherine Burt, a professor of health promotion and nutrition sciences at Lehman College, whose writing on MSG was not industry funded, told me. It’s “a great way to make foods exciting and healthy.”

    MSG can also be used to deliberately reduce salt intake at home. Adding a new ingredient to a home pantry can be daunting, but consider that MSG is already in most kitchens, occurring naturally in umami-rich items such as Parmesan cheese and mushrooms and added to processed foods such as Campbell’s Soup and Doritos. These days, it’s easy enough to find it online or in stores, sold in shakers or packets, much like salt. Li recommends that the MSG-curious start seasoning their food with a 50–50 mixture of MSG and table salt. When eating processed foods, choose low-sodium versions of products (not “reduced sodium” goods, which may not actually have low levels of salt). They’ll likely taste terrible, so add MSG in increments until they taste good, Lee said.

    We still have much to learn about MSG as a salt substitute, but the biggest challenge to it taking off is cultural, not scientific. To a certain degree, tastes are changing: Celebrity chefs such as David Chang champion it, and one highly acclaimed New York restaurant now serves an MSG martini. But the perception that MSG is unhealthy still persists, despite evidence to the contrary. Words such as “sneaky,” “disguised,” and “nasty” are still used to describe it, and grocery stores such as Whole Foods and Trader Joe’s make a point of mentioning that their foods have no MSG. Nevertheless, as long as old misconceptions about MSG persist, they will continue to hamper the potential for a better salt substitute. America’s aversion toward MSG may be intended to promote better health, but at this point, it might just be doing precisely the opposite.


    This story originally stated that the New England Journal of Medicine letter about MSG was a hoax. This was once believed but has since been disproved.

    [ad_2]

    Yasmin Tayag

    Source link

  • Is COVID Immunity Hung Up on Old Variants?

    Is COVID Immunity Hung Up on Old Variants?

    [ad_1]

    In the two-plus years that COVID vaccines have been available in America, the basic recipe has changed just once. The virus, meanwhile, has belched out five variants concerning enough to earn their own Greek-letter names, followed by a menagerie of weirdly monikered Omicron subvariants, each seeming to spread faster than the last. Vaccines, which take months to reformulate, just can’t keep up with a virus that seems to reinvent itself by the week.

    But SARS-CoV-2’s evolutionary sprint might not be the only reason that immunity can get bogged down in the past. The body seems to fixate on the first version of the virus that it encountered, either through injection or infection—a preoccupation with the past that researchers call “original antigenic sin,” and that may leave us with defenses that are poorly tailored to circulating variants. In recent months, some experts have begun to worry that this “sin” might now be undermining updated vaccines. At an extreme, the thinking goes, people may not get much protection from a COVID shot that is a perfect match for the viral variant du jour.

    Recent data hint at this possibility. Past brushes with the virus or the original vaccine seem to mold, or even muffle, people’s reactions to bivalent shots—“I have no doubt about that,” Jenna Guthmiller, an immunologist at the University of Colorado School of Medicine, told me. The immune system just doesn’t make Omicron-focused antibodies in the quantity or quality it probably would have had it seen the updated jabs first. But there’s also an upside to this stubbornness that we could not live without, says Katelyn Gostic, an immunologist and infectious-disease modeler who has studied the phenomenon with flu. Original antigenic sin is the reason repeat infections, on average, get milder over time, and the oomph that enables vaccines to work as well as they do. “It’s a fundamental part,” Gostic told me, “of being able to create immunological memory.”

    This is not just basic biology. The body’s powerful first impressions of this coronavirus can and should influence how, when, and how often we revaccinate against it, and with what. Better understanding of the degree to which these impressions linger could also help scientists figure out why people are (or are not) fighting off the latest variants—and how their defenses will fare against the virus as it continues to change.


    The worst thing about “original antigenic sin” is its name. The blame for that technically lies with Thomas Francis Jr., the immunologist who coined the phrase more than six decades ago after noticing that the initial flu infections people weathered in childhood could bias how they fared against subsequent strains. “Basically, the flu you get first in life is the one you respond to most avidly for the long term,” says Gabriel Victora, an immunologist at Rockefeller University. That can become somewhat of an issue when a very different-looking strain comes knocking.

    In scenarios like these, original antigenic sin may sound like the molecular equivalent of a lovesick teen pining over an ex, or a student who never graduates out of immunological grade school. But from the immune system’s point of view, never forgetting your first is logically sound. New encounters with a pathogen catch the body off guard—and tend to be the most severe. A deep-rooted defensive reaction, then, is practical: It ups the chances that the next time the same invader shows up, it will be swiftly identified and dispatched. “Having good memory and being able to boost it very quickly is sometimes a very good thing,” Victora told me. It’s the body’s way of ensuring that it won’t get fooled twice.

    These old grudges come with clear advantages even when microbes morph into new forms, as flu viruses and coronaviruses often do. Pathogens don’t remake themselves all at once, so immune cells that home in on familiar snippets of a virus can still in many cases snuff out enough invaders to prevent an infection’s worst effects. That’s why even flu shots that aren’t perfectly matched to the season’s most prominent strains are usually still quite good at keeping people out of hospitals and morgues. “There’s a lot of leniency in how much the virus can change before we really lose protection,” Guthmiller told me. The wiggle room should be even bigger, she said, with SARS-CoV-2, whose subvariants tend to be far more similar to one another than, say, different flu strains are.

    With all the positives that immune memory can offer, many immunologists tend to roll their eyes at the negative and bizarrely moralizing implications of the phrase original antigenic sin. “I really, really hate that term,” says Deepta Bhattacharya, an immunologist at the University of Arizona. Instead, Bhattacharya and others prefer to use more neutral words such as imprinting, evocative of a duckling latching onto the first maternal figure it spots. “This is not some strange immunological phenomenon,” says Rafi Ahmed, an immunologist at Emory University. It’s more a textbook example of what an adaptable, high-functioning immune system does, and one that can have positive or negative effects, depending on context. Recent flu outbreaks have showcased a little bit of each: During the 2009 H1N1 pandemic, many elderly people, normally more susceptible to flu viruses, fared better than expected against the late-aughts strain, because they’d banked exposures to a similar-looking H1N1—a derivative of the culprit behind the 1918 pandemic—in their youth. But in some seasons that followed, H1N1 disproportionately sickened middle-aged adults whose early-life flu indoctrinations may have tilted them away from a protective response.

    The backward-gazing immune systems of those adults may have done more than preferentially amplify defensive responses to a less relevant viral strain. They might have also actively suppressed the formation of a response to the new one. Part of that is sheer kinetics: Veteran immune cells, trained up on past variants and strains, tend to be quicker on the draw than fresh recruits, says Scott Hensley, an immunologist at the Perelman School of Medicine at the University of Pennsylvania. And the greater the number of experienced soldiers, the more likely they are to crowd out rookie fighters—depriving them of battlefield experience they might otherwise accrue. Should the newer viral strain eventually return for a repeat infection, those less experienced immune cells may not be adequately prepared—leaving people more vulnerable, perhaps, than they might otherwise have been.

    Some researchers think that form of imprinting might now be playing out with the bivalent COVID vaccines. Several studies have found that the BA.5-focused shots are, at best, moderately more effective at producing an Omicron-targeted antibody response than the original-recipe jab—not the knockout results that some might have hoped for. Recent work in mice from Victora’s lab backs up that idea: B cells, the manufacturers of antibodies, do seem to have trouble moving past the impressions of SARS-CoV-2’s spike protein that they got from first exposure. But the findings don’t really trouble Victora, who gladly received his own bivalent COVID shot. (He’ll take the next update, too, whenever it’s ready.) A blunted response to a new vaccine, he told me, is not a nonexistent one—and the more foreign a second shot recipe is compared with the first, the more novice fighters should be expected to participate in the fight. “You’re still adding new responses,” he said, that will rev back up when they become relevant. The coronavirus is a fast evolver. But the immune system also adapts. Which means that people who receive the bivalent shot can still expect to be better protected against Omicron variants than those who don’t.

    Historical flu data support this idea. Many of the middle-aged adults slammed by recent H1N1 infections may not have mounted perfect attacks on the unfamiliar virus, but as immune cells continued to tussle with the pathogen, the body “pretty quickly filled in the gaps,” Gostic told me. Although it’s tempting to view imprinting as a form of destiny, “that’s just not how the immune system works,” Guthmiller told me. Preferences can be overwritten; biases can be undone.


    Original antigenic sin might not be a crisis, but its existence does suggest ways to optimize our vaccination strategies with past biases in mind. Sometimes, those preferences might need to be avoided; in other instances, they should be actively embraced.

    For that to happen, though, immunologists would need to fill in some holes in their knowledge of imprinting: how often it occurs, the rules by which it operates, what can entrench or alleviate it. Even among flu viruses, where the pattern has been best-studied, plenty of murkiness remains. It’s not clear whether imprinting is stronger, for instance, when the first exposure comes via infection or vaccination. Scientists can’t yet say whether children, with their fiery yet impressionable immune systems, might be more or less prone to getting stuck on their very first flu strain. Researchers don’t even know for certain whether repetition of a first exposure—say, through multiple doses of the same vaccine, or reinfections with the same variant—will more deeply embed a particular imprint.

    It does seem intuitive that multiple doses of a vaccine could exacerbate an early bias, Ahmed told me. But if that’s the case, then the same principle might also work the other way: Maybe multiple exposures to a new version of the virus could help break an old habit, and nudge the immune system to move on. Recent evidence has hinted that people previously infected with an early Omicron subvariant responded more enthusiastically to a bivalent BA.1-focused vaccine—available in the United Kingdom—than those who’d never encountered the lineage before. Hensley, at the University of Pennsylvania, is now trying to figure out if the same is true for Americans who got the BA.5-based bivalent shot after getting sick with one of the many Omicron subvariants.

    Ahmed thinks that giving people two updated shots—a safer approach, he points out, than adding an infection to the mix—could untether the body from old imprints too. A few years ago, he and his colleagues showed that a second dose of a particular flu vaccine could help shift the ratio of people’s immune responses. A second dose of the fall’s bivalent vaccine might not be practical or palatable for most people, especially now that BA.5 is on its way out. But if next autumn’s recipe overlaps with BA.5 in ways that it doesn’t with the original variant—as it likely will to at least some degree, given the Omicron lineage’s continuing reign—a later, slightly different shot could still be a boon.

    Keeping vaccine doses relatively spaced out—on an annual basis, say, à la flu shots—will likely help too, Bhattacharya said. His recent studies, not yet published, hint that the body might “forget” old variants, as it were, if it’s simply given more time: As antibodies raised against prior infections and injections fall away, vaccine ingredients could linger in the body rather than be destroyed by prior immunity on sight. That slightly extended stay might offer the junior members of the immune system—lesser in number, and slower on the uptake—more of an opportunity to cook up an Omicron-specific response.

    In an ideal world, researchers might someday know enough about imprinting to account for its finickiness whenever they select and roll out new shots. Flu shots, for instance, could be personalized to account for which strains babies were first exposed to, based on birth year; combinations of COVID vaccine doses and infections could dictate the timing and composition of a next jab. But the world is not yet living that reality, Gostic told me. And after three years of an ever-changing coronavirus and a fluctuating approach to public health, it’s clear that there won’t be a single vaccine recipe that’s ideal for everyone at once.

    Even Thomas Francis Jr. did not consider original antigenic sin to be a total negative, Hensley told me. According to Francis, the true issue with the “sin” was that humans were missing out on the chance to imprint on multiple strains at once in childhood, when the immune system is still a blank slate—something that modern researchers could soon accomplish with the development of universal vaccines. Our reliance on first impressions can be a drawback. But the same phenomenon can be an opportunity to acquaint the body with diversity early on—to give it a richer narrative, and memories of many threats to come.

    [ad_2]

    Katherine J. Wu

    Source link