ReportWire

Tag: people

  • Do Overdoses Look Different Now?

    Do Overdoses Look Different Now?

    [ad_1]

    Most likely, the person’s skin color will change. An ashy tone might creep in, or they could turn a shade of blue. If too much fluid pools in their mouth or lungs and mixes with air, foam will appear at their lips. There might be a sound, too—that of light snoring. These are some of the main symptoms of an overdose. Although the drug causing the reaction might be different, the symptoms look the same. “An overdose is an overdose,” Soma Snakeoil, a co-founder of the Sidewalk Project, a harm-reduction organization, told me.

    But although overdose symptoms have not shifted, the ability to treat it has, most notably because of the availability of naloxone, the medication that can quickly reverse an overdose and that was approved in late March to be sold over the counter, as Narcan. This move happened at least in part because in the past few decades, the entire context of an overdose in the United States has changed. The U.S. has entered its fourth wave of the opioid crisis, and the death toll is different now: Overdoses have been steadily increasing for many years, but this wave, also known as the “era of overdoses,” has seen the highest number of fatal overdoses yet. “I think what makes this current crisis so unique is the volume” of overdoses, John Pamplin II, an epidemiologist at Columbia’s school of public health, told me. And that is happening because the drugs have changed too. “It’s not necessarily that more people are using drugs,” Emilie Bruzelius, an epidemiology researcher at Columbia’s school of public health, told me. “The opioids that people are using now are incredibly strong, and they’re more likely to cause an overdose.”

    The result is that any person using drugs has a higher chance of overdosing than ever before. “There’s no population segment that is insulated,” Bruzelius said. “It’s really affecting everybody now.”

    The origins of the opioid crisis can be traced back to 1999. As doctors prescribed opioids more and more—OxyContin prescriptions for non-cancer-related pain alone increased from about 670,000 in 1997 to 6.2 million in 2002—related deaths rose swiftly. In that same period, the number of deaths increased almost 30 percent, to nearly 9,000. This first wave largely affected white people: By 2010, the opioid mortality rate was more than two times higher for white people than Black people.

    That year, a second wave began, in which overdose deaths involving heroin grew most dramatically. By 2015, heroin overdose deaths surpassed the number of deaths attributable to opioid pills. This time, the total opioid mortality rate grew for both Black and white populations; death rates increased by an average of at least 30 percent a year beginning in 2010, and accelerated even faster after 2013. In this same period, illicitly manufactured fentanyl—a synthetic opioid approved for pain relief—was being slipped into heroin, counterfeit pills, cocaine, and other drugs. Many of the people taking these drugs did not realize that they were taking fentanyl at all, leading to a third wave of overdoses. Mortality skyrocketed. In 2017, synthetic opioids were responsible for more than 28,000 deaths, while opioid-pill and heroin overdose deaths had leveled off at about 15,000. The demographics of the crisis continued to shift too, and in 2020, the fastest increases in death rates was experienced by Black and Indigenous Americans, surpassing the death rate of white Americans, Pamplin told me.

    The new, fourth wave is characterized by more mixing of different drugs. “People are overdosing from cocaine and fentanyl or methamphetamines and fentanyl or methamphetamines and fentanyl and heroin,” Bruzelius told me. Recently, xylazine—a non-opiate sedative also known as “tranq”—has infiltrated the fentanyl supply, resulting in what the DEA has deemed the deadliest threat yet.

    This is the context in which the FDA approved Narcan to be sold over the counter. Narcan packages naloxone as a nasal spray, and the FDA argued that its approval could “help improve access to naloxone, increase the number of locations where it’s available, and help reduce overdose deaths throughout the country.” By binding to opioid receptors, naloxone blocks the effects of opiates in the system. This reverses the impact of an overdose, restoring normal breathing.

    But drug policies in America tend to swing, pendulum-like, from one extreme to the other, David Courtwright, a historian at the University of North Florida, told me: A response focused on care for drug users might give way to a more punitive policy. Already, some critics of Narcan’s availability have pushed to restrict its use on the grounds that an effective overdose treatment could encourage drug use—even though there’s “just no kind of scientific or empirical backing” for those arguments, Bruzelius said. Here, the simplest logic holds: If overdoses are affecting every community in America, better to have an accessible treatment everywhere.

    [ad_2]

    Zoya Qureshi

    Source link

  • No One in Movies Knows How to Swallow a Pill

    No One in Movies Knows How to Swallow a Pill

    [ad_1]

    There are two ways of taking pills—two and only two.

    You pinch the pill between your thumb and index finger, pick it up, and place it on your tongue. You take a drink of water. This method is the tweezers.

    Or else: You place the pill in your palm and launch it toward your mouth, as if your teeth were battlements and your arm a siege machine. Don’t bother with the water. This method is the catapult.

    In real-world situations, many people—let’s say most—make a habit of the tweezers. In the movies, the opposite is true. An on-screen pill bottle works like Chekhov’s gun: Eventually, its contents will be fired at an actor’s mouth, or smashed between his lips, or hurled into his gullet.

    Think of Austin Butler as the lead in Elvis, alone in his hotel room: He slaps those quaaludes in, liquid-free, sideburns tilted toward the ceiling. It’s a textbook movie swallow, the Stanislavski Fling. Butler got an Oscar nomination; so did Ellen Burstyn, popping diet pills in Requiem for a Dream. On Succession, Jeremy Strong and Kieran Culkin, each a two-time Emmy nominee, gobble meds on-screen. Going catapult is everywhere in cinema; it’s a gesture that befits the biggest stars. Angelina Jolie shoots her pills in Girl, Interrupted. So does Brittany Murphy. Jake Gyllenhaal catapults a pill in Donnie Darko. Albert Brooks in Modern Romance. In Goodfellas, Ray Liotta does it twice.

    I love the movies! But it’s time we had a public-health announcement: The catapult is not, in fact, how a person should be taking pills. The act of swallowing a medication is so pervasive—and so intimate—that one easily forgets it is a skill that must be learned. In the U.S., roughly three-fifths of all adults are on prescription drugs; perhaps one-sixth will falter when they try to gulp it down. Twenty years ago, Bonnie Kaplan, a research psychologist at the University of Calgary, devised a new technique for helping people overcome this problem. Her method, as laid out in a mesmerizing video, suggests that you turn your head to make a pill go in. (No one has ever done this in a movie and no one ever will.) The turning motion helps open your upper esophageal sphincter, Kaplan says, though she does admit that more familiar postures have their own advantages. Some people like to raise their chins: “They say it is easier for the pill to slide down their throat, as if their tongue is a ski jump and it is a straight shot down the hill.” Others tip their heads the other way, chin-to-chest, “because they say it is more relaxing in the neck.”

    But on the all-important matter of the hand, Kaplan’s messaging is very clear: You pick up the pill between your fingers; then you place it on your tongue. Which is to say, you do the tweezers. Other training methods are consistent with this rule. One approach for teaching children, published in 1984, describes “correctly placing” a pill on the back of the tongue—which clearly cannot be accomplished via a whole-hand toss; another, from 2006, says to “place the pill on your tongue towards the back of your mouth.”

    That’s how people ought to take their pills. But how do people really do it, in real life? At the start of her research, Kaplan told me, she wasn’t telling takers what to do; she spent time observing how they liked to swallow medications on their own. The cinematic catapult was simply nonexistent in the wild, she said. “I never saw anyone just throw it back.” Never? Anyone? I asked Kaplan to describe the way she swallows pills herself, and she paused before she answered, as if she’d never really thought this through. “My husband and I both turn our heads to the right,” she said at last. First she’ll place the pill on the back of her tongue, and then she’ll twist and swallow. “But you know what?” she said. “I do often clap my hand to my mouth with my last pill or two.”

    “It’s very individual,” Cindy Corbett, a nursing-science professor at the University of South Carolina, told me. She’s on a team that uses smartwatch accelerometers to track patients’ adherence to their medication regimen. Their system knows when someone moves a hand up to their face, she told me, but it won’t distinguish how a pill is being held, or whether it is placed or flung into the mouth. (Indeed, the study’s four-step “protocol-guided medication-taking activity” includes this ambivalent instruction: “Place/toss pill to mouth.”) When I asked Corbett what she’s seen herself in this regard, as a clinician, she drew a blank. “I’ve never thought about it that much.”

    Maybe this is it: If you even have to think about the way you swallow pills, then you’re almost certainly someone who has trouble taking pills; and if you’re someone who has trouble taking pills, then you really should be taking pills in tweezer mode. In the off-screen world, to catapult is a privilege reserved for those with floppy throats. It’s the difference between the gags and the gag-nots. That inequality is only reinforced by the movieland fantasy of universal tossing, which sets up (as only Hollywood knows how) an impossible and unhealthy standard for behavior. Look, Elvis gobbles benzos; why can’t I? “People’s preconceived notions of how they’re supposed to swallow pills does lead to mental barriers,” says Marissa Harkness, a co-creator of the Pill Skills training kit, a case of sugar-based placebos made in different shapes and sizes.

    When actors catapult on camera, they get the benefit of looking more dramatic: bigger gestures, more to see. But something more important is going on in movie swallows, a deeper meaning to the movement—an implied relationship of power. Taking pills by catapult suggests that you’re a victim, that your body and your mind are under siege. A hand that’s driven by compulsion fires drugs into the face. A teenage boy is pelted by his Prozac. But some stories need to have this flipped, so the pill can be a tool instead of an affliction. In Taxi Driver, Robert De Niro tweezers bennies. He’s a man on a mission. And the most famous pill-taking scene in movie history, from The Matrix, has Keanu Reeves pinch a pill between his thumb and index fingers in dramatic close-up, and deposit it into his mouth. Then he drinks a glass of water. (Is that a movie first?) A character who tweezers is going on a journey, the film director John Magary told me. He’s curious. He’s in control. (From Magary’s films to date: two catapults, zero tweezers.)

    Perhaps the movies have this figured out. There are two ways of taking pills—two and only two. The tweezers or the catapult; self-knowledge or oblivion. In the end, the choice is yours.

    [ad_2]

    Daniel Engber

    Source link

  • Long-Haulers Are Trying to Define Themselves

    Long-Haulers Are Trying to Define Themselves

    [ad_1]

    Imagine you need to send a letter. The mailbox is only two blocks away, but the task feels insurmountable. Air hunger seizes you whenever you walk, you’re plagued by dizziness and headaches, and anyway, you keep blanking on your zip code for the return address. So you sit in the kitchen, disheartened by the letter you can’t send, the deadlines you’ve missed, the commitments you’ve canceled. Months have passed since you got COVID. Weren’t you supposed to feel better by now?

    Long COVID is a diverse and confusing condition, a new disease with an unclear prognosis, often-fluctuating symptoms, and a definition people still can’t agree on. And in many cases, it is disabling. In a recent survey, 1.6 percent of American adults said post-COVID symptoms limit their daily activities “a lot.” That degree of upheaval aligns with the Americans With Disabilities Act’s definition of disability: “a physical or mental impairment that substantially limits one or more major life activities.”

    But for many people experiencing long COVID who were able-bodied before, describing themselves as “disabled” is proving to be a complicated decision. This country is not kind to disabled people: American culture and institutions tend to operate on the belief that a person’s worth derives from their productivity and physical or cognitive abilities. That ableism was particularly stark in the early months of the pandemic, when some states explicitly de-prioritized certain groups of disabled people for ventilators. Despite the passage of the ADA in 1990, disabled people still confront barriers accessing things such as jobs and health care, and even a meal with friends at a restaurant. Most of our cultural narratives cast disability as either a tribulation to overcome or a tragedy.

    Consequently, incorporating disability into your identity can require a lot of reflection. Lizzie Jones, who finished her doctoral research in disability studies last year and now works for an educational consultancy, suffered a 30-foot fall that shattered half of her body a week before her college graduation. She told me that her accident prompted “radical identity shifts” as she transitioned from trying to get the life she’d imagined back on track to envisioning a new one.

    These are the sorts of mindset changes that Ibrahim Rashid struggled with after contracting COVID in November 2020, when he was a graduate student. He dealt with debilitating symptoms for months, but even after applying for disability accommodations to finish his degree, he “was so scared of that word,” he told me. Rashid was afraid of people treating him differently and of losing his internship offer. Most terrifying, calling himself disabled felt like an admission that his long COVID wasn’t going to suddenly resolve.

    Aaron Teasdale, an outdoors and travel writer and a mountaineer, has also been wrestling with identity questions since he got COVID in January 2022. For months, he spent most of his time in a remote-controlled bed, gazing out the window at the Montana forests he once skied. Although his fatigue is now slowly improving, he had to take Ritalin to speak with me. He was still figuring out what being disabled meant to him, whether it simply described his current condition or reflected some new, deeper part of himself—a reckoning made more difficult by the unknowability of his prognosis. “Maybe I just need more time before I say I’m a disabled person,“ he said. “When you have your greatest passions completely taken away from you, it does leave you questioning, Well, who am I?

    Long COVID can wax and wane, leaving people scrambling to adapt. It doesn’t mesh with the stereotype of disability as static, visible, and binary—the wheelchair user cast in opposition to the pedestrian. Nor does the fact that long COVID is often imperceptible in casual interactions, which forces long-haulers to contend with disclosure and the possibility of passing as able-bodied. One such long-hauler is Julia Moore Vogel, a program director at Scripps Research, who initially hesitated at the idea of getting a disabled-parking permit. “My first thought was, I’m not disabled, because I can walk,” she told me. But if she did walk, she’d be drained for days. Taking her daughter to the zoo or the beach was out of the question.

    Once she got over her apprehension, identifying as disabled ended up feeling empowering. Getting that permit was “one of the best things I’ve done for myself,” Vogel told me. She could drive her kid to the playground, park nearby, and then sit and watch her play. After plenty of therapy and conversations with other disabled people, Rashid, too, came to embrace disability as part of his identity, so much so that he now speaks and writes about chronic illness.

    Usually, the community around a disease—including advocacy among those it disables—arises after scientists name it. Long COVID upended that order, because the term first spread through hashtags and support groups in 2020. Instead of doctors informing patients of whether their symptoms fit a certain illness, patients were telling doctors what symptoms their illness entailed. And there were a lot of symptoms: everything from life-altering neurocognitive problems and dizziness to a mild, persistent cough.

    As long-COVID networks blossomed online, members began seeking support from wider disability-rights communities, and contributing fresh energy and resources to those groups. People who’d fought similar battles for decades sometimes bristled at the greater political capital afforded to long-haulers, whose advocacy didn’t universally extend to other disabled people; for the most part, though, long-haulers were welcomed.

    Tapping into conversations among disabled people “has shown me that I’m simply not alone,” Eris Eady, a writer and an artist who works for Planned Parenthood, told me. Eady, who is queer and Black, found that long COVID interplayed with struggles they already faced on account of their identity. So they sought advice from disabled Black women about interdependence, mutual aid, and accessibility, as well as about being dismissed by doctors, an experience more prevalent among women and people of color.

    Disabled communities have years of experience supporting people through identity changes. The writer and disability-justice organizer Leah Lakshmi Piepzna-Samarasinha told me that when she was newly disabled, she was dogged with heavy questions: Am I going to be able to make a living? Am I datable? Her isolation and fear dissipated only when she met other young disabled people, who taught her how to be creative in “hacking the world.”

    For long-haulers navigating these transitions for the first time, the process can be rocky. Rachel Robles, a contributor to The Long COVID Survival Guide, told me she spent her early months with long COVID “waking up every day and thinking, Okay, is this the day it’s left my body?” Conceiving of herself as disabled didn’t take away her long COVID. She didn’t stop seeing doctors and trying treatments. But thinking about accessibility did inspire her to return to gymnastics, which she’d quit decades earlier because of a heart condition. If she couldn’t lift her hands over her head sometimes, and if a dive roll would never be in her future, then so be it: Gymnastics could be about enjoying what her body could do, not yearning for what it couldn’t. Before she identified as disabled, returning to gymnastics “was something I would have never, ever imagined,” Robles said. And she never would have done it had she remained focused only on when she might recover.

    Hoping for improvement is a natural response to illness, especially one with a trajectory as uncertain as long COVID’s. But focusing exclusively on relinquished past identities or unrealized future ones can dampen our curiosity about the present. A better way to think about it is “What are the things you can do with the body that you have, and what are the things you might not know you can do yet?” Piepzna-Samarasinha said. “Who am I right now?”

    [ad_2]

    Lindsay Ryan

    Source link

  • We’ve Had a Cheaper, More Potent Ozempic Alternative for Decades

    We’ve Had a Cheaper, More Potent Ozempic Alternative for Decades

    [ad_1]

    The Ozempic craze shows no signs of slowing. Demand for the drug, popularly used for weight loss, is so monumental that it is already changing the diet industry and spurring a “marketing bonanza” among the dozens of telehealth start-ups that now prescribe it. A highly public ad campaign from one start-up, Ro, banks on the drug’s simple premise: “A weekly shot to lose weight.”

    Never before has a weight-loss treatment been hyped this way and been able to deliver on its promise. Ozempic itself is technically a diabetes drug, but its active ingredient, semaglutide, has been approved by the FDA for weight loss under the brand name Wegovy, and can reduce a person’s body weight by up to 20 percent through a weekly injection. An even more powerful drug, known as tirzepatide, or Mounjaro, may soon be approved for weight loss, and a host of new medications are coming down the pipeline. All signs suggest that America is on the verge of a weight-loss revolution.

    But for people with obesity, semaglutide isn’t even the most effective weight-loss treatment around—not even close. Bariatric surgery, which has existed for many decades, is still significantly more potent. This class of procedures, which, broadly speaking, reconfigure the digestive system so people feel less hungry and more full, is considered to be the “gold standard” for treating obesity, Holly Lofton, an obesity-medicine physician at NYU, told me. Most people experience weight loss of 50 percent and, with one procedure, up to 80 percent, according to the Cleveland Clinic.

    Despite the impressive abilities of the new crop of weight-loss drugs—and bold assertions that such drugs could someday replace surgery outright—several doctors told me that surgery will likely continue to be the top-line treatment for obesity, even as the medications improve. People may seek out treatment with the new drugs because they’re so popular, but “long term, there will be an increase in surgery,” Shauna Levy, a professor specializing in bariatric surgery at Tulane University School of Medicine, told me. The new drugs, however potent, may be less a revolutionary fix for obesity and more a powerful tool for treating it—one of many that already exist.


    Unlike semaglutide, bariatric surgery, first introduced in the 1950s, took several decades to become accepted by the medical community. Initial attempts made people so sick that, at times, the surgery had to be reversed. The term bariatric surgery refers to several different procedures that reshape the gastrointestinal tract so that it absorbs fewer nutrients, holds less food, or both. These days, the most commonly performed surgery is called a Roux-en-Y, which shrinks the stomach to the size of a walnut—so people need less food to feel satisfied—and then reconnects it to the small intestine in a Y shape, rather than linearly. This gastric bypass lets food circumvent most of the stomach, leaving fewer opportunities for the body to harvest nutrients. In another common procedure, surgeons sculpt the stomach into a banana-size “sleeve” and toss the rest; another common type involves rerouting the intestines in a way that minimizes the area where calories can be absorbed.

    But bariatric surgery does more than shrink gastrointestinal real estate. It exerts a less visible but equally powerful effect on the many different hormones that control hunger. Some procedures remove the part of the gut that produces the “hunger hormone,” ghrelin, while the rerouting of food through a Roux-en-Y ramps up the release of “incretin” hormones that create the feeling of fullness after eating.

    In a sense, the new weight-loss drugs are essentially trying to re-create the effects of bariatric surgery: The success of these drugs is due to their ability to mimic the incretin hormones and get people to feel satisfied with less food. Semaglutide masquerades as the hormone GLP-1, whereas Mounjaro poses as both GLP-1 and GIP. But these are just two hormones; bariatric surgery “touches on multiple different hormones and different pathways” and, as such, is “more comprehensive,” Levy said. In one study, Mounjaro, considered the most powerful of the current crop of medications, led to 20 percent or more weight loss in 57 percent of people who took the highest dose—an impressive feat, but still a far cry from what is possible with surgery. Similarly, Ozempic and Mounjaro, both technically diabetes drugs, have powerful effects on blood-sugar levels over time, but many surgery patients “leave the hospital already in remission from their diabetes,” Levy said.

    In addition to sheer potency, surgery is also much more affordable than these weight-loss drugs. Unlike the drugs, bariatric surgery is covered by Medicare if the patient meets certain criteria, including having a BMI equal to or greater than 35 and at least one comorbidity related to obesity. Many private insurers cover it too, albeit to varying degrees. Out of pocket, surgery costs $15,000 to $25,000—not cheap, but still cheaper than shelling out more than $1,000 a month indefinitely. “The patient must understand that they have to continue taking medication forever,” Lofton said. People who stop taking semaglutide generally regain the weight they lost. Lofton told me about one patient who had to forgo rent just to pay for the drugs: Factoring in insurance, “you can pay for three months of medicine and then have surgery at the same price.”

    Neither treatment, of course, is without its potential downsides. Semaglutide can cause temporary but nasty side effects such as nausea, vomiting, and diarrhea—and though it is considered safe for treating obesity, long-term data on this usage span just two years. Because many surgeries are done laparoscopically—using only tiny incisions—mortality is vanishingly low, and many patients go home after two or three days; full recovery usually takes four to six weeks. In the long term, complications such as hernias, gallstones, and low blood sugar can develop.

    But there’s a reason bariatric surgery has not led to a weight-loss revolution of the kind that now gets associated with semaglutide. Despite its dramatic effects, and obesity’s prevalence across America, only 1 percent of people eligible for surgery actually get it. People hesitate for many reasons, medical and otherwise, but the most pervasive issue is a lack of awareness that surgery is even a safe or realistic option for weight loss. Bariatric surgery is plagued by stigma even within the medical community: In the 1990s, it was dismissed as a “barbaric” way to address an issue that, many believed, could be treated with diet and exercise. “There are a lot of primary-care doctors who are not talking enough about surgery” because they were trained with that old mindset, Levy said. ​​It doesn’t help that bariatric surgery hasn’t exactly been a media sensation, with few high-profile patient advocates beyond Al Roker and Mariah Carey. In contrast, stories of celebrities on weight-loss drugs abound. Unlike surgery, semaglutide has the potential to be taken recreationally.


    The advantages that surgery has over weight-loss drugs may change as the drugs become more potent and eventually cheaper. But for now, semaglutide won’t dramatically shift the way obesity is treated, doctors told me—in fact, these new drugs may act as a conduit to surgery itself. Levy predicts that their sheer popularity will trigger a brief dip in the bariatric-surgery rate, but as price remains an issue, and people with obesity are unable to reach their weight-loss goals on the drugs alone, “they may start opening their mind to surgery.”

    Certainly, in some patients, weight-loss drugs alone could lead to lasting weight loss. And they can benefit those who are overweight but don’t qualify for surgery. But more widely, these drugs will likely be used in tandem with bariatric surgery to produce more dramatic, longer-lasting results, experts told me. “I don’t see this as an either/or,” Fatima Cody Stanford, an obesity-medicine physician at Massachusetts General Hospital and Harvard Medical School, told me. “I see it as surgery plus medicine.”

    Drugs can help fill in any gaps that surgery leaves behind. Weight can rebound after a procedure, because the body has a way of rebalancing itself; hormones that were tamped down due to bariatric surgery, Stanford said, can “start to reemerge with a vengeance.” About a fifth of people, and perhaps even more, regain a significant amount of weight—15 percent or more—two to five years after surgery. All of the doctors I spoke with said that medication could be a powerful tool to prevent post-surgery weight rebounds—though to keep that weight off, the medication would still have to be taken in perpetuity. Stanford estimated that more than 90 percent of her patients are on weight-loss drugs after surgery—and not necessarily semaglutide; older medications often suffice. Drugs could also be used to help people prepare for surgery, Lofton said. Some doctors encourage patients to lose weight beforehand to decrease the risk of complications such as blood clots, heart attack, and infection.

    Despite the hype, weight-loss drugs were never a perfect treatment for obesity. Neither is bariatric surgery, for that matter. “It is not a cure,” Lofton told me. A cure, she explained, would ensure that hunger doesn’t return and that fat cells don’t get bigger, a hallmark of obesity: “We have nothing that does that”—not even more potent next-gen drugs will provide a permanent fix. But the effect of combining surgery and medication could come close, she said.

    That no cure for obesity exists is evidence of its complexity. All of the experts I spoke with pointed out that obesity has long been misunderstood as a failure of personal will, as laziness or gluttony. That misunderstanding has led to inadequate care: Many people who regain weight after a bariatric procedure are made to feel by their doctors like they “wasted the surgery,” even if human biology is to blame, Stanford said. Ozempic and other weight-loss medications frame obesity as a condition that can be treated with drugs—in other words, a disease. Patients on those medications may realize, “Hey, maybe it’s not just me being lazy this whole time—maybe there is science to it and an actual disease here,” said Levy. Collectively understanding obesity as an illness that exists alongside heart disease and cancer—diseases routinely treated with medication and surgery—instead of as a matter of personal inadequacy will have far more profound impacts on people with obesity than any drug alone.

    [ad_2]

    Yasmin Tayag

    Source link

  • Long COVID Is Being Erased—Again

    Long COVID Is Being Erased—Again

    [ad_1]

    Updated at 6:29 p.m. ET on April 21, 2023

    Charlie McCone has been struggling with the symptoms of long COVID since he was first infected, in March 2020. Most of the time, he is stuck on his couch or in his bed, unable to stand for more than 10 minutes without fatigue, shortness of breath, and other symptoms flaring up. But when I spoke with him on the phone, he seemed cogent and lively. “I can appear completely fine for two hours a day,” he said. No one sees him in the other 22.  He can leave the house to go to medical appointments, but normally struggles to walk around the block. He can work at his computer for an hour a day. “It’s hell, but I have no choice,” he said. Like many long-haulers, McCone is duct-taping himself together to live a life—and few see the tape.

    McCone knows 12 people in his pre-pandemic circles who now also have long COVID, most of whom confided in him only because “I’ve posted about this for three years, multiple times a week, on Instagram, and they’ve seen me as a resource,” he said. Some are unwilling to go public, because they fear the stigma and disbelief that have dogged long COVID. “People see very little benefit in talking about this condition publicly,” he told me. “They’ll try to hide it for as long as possible.”

    I’ve heard similar sentiments from many of the dozens of long-haulers I’ve talked with, and the hundreds more I’ve heard from, since first reporting on long COVID in June 2020. Almost every aspect of long COVID serves to mask its reality from public view. Its bewilderingly diverse symptoms are hard to see and measure. At its worst, it can leave people bed- or housebound, disconnected from the world. And although milder cases allow patients to appear normal on some days, they extract their price later, in private. For these reasons, many people don’t realize just how sick millions of Americans are—and the invisibility created by long COVID’s symptoms is being quickly compounded by our attitude toward them.

    Most Americans simply aren’t thinking about COVID with the same acuity they once did; the White House long ago zeroed in on hospitalizations and deaths as the measures to worry most about. And what was once outright denial of long COVID’s existence has morphed into something subtler: a creeping conviction, seeded by academics and journalists and now common on social media, that long COVID is less common and severe than it has been portrayed—a tragedy for a small group of very sick people, but not a cause for societal concern. This line of thinking points to the absence of disability claims, the inconsistency of biochemical signatures, and the relatively small proportion of severe cases as evidence that long COVID has been overblown. “There’s a shift from ‘Is it real?’ to ‘It is real, but …,’” Lekshmi Santhosh, the medical director of a long-COVID clinic at UC San Francisco, told me.

    Yet long COVID is a substantial and ongoing crisis—one that affects millions of people. However inconvenient that fact might be to the current “mission accomplished” rhetoric, the accumulated evidence, alongside the experience of long haulers, makes it clear that the coronavirus is still exacting a heavy societal toll.


    As it stands, 11 percent of adults who’ve had COVID are currently experiencing symptoms that have lasted for at least three months, according to data collected by the Census Bureau and the CDC through the national Household Pulse Survey. That equates to more than 15 million long-haulers, or 6 percent of the U.S. adult population. And yet, “I run into people daily who say, ‘I don’t know anyone with long COVID,’” says Priya Duggal, an epidemiologist and a co-lead of the Johns Hopkins COVID Long Study. The implication is that the large survey numbers cannot be correct; given how many people have had COVID, we’d surely know if one in 10 of our contacts was persistently unwell.

    But many factors make that unlikely. Information about COVID’s acute symptoms was plastered across our public spaces, but there was never an equivalent emphasis that even mild infections can lead to lasting and mercurial symptoms; as such, some people who have long COVID don’t even know what they have. This may be especially true for the low-income, rural, and minority groups that have borne the greatest risks of infection. Lisa McCorkell, a long-hauler who is part of the Patient-Led Research Collaborative, recently attended a virtual meeting of Bay Area community leaders, and “when I described what it is, some people in the chat said, ‘I just realized I might have it.’”

    Admitting that you could have a life-altering and long-lasting condition, even to yourself, involves a seismic shift in identity, which some people are understandably loath to make. “Everyone I know got Omicron and got over it, so I really didn’t want to concede that I didn’t survive this successfully,” Jennifer Senior, a friend and fellow staff writer at The Atlantic, who has written about her experience with long COVID, told me. Duggal mentioned an acquaintance who, after a COVID reinfection, can no longer walk the quarter mile to pick her kids up from school, or cook them dinner. But she has turned down Duggal’s offer of an appointment; instead, she is moving across the country for a fresh start. “That is common: I won’t call it ‘long COVID’; I’ll just change everything in my life,” Duggal told me. People who accept the condition privately may still be silent about it publicly. “Disability is often a secret we keep,” Laura Mauldin, a sociologist who studies disability, told me. One in four Americans has a disability; one in 10 has diabetes; two in five have at least two chronic diseases. In a society where health issues are treated with intense privacy, these prevalence statistics, like the one-in-10 figure for long COVID, might also intuitively feel like overestimates.

    Some long-haulers are scared to disclose their condition. They might feel ashamed for still being sick, or wary about hearing from yet another loved one or medical professional that there’s nothing wrong with them. Many long-haulers worry that they’ll be perceived as weak or needy, that their friends will stop seeing them, or that employers will treat them unfairly. Such fears are well founded: A British survey of almost 1,000 long-haulers found that 63 percent experienced overt discrimination because of their illness at least “sometimes,” and 34 percent sometimes regretted telling people that they have long COVID. “So many people in my life have reached out and said, ‘I’m experiencing this,’ but they’re not telling the rest of our friends,” McCorkell said.

    Imagine that you interact with 50 people on a regular basis, all of whom got COVID. If 10 percent are long-haulers, that’s five people who are persistently sick. Some might not know what long COVID is or might be unwilling to confront it. The others might have every reason to hide their story. “Numbers like 10 percent are not going to naturally present themselves in front of you,” McCone told me. Instead, “you’ll hear from 45 people that they are completely fine.”

    Illustration by Paul Spella / The Atlantic; Getty

    The same factors that stop people from being public about their condition—ignorance, denial, or concerns about stigma—also make them less likely to file for disability benefits. And that process is, to put it mildly, not easy. Applicants need thorough medical documentation; many long-haulers struggle to find doctors who believe their symptoms are real. Even with the right documents, applicants must hack their way through bureaucratic overgrowth, likely while fighting fatigue or brain fog. For these reasons, attempting to measure long COVID through disability claims is a profoundly flawed exercise. Even if people manage to apply, they face an average wait time of seven months and a two-in-three denial rate. McCone took six weeks to put an application together, and, despite having a lawyer and extensive medical documentation, was denied after one day. McCorkell knows many first-wavers—people who’ve had long COVID since March 2020—“who are just getting their approvals now.”

    An alternative source of data comes from the Census Bureau’s Current Population Survey, which simply asks working-age Americans if they have any of six forms of disability. Using that data, Richard Deitz, an economics-research adviser at the Federal Reserve Bank of New York, calculated that about 1.7 million more people now say they do than in mid-2020, reversing a years-long decline. These numbers are lower than expected if one in 10 people who gets COVID really does become a long-hauler, but the survey doesn’t directly capture many of the condition’s most common symptoms, such as fatigue, neurological problems beyond brain fog, and post-exertional malaise, where a patient’s symptoms get dramatically worse after physical or mental exertion. About 900,000 of the newly disabled people are also still working. David Putrino, who leads a long-COVID rehabilitation clinic at Mount Sinai, told me that many of his patients are refused the accommodations required under the Americans With Disabilities Act. Their employers won’t allow them to work remotely or reduce their hours, because, he said, “you look at them and don’t see an obvious disability.”


    Long COVID can also seem bafflingly invisible when people look at it with the wrong tools. For example, a 2022 study by National Institutes of Health researchers compared 104 long-haulers with 85 short-term COVID patients and 120 healthy people and found no differences in measures of heart or lung capacities, cognitive tests, or levels of common biomarkers—bloodstream chemicals that might indicate health problems. This study has been repeatedly used as evidence that long COVID might be fictitious or psychosomatic, but in an accompanying editorial, Aluko Hope, the medical director of Oregon Health and Science University’s long-COVID program, noted that the study exactly mirrors what long-haulers commonly experience: They undergo extensive testing that turns up little and are told, “Everything is normal and nothing is wrong.”

    The better explanation, Putrino told me, is that “cookie-cutter testing” doesn’t work—a problem that long COVID shares with other neglected complex illnesses, such as myalgic encephalomyelitis/chronic-fatigue syndrome and dysautonomia. For example, the NIH study didn’t consider post-exertional malaise, a cardinal symptom of both ME/CFS and long COVID; measuring it requires performing cardiopulmonary tests on two successive days. Most long-haulers also show spiking heart rates when asked to simply stand against a wall for 10 minutes—a sign of problems with their autonomic nervous system. “These things are there if you know where to look,” Putrino told me. “You need to listen to your patients, hear where the virus is affecting them, and test accordingly.”

    Contrary to popular belief, researchers have learned a huge amount about the biochemical basis of long COVID, and have identified several potential biomarkers for the disease. But because long COVID is likely a cluster of overlapping conditions, there might never be a singular blood test that “will tell you if you have long COVID 100 percent of the time,” Putrino said. The best way to grasp the scale of the condition, then, is still to ask people about their symptoms.

    Large attempts to do this have been relatively consistent in their findings: The U.S. Household Pulse Survey estimates that one in 10 people who’ve had COVID currently have long COVID; a large Dutch study put that figure at one in eight. The former study also estimated that 6 percent of American adults are long-haulers; a similar British survey by the Office for National Statistics estimated that 3 percent of the general population is. These cases vary widely in severity, and about one in five long-haulers is barely affected by their symptoms—but the remaining majority very much is. Another one in four long-haulers (or 4 million Americans) has symptoms that severely limit their daily activities. The others might, at best, wake every day feeling as if they haven’t had any rest, or feel trapped in an endless hangover. They might work or socialize when their tidal symptoms ebb, but only by making big compromises: “If I work a full day, I can’t also then make dinner or parent without significant suffering,” JD Davids, who has both long COVID and ME/CFS, told me.

    Some people do recover. A widely cited Israeli study of 1.9 million people used electronic medical records to show that most lingering COVID symptoms “are resolved within a year from diagnosis,” but such data fail to capture the many long-haulers who give up on the medical system precisely because they aren’t getting better or are done with being disbelieved. Other studies that track groups of long-haulers over time have found less rosy results. A French one found that 85 percent of people who had symptoms two months after their infection were still symptomatic after a year. A Scottish team found that 42 percent of its patients had only partially recovered at 18 months, and 6 percent had not recovered at all. The United Kingdom’s national survey shows that 69 percent of people with long COVID have been dealing with symptoms for at least a year, and 41 percent for at least two.

    The most recent data from the U.S. and the U.K. show that the total number of long-haulers has decreased over the past six months, which certainly suggests that people recover in appreciable numbers. But there’s a catch: In the U.K., the number of people who have been sick for more than a year, or who are severely limited by their illness, has gone up. A persistent pool of people is still being pummeled by symptoms—and new long-haulers are still joining the pool. This influx should be slower than ever, because Omicron variants seem to carry a lower risk of triggering long COVID, while vaccines and the drug Paxlovid can lower that risk even further. But though the odds against getting long COVID are now better, more people are taking a gamble, because preventive precautions have been all but abandoned.

    Even if prevalence estimates were a tenth as big, that would still mean more than 1 million Americans are dealing with a chronic illness that they didn’t have three years ago. “When long COVID first came on the scene, everyone told us that once we have the prevalence numbers, we can do something about it,” McCorkell told me. “We got those numbers. Now people say, ‘Well, we don’t believe them. Try again.’”


    To a degree, I sympathize with some of the skepticism regarding long COVID, because the condition challenges our typical sense of what counts as solid evidence. Blood tests, electronic medical records, and disability claims all feel like rigorous lines of objective data. Their limitations become obvious only when you consider what the average long-hauler goes through—and those details are often cast aside because they are “anecdotal” and, by implication, unreliable. This attitude is backwards: The patients’ stories are the ground truth against which all other data must be understood. Gaps between the data and the stories don’t immediately invalidate the latter; they just as likely show the holes in the former.

    Laura Mauldin, the disability sociologist, argues that the U.S. is primed to discount those experiences because the country’s values—exceptionalism, strength, self-reliance—have created what she calls the myth of the able-bodied public. “We cannot accept that our bodies are fallible, or that disability is utterly ordinary and expected,” she told me. “We go to great pains to pretend as though that is not the case.” If we believe that a disabling illness like long COVID is rare or mild, “we protect ourselves from having to look at it.” And looking away is that much easier because chronic illnesses like long COVID are more likely to affect women—“who are more likely to have their symptoms attributed to psychological problems,” Mauldin said—and because the American emphasis on work ethic devalues people who can’t work as much or as hard as their peers.

    Other aspects of long COVID make it hard to grasp. Like other similar, neglected chronic illnesses, it defies a simplistic model of infectious disease in which a pathogen causes a predictable set of easily defined symptoms that alleviate when the bug is destroyed. It challenges our belief in our institutions, because truly contending with what long-haulers go through means acknowledging how poorly the health-care system treats chronically ill patients, how inaccessible social support is to them, and how many callous indignities they suffer at the hands of even those closest to them. Long COVID is a mirror on our society, and the image it reflects is deeply unflattering.

    Most of all, long COVID is a huge impediment to the normalization of COVID. It’s an insistent indicator that the pandemic is not actually over; that policies allowing the coronavirus to spread freely still carry a cost; that improvements such as better indoor ventilation are still wanting; that the public emergency may have been lifted but an emergency still exists; and that millions cannot return to pre-pandemic life. “Everyone wants to say goodbye to COVID,” Duggal told me, “and if long COVID keeps existing and people keep talking about it, COVID doesn’t go away.” The people who still live with COVID are being ignored so that everyone else can live with ignoring it.


    This article originally misstated the name of the bank where Richard Deitz works.

    [ad_2]

    Ed Yong

    Source link

  • Adult ADHD Is the Wild West of Psychiatry

    Adult ADHD Is the Wild West of Psychiatry

    [ad_1]

    In October, when the FDA first announced a shortage of Adderall in America, the agency expected it to resolve quickly. But five months in, the effects of the shortage are still making life tough for people with attention-deficit hyperactivity disorder who rely on the drug. Stories abound of frustrated people going to dozens of pharmacies in search of medication each month, only to come up short every time. Without treatment, students have had a hard time in school, and adults have struggled to keep up at work and maintain relationships. The Adderall shortage has ended, but the widely used generic versions of the drug, known as amphetamine mixed salts, are still scarce.

    A “perfect storm” of factors—manufacturing delays, labor shortages, tight regulations—is to blame for the shortage, David Goodman, an ADHD expert and a psychiatry professor at the Johns Hopkins University School of Medicine, told me. And they have all been compounded by the fact that the pandemic produced a surge in Americans who want Adderall. The most dramatic changes occurred among adults, according to a recent CDC report on stimulant prescriptions, with increases in some age groups of more than 10 percent in just a single year, from 2020 to 2021. It’s the nature of the spike in demand for Adderall—among adults—that has some ADHD experts worried about “whether the demand is legitimate,” Goodman said. It’s possible that at least some of these new Adderall patients, he said, are getting prescriptions they do not need.

    The problem is that America has no standard clinical guidelines for how doctors should diagnose and treat adults with ADHD—a gap the CDC has called a “public health concern.” When people come in wanting help for ADHD, providers have “a lot of choices about what to use and when to use it, and those parameters have implications for good care or bad care,” Craig Surman, a psychiatry professor and an ADHD expert at Harvard and the scientific coordinator of adult-ADHD research at Massachusetts General Hospital, told me. The stimulant shortage will end, but even then, adults with ADHD may not get the care they need.

    For more than 200 years, symptoms related to ADHD—such as difficulty focusing, inability to sit still, and fidgeting—have largely been associated with children and teenagers. Doctors widely assumed that kids would grow out of it eventually. Although symptoms become “evident at a very early period of life,” one Scottish physician wrote in 1798, “what is very fortunate [is that] it is generally diminished with age.” For some people, ADHD symptoms really do get better as they enter adulthood, but for most, symptoms continue. The focus on children persists today in part because of parental pressure. Pediatricians have had to build a child-focused ADHD model, Surman said, because parents come in and say, “What are we going to do with our kid?” As a result, treating children ages 4 to 18 for ADHD is relatively straightforward: Clear-cut clinical guidelines from the American Academy of Pediatrics specify the need for rigorous psychiatric testing that rules out other causes and includes reports about the patient from parents and teachers. Treatment usually involves behavior management and, if necessary, medication.

    But there is no equivalent playbook for adults with ADHD in the U.S.—unlike in other developed nations, including the U.K. and Canada. In fact, the disorder was only recently acknowledged within the field of adult psychiatry. One reason it went overlooked for so long is because ADHD can sometimes look different in kids compared with adults: Physical hyperactivity tends to decrease with age as opposed to, say, emotional or organizational problems. “The recognition that ADHD is a life-span disorder that persists into adulthood in most people has really only happened in the last 20 years,” Margaret Sibley, a psychiatry professor at the University of Washington School of Medicine, told me. And the field of adult psychiatry has been slow to catch up. Adult ADHD was directly addressed for the first time in DSM-5—the American Psychiatric Association’s diagnostic bible—in 2013, but the criteria described there still haven’t been translated into practical instructions for clinicians.

    Addressing adult ADHD isn’t as simple as adapting children’s standards for grown-ups. A key distinction is that the disorder impairs different aspects of an adult’s life: Whereas a pediatrician would investigate ADHD’s impact at school or at home, a provider evaluating an adult might delve into its effects at work or in romantic relationships. Sources of information differ too: Parents and teachers can shed light on a child’s situation, but “you wouldn’t call the parent of a 40-year-old to get their take on whether the person has ADHD,” Sibley said. Providers usually rely instead on self-reporting—which isn’t always accurate. Complicating matters, the symptoms of ADHD tend to be masked by other cognitive issues that arise in adulthood, such as those caused by depression, drug use, thyroid problems, or hormonal shifts, Sibley said: “It’s a tough disorder to diagnose, because there’s no objective test.” The best option is to perform a lengthy psychiatric evaluation, which usually involves reviewing symptoms, performing a medical exam, taking the patient’s history, and assessing the patient using rating scales or checklists, according to the APA.

    Without clinical guidelines or an organizational body to enforce them, there is no pressure to uphold that standard. Virtual forms of ADHD care that proliferated during the pandemic, for example, were rarely conducive to lengthy evaluations. A major telehealth platform that dispensed ADHD prescriptions, Cerebral, has been investigated for sacrificing medical rigor for speedy treatment and customer satisfaction, potentially letting people without ADHD get Adderall for recreational use. In one survey, 97 percent of Cerebral users said they’d received a prescription of some kind. Initial consultations with providers lasted just half an hour, reported The Wall Street Journal; former employees feared that the company’s rampant stimulant-prescribing was fueling an addiction crisis. “It’s impossible to do a comprehensive psychiatric evaluation in 30 minutes,” Goodman said. (Cerebral previously denied wrongdoing and no longer prescribes Adderall or other stimulants.)

    The bigger problem is that too few providers are equipped to do those evaluations in the first place. Because adult ADHD was only recently recognized, most psychiatrists working today received no formal training in treating the disorder. “There’s a shortage of expertise,” Surman said. “It’s a confusing space where, at this point, consumers often are educating providers.” The dearth of trained professionals means that many adults seeking help for ADHD are seen by providers, including primary-care doctors, social workers, and nurse practitioners, who lack the experience to offer it. “It’s a systemic issue,” Sibley said, “not that they’re being negligent.”

    The lack of trained providers opens up the potential for inadequate or even dangerous care. Adderall is just one of many stimulants used to treat ADHD, and choosing the right one for a patient can be challenging—and not all people with ADHD need or want to take them. But even the most well-intentioned health-care professionals may be unprepared to evaluate patients properly. The federal government considers Adderall a highly addictive Schedule II drug, like oxycodone and fentanyl, and the risks of prescribing it unnecessarily are high: Apart from dependency, it can also cause issues such as heart problems, mood changes, anxiety, and depression. Some people with ADHD might be better off with behavioral therapy or drugs that aren’t stimulants. Unfortunately, it can be all too easy for inexperienced providers to start a patient on these drugs and continue treatment. “If I give stimulants to the average person, they’ll say their mood, their thinking, and their energy are better,” Goodman said. “It’s very important not to make a diagnosis based on the response to stimulant medication.” But the uptick in adults receiving prescriptions for those drugs since at least 2016 is a sign that this might be happening.

    The fact that adult ADHD is surging may soon lead to change. Last year, the American Professional Society of ADHD and Related Disorders began drafting the long-needed guidelines. The organization’s goal is to standardize care and treatment for adult ADHD across the country, said Goodman, who is APSARD’s treasurer. Establishing standards could have “broad, sweeping implications” beyond patient care, he added: Their existence could compel more medical schools to teach about adult ADHD, persuade insurance companies to cover treatment, and pressure lawmakers to include it in workplace policies.

    A way out of this mess, however long overdue, is only going to become even more necessary. Nearly 5 percent of adults are thought to have the disorder, but less than 20 percent of them have been diagnosed or have received treatment (compared with about 77 percent of children). “You have a much larger market of recognized and untreated adults, and that will continue to increase,” Goodman said. Women—who, like girls, are historically underdiagnosed—will likely make up a substantial share. Adults with ADHD may have suffered in silence in the past, but a growing awareness of the disorder, made possible by ongoing destigmatization, will continue to boost the ranks of people who want help. On social media, ADHD influencers abound, as do dedicated podcasts on Spotify.

    Until guidelines are published—and embedded into medical practice—the adult-ADHD landscape will remain chaotic. Some people will continue to get Adderall prescriptions they don’t need, and others may be unable to get an Adderall prescription they do need. Rules alone couldn’t have prevented the shortage, and they won’t stop it now. But in more ways than one, their absence means that many people who need help for ADHD are unable to receive it.

    [ad_2]

    Yasmin Tayag

    Source link

  • Seltzer Is Torture

    Seltzer Is Torture

    [ad_1]

    I do not like carbonated beverages, plain and simple. I won’t drink soda, and you’ll never catch me with a beer. Gin and tonics are a no. Sparkling water? A beast in disguise. Oh, the cocktail is not that fizzy, you say? I’ve heard that one before. And get your slushie out of my face. As I said, I do not like carbonated beverages. I do not like them at all.

    I don’t just mean that they taste bad to me, the way soap or penicillin does. I mean that they hurt me. They inflict actual, physical pain on my mouth. The sensation is prickly, like having my tongue poked with hundreds of needles. On the handful of foolhardy occasions when I’ve dared take a sip of Coke, it’s felt like what I imagine sipping static electricity would feel like, at least until the pain subsides and I’m left with nothing but the hyper-saturated sweetness of a melted freezer pop. Even after I swallow, my mouth feels raw.

    When I try to explain this aversion, people sometimes struggle to wrap their mind around it. “Even sparkling cider?” they ask incredulously. “Even cream soda?” Yes, even sparkling cider. Yes, even cream soda. Occasionally, people try to relate: “Oh, I hate carbonation tooexcept in champagne.” Whatever these people mean by “hate” is clearly not the same thing I mean. The specifics of the drink make no difference to me. The carbonation itself is the problem.

    Part of me wonders whether this all traces back to an incident from my childhood. When I was 6 or 7 years old, I accidentally ate a piece of sushi covered in more wasabi than I’d bargained for and, in a panic, took a big gulp of water—except the water wasn’t water; it was seltzer, and I spit it all over the table. A couple of years later, I tried root beer at day camp and spat that out too. By that point, I’d pretty much learned my lesson.

    So why am I like this? It’s not as though my mouth is hypersensitive to all tastes and sensations. I pop Sour Skittles at the movies and have a pretty high spice tolerance. My issue is more specific and, given that Americans consume more than 40 gallons of soda a person each year, very rare. But apparently I’m not the only one: On Reddit’s r/unpopularopinion forum and others like it, never-fizzers find common cause. Drinking carbonated beverages is “kinda masochist.” It’s “pure agony.” It’s like “swallowing battery acid.” “I feel like I’m drinking flesh eating bacteria,” one Redditor writes. “I swear I thought I was the only one who thinks they hurt,” another replies.

    You can find dozens of posts like these online—so many, in fact, that you may begin to wonder: How many times can an unpopular opinion be posted before it ceases to qualify as an unpopular opinion? Scientists, for their part, have documented at least one instance of an anaphylactic reaction to sparkling water. That reaction was not caused by the bubbles themselves, but neither is carbonation’s distinctive mouthfeel. For a long time, people assumed that the fizzy sensation was just the tactile experience of having bubbles pop inside your mouth. Early suspicions to the contrary came from mountaineers, who reported that when they raised a toast at the summit, their bubbly champagne tasted flat. In 2013, researchers confirmed that the “bite” of carbonation is not dependent on bubbles: Even after drinking sparkling water in a pressure chamber, where bubbles cannot form, test subjects still reported feeling the slight “sting, burn, or pungency” associated with fizzy drinks, both on the tip of their tongue and at the back of their throat.

    The source of that bite, scientists determined, is the carbonic acid formed when enzymes in the mouth break down carbon dioxide. (That process happens to be inhibited by a medication commonly taken by mountaineers to stave off altitude sickness.) The acid activates pain receptors, Earl Carstens, a neurobiologist at UC Davis, told me, so the experience of drinking a carbonated beverage should be sharp and irritating for everyone. In that sense, the weird thing is not that some people hate carbonation; it’s that anyone likes it at all. Social conditioning may play a role: We accept the pain of drinking soda because we’re taught that it’s okay. Or perhaps the mild pain is associated with a pleasurable release of endorphins, as can occur when people eat a spicy food. Both of those factors are likely in play, Carstens said.

    But as my experience shows, not everyone experiences carbonic-acid pain the same way. Some people feel a refreshing tickle, others a chemical assault. No one knows why. Scientists have traced other aversions—to cilantro, for example, or tannic wines—to natural variations in human taste and smell receptors. “We are not at the same place in our knowledge of carbonation,” Emily Liman, a neurobiologist at the University of Southern California, told me. The problem faced by sodaphobes may yet turn out to have a genetic explanation, but for the moment, scientists don’t even understand exactly which cells are involved in the sensation. Pain receptors (such as the ones that detect spiciness) and taste cells (such as the ones that detect sourness) seem to play a part in feeling carbonation, Liman said, but it’s unclear exactly which cells contribute.

    In short, there’s no way to know whether I’m the victim of busted mouth biology, or of some long-repressed experience that bubbles up as oral pain, or of something else entirely. In any case, hating carbonation only means that I have to do a lot of polite declining. It’s not a huge deal, yet I sometimes find myself perturbed to to be cut off from a whole sector of human experience, to dislike something that almost everyone else seems to like, and to dislike it not because of some contrarian impulse or principled objection but because of my physiology or my psychology. Best not to indulge such musings, though—they can easily give way to temptation. Last summer, after years of strict avoidance, I ordered a cider at a bar, thinking that maybe, after all these years, something had changed. Nope!

    [ad_2]

    Jacob Stern

    Source link

  • Quit Your Bucket List

    Quit Your Bucket List

    [ad_1]

    Years ago, just after I finished my psychiatry residency, a beloved supervisor called to say she had some bad news. At a routine checkup, she had glanced at her chest X-ray up on the viewing box while waiting for her doctor to come into the room. She was a trauma surgeon before becoming a psychiatrist and had spent years reading chest X-rays, so she knew that the coin-size lesion she saw in her lung was almost certainly cancer, given her long history of smoking.

    We had dinner soon after. She was still more than two years away from the end of her life and felt physically fine—vital, even. That’s why I was so surprised when she said she had no desire to spend whatever time she had left on exotic travel or other new adventures. She wanted her husband, her friends, her family, dinner parties, and the great outdoors. “Just more Long Island sunsets. I don’t need Bali,” she told me.

    At the end of life, you might expect people to feel regret for all the things they wanted to do and never made time for. But I have yet to know a patient or friend who, facing the blunt fact of their own mortality, had anything close to a bucket list. This squares with some recent research that shows that people tend to prefer familiar experiences more when they are reminded that their days are limited. The people I know even regretted the novelty they’d chased along the way, whether it was recreational-drug use or dating exciting people who they knew weren’t relationship material.

    Deathbed pronouncements can have limited applications for the rest of life, but this pattern suggests that novelty is perhaps overrated. Chasing the high of new sensations simply isn’t appealing for many people, and can sometimes even be bad for our health. I suspect that’s because, too often, the pursuit of novelty requires sacrificing the things we already know we love.

    It’s a common misconception that people who don’t have a taste for the newest, sexiest experience are dull, incurious, and unimaginative. A 2002 study found that people will switch away from their favorite, habitual choices when they know others are watching in order to avoid being judged as narrow-minded. And yet, Warren Buffett notoriously eats breakfast at the same fast-food restaurant every day and sticks to a strict work schedule. Taylor Swift’s music can be redundant and predictable. Barack Obama is famous for his strict morning exercise regime and daily reading time.

    Even when they’re not facing death, many people just don’t seem to like novelty that much. In 2017, a poll by a British soup company found that 77 percent of U.K. workers had consumed the exact same lunch every day for nine months and that one in six people had done so for at least two years. You might think it’s just a matter of convenience or economic exigency (the study didn’t say), but I’m not so sure; wealthy people I know partake in similar behavior, even if they do it at a fancy restaurant. Consider, too, that when people lose a pet, many run out and get a replacement of the same breed with a similar temperament. They repeatedly date people with the same quirks and problems. They return to a favorite vacation spot. They listen to the same musical artists and styles time and again.

    Research shows that humans have an intrinsic preference for things and people they are familiar with, something called the mere exposure effect. Several studies have shown that people who listen to unfamiliar songs repeatedly grow fonder of the songs they hear most  by the end of the experiment, even if they did not initially like them very much. You don’t even have to be aware that you’re growing used to something for the effect to work.

    This tendency toward repetition may seem natural, even lazy, but it runs counter to much of our history. We, along with other animals, evolved to be exquisitely sensitive to novel experiences. Way back in the Paleolithic era, there was a clear survival advantage to being attuned to new situations, which could lead someone to a potential mate or a piece of mastodon, or reveal a deadly threat. Nowadays, though, with every conceivable reward—food, sex, drugs, emotional validation, you name it—either a click, tap, or ChatGPT query away, conventional novelty-seeking has lost much of its adaptive advantage.

    As Arthur Brooks has written in The Atlantic, novelty can be fun and exciting. New and unexpected experiences activate the brain’s reward pathway more powerfully than familiar ones, leading to greater dopamine release and a more intense sense of pleasure. But on its own, excitement won’t bring about enduring happiness. Human beings habituate rapidly to what is new. To achieve a lifetime of stimulation, you would have to embark on an endless search for the unfamiliar, which would inevitably lead to disappointment. Worse, the unfettered pursuit of novelty can lead to harm through excessive thrill-seeking—including antisocial behavior such as reckless driving—particularly when the novelty seeker has poor impulse control and a disregard for others.

    There’s a better way. Research shows that when novelty-seeking is paired with persistence, people are far more likely to be happy, probably because they are able to achieve something meaningful. You might, for example, take a variety of courses in college or try different summer internships if you’re not yet sure what interests you. When one really clicks, you should explore it in depth; it might even become a lifelong passion. This principle relates to less consequential pleasures, too: If you’re checking out a new neighborhood joint, consider ordering different things during your first few visits, then picking your favorite and sticking with it.

    Novelty-seeking is most valuable when you use it as a tool to discover the things and people you love—and once you find them, go deep and long with those experiences and relationships. The siren call that tells you there might be a new and better version of what you already have is likely an illusion, driven by your brain’s relentless reward pathway. When in doubt, pick a beloved activity over an unfamiliar one.

    This golden rule of novelty may help explain why some people at the end of their life regret having spent so much time exploring new things, even if they once brought fleeting pleasure. Age, too, might partly explain this feeling, because older people tend to be less open to new experiences. But that’s probably not the whole story. My colleagues who treat children and adolescents have mentioned that, in the face of life-threatening diagnoses, even young people prefer the familiar. They do so not only because the familiar is known and safe, but because it is more meaningful to them. After all, things become familiar to us because we choose them repeatedly—and we do that because they are deeply rewarding.

    Imagine, just for a moment, that your death is near. What might you miss out on if you put your bucket list on hold? Sure, you won’t make it to Bali or Antarctica. But maybe instead you could fit in one last baseball game with your kids, one last swim in the ocean, one last movie with your beloved, one last Long Island sunset. If you prioritize the activities and people you already love, you won’t reach the end of your life wishing you’d made more time for them.

    [ad_2]

    Richard A. Friedman

    Source link

  • Ozempic Is About to Be Old News

    Ozempic Is About to Be Old News

    [ad_1]

    All of a sudden, Ozempic is everywhere. The weight-loss drug that it contains, semaglutide, is a potent treatment for obesity, and Hollywood and TikTok celebrities have turned it into a sensation. In just a few months, the medication has been branded as “revolutionary” and “game-changing,” with the power to permanently alter society’s conceptions of fatness and thinness. Certainly, a drug like semaglutide could be all of those things: Never in the history of medicine has one so safely led to such dramatic weight loss in so many people.

    But let’s not get ahead of ourselves. As weight-loss medications go, Ozempic is far from perfect. Though the drug has profound impacts, it requires weekly injections, a tolerance for uncomfortable side effects, and the stamina—not to mention the budget—for long-term treatment. (Ozempic has somehow become a catchall term for semaglutide but technically that product has gotten FDA sign-off only as a diabetes medication. A larger dose of semaglutide, marketed as Wegovy, has been approved for weight loss.)

    Made by the Danish drugmaker Novo Nordisk, semaglutide dominates the U.S. weight-loss market right now, but its reign might be short-lived. The colossal demand for these drugs has spurred a competition in the pharmaceutical industry to develop even more potent and powerful medications. The first of them could become available as soon as this summer. For all its hype, semaglutide is the stepping stone and not the final destination of a new class of obesity drugs. Just how good they get, and how quickly, will go a long way in determining whether this pharmaceutical revolution actually meets its full promise.

    In a sense, semaglutide hardly represents a major step forward in science. Diet drugs are nothing new, and even the category of pharmaceuticals that these new products belong to, called “GLP-1 agonists,” has been around for several years. These drugs mimic the hormone GLP-1 (glucagon-like peptide one) and bind to its receptor in the body. This triggers a sense of fullness associated with having just eaten, and also slows the release of food from the stomach. (It also increases insulin secretion, keeping blood sugar in check, which is why Ozempic is still intended as a diabetes drug.) Already, these pharmaceuticals have gotten better over time: A daily injection called liraglutide and sold as Saxenda, which was approved by the FDA in 2014 for obesity, leads to the loss of 5 to 10 percent of a person’s body weight in most cases. But one reason semaglutide took off in a way that liraglutide didn’t is that it can lead to weight loss of up to 20 percent. “Now you have a shot that’s once a week instead of every day, you’re making dramatic improvements, and people notice more,” Angela Fitch, the president of the Obesity Medicine Association and the chief medical officer of the obesity-care start-up Knownwell, told me.

    But not everyone who takes these drugs can achieve that level of weight loss. More than 60 percent of those on Wegovy experience smaller changes, in part because the drug can’t account for the complex drivers of obesity that aren’t related to food. The next generation of drugs is reaching for more. The first leap forward is Mounjaro, known generically as tirzepatide, a diabetes drug from Eli Lilly that the FDA is expected to approve for weight loss this year. In one study, it led to 20 percent or more weight loss in up to 57 percent of people who took the highest dose; The Wall Street Journal recently called it the “King Kong” of weight-loss drugs. People on Mounjaro tend to lose more weight more quickly and generally have a “better experience” than those on Wegovy, Keith Tapper, a biotech analyst at BMO Capital Markets, told me. It’s also cheaper, though by no means cheap, at roughly $980 for the highest-dose option, he said; a dose of Wegovy costs about $1,350.

    These leaps in potency are happening on the molecular level. Like semaglutide, Mounjaro mimics the effects of GLP-1, but it also hits receptors for another hormone—GIP. That leads to even more weight loss by further attenuating focus on food and potentially also increasing the activity of a fat-burning enzyme, said Tapper. So-called dual-agonist drugs “offer a step change” in both weight loss and blood-sugar control, he added.

    And why stop at two receptors when so many others are involved in regulating hunger? “This area is exploding in terms of research and testing different combinations of hormones,” which are still poorly understood, Shauna Levy, a professor specializing in bariatric surgery at Tulane University School of Medicine, told me. Eli Lilly has another drug in the works that targets three receptors; one from the drugmaker Amgen works by “putting the brakes” on the GIP receptor and “putting the gas” on GLP-1’s, a company spokesperson told me. Several other companies have already joined what some have dubbed a “race” to develop the next great obesity drug, in which Lilly, Pfizer, Amgen, Structure Therapeutics, and Viking Therapeutics are expected to be the front-runners, said Tapper.

    The potency of weight-less drugs is not the only factor that will determine the shape of their future trajectory. Wegovy and Mounjaro injections are tolerable for most people, but they are less convenient than a pill. Making oral versions of these drugs isn’t as easy as packing everything into a capsule, though. Semaglutide is a molecule that gets chewed up in the stomach. For this reason, the semaglutide pill Rybelsus, which is already approved for diabetes, leads to far less dramatic weight loss than its injectable kin. But drugmakers are undeterred by this complication, because a pill even more powerful than semaglutide would no doubt have many customers. In January, Pfizer’s CEO Albert Bourla said that an oral weight-loss drug “unlocks the market,” which he estimated could eventually be worth $90 billion. Pfizer doesn’t have any weight-loss drugs yet but is developing a twice-daily GLP-1 agonist pill; Eli Lilly also has an oral version in the works. Tapper expects those drugs to become available in 2026, and a similar offering from Structure Therapeutics is likely to follow the next year.

    Drugmakers will also likely vie to create drugs with fewer side effects. Novo Nordisk notes that gastrointestinal issues are common with semaglutide; accounts of horrible nausea, constipation, and vomiting have proliferated online. As one actor put it to New York Magazine, people on Ozempic are “shitting their brains out.” With Wegovy, more serious issues, such as pancreatitis, thyroid cancer, and kidney failure, are also possible but are considered rare. Although nothing to scoff at, side effects tend to subside with prolonged treatment and can usually be managed with help from a doctor, said both Fitch and Levy, who regularly prescribe semaglutide to patients with obesity. It’s possible, Levy added, that people experiencing really terrible effects may be getting their drugs from shady compounding pharmacies or even from other countries.

    The fact that people are turning to sketchy outlets to get weight-loss drugs underscores the biggest issue with them: access. Medicare and most private insurance companies don’t cover anti-obesity drugs. (Such drugs are classified as “cosmetic” by the Centers for Medicare and Medicaid Services, and thus don’t qualify for coverage.) “I am hopeful that the price will come down with more competition,” Fitch told me. But there’s no guarantee that will happen: Competition typically makes a product cheaper over time, but research suggests that isn’t always the case in pharmaceuticals. Even if the drugs do become cheaper, they may not become cheap enough. The oral forms of these drugs, some of which could be available by 2026, are expected to cost about $500 a month, Tapper said. By 2030, the cost of obesity drugs could come down to about $350 a month, according to a recent Morgan Stanley analysis, which would still be out of reach for many Americans.

    Levy estimates that the next five years will bring about a “huge explosion” of next-gen obesity drugs. In that case, the market will likely expand to accommodate a variety of drugs with different price points and efficacies. Some people may aim to lose 20 or more percent of their body weight; some may be content with less. The market is so diverse that it will likely “support a broad range of options,” said Tapper, such as cheaper, lower-dose oral drugs for people who have milder medical issues, and more expensive injectables for those with more severe medical concerns. That opens up the possibility that medically mediated weight loss could soon be an option for a far greater proportion of people.

    Regardless of how much these drugs’ costs may decrease, they will always add up if people are paying out of pocket for them. They are meant to be taken long term: Once a person stops taking Wegovy, the weight tends to come right back. The current crop of weight-loss medications are essentially maintenance drugs, much like the cholesterol-busting drug Lipitor, which is taken daily to treat long-term disease. But Lipitor, unlike obesity drugs, is generally covered by insurance. Unless obesity drugs receive the same kind of coverage, no level of improvement will lead them to deliver on what Ozempic is promising us now.

    [ad_2]

    Yasmin Tayag

    Source link

  • The Woolly-Mammoth Meatball Is an All-Time Great Food Stunt

    The Woolly-Mammoth Meatball Is an All-Time Great Food Stunt

    [ad_1]

    On Tuesday, two men at a museum in the Netherlands lifted a black sheet off a table to reveal a cantaloupe-size globe of overcooked meat perspiring under a bell jar. This was no ordinary spaghetti topper: It was a woolly-mammoth meatball, created by an Australian lab-grown-meat company called Vow.

    The meatball, made using real mammoth DNA, supposedly smelled like cooked crocodile meat, and in press photos, it looked oddly furry, like it had been coughed up by a cat or rolled around by a dung beetle. Still, meat from a long-extinct behemoth that lived during the Ice Age—how could I not want to try it? Although some on Twitter were clearly grossed out, many others were also intrigued. “Bet it tastes better than Ikeas,” one user wrote.

    Disappointingly, the meatball was not made for consumption. Because it contains proteins that haven’t been eaten in thousands of years, the scientists who made it aren’t sure it would be safe. It was a marketing ploy cooked up by a creative agency that worked with Vow. I eventually realized that I wanted the meatball for the same reasons I wanted the Doritos Locos Taco, KFC’s Double Down Sandwich, and Van Leeuwen’s ranch-flavored ice cream: sheer, dumb novelty. This was stunt marketing 101 applied to the future of food, and I was the sucker falling for it.

    Food marketers have made an art of using stunt foods to draw attention to brands and court new audiences. Starbucks’s unhinged Unicorn Frappuccino begged to be Instagrammed; Buffalo Wild Wings chicken coated with Mountain Dew–infused sauce pandered to anyone who has ever experienced the late-night munchies. Typically unexpected, funny, or edgy, stunt foods are “pure marketing,” Mark Lang, a marketing professor at the University of Tampa, told me. They work because they’re bonkers enough to break through the noise of social media and get people talking, he said. But so far, they have caught our attention by twisting familiar items. Lab-grown meat, and all the permutations of protein it makes possible, is pushing us into a new era of stunt marketing, one involving foods people may have never tried.

    George Pappou, Vow’s CEO and founder, told me that the meatball was meant to “start a conversation about the food that we’re going to eat tomorrow being different from the food that we eat today.” Although the stunt drew attention toward Vow—I am writing this, and you are reading this, after all—the company doesn’t have any products on the market yet, only plans to introduce lab-made Japanese quail to diners in Singapore later this year. So what did it accomplish, exactly? “I don’t think of this one so much as a stunt as a demonstration,” Lang said. “It’s an exaggeration of the physical capabilities of new science.”

    Because lab-grown meat is still meat, just without animal husbandry and slaughter, it’s often held up as the future of sustainable, ethical carnivory. Beef or chicken made in this way probably won’t be widely available at your grocery store anytime soon, but according to an estimate by McKinsey, the industry as a whole could be worth $25 billion by 2030. Lab-grown meat—or “cultivated” meat, as the industry likes to call it—is made by growing animal cells in a large tank until they form a sizable lump of tissue. Then it’s seasoned and processed in much the same way as conventional meat, forming foods such as patties, nuggets, and meatballs. Vow’s meatball was grown from sheep cells that were engineered to contain a short mammoth DNA sequence, sourced from publicly available data. As a result, the cells produced the mammoth version of myoglobin, a protein that contributes to the metallic, “meaty” taste of muscle.

    Theoretically, this process can be used to create meat from any animal whose cells are readily available or whose DNA has been sequenced. Think of DNA as essentially an IKEA manual for building tissue. Even animals whose sequences are incomplete can be partly resurrected: Gaps in the woolly-mammoth DNA were filled in using sequences from elephants, like using Billy-bookcase instructions to build a Kallax shelf. Growing the mammoth meat, in a relatively small amount, was “ridiculously easy and fast,” Ernst Wolvetang, a scientist who worked with Vow, told the Guardian. The same could eventually be said of any type of cultivated meat if the industry can surmount the significant cost and efficiency-related challenges involved in scaling up.

    Imagine the stunts that could be possible then: nuggets for every dinosaur in Jurassic Park, even human meatballs. Already, a few companies besides Vow are pursuing more exotic fare: The New York–based Primeval Foods plans to release cultivated lion burgers, ground meat, and sausages, followed by meat from giraffes and zebras, founder and CEO Yilmaz Bora told me. Diners are always looking for something new, so food “must go beyond the current beef, chicken, and pork dishes and come without the expense of nature and animals,” he said.

    Using stunt marketing to raise awareness about the potential of cultivated meat isn’t a guarantee that people will want to eat those products if they ever become widely available. Sometimes the creations are too gross to even consider seriously, such as Hellmann’s “mayo-nog” or Oscar Mayer’s “cold dogs,” which were, uh, hot-dog-flavored ice-cream weiners on a stick. Yet unlike these stunts, people don’t have the same frame of reference for a meatball made of cultivated mammoth meat. “The risk is that it’s off-putting,” Michael Cohen, a marketing professor at NYU, told me. Or enticing.

    If the mammoth meatball made you think They can do that?, then perhaps it will have done some good. If not, then it was, at the very least, a valid attempt to engage with the science. “The meatball thing was a very well-crafted marketing activity for a product”—lab-grown meat as a category—“that I think is going to have very low adoption,” Lang said. A majority of Americans have “food neophobia,” a reluctance to adopt new foods, he said; many don’t even eat seafood. Still, in the past five months, the FDA granted its first two approvals to lab-grown chicken products, clearing a regulatory pathway for even more cultivated goods. If the technology is ever able to scale, perhaps foods like mammoth meatballs will no longer be seen as a stunt. Eventually, they might just be dinner.

    [ad_2]

    Yasmin Tayag

    Source link

  • Can Gravity Make People Sick?

    Can Gravity Make People Sick?

    [ad_1]

    Bad things happen to a human body in zero gravity. Just look at what happens to astronauts who spend time in orbit: Bones disintegrate. Muscles weaken. So does immunity. “When you go up into space,” says Saïd Mekari, who studies exercise physiology at the University of Sherbrooke, in Canada, “it’s an accelerated model of aging.” Earthbound experiments mimicking weightlessness have revealed similar effects. In the 1970s, Russian scientists immersed volunteers in bathtubs covered in a large sheet of waterproof fabric, enabling them to float without being wet. In some of these studies, which lasted up to 56 days, subjects developed serious heart problems and struggled to control their posture and leg movements.

    Weightlessness hurts us because our bodies are fine-tuned to gravity as we experience it here on Earth. It tugs at us from birth to death, and still our intestines stay firmly coiled in their stack, blood flows upward, and our spine is capable of holding up our head. Unnatural contortions can throw things off: People have died from hanging upside down for too long. But as a general rule, the constant push of g-force on our body is a part of life that we rarely notice.

    Or at least, that’s what scientists have always thought. But there is another possibility: that gravity itself is making some people sick. A new, peer-reviewed theory suggests that the body’s relationship with gravity can go haywire, causing a disorder that has long been a troubling mystery: irritable bowel syndrome.

    This is a rogue idea that is far from widely accepted, though one that at least some experts say can’t be dismissed outright. IBS is a very common ailment, affecting up to an estimated 15 percent of people in the United States, and the symptoms can be brutal. People who have IBS experience abdominal pain and gas, feel bloated, and often have diarrhea, constipation, or both. But no exact cause of IBS has been pinned down. There’s evidence behind many competing theories, such as early-life stress, diet, and even gut infections, but none have emerged as the sole explanation. That is a problem for patients—it’s difficult to treat a condition when you don’t know what to target.

    Brennan Spiegel, a gastroenterologist at Cedars-Sinai Medical Center, in Los Angeles, has a different idea: People with IBS are hypersensitive to gravity as a result of any number of factors—stress, weight gain, a change in the gut microbiome, bad sleep patterns, or another behavior or injury. The idea came to him after watching a relative confined to a nursing-home bed develop classic symptoms of IBS. “We’re upright organisms,” he told me. “We’re not really supposed to be lying flat for that long.” The hypothesis, published late last year in The American Journal of Gastroenterology, is just that, a hypothesis. Spiegel hasn’t conducted any experiments or patient surveys that point to a “mismatch” in our body’s reaction to gravity as the cause of IBS, though the mechanics are all based in firm science. But part of what makes the theory so alluring is that it might encompass all of the other conventional explanations for the disease. “It’s meant to be a new way of thinking about old ideas,” he said.

    So exactly how would someone’s relationship with gravity get off-kilter? Consider serotonin, a chemical that carries messages from the brain to the body. Spiegel sees serotonin as an “anti-gravity substance” because of the role it plays in so many important bodily functions influenced by g-force, such as blood flow. Serotonin can cause blood vessels to narrow, slowing circulation. It can make certain muscles contract or relax. It’s also crucial to digestion, helping with bowel function, getting rid of irritating foods, and regulating how much we eat. Without serotonin, gravity would turn our intestines into a “flaccid sac,” Spiegel writes. Because 95 percent of the body’s serotonin is produced in the gut, if levels spike or plummet from factors such as stress, then the chemical’s possible handling of gravity would be thrown into chaos, affecting digestion. The result, he theorizes, is IBS.

    Other parts of our body that respond to gravity can also be in on the problem. We are hardwired to react negatively to situations in which the pull of gravity might harm us; walk to the edge of a cliff and your body will tell you something. The amygdala in our brain is key to fear responses, and stress of various kinds can cause it to go into overdrive. Spiegel thinks that when stress taxes the amygdala, a person begins overreacting to potential threats, including from gravity. The digestive issues that make up IBS are a manifestation of that overreaction. Sure enough, people with IBS have been shown to have a hyperactive amygdala.

    That is hardly anything close to proof. The thought that this painful and prolonged condition could be a gravity disorder is a major stretch, relying on a renegade interpretation of basic biology. “People just think I’m crazy,” Spiegel said. Many of his fellow doctors are not sold on the idea. The gravity hypothesis is another in a long parade of unconvincing theories about IBS, Emeran Mayer, a gastroenterologist at UCLA, told me. He’s heard them all: “It doesn’t exist; it’s a hysterical trait of neurotic housewives; it’s abnormal electrical activity in the colon.” He added, “I don’t think there’s any other disease that has gone through these peaks of attention-grabbing new theories.”

    Spiegel’s idea has clear holes. If a faulty reaction to gravity triggers IBS, says David C. Kunkel, a gastroenterologist at UC San Diego, then you would expect to see higher rates of IBS among populations living at sea level versus at high altitudes, where g-force is slightly weaker. But that doesn’t seem to be the case: About a quarter of Peruvians live high in the mountains and most Icelanders live at sea level, yet both countries have high rates of IBS. Likewise, IBS rates appear to decrease with age, “which would not be expected if the disease was caused by a constant gravitational force,” Kunkel told me.

    Spiegel is aware that the gravity hypothesis has little support in the field and no proof. But the gravity hypothesis has some logic behind it. The fact that the weightlessness of space travel can drastically change the body lends credence to the idea that other shifts in our relationship to gravity could do the same, says Declan McCole, a biomedical scientist at UC Riverside.

    And the gut may be particularly sensitive to gravity changes. McCole has found that weightlessness made epithelial cells—which line the gut and stop invaders from entering the body—easier to evade. So if our internal chemistry can change in a way that makes us hypersensitive to gravity, then, to McCole, it stands to reason that such a shift could hit the gut hard. He’s less sure of whether that hypersensitivity exists. If it does, then why haven’t we identified any chemicals that help handle gravity, as we have for fear or sex drive or hunger? That molecule may indeed turn out to be serotonin, but right now there’s no proof.

    The gravity hypothesis really matters only if it is meaningful for people with IBS. And that’s not guaranteed. Tying the very real pain of IBS to such a fantastical idea may seem closer to mythology than medicine, leaving patients feeling dismissed or belittled. Or they may throw up their hands in despair and prepare for a lifetime of pain: If the immovable force of gravity is the enemy, then why bother fighting?

    But if there is some truth to it, then the hypothesis could also provide a possible starting place for treatments. Some of Spiegel’s suggestions are already common, such as weight loss and medications that decrease serotonin, but he also advocates for some gravity-specific therapies. “I do talk about it with my patients,” Spiegel said. “I recommend certain yoga poses; I recommend tilt tables.” People who have IBS may balk at his more radical ideas, such as moving to a higher altitude or farther from the equator.

    The gravity hypothesis may never be anything more than a hypothesis. We have a long way to go before truly knowing whether the human body can develop a hypersensitivity to gravity that can make us ill, or whether some of us are better equipped to handle gravity than others. But the weight of evidence is enough to make us think twice before ignoring the idea that our body’s relationship to gravity can go awry—including for those of us not coping with IBS. If gravity might contribute to IBS, why not other ailments too? And then, why can’t it also be harnessed for good? Mekari and his colleagues recently found that lying at a six-degree downward angle sped up response times to cognition tests—pointing to a possible link between gravity and executive functioning. Antigravity treadmills, which help astronauts prepare for weightlessness, are being studied for the treatment of cerebral palsy, Parkinson’s disease, and sports injuries.

    All of these unknowns about gravity can feel haunting. Life on Earth has changed a lot since its first forms appeared about 4 billion years ago, but through it all, gravity has seemingly remained constant—perhaps the single thing that connects every organism that has ever lived. What if there’s still much we have to learn about what it’s doing to us? After all, right now your body is coping with gravity, just as it has been for every other second of your life. Perhaps it would be weirder if gravity wasn’t doing anything to us over time. “Every fiber in our body is straining to manage this force,” Spiegel said. You don’t need to spend 56 days in a bathtub to figure that out.

    [ad_2]

    Jessica Wapner

    Source link

  • Milk Has Lost All Meaning

    Milk Has Lost All Meaning

    [ad_1]

    You overhear a lot of strange things in coffee shops, but an order for an “almond-based dairy-alternative cappuccino” is not one of them. Ditto a “soy-beverage macchiato” or an “oat-drink latte.” Vocalizing such a request elicited a confidence-hollowing glare from my barista when I recently attempted this stunt in a New York City café. To most people, plant-based milk is plant-based milk.

    But though the American public has embraced this naming convention, the dairy industry has not. For more than a decade, companies have sought to convince the FDA that plant-based products shouldn’t be able to use the M-word. An early skirmish played out in 2008 over the name “soy milk,” which, the FDA acknowledged at the time, wasn’t exactly milk; a decade later, then-FDA Commissioner Scott Gottlieb pointed out that nut milk shouldn’t be called “milk” because “an almond doesn’t lactate.” To be safe, some fake-milk products have stuck to vaguer labels such as “drink,” “beverage,” and “dairy alternative.”

    But a few weeks ago, the FDA signaled an end to the debate by proposing long-awaited naming recommendations: Plant-based milk, the agency said, could be called “milk” if its plant origin was clearly identified (for example, “pistachio milk”). In addition, labels could clearly state how the product differs nutritionally from regular milk. A package labeled “rice milk” would be acceptable, but it should note when the product has less calcium or vitamin D than milk.

    Rather than prompt a détente, these recommendations are sucking milk into an existential crisis. Differentiating plant-based milk and milk requires defining what milk actually is, but doing so is at odds with the acknowledgement that plant-based milk is milk. It is impossible to compare plant-based and cow’s milk if there isn’t a standard nutrient content for cow’s milk, which comes in a range of formulations. This awkward moment is the culmination of a decades-long shift in the way the FDA—and consumers—have come to think about and define food in general. At this point, it’s unclear what milk is anymore.

    Technically, milk has an official definition, together with more than 250 other foods, including ketchup and peanut butter. In 1973, the FDA came up with this: “The lacteal secretion, practically free from colostrum, obtained by the complete milking of one or more healthy cows.” (Yum.) The recent guidance doesn’t override this definition but doesn’t uphold it either, so milk’s status remains vague. The agency doesn’t seem to mind; consumers understand that plant-based milk isn’t dairy milk, a spokesperson told me. But the FDA has long allowed for loose interpretations of this standard, which is why the lacteal secretions of sheep and goats can be called “milk.” As time goes on, what can be called “milk” seems to matter less and less.

    At one point, names mattered. In the late 1800s, people began to worry that their food was no longer “normal and natural and pure,” Xaq Frohlich, a food historian at Auburn University who is writing a book on the history of the FDA’s food standards, told me. As food production scaled up in the late 19th century, so did attempts to cut corners with cheap products parading as the real thing, such as margarine made with beef tallow. In 1939, the FDA began establishing so-called standards of identity based on traditional ideas of food.

    But the agency’s food definitions were malleable even before oat milk. The agency hasn’t been very strict about standards of identity, because consumers haven’t either. Around the 1960s, as people became aware of the ills of animal fat and cholesterol—and purchased the low-fat and diet foods that proliferated in response—the agency moved away from defining the identity of food toward a policy of “informative labeling” that provided nutritional information directly on the package so consumers knew exactly what they were eating. It became accepted that food was something that could be “tinkered with,” Frohlich said, and what mattered more than whether something was natural was whether it was healthy. In the midst of this change, milk was assigned its official identity, which came with caveats for added vitamins. Loosely interpreted, “milk” soon came to encompass that of other ruminants, as well as chocolate, strawberry, skim, lactose-free, and calcium-fortified stuff.

    In this context, the FDA’s recent expansion of this standard to accommodate plant-based milk is to be expected; Frohlich doesn’t think the plant-based or dairy industries “are particularly surprised by this proposal.” Very little will change if the new guidance becomes policy. (The decision has to go through a public-comment period before the FDA issues the final word.) If anything, there may be more plant-based products labeled “milk” at the supermarket, and perhaps the new labels will stave off any potential confusion that occurs. Pointing out nutritional differences between plant-based and dairy milk on packaging, the FDA spokesperson said, is meant to address the “potential public-health concern” that people will mistakenly expect these products to be nutritional substitutes for each other. But the nutritional value of dairy milk varies depending on the type, and in some cases, the nutrients are added in. Milk is just confusing, and perhaps that’s okay. For most consumers, milk will continue to be milk—a white-ish fluid, sourced from a variety of plants and animals, and ever-evolving.

    Milk aside, for most modern consumers, what to call a food matters less than other factors, such as what it consists of, where it comes from, how it’s made, and its impact on the planet. “Public understandings of food have really changed since the early 21st century,” Charlotte Biltekoff, a professor of food science and technology at UC Davis, told me. In some cases, people don’t define food by what it is so much as what it does. Many plant-based milks, Biltekoff said, don’t look or taste much like dairy milk but are accepted as milk because they’re used in the same way: splashed in coffee, poured into cereal, or as an ingredient in baked goods. In short, trying to define food with a standard identity can’t capture “the full scope of how most people interact with food and health right now,” she said. A name—or, indeed, a label pointing out nutritional differences between dairy and plant-based milk—can encompass only a fraction of what people want to know about milk, all of which is beyond what the FDA can regulate, Biltekoff added. No wonder its name doesn’t seem to matter much anymore.

    That’s not to say that all food names will eventually become diffuse to the point of meaninglessness. It’s hard to imagine peanut referring to anything but the legume, but then again, a debate over what counted as “peanut butter” lasted for a decade in the ’60s and ’70s. Naming clashes, in all likelihood, will occur over staple foods that already attract a lot of scrutiny and are produced by powerful industries, such as eggs or meat. For example, Americans use the term meat flexibly: In addition to animal flesh, it can also refer to products made from plants, fungi, or even mammal cells grown in a lab. Just as the dairy and plant-based industries fueled the naming debate over milk, there will undoubtedly be pushback from those holding on to and breaking meat conventions: “You will see the meat industry make similar arguments” about what constitutes a hamburger or what lab-grown chicken can be named, Frohlich said.

    So long as technology keeps pushing the boundaries of what food can be, food names will continue to shift, and the results won’t always be neat. Someone can value natural foods plucked from farmers’ markets and served to them at farm-to-table restaurants but at the same time champion technological advances that make different versions of our foods possible. Such a person might exclusively eat free-range organic bacon but demand highly processed oat milk for their cortado. These inner conflicts are inevitable as we undergo what Biltekoff calls “a kind of evolution in our understanding of what good food is.” Milk, for now, remains fluid—simultaneously many things and nothing at all.

    [ad_2]

    Yasmin Tayag

    Source link

  • The New Anarchy

    The New Anarchy

    [ad_1]

    “Blood grows hot, and blood is spilled. Thought is forced from old channels into confusion. Deception breeds and thrives. Confidence dies, and universal suspicion reigns. Each man feels an impulse to kill his neighbor, lest he be first killed by him. Revenge and retaliation follow. And all this … may be among honest men only. But this is not all. Every foul bird comes abroad, and every dirty reptile rises up. These add crime to confusion.”

    — Abraham Lincoln, letter to the Missouri abolitionist Charles D. Drake, 1863

    I. ON THE BRINK

    In the weeks before Labor Day 2020, Ted Wheeler, the mayor of Portland, Oregon, began warning people that he believed someone would soon be killed by extremists in his city. Portland was preparing for the 100th consecutive day of conflict among anti-police protesters, right-wing counterprotesters, and the police themselves. Night after night, hundreds of people clashed in the streets. They attacked one another with baseball bats, Tasers, bear spray, fireworks. They filled balloons with urine and marbles and fired them at police officers with slingshots. The police lobbed flash-bang grenades. One man shot another in the eye with a paintball gun and pointed a loaded revolver at a screaming crowd. The FBI notified the public of a bomb threat against federal buildings in the city. Several homemade bombs were hurled into a group of people in a city park.

    Explore the April 2023 Issue

    Check out more from this issue and find your next story to read.

    View More

    Extremists on the left and on the right, each side inhabiting its own reality, had come to own a portion of downtown Portland. These radicals acted without restraint or, in many cases, humanity.

    In early July, when then-President Donald Trump deployed federal law-enforcement agents in tactical gear to Portland—against the wishes of the mayor and the governor—conditions deteriorated further. Agents threw protesters into unmarked vans. A federal officer shot a man in the forehead with a nonlethal munition, fracturing his skull. The authorities used chemical agents on crowds so frequently that even Mayor Wheeler found himself caught in clouds of tear gas. People set fires. They threw rocks and Molotov cocktails. They swung hammers into windows. Then, on the last Saturday of August, a 600-vehicle caravan of Trump supporters rode into Portland waving American flags and Trump flags with slogans like TAKE AMERICA BACK and MAKE LIBERALS CRY AGAIN. Within hours, a 39-year-old man would be dead—shot in the chest by a self-described anti-fascist. Five days later, federal agents killed the suspect—in self-defense, the government claimed—during a confrontation in Washington State.

    What had seemed from the outside to be spontaneous protests centered on the murder of George Floyd were in fact the culmination of a long-standing ideological battle. Some four years earlier, Trump supporters had identified Portland, correctly, as an ideal place to provoke the left. The city is often mocked for its infatuation with leftist ideas and performative politics. That reputation, lampooned in the television series Portlandia, is not completely unwarranted. Right-wing extremists understood that Portland’s reaction to a trolling campaign would be swift, and would guarantee the celebrity that comes with virality. When Trump won the presidency, this dynamic intensified, and Portland became a place where radicals would go to brawl in the streets. By the middle of 2018, far-right groups such as the Proud Boys and Patriot Prayer had hosted more than a dozen rallies in the Pacific Northwest, many of them in Portland. Then, in 2020, extremists on the left hijacked largely peaceful anti-police protests with their own violent tactics, and right-wing radicals saw an opening for a major fight.

    What happened in Portland, like what happened in Washington, D.C., on January 6, 2021, was a concentrated manifestation of the political violence that is all around us now. By political violence, I mean acts of violence intended to achieve political goals, whether driven by ideological vision or by delusions and hatred. More Americans are bringing weapons to political protests. Openly white-supremacist activity rose more than twelvefold from 2017 to 2021. Political aggression today is often expressed in the violent rhetoric of war. People build their political identities not around shared values but around a hatred for their foes, a phenomenon known as “negative partisanship.” A growing number of elected officials face harassment and death threats, causing many to leave politics. By nearly every measure, political violence is seen as more acceptable today than it was five years ago. A 2022 UC Davis poll found that one in five Americans believes political violence would be “at least sometimes” justified, and one in 10 believes it would be justified if it meant returning Trump to the presidency. Officials at the highest levels of the military and in the White House believe that the United States will see an increase in violent attacks as the 2024 presidential election draws nearer.

    In recent years, Americans have contemplated a worst-case scenario, in which the country’s extreme and widening divisions lead to a second Civil War. But what the country is experiencing now—and will likely continue to experience for a generation or more—is something different. The form of extremism we face is a new phase of domestic terror, one characterized by radicalized individuals with shape-shifting ideologies willing to kill their political enemies. Unchecked, it promises an era of slow-motion anarchy.

    Consider recent events. In October 2020, authorities arrested more than a dozen men in Michigan, many of them with ties to a paramilitary group. They were in the final stages of a plan to kidnap the state’s Democratic governor, Gretchen Whitmer, and possessed nearly 2,000 rounds of ammunition and hundreds of guns, as well as silencers, improvised explosive devices, and artillery shells. In January 2021, of course, thousands of Trump partisans stormed the U.S. Capitol, some of them armed, chanting “Where’s Nancy?” and “Hang Mike Pence!” Since then, the headlines have gotten smaller—or perhaps numbness has set in—but the violence has continued. In June 2022, a man with a gun and a knife who allegedly said he intended to kill Supreme Court Justice Brett Kavanaugh was arrested outside Kavanaugh’s Maryland home. In July, a man with a loaded pistol was arrested outside the home of Pramila Jayapal, the leader of the Congressional Progressive Caucus. She had heard someone outside shouting “Fuck you, cunt!” and “Commie bitch!” Days later, a man with a sharp object jumped onto a stage in upstate New York and allegedly tried to attack another member of Congress, the Republican candidate for governor. In August, just after the seizure of documents from Trump’s Mar-a-Lago home, a man wearing body armor tried to breach the FBI’s Cincinnati field office. He was killed in a shoot-out with police. In October, in San Francisco, a man broke into the home of Nancy Pelosi, then the speaker of the House, and attacked her 82-year-old husband with a hammer, fracturing his skull. In January 2023, a failed Republican candidate for state office in New Mexico who referred to himself as a “MAGA king” was arrested for the alleged attempted murder of local Democratic officials in four separate shootings. In one of the shootings, three bullets passed through the bedroom of a state senator’s 10-year-old daughter as she slept.

    Experts I interviewed told me they worry about political violence in broad regions of the country—the Great Lakes, the rural West, the Pacific Northwest, the South. These are places where extremist groups have already emerged, militias are popular, gun culture is thriving, and hard-core partisans collide during close elections in politically consequential states. Michigan, Wisconsin, Pennsylvania, Arizona, and Georgia all came up again and again.

    For the past three years, I’ve been preoccupied with a question: How can America survive a period of mass delusion, deep division, and political violence without seeing the permanent dissolution of the ties that bind us? I went looking for moments in history, in the United States and elsewhere, when society has found itself on the brink—or already in the abyss. I learned how cultures have managed to endure sustained political violence, and how they ultimately emerged with democracy still intact.

    Some lessons are unhappy ones. Societies tend to ignore the obvious warning signs of endemic political violence until the situation is beyond containment, and violence takes on a life of its own. Government can respond to political violence in brutal ways that undermine democratic values. Worst of all: National leaders, as we see today in an entire political party, can become complicit in political violence and seek to harness it for their own ends.

    II. SALAD-BAR EXTREMISM

    If you’re looking for a good place to hide an anarchist, you could do worse than Barre, Vermont. Barre (pronounced “berry”) is a small city in the bowl of a steep valley in the northern reaches of a lightly populated, mountainous state. You don’t just stumble upon a place like this.

    I went to Barre in October because I wanted to understand the anarchist who had fled there in the early 1900s, at the beginning of a new century already experiencing extraordinary violence and turbulence. The conditions that make a society vulnerable to political violence are complex but well established: highly visible wealth disparity, declining trust in democratic institutions, a perceived sense of victimhood, intense partisan estrangement based on identity, rapid demographic change, flourishing conspiracy theories, violent and dehumanizing rhetoric against the “other,” a sharply divided electorate, and a belief among those who flirt with violence that they can get away with it. All of those conditions were present at the turn of the last century. All of them are present today. Back then, few Americans might have guessed that the violence of that era would rage for decades.

    In 1901, an anarchist assassinated President William McKinley—shot him twice in the gut while shaking his hand at the Buffalo World’s Fair. In 1908, an anarchist at a Catholic church in Denver fatally shot the priest who had just given him Communion. In 1910, a dynamite attack on the Los Angeles Times killed 21 people. In 1914, in what officials said was a plot against John D. Rockefeller, a group of anarchists prematurely exploded a bomb in a New York City tenement, killing four people. That same year, extremists set off bombs at two Catholic churches in Manhattan, one of them St. Patrick’s Cathedral. In 1916, an anarchist chef dumped arsenic into the soup at a banquet for politicians, businessmen, and clergy in Chicago; he reportedly used so much that people immediately vomited, which saved their lives. Months later, a shrapnel-filled suitcase bomb killed 10 people and wounded 40 more at a parade in San Francisco. America’s entry into World War I temporarily quelled the violence—among other factors, some anarchists left the country to avoid the draft—but the respite was far from total. In 1917, a bomb exploded inside the Milwaukee Police Department headquarters, killing nine officers and two civilians. In the spring of 1919, dozens of mail bombs were sent to an array of business leaders and government officials, including Supreme Court Justice Oliver Wendell Holmes.

    All of this was prologue. Starting late in the evening on June 2, 1919, in a series of coordinated attacks, anarchists simultaneously detonated massive bombs in eight American cities. In Washington, an explosion at the home of Attorney General A. Mitchell Palmer blasted out the front windows and tore framed photos off the walls. Palmer, in his pajamas, had been reading by his second-story window. He happened to step away minutes before the bomb went off, a decision that authorities believed kept him alive. (His neighbors, the assistant secretary of the Navy and his wife, Franklin and Eleanor Roosevelt, had just gotten home from an evening out when the explosion also shattered their windows. Franklin ran over to Palmer’s house to check on him.) The following year, a horse-drawn carriage drew up to the pink-marble entrance of the J. P. Morgan building on Wall Street and exploded, killing more than 30 people and injuring hundreds more.

    From these episodes, one name leaps out across time: Luigi Galleani. Galleani, who was implicated in most of the attacks, is barely remembered today. But he was, in his lifetime, one of the world’s most influential terrorists, famous for advancing the argument for “propaganda of the deed”: the idea that violence is essential to the overthrow of the state and the ruling class. Born in Italy, Galleani immigrated to the United States and spread his views through his anarchist newspaper, Cronaca Sovversiva, or “Subversive Chronicle.” He told the poor to seize property from the rich and urged his followers to arm themselves—to find “a rifle, a dagger, a revolver.”

    Galleani fled to Barre in 1903 under the name Luigi Pimpino after several encounters with law enforcement in New Jersey. He attracted disciples—“Galleanisti,” they were called—despite shunning all forms of organization and hierarchy. He was quick-witted, with an imposing intellect and a magnetic manner of speaking. Even the police reports described his charisma.

    Left: Mug shot of the anarchist leader Lui­gi Galleani, 1919. Right: The aftermath of the Wall Street bombing outside the J. P. Morgan building, 1920. (Paul Spella; source images: Paul Avrich Collection, Rare Book and Special Collections, Library of Congress; Bettmann / Getty)

    The population of Barre today is slightly smaller than it was in Galleani’s day—roughly 10,000 then, 8,500 now—and it is the sort of place that is more confused by the presence of strangers than wary of them. The first thing you notice when you arrive is the granite. There is a mausoleum feel to any granite city, and on an overcast day the gray post-office building on North Main Street gives the illusion that all of the color has suddenly vanished from the world. Across the street, at city hall, I wandered into an administrative office where an affable woman—You came to Barre? On purpose?—generously agreed to take me inside the adjacent opera house, which, recently refurbished, looks much as it did on the winter night in 1907 when Galleani appeared there before a packed house to give a speech alongside the anarchist Emma Goldman.

    Galleani almost certainly could have disappeared into Barre with his wife and children and gotten away with it. He did not want that. In his own telling, Galleani’s anger was driven by how poorly the working class was treated, particularly in factories. In Barre, granite cutters spent long hours mired in the sludge of a dark, unheated, and poorly ventilated workspace, breathing in silica dust, which made most of them gravely ill. Seeing the town, even a century after Galleani was there, I could understand why his time in Vermont had not altered his worldview. In the foreword to a 2017 biography, Galleani’s grandson, Sean Sayers, put a hagiographic gloss on Galleani’s legacy: “He was not a narrow and callous nihilist; he was a visionary thinker with a beautiful idea of how human society could be—an idea that still resonates today.” For Galleani and other self-identified “communist anarchists” like him, the beautiful idea was a world without government, without laws, without property. Other anarchists did not share his idealism. The movement was torn by disagreements—they were anarchists, after all.

    In Galleani’s day, as in our own, the lines of conflict were not cleanly delineated. American radicalism can be a messy stew of ideas and motivations. Violence doesn’t need a clear or consistent ideology and often borrows from several. Federal law-enforcement officials use the term salad-bar extremism to describe what worries them most today, and it applies just as aptly to the extremism of a century ago.

    When Galleani had arrived in America, he’d encountered a nation in a terrible mood, one that would feel familiar to us today. Galleani’s children were born into violent times. The nation was divided not least over the cause of its divisions. The gap between rich and poor was colossal—the top 1 percent of Americans possessed almost as much wealth as the rest of the country combined. The population was changing rapidly. Reconstruction had been defeated, and southern states in particular remained horrifically violent toward Black people, for whom the threat of lynching was constant. The Great Migration was just beginning. Immigration surged, inspiring intense waves of xenophobia. America was primed for violence—and to Galleani and his followers, destroying the state was the only conceivable path.

    The spectacular violence of 1919 and 1920 proved a catalyst. A concerted nationwide hunt for anarchists began. This work, which culminated in what came to be known as the Palmer Raids, entailed direct violations of the Constitution. In late 1919 and early 1920, a series of raids—carried out in more than 30 American cities—led to the warrantless arrests of 10,000 suspected radicals, mostly Italian and Jewish immigrants. Attorney General Palmer’s dragnet ensnared many innocent people and has become a symbol of the damage that overzealous law enforcement can cause. Hundreds of people were ultimately deported. Some had fallen afoul of a harsh new federal immigration law that broadly targeted anarchists. One of them was Luigi Galleani. “The law was kind of designed for him,” Beverly Gage, a historian and the author of The Day Wall Street Exploded, told me.

    The violence did not stop immediately after the Palmer Raids—in an irony that frustrated authorities, Galleani’s deportation made it impossible for them to charge him in the Wall Street bombing, which they believed he planned, because it occurred after he’d left the country. Nevertheless, sweeping action by law enforcement helped put an end to a generation of anarchist attacks.

    That is the most important lesson from the anarchist period: Holding perpetrators accountable is crucial. The Palmer Raids are remembered, rightly, as a ham-handed application of police-state tactics. Government actions can turn killers into martyrs. More important, aggressive policing and surveillance can undermine the very democracy they are meant to protect; state violence against citizens only validates a distrust of law enforcement.

    But deterrence conducted within the law can work. Unlike anti-war protesters or labor organizers, violent extremists don’t have an agenda that invites negotiation. “Today’s threats of violence can be inspired by a wide range of ideologies that themselves morph and shift over time,” Deputy Homeland Security Adviser Josh Geltzer told me. Now as in the early 20th century, countering extremism through ordinary debate or persuasion, or through concession, is a fool’s errand. Extremists may not even know what they believe, or hope for. “One of the things I increasingly keep wondering about is—what is the endgame?” Mary McCord, a former assistant U.S. attorney and national-security official, told me. “Do you want democratic government? Do you want authoritarianism? Nobody talks about that. Take back our country . Okay, so you get it back. Then what do you do?”

    III. CREEPING VIOLENCE

    In another country, and in a time closer to our own, a sustained outbreak of domestic terrorism brought decades of attacks—and illustrates the role that ordinary citizens can sometimes play, along with deterrence, in restoring stability.

    On Saturday, August 2, 1980, a bomb hidden inside a suitcase blew up at the Bologna Centrale railway station, killing 85 people and wounding hundreds more, many of them young families setting off on vacation. The explosion flattened an entire wing of the station, demolishing a crowded restaurant, wrecking a train platform, and freezing the station’s clock at the time of the detonation: 10:25 a.m.

    The Bologna massacre remains the deadliest attack in Italy since World War II. By the time it occurred, Italians were more than a decade into a period of intense political violence, one that came to be known as Anni di Piombo, or the “Years of Lead.” From roughly 1969 to 1988, Italians experienced open warfare in the streets, bombings of trains, deadly shootings and arson attacks, at least 60 high-profile assassinations, and a narrowly averted neofascist coup attempt. It was a generation of death and bedlam. Although exact numbers are difficult to come by, during the Years of Lead, at least 400 people were killed and some 2,000 wounded in more than 14,000 separate attacks.

    As I sat at the Bologna Centrale railway station in September, a place where so many people had died, I found myself thinking, somewhat counterintuitively, about how, in the great sweep of history, the political violence in Italy in the 1970s and ’80s now seems but a blip. Things were so terrible for so long. And then they weren’t. How does political violence come to an end?

    No one can say precisely what alchemy of experience, temperament, and circumstance leads a person to choose political violence. But being part of a group alters a person’s moral calculations and sense of identity, not always for the good. Martin Luther King Jr., citing the theologian Reinhold Niebuhr, wrote in his “Letter From Birmingham Jail” that “groups tend to be more immoral than individuals.” People commit acts together that they’d never contemplate alone.

    Vicky Franzinetti was a teenage member of the far-left militant group Lotta Continua during the Years of Lead. “There was a lot of what I would call John Wayneism, and a lot of people fell for that,” she told me. “Whether it’s the Black Panthers or the people who attacked on January 6 on Capitol Hill, violence has a mesmerizing appeal on a lot of people.” A subtle but important shift also took place in Italian political culture during the ’60s and ’70s as people grasped for group identity. “If you move from what you want to who you are, there is very little scope for real dialogue, and for the possibility of exchanging ideas, which is the basis of politics,” Franzinetti said. “The result is the death of politics, which is what has happened.”

    In talking with Italians who lived through the Years of Lead about what brought this period to an end, two common themes emerged. The first has to do with economics. For a while, violence was seen as permissible because for too many people, it felt like the only option left in a world that had turned against them. When the Years of Lead began, Italy was still fumbling for a postwar identity. Some Fascists remained in positions of power, and authoritarian regimes controlled several of the country’s neighbors—Greece, Portugal, Spain, Turkey. Not unlike the labor movements that arose in Galleani’s day, the Years of Lead were preceded by intensifying unrest among factory workers and students, who wanted better social and working conditions. The unrest eventually tipped into violence, which spiraled out of control. Leftists fought for the proletariat, and neofascists fought to wind back the clock to the days of Mussolini. When, after two decades, the economy improved in Italy, terrorism receded.

    The second theme was that the public finally got fed up. People didn’t want to live in terror. They said, in effect: Enough. Lotta Continua hadn’t resorted to violence in the early years. When it did grow violent, it alienated its own members. “I didn’t like it, and I fought it,” Franzinetti told me. Simonetta Falasca-Zamponi, a sociology professor at UC Santa Barbara who lived in Rome at the time, recalled: “It went too far. Really, it reached a point that was quite dramatic. It was hard to live through those times.” But it took a surprisingly long while to reach that point. The violence crept in—one episode, then another, then another—and people absorbed and compartmentalized the individual events, as many Americans do now. They did not understand just how dangerous things were getting until violence was endemic. “It started out with the kneecappings,” Joseph LaPalombara, a Yale political scientist who lived in Rome during the Years of Lead, told me, “and then got worse. And as it got worse, the streets emptied after dark.”

    A turning point in public sentiment, or at least the start of a turning point, came in the spring of 1978, when the leftist group known as the Red Brigades kidnapped the former prime minister and leader of the Christian Democrats Aldo Moro, killing all five members of his police escort and turning him into an example of how We don’t negotiate with terrorists can go terrifically wrong. Moro was held captive and tortured for 54 days, then executed, his body left in the back of a bright-red Renault on a busy Rome street. In a series of letters his captors allowed him to send, Moro had begged Italian officials to arrange for his freedom with a prisoner exchange. They refused. After his murder, the final letter he’d written to his wife, “my dearest Noretta,” roughly 10 days before his death, was published in a local newspaper. “In my last hour I am left with a profound bitterness at heart,” he wrote. “But it is not of this I want to talk but of you whom I love and will always love.” Moro did not want a state funeral, but Italy held one anyway.

    Illustration with 2 archival photos: dead person covered by white sheet lying in street next to car with open doors; people walking on sidewalk past large graffiti on side of building "Brigate Rosse!"
    Top: A bodyguard slain by the Red Brigades during the kidnapping of former Italian Prime Minister Aldo Moro, 1978. Bottom: Graffiti in Milan supporting the Red Brigades, 1977. (Paul Spella; source images: Gianni Giansanti / Gamma-Rapho / Getty; Adriano Alecchi / Mondadori Portfolio / Getty)

    The conventional wisdom among terrorism experts had been that terrorists wanted publicity but didn’t really want to kill people—or, as the Rand Corporation’s Brian Jenkins put it in 1975, “Terrorists want a lot of people watching, not a lot of people dead.” But conditions had become so bad by the time Moro was murdered that newspapers around the world were confused when days passed without a political killing or shooting in Italy. “Italians Puzzled by 10-Day Lull in Terrorist Activity,” read one headline in The New York Times a few weeks after Moro’s murder. “When he was killed, it got a lot more serious,” Alexander Reid Ross, who hosts a history podcast about the era called Years of Lead Pod, told me. “People stopped laughing. It was no longer something where you could say, ‘It’s a sideshow.’ ”

    The Moro assassination was followed by an intensification of violence, including the Bologna-station bombing. People who had ignored the violence now paid attention; people who might have been tempted by revolution now stayed home. Meanwhile, the crackdown that followed—which involved curfews, traffic stops, a militarized police presence, and deals with terrorists who agreed to rat out their collaborators—caused violent groups to implode.

    The example of Aldo Moro offers a warning. It shouldn’t take an act like the assassination of a former prime minister to shake people into awareness. But it often does. William Bernstein, the author of The Delusions of Crowds, is not optimistic that anything else will work: “The answer is—and it’s not going to be a pleasant answer—the answer is that the violence ends if it boils over into a containable cataclysm.” What if, he went on—“I almost hesitate to say this”—but what if they actually had hanged Mike Pence or Nancy Pelosi on January 6? “I think that would have ended it. I don’t think it ends without some sort of cathartic cataclysm. I think, absent that, it just boils along for a generation or two generations.” Bernstein wasn’t the only expert to suggest such a thing.

    No wonder some American politicians are terrified. “We’ve had an exponential increase in threats against members of Congress,” Senator Amy Klobuchar, a Democrat from Minnesota, told me in January. Klobuchar thought back to when she was standing at President Joe Biden’s inauguration ceremony, two weeks after the attempted insurrection. At the time, as Democrats and most Republicans came together for a peaceful transfer of power, she felt as though a violent eruption in American history might be ending. But Klobuchar now believes she was “naive” to think that Republicans would break with Trump and restore the party’s democratic values. “We have Donald Trump, his shadow, looming over everything,” she said.

    This past February, Biden sought to dispel that shadow as he stood before Congress to deliver his State of the Union address. “There’s no place for political violence in America,” he said. “And we must give hate and extremism in any form no safe harbor.” Biden’s speech was punctuated by jeers and name-calling by Republicans.

    IV. A BROKEN SOCIAL CONTRACT

    The taxonomy of what counts as political violence can be complicated. One way to picture it is as an iceberg: The part that protrudes from the water represents the horrific attacks on both hard targets and soft ones, in which the attacker has explicitly indicated hatred for the targeted group—fatal attacks at supermarkets and synagogues, as well as assassination attempts such as the shooting at a congressional-Republican baseball practice in 2017. Less visible is the far more extensive mindset that underlies them. “There are a lot of people who are out for a protest, who are advocating for violence,” Erin Miller, the longtime program manager at the University of Maryland’s Global Terrorism Database, told me. “Then there’s a smaller number at the tip of the iceberg that are willing to carry out violent attacks.” You can’t get a grip on political violence just by counting the number of violent episodes. You have to look at the whole culture.

    A society’s propensity for political violence—including cataclysmic violence—may be increasing even as ordinary life, for many people, probably most, continues to feel normal. A drumbeat of violent attacks, by different groups with different agendas, may register as different things. But collectively, as in Italy, they have the power to loosen society’s screws.

    In December, I spoke again with Alexander Reid Ross, who in addition to hosting Years of Lead Pod is a lecturer at Portland State University. We met in Pioneer Courthouse Square, in downtown Portland. I had found the city in a wounded condition. This was tragic to me two times over—first, because I knew what had happened there, and second, because I had immediately absorbed Portland’s charm. You can’t encounter all those drawbridges, or the swooping crows, or the great Borgesian bookstore, or the giant elm trees and do anything but fall in love with the place. But downtown Portland was not at its best. The first day I was there I counted more birds than people, and many of the people I saw were quite obviously struggling badly.

    On the gray afternoon when we met, Ross and I happened to be sitting at the site of the first far-right protest he remembers witnessing in his city, back in 2016; members of a group called Students for Trump, stoked by Alex Jones’s disinformation outlet, Infowars, had gathered to assert their political preferences and provoke their neighbors. Ross is a geographer, a specialty he assumed would keep him focused on land-use debates and ecology, which is one of the reasons he moved to Oregon in the first place. After that 2016 rally, Ross paid closer attention to the political violence unfolding in Portland. We decided to take a walk so that Ross could point out various landmarks from the—well, we couldn’t decide what to call the period of sustained violence that started in 2016 and was reignited in 2020. The siege? The occupation? The revolt? What happened in Portland has a way of being too slippery for precise language.

    We walked southwest from the square before doubling back toward the Willamette River. Over here was the historical society that protesters broke into and vandalized one night. Over there was where the statues got toppled. (“Portland is a city of pedestals now,” Ross said.) A federal building still had a protective fence surrounding it more than a year after the street violence had ended. At one point, the mayor had to order a drawbridge raised to keep combatants apart.

    On the evening of June 30, 2018, Ross found himself in the middle of a violent brawl between hundreds of self-described antifa activists and members of the Proud Boys and Patriot Prayer, a local pro-Trump offshoot. Ross described to me a number of “ghoulish” encounters he’d had with Patriot Prayer, and I asked him which moment was the scariest. “It’s on video,” he told me. “You can see it: me getting punched.” I later watched the video. In it, Ross rushes toward a group of men who are repeatedly kicking and bludgeoning a person dressed all in black, lying in the street. Ross had told me earlier that he’d intervened because he thought he was watching someone being beaten to death. After Ross gets clocked, he appears dazed, then dashes back toward the fight. “That’s enough! That’s enough!” he shouts.

    By the time of this fight, Patriot Prayer had become a fixture in Portland. Its founder, Joey Gibson, has said in interviews that he was inspired to start Patriot Prayer to fight for free speech, but the group’s core belief has always been in Donald Trump. Its first event, in Vancouver, Washington, in October 2016, was a pro-Trump rally. From there, Gibson deliberately picked ultraliberal cities such as Portland, Berkeley, Seattle, and San Francisco for his protests, and in doing so quickly attracted like-minded radicals—the Proud Boys, the Three Percenters, Identity Evropa, the Hell Shaking Street Preachers—who marched alongside Patriot Prayer. These were people who seemed to love Trump and shit-stirring in equal measure. White nationalists and self-described Western chauvinists showed up at Gibson’s events. (Gibson’s mother is Japanese, and he has insisted that he does not share their views.) By August 2018, Patriot Prayer had already held at least nine rallies in Portland, routinely drawing hundreds of supporters—grown men in Boba Fett helmets and other homemade costumes; at least one man with an SS neck tattoo. In 2019, Gibson himself was arrested on a riot charge. Patriot Prayer quickly became the darling of Infowars.

    photo of masked person running on street in cloud of tear gas
    Paul Spella; source image: Nathan Howard / Getty

    The morning after I met Ross, I drove across the river to Vancouver, a town of strip-mall churches and ponderosa pine trees, to meet with Lars Larson, who records The Lars Larson Show—tagline: “Honestly Provocative Talk Radio”—from his home studio. Larson greeted me with his two dogs and a big mug of coffee. His warmth, quick-mindedness, and tendency to filibuster make him irresistible for talk radio. And his allegiance to MAGA world helps him book guests like Donald Trump Jr., whom Larson introduced on a recent episode as “the son of the real president of the United States of America.” Over the course of our conversation, he described January 6 as “some ruined furniture in the Capitol”; suggested that the city government of Charlottesville, Virginia, was secretly behind the violent clash at the 2017 “Unite the Right” rally; and made multiple references to George Soros, including suggesting that Soros may have paid for people to come to Portland to tear up the city. When I pressed Larson on various points, he would walk back whatever he had claimed, but only slightly. He does not seem to be a conspiracy theorist, but he plays one on the radio.

    Larson blamed Portland’s troubles on a culture of lawlessness fostered by a district attorney who, he said, repeatedly declined to prosecute left-wing protesters. He sees this as an uneven application of justice that undermined people’s faith in local government. It is more accurate to say that the district attorney chose not to prosecute lesser crimes, focusing instead on serious crimes against people and property; ironically, the complaint about uneven application comes from both the far left and the far right. When I asked Larson whether Patriot Prayer is Christian nationalist in ideology, the question seemed to make him uncomfortable, and he emphasized his belief in pluralism and religious freedom. He also compared Joey Gibson and Patriot Prayer marching on Portland to civil-rights activists marching on Selma in 1965. “What I heard people tell Patriot Prayer is ‘If you get attacked every time you go to Portland, don’t go to Portland,’ ” he told me. “Would you have given that same advice to Martin Luther King?”

    Gibson’s lawyer Angus Lee accused the government of “political persecution”; Gibson was ultimately acquitted of the riot charge. Patriot Prayer, Lee went on, is “not like these other organizations you referenced that have members and that sort of thing. Patriot Prayer is more of an idea.” Gibson himself once put it in blunter terms. “I don’t even know what Patriot Prayer is anymore,” he said in a 2017 interview on a public-access news channel in Portland. “It’s just these two words that people hear and it sparks emotions … All Patriot Prayer is is videos and social-media presence.”

    The more I talked with people about Patriot Prayer, the more it began to resemble a phenomenon like QAnon—a decentralized and amorphous movement designed to provoke reaction, tolerant of contradictions, borrowing heavily from internet culture, overlapping with other extremist movements like the Proud Boys, linked to high-profile episodes of violence, and ultimately focused on Trump. I couldn’t help but think of Galleani, his “beautiful idea,” and the diffuse ideology of his followers. One key difference: Galleani was fighting against the state, whereas movements like QAnon and groups like Patriot Prayer and the Proud Boys have been cheered on by a sitting president and his party.

    When I met with Portland’s mayor, Ted Wheeler, at city hall, he recalled night after night of violence, and at times planning for the very worst, meaning mass casualties. Portlanders had taken to calling him “Tear Gas Ted” because of the police response in the city. One part of any mayor’s job is to absorb the community’s scorn. Few people have patience for unfilled potholes or the complexities of trash collection. Disdain for Wheeler may have been the one thing that just about every person I met in Portland shared, but his job has been difficult even by big-city standards. He confronted a breakdown of the social contract.

    “Political violence, in my opinion, is the extreme manifestation of other trends that are prevalent in our society,” Wheeler told me. “A healthy democracy is one where you can sit on one side of the table and express an opinion, and I can sit on the other side of the table and express a very different opinion, and then we have the contest of ideas … We have it out verbally. Then we go drink a beer or whatever.”

    When extremists began taunting Portlanders online, it was very quickly “game on” for violence in the streets, Wheeler said. In this way, Portland stands as a warning to cities that now seem calm: It takes very little provocation to inflame latent tensions between warring factions. Once order collapses, it is extraordinarily difficult to restore. And it can be dangerous to attempt to do so through the use of force, especially when one violent faction is lashing out, in part, against state authority.

    Aaron Mesh moved to Portland 16 years ago, to take a job as Willamette Week’s film critic, and since then has worked his way up to managing editor. He is sharp-tongued and good-humored, and it is obvious that he loves his city in the way that any good newspaperman does, with a mix of fierce loyalty and heaping criticism. Like Wheeler, he trained attention on the dynamic of action and reaction—on how rising to the bait not only solves nothing but can make things worse. “There was this attitude of We’re going to theatrically subdue your city with these weekend excursions,” Mesh said, describing the confrontations that began in 2016 as a form of cosplay, with right-wing extremists wearing everything from feathered hats to Pepe the Frog costumes and left-wing extremists dressed up in what’s known as black bloc: all-black clothing and facial coverings. “I do want to emphasize,” he said, “that everyone involved in this was a massive fucking loser, on both sides.”

    It was as though all of the most unsavory characters on the internet had crawled out of the computer. The fights were enough of a spectacle that not everyone took them seriously at first. Mesh said it was impossible to overstate “the degree to which Portland became a lodestone in the imagination of a nascent Proud Boys movement,” a place where paramilitary figures on the right went “to prove that they had testicles.” He went on: “You walk into town wearing a helmet and carrying a big American flag” and then wait and see “who throws an egg at your car or who gives you the middle finger, and you beat the living hell out of them.”

    Both sides behaved despicably. But only the right-wingers had the endorsement of the president and the mainstream Republican Party. “Despite being run by utter morons,” Mesh said of Patriot Prayer, “they managed to outsmart most of their adversaries in this city, simply by provoking violent reactions from people who were appalled by their politics.” The argument for violence among people on the left is often, essentially, If you encounter a Nazi, you should punch him. But “what if the only thing the Nazi wants is for you to punch him?” Mesh asked. “What if the Nazis all have cameras and they’re immediately feeding all the videos of you punching them to Tucker Carlson? Which is what they did.”

    The situation in Portland became so desperate, and the ideologies involved so tangled, that the violence began to operate like its own weather system—a phenomenon that the majority of Portlanders could see coming and avoid, but one that left behind tremendous destruction. Most people don’t want to fight. But it takes startlingly few violent individuals to exact generational damage.

    V. THE COMPLICIT STATE

    America was born in revolution, and violence has been an undercurrent in the nation’s politics ever since. People remember the brutal opposition to the civil-rights struggle, and recall the wave of terrorism spawned by the anti-war movement of the 1960s. But the most direct precursor to what we’re experiencing now is the anti-government Patriot movement, which can be traced to the 1980s and eventually led to deadly standoffs between federal agents and armed citizens at Ruby Ridge, Idaho, in 1992, and in Waco, Texas, in 1993. Three people were killed at Ruby Ridge. As many as 80 died in Waco, 25 of them children. Those incidents stirred the present-day militia movement and directly inspired the Oklahoma City bombers, anti-government extremists who killed 168 people at the Alfred P. Murrah Federal Building in 1995. The surge in militia activity, white nationalism, and apocalypticism of the 1990s seemed to peter out in the early 2000s. This once struck me as a bright spot, an earlier success we might learn from today. But when I mentioned this notion to Carolyn Gallaher, a scholar who spent two years following a right-wing paramilitary group in Kentucky in the 1990s, she said, “The militia movement waned very quickly in the 1990s not because of anything we did, but because of Oklahoma City. That bombing really put the movement on the back foot. Some groups went underground. Some groups dispersed. You also saw that happen with white-supremacist groups.”

    A generation later, political violence in America unfolds with little organized guidance and is fed by a mishmash of extremist right-wing views. It predates the emergence of Donald Trump, but Trump served as an accelerant. He also made tolerance of political violence a defining trait of his party, whereas in the past, both political parties condemned it. At the height of the Patriot movement, “there was this fire wall” between extremist groups and elected officials that protected democratic norms, according to Gallaher. Today, “the fire wall between these guys and formal politics has melted away.” Gallaher does not anticipate an outbreak of civil strife in America in a “classic sense”—with Blue and Red armies or militias fighting for territory. “Our extremist groups are nowhere near as organized as they are in other countries.”

    Because it is chaotic, Americans tend to underestimate political violence, as Italians at first did during the Years of Lead. Some see it as merely sporadic, and shift attention to other things. Some say, in effect, Wake me when there’s civil war. Some take heart from moments of supposed reprieve, such as the poor showing by election deniers and other extremists in the 2022 midterm elections. But think of all the ongoing violence that at first glance isn’t labeled as being about politics per se, but is in fact political: the violence, including mass shootings, directed at LGBTQ communities, at Jews, and at immigrants, among others. In November, the Department of Homeland Security issued a bulletin warning that “the United States remains in a heightened threat environment” due to individuals and small groups with a range of “violent extremist ideologies.” It warned of potential attacks against a long list of places and people: “public gatherings, faith-based institutions, the LGBTQI+ community, schools, racial and religious minorities, government facilities and personnel, U.S. critical infrastructure, the media, and perceived ideological opponents.”

    The broad scope of the warning should not be surprising—not after the massacres in Pittsburgh, El Paso, Buffalo, and elsewhere. One month into 2023, the pace of mass shootings in America—all either political or, inevitably, politicized—was at an all-time high. “There’s no place that’s immune right now,” Mary McCord, the former assistant U.S. attorney, observed. “It’s really everywhere.” She added, “Someday, God help us, we’ll come out of this. But it’s hard for me to imagine how.”

    The sociologist Norbert Elias, who left Germany for France and then Britain as the Nazi regime took hold, famously described what he called the civilizing process as “a long sequence of spurts and counter-spurts,” warning that you cannot fix a violent society simply by eliminating the factors that made it deteriorate in the first place. Violence and the forces that underlie it have the potential to take us from the democratic backsliding we already know to a condition known as decivilization. In periods of decivilization, ordinary people fail to find common ground with one another and lose faith in institutions and elected leaders. Shared knowledge erodes, and bonds fray across society. Some people inevitably decide to act with violence. As violence increases, so does distrust in institutions and leaders, and around and around it goes. The process is not inevitable—it can be held in check—but if a period of bloodshed is sustained for long enough, there is no shortcut back to normal. And signs of decivilization are visible now.

    illustration with photo of person in gas mask looking at camera with person behind in stars-and-stripes face mask and clouds of tear gas
    A pro-Trump demonstrator at the U.S. Capitol on January 6, 2021, when insurrectionists stormed the building (Paul Spella; source image: Brendan Smialowski / AFP / Getty)

    “The path out of bloodshed is measured not in years but in generations,” Rachel Kleinfeld writes in A Savage Order, her 2018 study of extreme violence and the ways it corrodes a society. “Once a democracy descends into extreme violence, it is always more vulnerable to backsliding.” Cultural patterns, once set, are durable—the relatively high rates of violence in the American South, in part a legacy of racism and slaveholding, persist to this day. In The Delusions of Crowds, William Bernstein looks further afield, to Germany. He told me, “You can actually predict anti-Semitism and voting for the Nazi Party by going back to the anti-Semitism across those same regions in the 14th century. You can trace it city to city.”

    Three realities mark the current era of political violence in America as different from what has come before, and make dealing with it much harder. The first—obvious—is the universal access to weaponry, including military-grade weapons.

    Second, today’s information environment is simultaneously more sophisticated and more fragmented than ever before. In 2006, the analyst Bruce Hoffman argued that contemporary terrorism had become dangerously amorphous. He was referring to groups like al-Qaeda, but we now witness what he described among domestic American extremists. As Hoffman and others see it, the defining characteristic of post-9/11 terrorism is that it is decentralized. You don’t need to be part of an organization to become a terrorist. Hateful ideas and conspiracy theories are not only easy to find online; they’re actively amplified by social platforms, whose algorithms prioritize the anger and hate that drive engagement and profit. The barriers to radicalization are now almost nonexistent. Luigi Galleani would have loved Twitter, YouTube, and Telegram. He had to settle for publishing a weekly newspaper. Because of social media, conspiracy theories now spread instantly and globally, often promoted by hugely influential figures in the media, such as Tucker Carlson and of course Trump, whom Twitter and Facebook have just reinstated.

    The third new reality goes to the core of American self-governance: people refusing to accept the outcome of elections, with national leaders fueling the skepticism and leveraging it for their own ends. In periods of decivilization, violence often becomes part of a governing strategy. This can happen when weak states acquiesce to violence simply to survive. Or it can happen when politicians align themselves with violent groups in order to bolster authority—a characteristic of what Kleinfeld, in her 2018 book, calls a “complicit state.” This is a well-known tactic among authoritarian incumbents worldwide who wield power by mobilizing state and vigilante violence in tandem.

    Complicity is insidious. It doesn’t require a revolution. You can see complicity, for example, in Trump’s order to the Proud Boys to “stand back and stand by” in the months ahead of January 6. You can see it in the Republican Party’s defense of Trump even after he propelled insurrectionists toward the U.S. Capitol. And you can see it in the way that powerful politicians and television personalities continue to cheer on right-wing extremists as “patriots” and “political prisoners,” rather than condemning them as vigilantes and seditionists.

    Americans sometimes wonder what might have happened if the Civil War had gone the other way—what the nation would be like now, or whether it would even exist, if the South had won. But that thought experiment overlooks the fact that we do know what it looks like for violent extremists to win in the United States. In the 1870s, white supremacists who objected to Reconstruction led a campaign of violence that they perversely referred to as Redemption. They murdered thousands of Black people in terror lynchings. They drove thousands more Black business owners, journalists, and elected officials out of their homes and hometowns, destroying their livelihoods. Sometimes violence ends not because it is overcome, but because it has achieved its goal.

    Norbert Elias’s warnings notwithstanding, dealing seriously with society’s underlying pathologies is part of the answer to political violence in the long term. But so, too, is something we have not had and perhaps can barely imagine anymore: leaders from all parts of the political constellation, and at all levels of government, and from all segments of society, who name the problem of political violence for what it is, explain how it will overwhelm us, and point a finger at those who foment it, either directly or indirectly. Leaders who understand that nothing else will matter if we can’t stop this one thing. The federal government is right to take a hard line against political violence—as it has done with its prosecutions of Governor Whitmer’s would-be kidnappers and the January 6 insurrectionists (almost 1,000 of whom have been charged). But violence must also be confronted where it first takes root, in the minds of citizens.

    Ending political violence means facing down those who use the language of democracy to weaken democratic systems. It means rebuking the conspiracy theorist who uses the rhetoric of truth-seeking to obscure what’s real; the billionaire who describes his privately owned social platform as a democratic town square; the seditionist who proclaims himself a patriot; the authoritarian who claims to love freedom. Someday, historians will look back at this moment and tell one of two stories: The first is a story of how democracy and reason prevailed. The second is a story of how minds grew fevered and blood was spilled in the twilight of a great experiment that did not have to end the way it did.


    *Lead image source credits from left to right: Kathryn Elsesser / AFP / Getty; Michael Nigro / Sipa USA / Alamy; Mathieu Lewis-Rolland / AFP / Getty; Alex Milan Tracy / AP; Michael Nigro / Sipa USA / Alamy; Michael Nigro / Sipa USA / AP; Mathieu Lewis-Rolland / AFP / Getty; Mark Downey / ZUMA / Alamy; Mathieu Lewis-Rolland / AFP / Getty

    This article appears in the April 2023 print edition with the headline “The New Anarchy.” When you buy a book using a link on this page, we receive a commission. Thank you for supporting The Atlantic.

    [ad_2]

    Adrienne LaFrance

    Source link

  • No One Really Knows How Much COVID Is Silently Spreading … Again

    No One Really Knows How Much COVID Is Silently Spreading … Again

    [ad_1]

    In the early days of the pandemic, one of the scariest and most surprising features of SARS-CoV-2 was its stealth. Initially assumed to transmit only from people who were actively sick—as its predecessor SARS-CoV did—the new coronavirus turned out to be a silent spreader, also spewing from the airways of people who were feeling just fine. After months of insisting that only the symptomatic had to mask, test, and isolate, officials scrambled to retool their guidance; singing, talking, laughing, even breathing in tight quarters were abruptly categorized as threats.

    Three years later, the coronavirus is still silently spreading—but the fear of its covertness again seems gone. Enthusiasm for masking and testing has plummeted; isolation recommendations have been pared down, and may soon entirely disappear. “We’re just not communicating about asymptomatic transmission anymore,” says Saskia Popescu, an infectious-disease epidemiologist and infection-prevention expert at George Mason University. “People think, What’s the point? I feel fine.

    Although the concern over asymptomatic spread has dissipated, the threat itself has not. And even as our worries over the virus continue to shrink and be shunted aside, the virus—and the way it moves between us—is continuing to change. Which means that our best ideas for stopping its spread aren’t just getting forgotten; they’re going obsolete.

    When SARS-CoV-2 was new to the world and hardly anyone had immunity, symptomless spread probably accounted for most of the virus’s spread—at least 50 percent or so, says Meagan Fitzpatrick, an infectious-disease transmission modeler at the University of Maryland’s School of Medicine. People wouldn’t start feeling sick until four, five, or six days, on average, after being infected. In the interim, the virus would be xeroxing itself at high speed in their airway, reaching potentially infectious levels a day or two before symptoms started. Silently infected people weren’t sneezing and coughing—symptoms that propel the virus more forcefully outward, increasing transmission efficiency. But at a time when tests were still scarce and slow to deliver results, not knowing they had the virus made them dangerous all the same. Precautionary tests were still scarce, or very slow to deliver results. So symptomless transmission became a norm, as did epic superspreading events.

    Now, though, tests are more abundant, presymptomatic spread is a better-known danger, and repeated rounds of vaccination and infection have left behind layers of immunity. That protection, in particular, has slashed the severity and duration of acute symptoms, lowering the risk that people will end up in hospitals or morgues; it may even be chipping away at long COVID. At the same time, though, the addition of immunity has made the dynamics of symptomless transmission much more complex.

    On an individual basis, at least, silent spread could be happening less often than it did before. One possible reason is that symptoms are now igniting sooner in people’s bodies, just three or so days, on average, after infection—a shift that roughly coincided with the rise of the first Omicron variant and could be a quirk of the virus itself. But Aubree Gordon, an infectious-disease epidemiologist at the University of Michigan, told me that faster-arriving sicknesses are probably being driven in part by speedier immune responses, primed by past exposures. That means that illness might now coincide with or even precede the peak of contagiousness, shortening the average period in which people spread the virus before they feel sick. In that one very specific sense, COVID could now be a touch more flulike. Presymptomatic transmission of the flu does seem to happen on occasion, says Seema Lakdawala, a virologist at Emory University. But in general, “people tend not to hit their highest viral levels until after they develop symptoms,” Gordon told me.

    Coupled with more population-level immunity, this arrangement could be working in our favor. People might be less likely to pass the virus unwittingly to others. And thanks to the defenses we’ve collectively built up, the pathogen itself is also having more trouble exiting infected bodies and infiltrating new ones. That’s almost certainly part of the reason that this winter hasn’t been quite as bad as past ones have, COVID-wise, says Maia Majumder, an infectious-disease modeler at Harvard Medical School and Boston Children’s Hospital.

    That said, a lot of people are still undoubtedly catching the coronavirus from people who aren’t feeling sick. Infection per infection, the risk of superspreading events might now be lower, but at the same time people have gotten chiller about socializing without masks and testing before gathering in groups—a behavioral change that’s bound to counteract at least some of the forward shift in symptoms. Presymptomatic spread might be less likely nowadays, but it’s nowhere near gone. Multiply a small amount of presymptomatic spread by a large number of cases, and that can still seed … another large number of cases.

    There could be some newcomers to the pool of silent spreaders, too—those who are now transmitting the virus without ever developing symptoms at all. With people’s defenses higher than they were even a year and a half ago, infections that might have once been severe are now moderate or mild; ones that might have once been mild are now unnoticeable, says Seyed Moghadas, a computational epidemiologist at York University. At the same time, though, immunity has probably transformed some symptomless-yet-contagious infections into non-transmissible cases, or kept some people from getting infected at all. Milder cases are of course welcome, Fitzpatrick told me, but no one knows exactly what these changes add up to: Depending on the rate and degree of each of those shifts, totally asymptomatic transmission might now be more common, less common, or sort of a wash.

    Better studies on transmission patterns would help cut through the muck; they’re just not really happening anymore. “To get this data, you need to have pretty good testing for surveillance purposes, and that basically has stopped,” says Yonatan Grad, an infectious-disease epidemiologist at Harvard’s School of Public Health.

    Meanwhile, people are just straight-up testing less, and rarely reporting any of the results they get at home. For many months now, even some people who are testing have been seeing strings of negative results days into bona-fide cases of COVID—sometimes a week or more past when their symptoms start. That’s troubling on two counts: First, some legit COVID cases are probably getting missed, and keeping people from accessing test-dependent treatments such as Paxlovid. Second, the disparity muddles the start and end of isolation. Per CDC guidelines, people who don’t test positive until a few days into their illness should still count their first day of symptoms as Day 0 of isolation. But if symptoms might sometimes outpace contagiousness, “I think those positive tests should restart the isolation clock,” Popescu told me, or risk releasing people back into society too soon.

    American testing guidelines, however, haven’t undergone a major overhaul in more than a year—right after Omicron blew across the nation, says Jessica Malaty Rivera, an infectious-disease epidemiologist at Boston Children’s Hospital. And even if the rules were to undergo a revamp, they wouldn’t necessarily guarantee more or better testing, which requires access and will. Testing programs have been winding down for many months; free diagnostics are once again growing scarce.

    Through all of this, scientists and nonscientists alike are still wrestling with how to define silent infection in the first place. What counts as symptomless depends not just on biology, but behavior—and our vigilance. As worries over transmission continue to falter and fade, even mild infections may be mistaken for quiet ones, Grad told me, brushed off as allergies or stress. Biologically, the virus and the disease may not need to become that much more muted to spread with ease: Forgetting about silent spread may grease the wheels all on its own.

    [ad_2]

    Katherine J. Wu

    Source link

  • Are Colds Really Worse, or Are We All Just Weak Babies Now?

    Are Colds Really Worse, or Are We All Just Weak Babies Now?

    [ad_1]

    For the past few weeks, my daily existence has been scored by the melodies of late winter: the drip of melting ice, the soft rustling of freshly sprouted leaves—and, of course, the nonstop racket of sneezes and coughs.

    The lobby of my apartment building is alive with the sounds of sniffles and throats being cleared. Every time I walk down the street, I’m treated to the sight of watery eyes and red noses. Even my work Slack is rife with illness emoji, and the telltale pings of miserable colleagues asking each other why they feel like absolute garbage. “It’s not COVID,” they say. “I tested, like, a million times.” Something else, they insist, is making them feel like a stuffed and cooked goose.

    That something else might be the once-overlooked common cold. After three years of largely being punted out of the limelight, a glut of airway pathogens—among them, adenovirus, RSV, metapneumovirus, parainfluenza, common-cold coronaviruses, and rhinoviruses galore—are awfully common again. And they’re really laying some people out. The good news is that there’s no evidence that colds are actually, objectively worse now than they were before the pandemic started. The less-good news is that after years of respite from a bunch of viral nuisances, a lot of us have forgotten that colds can be a real drag.

    Once upon a time—before 2020, to be precise—most of us were very, very used to colds. Every year, adults, on average, catch two to three of the more than 200 viral strains that are known to cause the illnesses; young kids may contract half a dozen or more as they toddle in and out of the germ incubators that we call “day cares” and “schools.” The sicknesses are especially common during the winter months, when many viruses thrive amid cooler temps, and people tend to flock indoors to exchange gifts and breath. When the pandemic began, masks and distancing drove several of those microbes into hiding—but as mitigations have eased in the time since, they’ve begun their slow creep back.

    For the majority of people, that’s not really a big deal. Common-cold symptoms tend to be pretty mild and usually resolve on their own after a few days of nuisance. The virus infiltrates the nose and throat, but isn’t able to do much damage and gets quickly swept out. Some people may not even notice they’re infected at all, or may mistake the illness for an allergy—snottiness, drippiness, and not much more. Most of us know the drill: “Sometimes, it’s just congestion for a few days and feeling a bit tired for a while, but otherwise you’ll be just fine,” says Emily Landon, an infectious-disease physician at the University of Chicago. As a culture, we’ve long been in the habit of dismissing these symptoms as just a cold, not enough of an inconvenience to skip work or school, or to put on a mask. (Spoiler: The experts I spoke with were adamant that we all really should be doing those things when we have a cold.)

    The general infectious-disease dogma has always been that colds are a big nothing, at least compared with the flu. But gentler than the flu is not saying much. The flu is a legitimately dangerous disease that hospitalizes hundreds of thousands of Americans each year, and, like COVID, can sometimes saddle people with long-term symptoms. Even if colds are generally less severe, people can end up totally clobbered by headaches, exhaustion, and a burning sore throat; their eyes will tear up; their sinuses will clog; they’ll wake up feeling like they’ve swallowed serrated razor blades, or like their heads have been pumped full of fast-hardening concrete. It’s also common for cold symptoms to stretch out beyond a week, occasionally even two; coughs, especially, can linger long after the runny nose and headache resolve. At their worst, colds can lead to serious complications, especially in the very young, very old, and immunocompromised. Sometimes, cold sufferers end up catching a bacterial infection on top of their viral disease, a one-two punch that can warrant a trip to the ER. “The fact of the matter is, it’s pretty miserable to have a cold,” Landon told me. “And that’s how it’s always been.”

    As far as experts can tell, the average severity of cold symptoms hasn’t changed. “It’s about perception,” says Jasmine Marcelin, an infectious-disease physician at the University of Nebraska Medical Center. After skipping colds for several years, “experiencing them now feels worse than usual,” she told me. Frankly, this was sort of a problem even before COVID came onto the scene. “Every year, I have patients who call me with ‘the worst cold they’ve ever had,’” Landon told me. “And it’s basically the same thing they had last year.” Now, though, the catastrophizing might be even worse, especially since pandemic-brain started prompting people to scrutinize every sniffle and cough.

    There’s still a chance that some colds this season might be a shade more unpleasant than usual. Many people falling sick right now are just coming off of bouts with COVID, flu, or RSV, each of which infected Americans (especially kids) by the millions this past fall and winter. Their already damaged tissues may not fare as well against another onslaught from a cold-causing virus.

    It’s also possible that immunity, or lack thereof, could be playing a small role. Many people are now getting their first colds in three-plus years, which means population-level vulnerability might be higher than it normally is this time of year, speeding the rate at which viruses spread and potentially making some infections more gnarly than they’d otherwise be. But higher-than-usual susceptibility seems unlikely to be driving uglier symptoms en masse, says Roby Bhattacharyya, an infectious-disease physician and microbiologist at Massachusetts General Hospital. Not all cold-causing viruses leave behind good immunity—but many of those that do are thought to prompt the body to mount relatively durable defenses against truly severe infections, lasting several years or more.

    Plus, for a lot of viruses going around right now, the immunity question is largely moot, Landon told me. So many different pathogens cause colds that a recent exposure to one is unlikely to do much against the next. A person could catch half a dozen colds in a five-year time frame and not even encounter the same type of virus twice.

    It’s also worth noting that what some people are categorizing as the worst cold they’ve ever had might actually be a far more menacing virus, such as SARS-CoV-2 or a flu virus. At-home rapid tests for the coronavirus often churn out false-negative results in the early days of infection, even after symptoms start. And although the flu can sometimes be distinguished from a cold by its symptoms, they’re often pretty similar. The illnesses can only be definitively diagnosed with a test, which can be difficult to come by.

    The pandemic has steered our perception of illness into a false binary: Oh no, it’s COVID or Phew, it’s not. COVID is undoubtedly still more serious than a run-of-the-mill cold—more likely to spark severe disease or chronic, debilitating symptoms that can last months or years. But the range of severity between them overlaps more than the binary implies. Plus, Marcelin points out, what truly is “just” a cold for one person might be an awful, weeks-long slog for someone else, or worse—which is why, no matter what’s turning your face into a snot factory, it’s still important to keep your germs to yourself. The current outbreak of colds may not be any more severe than usual. But there’s no need to make it bigger than it needs to be.

    [ad_2]

    Katherine J. Wu

    Source link

  • Radio Atlantic: This Is Not Your Parents’ Cold War

    Radio Atlantic: This Is Not Your Parents’ Cold War

    [ad_1]

    During the Cold War, NATO had nightmares of hundreds of thousands of Moscow’s troops pouring across international borders and igniting a major ground war with a democracy in Europe. Western governments feared that such a move by the Kremlin would lead to escalation—first to a world war and perhaps even to a nuclear conflict.

    That was then; this is now.

    Russia’s invasion of Ukraine is nearly a year old, and the Ukrainians are holding on. The Russians, so far, not only have been pushed back, but are taking immense casualties and material losses. For many Americans, the war is now just another conflict in the news. Do we need to worry about the nuclear threat of Putin’s war in Europe the way we worried about such things three decades ago?

    Our staff writer Tom Nichols, an expert on nuclear weapons and the Cold War, counsels Americans not to be obsessed with nuclear escalation, but to be aware of the possibilities for accidents and miscalculations. You can hear his thoughts here:

    The following is a transcript of the episode:

    Tom Nichols: It’s been a year since the Russians invaded Ukraine and launched the biggest conventional war in Europe since the Nazis. One of the things that I think we’ve all worried about in that time is the underlying problem of nuclear weapons.

    This is a nuclear-armed power at war with hundreds of thousands of people in the middle of Europe. This is the nightmare that American foreign policy has dreaded since the beginning of the nuclear age.

    And I think people have kind of put it out of their mind, how potentially dangerous this conflict is, which is understandable, but also, I think, takes us away from thinking about something that is really the most important foreign problem in the world today.

    During the Cold War, we would’ve thought about that every day, but these days, people just don’t think about it, and I think they should.

    My name is Tom Nichols. I’m a staff writer at The Atlantic. And I’ve spent a lot of years thinking about nuclear weapons and nuclear war. For 25 years, I was a professor of national-security affairs at Naval War College.

    For this episode of Radio Atlantic, I want to talk about nuclear weapons and what I think we should have learned from the history of the Cold War about how to think about this conflict today.

    I was aware of nuclear weapons at a pretty young age because my hometown, Chicopee, Massachusetts, was home to a giant nuclear-bomber base, Strategic Air Command’s East Coast headquarters, which had the big B-52s that would fly missions with nuclear weapons directly to the Soviet Union.

    I had a classic childhood of air-raid sirens, and hiding in the basement, and going under the desks, and doing all of that stuff. My high-school biology teacher had a grim sense of humor and told us, you know, because of the Air Force base, we were slated for instant destruction. He said, Yeah, if anything ever happens, we’re gone. We’re gone in seven or eight minutes. So I guess the idea of nuclear war and nuclear weapons was a little more present in my life at an earlier age than for a lot of other kids.

    It’s been a long time since anyone’s really had to worry about global nuclear war. It’s been over 30 years since the fall of the Berlin Wall. I think people who lived through the Cold War were more than happy to forget about it. I know I am glad to have it far in the past. And I think younger people who didn’t experience it have a hard time understanding what it was all aboutand what that fear was about—because it’s part of ancient history now.

    But I think people really need to understand that Cold War history to understand what’s going on today, and how decision makers in Washington and in Europe, and even in Moscow, are playing out this war—because many of these weapons are still right where we left them.

    We have fewer of them, but we still have thousands of these weapons, many of them on a very short trigger. We could go from the beginning of this podcast to the end of the world, that short of [a] time. And it’s easy to forget that. During the Cold War, we were constantly aware of it, because it was the central influence on our foreign policy. But it’s important for us to look back at the history of the Cold War because we survived a long and very tense struggle with a nuclear-armed opponent. Now, some of that was through good and sensible policy. And some of it was just through dumb luck.

    Of course, the first big crisis that Americans really faced where they had to think about the existential threat of nuclear weapons was the Cuban missile crisis, in October of 1962.

    I was barely 2 years old. But living next to this big, plump nuclear target in Massachusetts, we actually knew people in my hometown who built fallout shelters. But we got through the Cuban missile crisis, in part because President Kennedy and Soviet Premier Nikita Khrushchev realized what was at stake.

    The gamble to put missiles in Cuba had failed, and that we had to—as Khruschev put it in one of his messages—we had to stop pulling on the ends of the rope and tightening the knot of war. But we also got incredibly lucky.

    There was a moment aboard a Soviet submarine where the sub commander thought they were under attack. And he wanted to use nuclear-tipped torpedoes to take out the American fleet, which would’ve triggered a holocaust.

    I mean, it would’ve been an incredible amount of devastation on the world. Tens, hundreds of millions of people dead. And, um, fortunately a senior commander who had to consent to the captain’s idea vetoed the whole thing. He said, I don’t think that’s what’s happening. I don’t think they’re trying to sink us, and I do not consent. And so by this one lucky break with this one Soviet officer, we averted the end of the world. I mean, we averted utter catastrophe.

    After the Cuban missile crisis, people are now even more aware of this existential threat of nuclear weapons and it starts cropping up everywhere, especially in our pop culture. I mean, they were always there in the ’50s; there were movies about the communist threat and attacks on America. But after the Cuban missile crisis, that’s when you start getting movies like Dr. Strangelove and Fail Safe.

    Both were about an accidental nuclear war, which becomes a theme for most of the Cold War. In Dr. Strangelove, an American general goes nuts and orders an attack on Russia. And in Fail Safe, a piece of machinery goes bad and the same thing happens. And I think this reflected this fear that we now had to live with, this constant threat of something that we and the Soviets didn’t even want to do, but could happen anyway.

    Even the James Bond movies, which were supposed to be kind of campy and fun, nuclear weapons were really often the source of danger in them. You know, bad guys were stealing them; people were trying to track our nuclear submarines. Throughout the ’60s, the ’70s, the ’80s nuclear weapons really become just kind of soaked into our popular culture.

    We all know the Cuban missile crisis because it’s just part of our common knowledge about the world, even for people that didn’t live through it. I think we don’t realize how dangerous other times were. I always think of 1983 as the year we almost didn’t make it.

    1983 was an incredibly tense year. President Ronald Reagan began the year calling the Soviet Union an “evil empire.” And announced that the United States would start pouring billions of dollars into an effort to defend against Soviet missiles, including space-based defenses, which the Soviets found incredibly threatening.

    The relationship between the United States and the Soviet Union had just completely broken down. Really, by the fall of 1983, it felt like war was inevitable. It certainly felt like to me war was inevitable. There was kind of that smell of gunpowder in the air. We were all pretty scared. I was pretty scared. I was a graduate student at that point. I was 23 years old, and I was certain that this war, this cataclysmic war, was going to happen not only in my lifetime, but probably before I was 30 years old.

    And then a lot of things happened in 1983 that elevated the level of tension between the United States and the Soviet Union to extraordinary levels. I would say really dangerous levels. The Soviets did their best to prove they were an evil empire by shooting down a fully loaded civilian airliner, killing 269 people. Just weeks after the shoot-down of the Korean airliner, Soviet Air Defenses got an erroneous report of an American missile launch against them. And this is another one of those cases where we were just lucky. We were just fortunate.

    And in this case, it was a Soviet Air Defense officer, a lieutenant colonel, who saw this warning that the Americans had launched five missiles. And he said, You know, nobody starts World War III with five missiles. That seems wrong.

    And he said, I just, I think the system—which still had some bugs—I just don’t think the system’s right. We’re gonna wait that out. We’re gonna ignore that. He was actually later reprimanded.

    It was almost like he was reprimanded and congratulated at the same time, because if he had called Moscow and said, Look, I’m doing my duty. I’m reporting Soviet Air Defenses have seen American birds are in the air. They’re coming at us and over to you, Kremlin. And from there, a lot of bad decisions could have cascaded into World War III, especially after a year where we had been in such amazingly high conflict with each other.

    Once again, just as after the Cuban missile crisis, the increase in tension in the 1980s really comes through in the popular culture. Music, movies, TV puts this sense of threat into the minds of ordinary Americans in a way that we just don’t have now. So people are going to the movies and they’re seeing movies like WarGames, once again about an accidental nuclear war. They’re seeing movies like Red Dawn, about a very intentional war by the Soviet Union against the United States.

    The Soviets thought that Red Dawn was actually part of Reagan’s attempt to use Hollywood to prepare Americans for World War III. In music, Ronald Reagan as a character made appearances in videos by Genesis or by Men at Work. That November, the biggest television event in history was The Day After, which was a cinematic representation of World War III.

    I mean, it was everywhere. By 1983, ’84, we were soaked in this fear of World War III. Nuclear war and Armageddon, no matter where you looked. I remember in the fall of 1983 going to see the new James Bond movie, one of the last Roger Moore movies, called Octopussy. And the whole plot amazed me because, of course, I was studying this stuff at the time, I was studying NATO and nuclear weapons.

    And here’s this opening scene where a mad Soviet general says, If only we can convince the West to give up its nuclear weapons, we can finally invade and take over the world.

    I saw all of these films as either a college student or a young graduate student, and again, it was just kind of woven into my life. Well, of course, this movie is about nuclear war. Of course, this movie is about a Soviet invasion. Of course, this movie is about, you know, the end of the world, because it was always there. It was always in the background. But after the end of the Cold War, that remarkable amount of pop-culture knowledge and just general cultural awareness sort of fades away.

    I think one reason that people today don’t look back at the Cold War with the same sense of threat is that it all ended so quickly. We went from [these] terrifying year[s] of 1983, 1984. And then suddenly Gorbachev comes in; Reagan reaches out to him; Gorbachev reaches back. They jointly agree in 1985—they issue a statement that to this day, is still considered official policy by the Russian Federation and by the United States of America. They jointly declare a nuclear war can never be won and must never be fought.

    And all of a sudden, by the summer of 1985, 1986, it’s just over, and, like, 40 years of tension just came to an end in the space of 20, 24 months. Something I just didn’t think I would see in my lifetime. And I think that’s really created a false sense of security in later generations.

    After the Cold War, in the ’90s we have a Russia that’s basically friendly to the United States but nuclear weapons are still a danger. For example, in 1995, Norway launched a scientific satellite on top of a missile—I think they were gonna study the northern lights—and the scientists gave everybody notice, you know, We’re gonna be launching this satellite. You’re gonna see a missile launch from Norway.

    Somebody in Russia just didn’t get the message, and the Russian defense people came to President Boris Yeltsin and they said, This might be a NATO attack. And they gave him the option to activate and launch Russian nuclear weapons. Yeltsin conferred with his people, and fortunately—because our relations were good, and because Boris Yeltsin and Bill Clinton had a good relationship, and because tensions were low in the world—Yeltsin says, Yeah, okay. I don’t buy that. I’m sure it’s nothing.

    But imagine again, if that had been somebody else.

    And that brings us to today. The first thing to understand is: We are in a better place than we were during the Cold War in many ways. During the Cold War, we had tens of thousands of weapons pointed at each other. Now by treaty, the United States and the Russian Federation each have about 1,500 nuclear weapons deployed and ready to go. Now, that’s a lot of nuclear weapons, but 1,500 is a lot better than 30,000 or 40,000.

    Nonetheless, we are dealing with a much more dangerous Russian regime with this mafia state led by Vladimir Putin.

    Putin is a mafia boss. There is no one to stop him from doing whatever he wants. And he has really convinced himself that he is some kind of great world historical figure who is going to reestablish this Christian Slavic empire throughout the former Soviet Union and remnants of the old Russian empire. And that makes him uniquely dangerous.

    People might wonder why Putin is even bothering with nuclear threats, because we’ve always thought of Russia as this giant conventional power because that’s the legacy of the Cold War. We were outnumbered. NATO at the time was only 16 countries. We were totally outnumbered by the Soviets and the Warsaw Pact in everything—men, tanks, artillery—and of course, the only way we could have repulsed an attack by the Soviet Union into Europe would’ve been to use nuclear weapons.

    I know earlier I mentioned the movie Octopussy. We’ve come a long way from the days when that mad Russian general could say, If only we got rid of nuclear weapons and NATO’s nuclear weapons, we could roll our tanks from Czechoslovakia to Poland through Germany and on into France.

    What people need to understand is that Russia is now the weaker conventional power. The Russians are now the ones saying, Listen, if things go really badly for us and we’re losing, we reserve the right to use nuclear weapons. The difference between Russia now and NATO then is: NATO was threatening these nuclear weapons if they were invaded and they were being just rolled over by Soviet tanks on their way to the English channel. The Russians today are saying, We started this war, and if it goes badly for us, we reserve the right to use nuclear weapons to get ourselves out of a jam.

    This conventional weakness is actually what makes them more dangerous, because they’re now continually being humiliated in the field. And a country that had gotten by by convincing people that they were a great conventional power, that they had a lot of conventional capability, they’re being revealed now as a hollow power. They can’t even defeat a country a third of their own size.

    And so when they’re running out of options, you can understand at that point where Putin says, Well, the only way to scramble the deck and to get a do-over here is to use some small nuclear weapon in that area to kind of sober everybody up and shock them into coming to the table or giving me what I want.

    Now, I think that would be incredibly stupid. And I think a lot of people around the world, including China and other countries, have told Putin that would be a really bad idea. But I think one thing we’ve learned from this war is that Putin is a really lousy strategist who takes dumb chances because he’s just not very competent.

    And that comes back to the Cold War lesson—that you don’t worry about someone starting World War III as much as you worry about bumbling into World War III because of a bunch of really dumb decisions by people who thought they were doing something smart and didn’t understand that they were actually doing something really dangerous.

    So where does this leave us? This major war is raging through the middle of Europe, the scenario that we always dreaded during the Cold War; thousands and thousands of Moscow’s troops flooding across borders. What’s the right way to think about this? Perhaps the most important thing to understand is that this really is a war to defend democracy against an aggressive, authoritarian imperial state.

    The front line of the fight for civilization, really, is in Ukraine now. If Ukraine loses this war, the world will be a very different place. That’s what makes it imperative that Americans think about this problem. I think it’s imperative to support Ukraine in this fight, but we should do that with a prudent understanding of real risks that haven’t gone away.

    And so I think the Cold War provides some really good guidance here, which is to be engaged, to be aware, but not to be panicked. Not to become consumed by this fear every day, because that becomes paralyzing, that becomes debilitating. It’s bad for you as a person. And it’s bad for democracies’ ability to make decisions—because then you simply don’t make any decisions at all, out of fear.

    I think it’s important not to fall victim to Cold War amnesia and forget everything we learned. But I also don’t think we should become consumed by a new Cold War paranoia where we live every day thinking that we’re on the edge of Armageddon.

    [ad_2]

    Tom Nichols

    Source link

  • The Future of Long COVID

    The Future of Long COVID

    [ad_1]

    In the early spring of 2020, the condition we now call long COVID didn’t have a name, much less a large community of patient advocates. For the most part, clinicians dismissed its symptoms, and researchers focused on SARS-CoV-2 infections’ short-term effects. Now, as the pandemic approaches the end of its third winter in the Northern Hemisphere, the chronic toll of the coronavirus is much more familiar. Long COVID has been acknowledged by prominent experts, national leaders, and the World Health Organization; the National Institutes of Health has set up a billion-dollar research program to understand how and in whom its symptoms unfurl. Hundreds of long-COVID clinics now freckle the American landscape, offering services in nearly every state; and recent data hint that well-vetted drugs to treat or prevent long COVID may someday be widespread. Long COVID and the people battling it are commanding more respect, says Hannah Davis, a co-founder of the Patient-Led Research Collaborative, who has had long COVID for nearly three years: Finally, many people “seem willing to understand.”

    But for all the ground that’s been gained, the road ahead is arduous. Long COVID still lacks a universal clinical definition and a standard diagnosis protocol; there’s no consensus on its prevalence, or even what symptoms fall under its purview. Although experts now agree that long COVID does not refer to a single illness, but rather is an umbrella term, like cancer, they disagree on the number of subtypes that fall within it and how, exactly, each might manifest. Some risk factors—among them, a COVID hospitalization, female sex, and certain preexisting medical conditions—have been identified, but researchers are still trying to identify others amid fluctuating population immunity and the endless slog of viral variants. And for people who have long COVID now, or might develop it soon, the interventions are still scant. To this day, “when someone asks me, ‘How can I not get long COVID?’ I can still only say, ‘Don’t get COVID,’” says David Putrino, a neuroscientist and physical therapist who leads a long-COVID rehabilitation clinic at Mount Sinai’s Icahn School of Medicine.

    As the world turns its gaze away from the coronavirus pandemic, with country after country declaring the virus “endemic” and allowing crisis-caliber interventions to lapse, long-COVID researchers, patients, and activists worry that even past progress could be undone. The momentum of the past three years now feels bittersweet, they told me, in that it represents what the community might lose. Experts can’t yet say whether the number of long-haulers will continue to increase, or offer a definitive prognosis for those who have been battling the condition for months or years. All that’s clear right now is that, despite America’s current stance on the coronavirus, long COVID is far from being beaten.


    Despite an influx of resources into long-COVID research in recent months, data on the condition’s current reach remain a mess—and scientists still can’t fully quantify its risks.

    Recent evidence from two long-term surveys have hinted that the pool of long-haulers might be shrinking, even as new infection rates remain sky-high: Earlier this month, the United Kingdom’s Office for National Statistics released data showing that 2 million people self-reported lingering symptoms at the very start of 2023, down from 2.3 million in August 2022. The U.S. CDC’s Household Pulse Survey, another study based on self-reporting, also recorded a small drop in long-COVID prevalence in the same time frame, from about 7.5 percent of all American adults to roughly 6. Against the massive number of infections that have continued to slam both countries in the pandemic’s third year and beyond, these surveys might seem to imply that long-haulers are leaving the pool faster than newcomers are arriving.

    Experts cautioned, however, that there are plenty of reasons to treat these patterns carefully—and to not assume that the trends will be sustained. It’s certainly better that these data aren’t showing a sustained, dramatic uptick in long-COVID cases. But that doesn’t mean the situation is improving. Throughout the pandemic, the size of the long-COVID pool has contracted or expanded for only two reasons: a change in the rate at which people enter, or at which they exit. Both figures are likely to be in constant flux, as surges of infections come and go, masking habits change, and vaccine and antiviral uptake fluctuates. Davis pointed out that the slight downward tick in both studies captured just a half-year stretch, so the downward slope could be one small portion of an undulating wave. A few hours spent at the beach while the tide is going out wouldn’t be enough to prove that the ocean is drying up.

    Recent counts of new long-COVID cases might also be undercounts, as testing slows and people encounter more challenges getting diagnosed. That said, it’s still possible that, on a case-by-case basis, the likelihood of any individual developing long COVID after a SARS-CoV-2 infection may have fallen since the pandemic’s start, says Deepti Gurdasani, a clinical epidemiologist at Queen Mary University of London and the University of New South Wales. Population immunity—especially acquired via vaccination—has, over the past three years, better steeled people’s bodies against the virus, and strong evidence supports the notion that vaccines can moderately reduce the risk of developing long COVID. Treatments and behavioral interventions that have become more commonplace may have chipped away at incidence as well. Antivirals can now help to corral the virus early in infection; ventilation, distancing, and masks—when they’re used—can trim the amount of virus that infiltrates the body. And if overall exposure to the virus can influence the likelihood of developing long COVID, that could help explain why so many debilitating cases arose at the very start of the pandemic, when interventions were few and far between, says Steven Deeks, a physician researcher at UC San Francisco.

    There’s not much comfort to derive from those individual-level stats, though, when considering what’s happening on broader scales. Even if immunity makes the average infected person less likely to fall into the long-COVID pool, so many people have been catching the virus that the inbound rate still feels like a flood. “The level of infection in many countries has gone up substantially since 2021,” Gurdasani told me. The majority of long-COVID cases arise after mild infections, the sort for which our immune defenses fade most rapidly. Now that masking and physical distancing have fallen by the wayside, people may be getting exposed to higher viral doses than they were a year or two ago. In absolute terms, then, the number of people entering the long-COVID pool may not really be decreasing. Even if the pool were getting slightly smaller, its size would still be staggering, an ocean of patients with titanic needs. “Anecdotally, we still have an enormous waitlist to get into our clinic,” Putrino told me.

    Deeks told me that he’s seen another possible reason for optimism: People with newer cases of long COVID might be experiencing less debilitating or faster-improving disease, based on what he’s seen. “The worst cases we’ve seen come from the first wave in 2020,” he said. But Putrino isn’t so sure. “If you put an Omicron long-COVID patient in front of me, versus one from the first wave, I wouldn’t be able to tell you who was who,” he said. The two cases would also be difficult to compare, because they’re separated by so much time. Long COVID’s symptoms can wax, wane, and qualitatively change; a couple of years into the future, some long-haulers who’ve just developed the condition may be in a spot that’s similar to where many veterans with the condition are now.

    Experts’ understanding of how often people depart the long-COVID pool is also meager. Some long-haulers have undoubtedly seen improvement—but without clear lines distinguishing short COVID from medium and long COVID, entry and exit into these various groups is easy to over- or underestimate. What few data exist on the likelihood of recovery or remission is inconsistent, and not always rosy: Investigators of RECOVER, a large national study of long COVID, have calculated that about two-thirds of the long-haulers in their cohort do not return to baseline health. Putrino, who has worked with hundreds of long-haulers since the pandemic began, estimates that although most of his patients experience at least some benefit from a few months of rehabilitation, only about one-fifth to one-quarter of them eventually reach the point of feeling about as well as they did before catching the virus, while the majority hit a middling plateau. A small minority of the people he has treated, he told me, never seem to improve at all.

    Letícia Soares, a long-hauler in Brazil who caught the virus near the start of the pandemic, falls into that final category. Once a disease ecologist who studied parasite transmission in birds, she is now mostly housebound, working when she is able as a researcher for the Patient-Led Research Collaborative. Her days revolve around medications and behavioral modifications she uses for her fatigue, sleeplessness, and chronic pain. Soares no longer has the capacity to cook or frequently venture outside. And she has resigned herself to this status quo until the treatment landscape changes drastically. It is not the life she pictured for herself, Soares told me. “Sometimes I think the person I used to be died in April of 2020.”

    Even long-haulers who have noticed an improvement in their symptoms are wary of overconfidence. Some absolutely do experience what could be called recovery—but for others, the term has gotten loaded, almost a jinx. “If the question is, ‘Are you doing the things you were doing in 2019?’ the answer is largely no,” says JD Davids, a chronic-illness advocate based in New York. For some, he told me, “getting better” has been more defined by a resetting of expectations than a return to good health. Relapses are also not uncommon, especially after repeat encounters with the virus. Lisa McCorkell, a long-hauler and a co-founder of the Patient-Led Research Collaborative, has felt her symptoms partly abate since she first fell ill in the spring of 2020. But, she told me, she suspects that her condition is more likely to deteriorate than further improve—partly because of “how easy it is to get reinfected now.”


    Last week, in his State of the Union address, President Joe Biden told the American public that “we have broken COVID’s grip on us.” Highlighting the declines in the rates of COVID deaths, the millions of lives saved, and the importance of remembering the more than 1 million lost, Biden reminded the nation of what was to come: “Soon we’ll end the public-health emergency.”

    When the U.S.’s state of emergency was declared nearly three years ago, as hospitals were overrun and morgues overflowed, the focus was on severe, short-term disease. Perhaps in that sense, the emergency is close to being over, Deeks told me. But long COVID, though slower to command attention, has since become its own emergency, never formally declared; for the millions of Americans who have been affected by the condition, their relationship with the virus does not yet seem to be in a better place.

    Even with many more health-care providers clued into long COVID’s ills, the waiting lists for rehabilitation and treatment remain untenable, Hannah Davis told me. “I consider myself someone who gets exceptional care compared to other people,” she said. “And still, I hear from my doctor every nine or 10 months.” Calling a wrap on COVID’s “emergency” phase could worsen that already skewed supply-demand ratio. Changes to the nation’s funding tactics could strip resources—among them, access to telehealth; Medicaid coverage; and affordable antivirals, tests, and vaccines—from vulnerable populations, including people of color, that aren’t getting their needs met even as things stand, McCorkell told me. And as clinicians internalize the message that the coronavirus has largely been addressed, attention to its chronic impacts may dwindle. At least one of the country’s long-COVID clinics has, in recent months, announced plans to close, and Davis worries that more could follow soon.

    Scientists researching long COVID are also expecting new challenges. Reduced access to testing will complicate efforts to figure out how many people are developing the condition, and who’s most at risk. Should researchers turn their scientific focus away from studying causes and cures for long COVID when the emergency declaration lifts, Davids and others worry that there will be ripple effects on the scientific community’s interest in other, neglected chronic illnesses, such as ME/CFS (myalgic encephalomyelitis or chronic fatigue syndrome), a diagnosis that many long-haulers have also received.

    The end of the U.S.’s official crisis mode on COVID could stymie research in other ways as well. At Johns Hopkins University, the infectious-disease epidemiologists Priya Duggal, Shruti Mehta, and Bryan Lau have been running a large study to better understand the conditions and circumstances that lead to long COVID, and how symptoms evolve over time. In the past two years, they have gathered online survey data from thousands of people who both have and haven’t been infected, and who have and haven’t seen their symptoms rapidly resolve. But as of late, they’ve been struggling to recruit enough people who caught the virus and didn’t feel their symptoms linger. “I think that the people who are suffering from long COVID will always do their best to participate,” Duggal told me. That may not be the case for individuals whose experiences with the virus were brief. A lot of them “are completely over it,” Duggal said. “Their life has moved on.”

    Kate Porter, a Massachusetts-based marketing director, told me that she worries about her family’s future, should long COVID fade from the national discourse. She and her teenage daughter both caught the virus in the spring of 2020, and went on to develop chronic symptoms; their experience with the disease isn’t yet over. “Just because the emergency declaration is expiring, that doesn’t mean that suddenly people are magically going to get better and this issue is going to go away,” Porter told me. After months of relative improvement, her daughter is now fighting prolonged bouts of fatigue that are affecting her school life—and Porter isn’t sure how receptive people will be to her explanations, should their illnesses persist for years to come. “Two years from now, how am I going to explain, ‘Well, this is from COVID, five years ago’?” she said.

    A condition that was once mired in skepticism, scorn, and gaslighting, long COVID now has recognition—but empathy for long-haulers could yet experience a backslide. Nisreen Alwan, a public-health researcher at the University of Southampton, in the U.K., and her colleagues have found that many long-haulers still worry about disclosing their condition, fearing that it could jeopardize their employment, social interactions, and more. Long COVID could soon be slated to become just one of many neglected chronic diseases, poorly understood and rarely discussed.

    Davis doesn’t think that marginalization is inevitable. Her reasoning is grim: Other chronic illnesses have been easier to push to the sidelines, she said, on account of their smaller clinical footprint, but the pool of long-haulers is enormous—comprising millions of people in the U.S. alone. “I think it’s going to be impossible to ignore,” she told me. One way or another, the world will have no choice but to look.

    [ad_2]

    Katherine J. Wu

    Source link

  • The COVID Emergency Is Ending. Is Vaccine Outreach Over Too?

    The COVID Emergency Is Ending. Is Vaccine Outreach Over Too?

    [ad_1]

    Stephen B. Thomas, the director of the Center for Health Equity at the University of Maryland, considers himself an eternal optimist. When he reflects on the devastating pandemic that has been raging for the past three years, he chooses to focus less on what the world has lost and more on what it has gained: potent antiviral drugs, powerful vaccines, and, most important, unprecedented collaborations among clinicians, academics, and community leaders that helped get those lifesaving resources to many of the people who needed them most. But when Thomas, whose efforts during the pandemic helped transform more than 1,000 Black barbershops and salons into COVID-vaccine clinics, looks ahead to the next few months, he worries that momentum will start to fizzle out—or, even worse, that it will go into reverse.

    This week, the Biden administration announced that it would allow the public-health-emergency declaration over COVID-19 to expire in May—a transition that’s expected to put shots, treatments, tests, and other types of care more out of reach of millions of Americans, especially those who are uninsured. The move has been a long time coming, but for community leaders such as Thomas, whose vaccine-outreach project, Shots at the Shop, has depended on emergency funds and White House support, the transition could mean the imperilment of a local infrastructure that he and his colleagues have been building for years. It shouldn’t have been inevitable, he told me, that community vaccination efforts would end up on the chopping block. “A silver lining of the pandemic was the realization that hyperlocal strategies work,” he said. “Now we’re seeing the erosion of that.”

    I called Thomas this week to discuss how the emergency declaration allowed his team to mobilize resources for outreach efforts—and what may happen in the coming months as the nation attempts to pivot back to normalcy.

    Our conversation has been edited for clarity and length.

    Katherine J. Wu: Tell me about the genesis of Shots at the Shop.

    Stephen B. Thomas: We started our work with barbershops and beauty salons in 2014. It’s called HAIR: Health Advocates In-Reach and Research. Our focus was on colorectal-cancer screening. We brought medical professionals—gastroenterologists and others—into the shop, recognizing that Black people in particular were dying from colon cancer at rates that were just unacceptable but were potentially preventable with early diagnosis and appropriate screening.

    Now, if I can talk to you about colonoscopy, I could probably talk to you about anything. In 2019, we held a national health conference for barbers and stylists. They all came from around the country to talk about different areas of health and chronic disease: prostate cancer, breast cancer, others. We brought them all together to talk about how we can address health disparities and get more agency and visibility to this new frontline workforce.

    When the pandemic hit, all the plans that came out of the national conference were on hold. But we continued our efforts in the barbershops. We started a Zoom town hall. And we started seeing misinformation and disinformation about the pandemic being disseminated in our shops, and there were no countermeasures.

    We got picked up on the national media, and then we got the endorsement of the White House. And that’s when we launched Shots at the Shop. We had 1,000 shops signed up in I’d say less than 90 days.

    Wu: Why do you think Shots at the Shop was so successful? What was the network doing differently from other vaccine-outreach efforts that spoke directly to Black and brown communities?

    Thomas: If you came to any of our clinics, it didn’t feel like you were coming into a clinic or a hospital. It felt like you were coming to a family reunion. We had a DJ spinning music. We had catered food. We had a festive environment. Some people showed up hesitant, and some of them left hesitant but fascinated. We didn’t have to change their worldview. But we treated them with dignity and respect. We weren’t telling them they’re stupid and don’t understand science.

    And the model worked. It worked so well that even the health professionals were extremely pleased, because now all they had to do was show up with the vaccine, and the arms were ready for needles.

    The barbers and stylists saw themselves as doing health-related things anyway. They had always seen themselves as doing more than just cutting hair. No self-respecting Black barber is going to say, “We’ll get you in and out in 10 minutes.” It doesn’t matter how much hair you have: You’re gonna be in there for half a day.

    Wu: How big of a difference do you think your network’s outreach efforts made in narrowing the racial gaps in COVID vaccination?

    Thomas: Attribution is always difficult, and success has many mothers. So I will say this to you: I have no doubt that we made a huge difference. With a disease like COVID, you can’t afford to have any pocket unprotected, and we were vaccinating people who would otherwise have never been vaccinated. We were dealing with people at the “hell no” wall.

    We were also vaccinating people who were homeless. They were treated with dignity and respect. At some of our shops, we did a coat drive and a shoe drive. And we had dentists providing us with oral-health supplies: toothbrush, floss, paste, and other things. It made a huge difference. When you meet people where they are, you’ve got to meet all their needs.

    Wu: How big of a difference did the emergency declaration, and the freeing-up of resources, tools, and funds, make for your team’s outreach efforts?

    Thomas: Even with all the work I’ve been doing in the barber shop since 2014, the pandemic got us our first grant from the state. Money flowed. We had resources to go beyond the typical mechanisms. I was able to secure thousands of KN95 masks and distribute them to shops. Same thing with rapid tests. We even sent them Corsi-Rosenthal boxes, a DIY filtration system to clean up indoor air.

    Without the emergency declaration, we would still be in the desert screaming for help. The emergency declaration made it possible to get resources through nontraditional channels, and we were doing things that the other systems—the hospital system, the local health department—couldn’t do. We extended their reach to populations that have historically been underserved and distrustful.

    Wu: The public-health-emergency declaration hasn’t yet expired. What signs of trouble are you seeing right now?

    Thomas: The bridge between the barbershops and the clinical side has been shut down in almost all places, including here in Maryland. I go to the shop and they say to me, “Dr. T, when are we going to have the boosters here?” Then I call my clinical partners, who deliver the shots. Some won’t even answer my phone calls. And when they do, they say, “Oh, we don’t do pop-ups anymore. We don’t do community-outreach clinics anymore, because the grant money’s gone. The staff we hired during the pandemic, they use the pandemic funding—they’re gone.” But people are here; they want the booster. And my clinical partners say, “Send them down to a pharmacy.” Nobody wants to go to a pharmacy.

    You can’t see me, so you can’t see the smoke still coming out of my ears. But it hurts. We got them to trust. If you abandon the community now, it will simply reinforce the idea that they don’t matter.

    Wu: What is the response to this from the communities you’re talking to?

    Thomas: It’s “I told you so, they didn’t care about us. I told you, they would leave us with all these other underlying conditions.” You know, it shouldn’t take a pandemic to build trust. But if we lose it now, it will be very, very difficult to build back.

    We built a bridge. It worked. Why would you dismantle it? Because that’s exactly what’s happening right now. The very infrastructure we created to close the racial gaps in vaccine acceptance is being dismantled. It’s totally unacceptable.

    Wu: The emergency declaration was always going to end at some point. Did it have to play out like this?

    Thomas: I don’t think so. If you talk to the hospital administrators, they’ll tell you the emergency declaration and the money allowed them to add outreach. And when the money went away, they went back to business as usual. Even though the outreach proved you could actually do a better job. And the misinformation and the disinformation campaign hasn’t stopped. Why would you go back to what doesn’t work?

    Wu: What is your team planning for the short and long term, with limited resources?

    Thomas: As long as Shots at the Shop can connect clinical partners to access vaccines, we will definitely keep that going.

    Nobody wants to go back to normal. So many of our barbers and stylists feel like they’re on their own. I’m doing my best to supply them with KN95 masks and rapid tests. We have kept the conversation going on our every-other-week Zoom town hall. We just launched a podcast. We put out some of our stories in the form of a graphic novel, The Barbershop Storybook. And we’re trying to launch a national association for barbers and stylists, called Barbers and Stylists United for Health.

    The pandemic resulted in a mobilization of innovation, a recognition of the intelligence at the community level, the recognition that you need to culturally tailor your strategy. We need to keep those relationships intact. Because this is not the last time we’re going to see a pandemic even in our lifetime. I’m doing my best to knock on doors to continue to put our proposals out there. Hopefully, people will realize that reaching Black and Hispanic communities is worth sustaining.

    [ad_2]

    Katherine J. Wu

    Source link

  • Psychedelics Open Your Brain. You Might Not Like What Falls In.

    Psychedelics Open Your Brain. You Might Not Like What Falls In.

    [ad_1]

    If you’ve ever been to London, you know that navigating its wobbly grid, riddled with curves and dead-end streets, requires impressive spatial memory. Driving around London is so demanding, in fact, that in 2006 researchers found that it was linked with changes in the brains of the city’s cab drivers: Compared with Londoners who drove fixed routes, cabbies had a larger volume of gray matter in the hippocampus, a brain region crucial to forming spatial memory. The longer the cab driver’s tenure, the greater the effect.

    The study is a particularly evocative demonstration of neuroplasticity: the human brain’s innate ability to change in response to environmental input (in this case, the spatially demanding task of driving a cab all over London). That hard-won neuroplasticity required years of mental and physical practice. Wouldn’t it be nice to get the same effects without so much work?

    To hear some people tell it, you can: Psychedelic drugs such as psilocybin, LSD, ayahuasca, and Ecstasy, along with anesthetics such as ketamine, can enhance a user’s neuroplasticity within hours of administration. In fact, some users take psychedelics for the express purpose of making their brain a little more malleable. Just drop some acid, the thinking goes, and your brain will rewire itself—you’ll be smarter, fitter, more creative, and self-aware. You might even get a transcendent experience. Popular media abound with anecdotes suggesting that microdosing LSD or psilocybin can expand divergent thinking, a more free and associative type of thinking that some psychologists link with creativity.

    Research suggests that psychedelic-induced neuroplasticity can indeed enhance specific types of learning, particularly in terms of overcoming fear and anxiety associated with past trauma. But claims about the transformative, brain-enhancing effects of psychedelics are, for the most part, overstated. We don’t really know yet how much microdosing, or a full-blown trip, will change the average person’s mental circuitry. And there’s reason to suspect that, for some people, such changes may be actively harmful.

    There is nothing new about the notion that the human and animal brain are pliant in response to everyday experience and injury. The philosopher and psychologist William James is said to have first used the term plasticity back in 1890 to describe changes in neural pathways that are linked to the formation of habits. Now we understand that these changes take place not only between neurons but also within them: Individual cells are capable of sprouting new connections and reorganizing in response to all kinds of experiences. Essentially, this is a neural response to learning, which psychedelics can rev up.

    We also understand how potent psychedelic drugs can be in inducing changes to the brain. Injecting psilocybin into a mouse can stimulate neurons in the frontal cortex to grow by about 10 percent and sprout new spines, projections that foster connections to other neurons. It also alleviated their stress-related behaviors—effects that persisted for more than a month, indicating enduring structural change linked with learning. Presumably, a similar effect takes place in humans. (Comparable studies on humans would be impossible to conduct, because investigating changes in a single neuron would require, well, sacrificing the subject.)

    The thing is, all those changes aren’t necessarily all good. Neuroplasticity just means that your brain—and your mind—is put into a state where it is more easily influenced. The effect is a bit like putting a glass vase back into the kiln, which makes it pliable and easy to reshape. Of course you can make the vase more functional and beautiful, but you might also turn it into a mess. Above all else, psychedelics make us exquisitely impressionable, thanks to their speed of action and magnitude of effect, though their ultimate effect is still heavily dependent on context and influence.

    We have all experienced heightened neuroplasticity during the so-called sensitive periods of brain development, which typically unfold between the ages of 1 and 4 when the brain is uniquely responsive to environmental input. This helps explain why kids effortlessly learn all kinds of things, like how to ski or speak a new language. But even in childhood, you don’t acquire your knowledge and skills by magic; you have to do something in a stimulating enough environment to leverage this neuroplastic state. If you have the misfortune of being neglected or abused during your brain’s sensitive periods, the effects are likely to be adverse and enduring—probably more so than if the same events happened later in life.

    Being in a neuroplastic state enhances our ability to learn, but it might also burn in negative or traumatic experiences—or memories—if you happen to have them while taking a psychedelic. Last year, a patient of mine, a woman in her early 50s, decided to try psilocybin with a friend. The experience was quite pleasurable until she started to recall memories of her emotionally abusive father, who had an alcohol addiction. In the weeks following her psilocybin exposure, she had vivid and painful recollections of her childhood, which precipitated an acute depression.

    Her experience might have been very different—perhaps even positive—if she’d had a guide or therapist with her while she was tripping to help her reappraise these memories and make them less toxic. But without a mediating positive influence, she was left to the mercy of her imagination. This must have been just the sort of situation legislators in Oregon had in mind last month when they legalized recreational psilocybin use, but only in conjunction with a licensed guide. It’s the right idea.

    In truth, researchers and clinicians haven’t a clue whether people who microdose frequently with psychedelics—and are thus walking around in a state of enhanced neuroplasticity—are more vulnerable to the encoding of traumatic events. In order to find out, you would have to compare a group of people who microdose against a group of people who don’t over a period of time and see, for example, if they differ in rates of PTSD. Crucially, you’d have to randomly assign people to either microdose or abstain—not simply let them pick whether they want to try tripping. In the absence of such a study, we are all currently involved in a large, uncontrolled social experiment. The results will inevitably be messy and inconclusive.

    Even if opening your brain to change were all to the good, the promise of neuroplasticity without limit—that you can rejuvenate and remodel the brain at any age—far exceeds scientific evidence. Despite claims to the contrary, each of us has an upper limit to how malleable we can make our brain. The sensitive periods, when we hit our maximum plasticity, is a finite window of opportunity that slams shut as the brain matures. We progressively lose neuroplasticity as we age. Of course we can continue to learn—it just takes more effort than when we were young. Part of this change is structural: At 75, your hippocampus contains neurons that are a lot less connected to one another than they were at 25. That’s one of the major reasons older people find that their memory is not as sharp as it used to be. You may enhance those connections slightly with a dose of psilocybin, but you simply can’t make your brain behave as if it’s five decades younger.

    This reality has never stopped a highly profitable industry from catering to people’s anxieties and hopes—especially seniors’. You don’t have to search long online before you find all kinds of supplements claiming to keep your brain young and sharp. Brain-training programs go even further, purporting to rewire your brain and boost your cognition (sound familiar?), when in reality the benefits are very modest, and limited to whatever cognitive task you’ve practiced. Memorizing a string of numbers will make you better at memorizing numbers; it won’t transfer to another skill and make you better at, say, chess.

    We lose neuroplasticity as we age for good reason. To retain our experience, we don’t want our brain to rewire itself too much. Yes, we lose cognitive fluidity along the way, but we gain knowledge too. That’s not a bad trade-off. After all, it’s probably more valuable to an adult to be able to use all of their accumulated knowledge than to be able to solve a novel mathematical problem or learn a new skill. More important, our very identity is encoded in our neural architecture—something we wouldn’t want to tinker with lightly.

    At their best, psychedelics and other neuroplasticity-enhancing drugs can do some wonderful things, such as speed up the treatment of depression, quell anxiety in terminally ill patients, and alleviate the worst symptoms of PTSD. That’s enough reason to research their uses and let patients know psychedelics are an option for psychiatric treatment when the evidence supports it. But limitless drug-induced self-enhancement is simply an illusion.

    [ad_2]

    Richard A. Friedman

    Source link