ReportWire

Tag: younger people

  • Millennials Have Lost Their Grip on Fashion

    Millennials Have Lost Their Grip on Fashion

    Ballet flats are back. Everyone’s saying it—Vogue, the TikTok girlies, The New York Times, Instagram’s foremost fashion narcs, the whole gang. Shoes from trendsetting brands such as Alaïa and Miu Miu line store shelves, and hundreds of cheap alternatives are available online at fast-fashion juggernauts such as Shein and Temu. You can run from the return of the ballet flat, but you can’t hide. And, depending on how much time your feet spent in the shoes the last time they were trendy, maybe you can’t run either.

    The ballet flat—a slipperlike, largely unstructured shoe style meant to evoke a ballerina’s pointe shoes—never disappears from the fashion landscape entirely, but its previous period of decided coolness was during the mid-to-late 2000s. Back then, teens were swathing themselves in Juicy Couture and Abercrombie & Fitch, Lauren Conrad was ruining her life by turning down a trip to Paris on The Hills, and fashion magazines were full of Lanvin and Chloé and Tory Burch flats. The style was paired with every kind of outfit you could think of—the chunky white sneaker of its day, if you will.

    How you feel about the shoes’ revival likely has a lot to do with your age. If you’re young enough to be witnessing ballet flats’ popularity for the first time, then maybe they seem like a pleasantly retro and feminine departure from lug soles and sneakers. If, like me, you’ve made it past 30(ish), the whole thing might make you feel a little old. Physically, ballet flats are a nightmare for your back, your knees, your arches; when it comes to support, most offer little more than you’d get from a pair of socks. Spiritually, the injury might be even worse. Twenty years is a normal amount of time to have passed for a trend to be revived as retro, but it’s also a rude interval at which to contemplate being punted out of the zeitgeist in favor of those who see your youth as something to be mined for inspiration—and therefore as something definitively in the past.

    Trends are a funny thing. Especially in fashion, people see trends as the province of the very young, but tracing their paths is often less straightforward. Take normcore’s dad sneakers: In the mid-2010s, the shoes became popular among Millennials, who were then hitting their 30s, precisely because they were the sneakers of choice for retired Boomers. But in order for a trend to reach the rare heights of population-level relevance, very young people do eventually need to sign on. In the case of dad sneakers, it took years for Zoomers to come around en masse, but their seal of approval has helped keep bulky New Balances popular for nearly a decade—far past the point when most trends fizzle.

    The return of ballet flats is a signal of this new cohort of fashion consumers asserting itself even more widely in the marketplace. The trends young people endorse tend to swing between extremes. The durable popularity of dad shoes all but guaranteed that some young people would eventually start to look for something sleeker and less substantial. The ballet flat fits perfectly within the turn-of-the-millennium fashion tropes—overplucked eyebrows, low-rise jeans, tiny sunglasses—that Zoomers have been tinkering with for several years.

    Ballet flats are an all-the-more-appropriate sign of a generational shift, in fact, because they are the folly of youth made manifest. Wearing them is an act of violence against podiatry, yes, but their drawbacks go further. Many ballet flats are so flimsy that they look trashed after only a few wears. They’re difficult to pair with socks, so they stink like feet almost as quickly. Ballet flats are impractical shoes that sneak into closets under the guise of practicality—hey, they’re not high heels!—and prey on people who do not yet know better.

    What does that mean, then, for the people who do know better? For one, it means that the extended adolescence that some Millennials experienced following the Great Recession is finally, inarguably over. We’re old, at least relatively speaking. Every generation eventually ages out of the particular cultural power of youth and then watches as younger people make mistakes that seem obvious in hindsight, and the ballet flat is a reminder that people my age are no longer the default main characters in culture that we once were. When I was a middle schooler begging for a pair of wooden-soled Candie’s platform sandals in the mid-’90s, I remember my mother, in a fit of exasperation, telling me that I couldn’t have them because she saw too many people fall off their platforms in the ’70s. This is the first time I remember contemplating my mom as a human being who existed long before I was conscious of her: someone who bought cool but ill-advised clothes and uncomfortable shoes, who went to parties where people sometimes had a hard time remaining upright.

    Even the cool girls with the coolest shoes at some point grow to regard parts of their past selves as a bit silly, and they become the people trying to save the kids from their own fashion hubris. This sensation is undoubtedly acute for Millennials, because this hubris is displayed most prominently in an arena they used to rule: the internet. On TikTok, the world’s hottest trend machine, the over-30 crowd is more onlooker than participant, and the youth are using the platform to encourage one another to dress like they’re going to a party at the Delt house in 2007. Someone has to warn them.

    If you’re realizing that this someone is you, my advice would be to not let the generational responsibilities of aging weigh too heavily on you. The upside of losing your spot at culture’s center stage, after all, is freedom. You can look around at what’s fashionable, pick the things that work for you, and write off the rest as the folly of youth. (The Zoomers are right: The lug-soled combat boots that I wore in high school actually are very cool.) In place of chasing trends, you can cultivate taste. When you fail at taste, at least you can be aware of your own questionable decisions. In the process of writing this article, I realized that French Sole still makes the exact same prim little flats that I must have bought three or four times over during the course of my first post-college job, in the late 2000s. They’re as flimsy as ever, but whatever made me love them 15 years ago is still there, buried under all of my better judgment. I haven’t closed the tab quite yet.

    Amanda Mull

    Source link

  • A Gaping Hole in Cancer-Therapy Trials

    A Gaping Hole in Cancer-Therapy Trials

    This article was originally published by Undark Magazine.

    In October 2021, 84-year-old Jim Yeldell was diagnosed with Stage 3 lung cancer. The first drug he tried disrupted his balance and coordination, so his doctor halved the dose to minimize these side effects, Yeldell recalls. In addition, his physician recommended a course of treatment that included chemotherapy, radiation, and a drug targeting a specific genetic mutation. This combination can be extremely effective—at least in younger people—but it can also be “incredibly toxic” in older, frail people, says Elizabeth Kvale, a palliative-care specialist at Baylor College of Medicine, and also Yeldell’s daughter-in-law.

    Older patients are often underrepresented in clinical trials of new cancer treatments, including the one offered to Yeldell. As a result, he only learned of the potential for toxicity because his daughter-in-law had witnessed the treatment’s severe side effects in the older adults at her clinic.

    This dearth of age-specific data has profound implications for clinical care, because older adults are more likely than younger people to be diagnosed with cancer. In the U.S., approximately 42 percent of people with cancer are over the age of 70—a number that could grow in the years to come—yet they comprise less than a quarter of the people in clinical trials to test new cancer treatments. Many of those who do participate are the healthiest of the aged, who may not have common age-related conditions like diabetes or poor kidney or heart function, says Mina Sedrak, a medical oncologist and the deputy director of the Center for Cancer and Aging at City of Hope National Medical Center.

    For decades, clinical trials have tended to exclude older participants for reasons that include concerns about preexisting conditions and other medications and participants’ ability to travel to trial locations. As a result, clinicians cannot be as certain that approved cancer drugs will work as predicted in clinical trials for the people most likely to have cancer. This dearth of data means that older cancer patients must decide if they want to pursue a treatment that might yield fewer benefits—and cause more side effects—than it did for younger people in the clinical trial.

    This evidence gap extends across the spectrum of cancer treatments—from chemotherapy and radiation to immune-checkpoint inhibitors—with sometimes-dire results. Many forms of chemotherapy, for example, have proved to be more toxic in older adults, a discovery that came only after the drugs were approved for use in this population. “This is a huge problem,” Sedrak says. In an effort to minimize side effects, doctors will often tweak the dose or duration of medications that are given to older adults, but these physicians are doing this without any real guidance.

    Despite recommendations from funders and regulators, as well as extensive media coverage, not much has changed in the past three decades. “We’re in this space where everyone agrees this is a problem, but there’s very little guidance on how to do better for older adults,” Kvale says. “The consequences in the real world are stark.”


    Post-approval studies of cancer drugs have helped shed light on the disconnect between how these drugs are used in clinical trials and how they are used in clinics around the country.

    For example, when Cary Gross, a physician and cancer researcher at Yale, set out to study the use of a new kind of cancer drug known as an immune-checkpoint inhibitor, he knew that most clinicians were well aware that clinical trials overlooked older patients. Gross’s research team suspected that some doctors might be wary of offering older adults the treatments, which work by preventing immune cells from switching off, thus allowing them to kill cancer cells. “Maybe they’re going to be more careful,” he says, and offer the intervention to younger patients first.

    But in a 2018 analysis of more than 3,000 patients, Gross and his colleagues found that within four months of approval by the FDA, most patients eligible to receive a class of immune-checkpoint inhibitors were being prescribed the drugs. And the patients receiving this treatment in clinics were significantly older than those in the clinical trials. “Oncologists were very ready to give these drugs to the older patients, even though they’re not as well represented,” Gross says.

    In another analysis, published this year, Gross and his colleagues examined how these drugs helped people diagnosed with certain types of lung cancer. The team found that the drugs extended the life of patients under the age of 55 by a median of four and a half months, but only by a month in those over the age of 75.

    The evidence doesn’t suggest that checkpoint inhibitors aren’t helpful for many patients, Gross says. But it’s important to identify which particular populations are helped the most by these drugs. “I thought that we would see a greater survival benefit than we did,” he says. “It really calls into question how we’re doing research, and we really have to double down on doing more research that includes older patients.”

    People over the age of 65 don’t fare well with other types of cancer treatments either. About half of older patients with advanced cancer experience severe and even potentially life-threatening side effects with chemotherapy, which can lead oncologists to lower medication doses, as in Yeldell’s case.

    There’s a strong connection between the lack of evidence from clinical trials and worse outcomes in the clinic, according to Kvale. “There’s a lot of enthusiasm for these medicines that don’t seem so toxic up front,” she says, “but understanding where they do or don’t work well is key—not just because of the efficacy, but because those drugs are almost toxically expensive sometimes.”

    Since the earliest reports of this data gap, regulators and researchers have tried to fix the problem. Changes to clinical trials have, in principle, made it easier for older adults to sign up. For instance, fewer and fewer studies have an upper age limit for participants. Last year, the FDA issued guidance to industry-funded trials recommending the inclusion of older adults and relaxing other criteria, to allow for participants with natural age-related declines. Still, the problem persists.

    When Sedrak and his colleagues set out to understand why the needle had moved so little over the past few decades, their analysis found a number of explanations, beginning with eligibility criteria that may inadvertently disqualify older adults. Physicians may also be concerned about their older patients’ ability to tolerate unknown side effects of new drugs. Patients and caregivers share these concerns. The logistics of participation can also prove problematic.

    “But of all these, the main driving force, the upstream force, is that trials are not designed with older adults in mind,” Sedrak says. Clinical trials tend to focus on survival, and although older adults do care about this, many of them have other motivations—and concerns—when considering treatment.


    Clinical trials are generally geared toward measuring improvements in health: They may track the size of tumors or months of life gained. These issues aren’t always top of mind for older adults, according to Sedrak. He says he’s more likely to hear questions about how side effects may influence the patient’s cognitive function, ability to live independently, and more. “We don’t design trials that capture the end points that older adults want to know,” he says.

    As a group, older adults do experience more side effects, sometimes so severe that the cure rivals the disease. In the absence of evidence from clinical trials, clinicians and patients have tried to find other ways to predict how a patient’s age might influence their response to treatment. In Yeldell’s case, discussions with Kvale and his care team led him to choose a less intensive course of treatment that has kept his cancer stable since October 2022. He continues to live in his own home and exercises with a trainer three times a week.

    For others trying to weigh their choices, researchers are developing tools that can create a more complete picture by accounting for a person’s physiological age. In a 2021 clinical trial, Supriya Mohile, a geriatric oncologist at the University of Rochester, and her colleagues tested the use of one such tool, known as a geriatric assessment, on the side effects and toxicity of cancer treatments. The tool assesses a person’s biological age based on various physiological tests.

    The team recruited more than 700 people with an average age of 77 who were about to embark on a new cancer-treatment regimen with a high risk of toxicity. Roughly half of the participants received guided treatment-management recommendations based on a geriatric assessment, which their oncologists factored into their treatment decisions. Only half of this group of patients experienced serious side effects from chemotherapy, compared with 71 percent of those who didn’t receive specialized treatment recommendations.

    This type of assessment can help avoid both undertreatment of people who might benefit from chemotherapy and overtreatment of those at risk of serious side effects, Mohile says. It doesn’t compensate for the lack of data on older adults. But in the absence of that evidence, tools such as geriatric assessment can help clinicians, patients, and families make better-informed choices. “We’re kind of going backwards around the problem,” Mohile says. Although geriatric oncologists recognize the need for better ways to make decisions, she says, “I think the geriatric assessment needs to be implemented until we have better clinical-trial data.”

    Since 2018, the American Society of Clinical Oncology has recommended the use of geriatric assessment to guide cancer care for older patients. But clinicians have been slow to follow through in their practice, in part because the assessment doesn’t necessarily show any cancer-specific benefits, such as tumors shrinking and people living longer. Instead, the tool’s main purpose is to improve quality of life. “We need more prospective therapeutic trials in older adults, but we also need all of these other mechanisms to be funded,” Mohile says, “So we actually know what to do for older adults who are in the real world.”

    Jyoti Madhusoodanan

    Source link

  • Life Is Worse for Older People Now

    Life Is Worse for Older People Now

    Last December, during a Christmas Eve celebration with my in-laws in California, I observed what I now realize was the future of COVID for older people. As everyone crowded around the bagna cauda, a hot dipping sauce shared like fondue, it was clear that we, as a family, had implicitly agreed that the pandemic was over. Our nonagenarian relatives were not taking any precautions, nor was anyone else taking precautions to protect them. Endive spear in hand, I squeezed myself in between my 94-year-old grandfather-in-law and his spry 99-year-old sister and dug into the dip.

    We all knew that older people bore the brunt of COVID, but the concerns seemed like a relic from earlier in the pandemic. The brutal biology of this disease meant that they disproportionately have fallen sick, been hospitalized, and died. Americans over 65 make up 17 percent of the U.S. population, but they have accounted for three-quarters of all COVID deaths. As the death count among older people began to rise in 2020, “a lot of my patients were really concerned that they were being exposed without anyone really caring about them,” Sharon Brangman, a geriatrician at SUNY Upstate University Hospital, told me.

    But even now, three years into the pandemic, older people are still in a precarious position. While many Americans can tune out COVID and easily fend off an infection when it strikes, older adults continue to face real threats from the illness in the minutiae of their daily life: grocery trips, family gatherings, birthday parties, coffee dates. That is true even with the protective power of several shots and the broader retreat of the virus. “There is substantial risk, even if you’ve gotten all the vaccines,” Bernard Black, a law professor at Northwestern University who studies health policy, told me. More than 300 people still die from COVID each day, and the overwhelming majority of them are older. People ages 65 and up are currently hospitalized at nearly 11 times the rate of adults under 50.

    Compounding this sickness are all the ways that, COVID aside, this pandemic has changed life for older adults. Enduring severe isolation and ongoing caregiver shortages, they have been disproportionately harmed by the past few years. Not all of them have experienced the pandemic in the same way. Americans of retirement age, 65 and older, are a huge population encompassing a range of incomes, health statuses, living situations, and racial backgrounds. Nevertheless, by virtue of their age alone, they live with a new reality: one in which life has become more dangerous—and in many ways worse—than it was before COVID.


    The pandemic was destined to come after older Americans. Their immune systems tend to be weaker, making it harder for them to fight off an infection, and they are more likely to have comorbidities, which further increases their risk of severe illness. The precarity that many of them already faced going into 2020—poverty, social isolation and loneliness, inadequate personal care—left them poorly equipped for the arrival of the novel coronavirus. More than 1 million people lived in nursing homes, many of which were densely packed and short on staff when COVID tore through them.

    A major reason older people are still at risk is that vaccines can’t entirely compensate for their immune systems. A study recently published in the journal Vaccines showed that for vaccinated adults ages 60 and over, the risk of dying from COVID versus other natural causes jumped from 11 percent to 34 percent within a year of completing their primary shot series. A booster dose brings the risk back down, but other research shows that it wears off too. A booster is a basic precaution, but “not one that everyone is taking,” Black, a co-author of the study, told me. Booster uptake among older Americans for the reengineered “bivalent” shots is the highest of all age groups, but still, nearly 60 percent have not gotten one.

    For every COVID death, many more older people develop serious illness. Risk increases with age, and people older than 70 “have a substantially higher rate of hospitalizations” than those ages 60 to 69, Caitlin Rivers, an epidemiologist at Johns Hopkins University, told me. Unlike younger people, most of whom fully recover from a bout with COVID, a return to baseline health is less guaranteed for older adults. In one study, 32 percent of adults over 65 were diagnosed with symptoms that lasted well beyond their COVID infection. Persistent coughs, aches, and joint pain can linger long after serious illness, together with indirect impacts such as loss of muscle strength and flexibility, which can affect older people’s ability to be independent, Rivers said. Older COVID survivors may also have a higher risk of cognitive decline. In some cases, these ailments could be part of long COVID, which may be more prevalent in older people.

    Certainly, some older adults are able to make a full recovery. Brangman said she has “old and frail” geriatric patients who bounced back after flu-like symptoms, and younger ones who still experience weakness and fatigue. Still, these are not promising odds. The antiviral Paxlovid was supposed to help blunt the wave of old people falling sick and ending up in the hospital—and it can reduce severe disease by 50 to 90 percent. But unfortunately, it is not widely used; as of July, just a third of Americans 80 or older took Paxlovid.

    The reality is that as long as the virus continues to be prevalent, older Americans will face these potential outcomes every time they leave their home. That doesn’t mean they will barricade themselves indoors, or that they even should. Still, “every decision that we make now is weighing that balance between risk and socialization,” Brangman said.


    Long before the pandemic, the threat of illness was already very real for older people.  Where America has landed is hardly a new way of life but rather one that is simply more onerous. “One way to think about it is that this is a new risk that’s out there” alongside other natural causes of death, such as diabetes and heart failure, Black said. But it’s a risk older Americans can’t ignore, especially as the country has dropped all COVID precautions. Since Christmas Eve, I have felt uneasy about how readily I normalized putting so little effort into protecting my nonagenarian loved ones, despite knowing what might happen if they got sick. For older people, who must contend with the peril of attending similar gatherings, “there’s sort of no good choice,” Black said. “The world has changed.”

    But this new post-pandemic reality also includes insidious effects on older people that aren’t directly related to COVID itself. Those who put off nonemergency visits to the doctor earlier in the pandemic, for example, risked worsening their existing health conditions. The first year of the pandemic plunged nearly everyone into isolation, but being alone created problems for older adults that still persist. Before the pandemic, the association between loneliness and higher mortality rates, increased cardiovascular risks, and dementia among older adults was already well established. Increased isolation during COVID amplified this association.

    The consequences of isolation were especially profound for older adults with physical limitations, Naoko Muramatsu, a community-health professor at the University of Illinois at Chicago, told me. When caregivers or family members were unable to visit, people who required assistance for even the smallest tasks, such as fetching the mail and getting dressed, had no options. “If you don’t walk around and if you don’t do anything, we can expect that cognitive function will decline,” Muramatsu said; she has observed this firsthand in her research. One Chinese American woman, interviewed in a survey of older adults living alone with cognitive impairment during the pandemic, described the debilitating effect of sitting at home all day.“I am so useless now,” she told the interviewer. “I am confused so often. I forget things.”

    Even older adults who have weathered the direct and indirect effects of the pandemic still face other challenges that COVID has exacerbated. Many have long relied on personal caregivers or the staff at nursing facilities. These workers, already scarce before the pandemic, are even more so now because many quit or were affected by COVID themselves. “Long-term care has been in a crisis situation for a long time, but it’s even worse now,” Muramatsu said, noting that many home care workers are older adults themselves. Nursing homes nationwide now have nearly 200,000 fewer employees compared with March 2020, which is especially concerning as the proportion of Americans over age 65 explodes.

    Older people won’t have one single approach to contending with this sad reality. “Everybody is trying to figure out what is the best way to function, to try to have some level of everyday life and activity, but also keep your risk of getting sick as low as possible,” Brangman said. Some of her patients are still opting to be cautious, while others consider this moment their “only chance to see grandchildren or concerts or go to family gatherings.” Either way, older Americans will have to wrestle with these decisions without so many of their peers who have died from COVID.

    Again, many of these people did not have it great before the pandemic, even if the rest of the country wasn’t paying attention. “We often don’t provide the basic social support that older people need,” Kenneth Covinsky, a clinician-researcher at the UCSF Division of Geriatrics, said. Rather, ageism, the willful ignorance or indifference to the needs of older people, is baked into American life. It is perhaps the main reason older adults were so badly affected by the pandemic in the first place, as illustrated by the delayed introduction of safety precautions in nursing homes and the blithe acceptance of COVID deaths among older adults. If Americans couldn’t bring themselves to care at any point over the past three years, will they ever?

    Yasmin Tayag

    Source link

  • Radio Atlantic: This Is Not Your Parents’ Cold War

    Radio Atlantic: This Is Not Your Parents’ Cold War

    During the Cold War, NATO had nightmares of hundreds of thousands of Moscow’s troops pouring across international borders and igniting a major ground war with a democracy in Europe. Western governments feared that such a move by the Kremlin would lead to escalation—first to a world war and perhaps even to a nuclear conflict.

    That was then; this is now.

    Russia’s invasion of Ukraine is nearly a year old, and the Ukrainians are holding on. The Russians, so far, not only have been pushed back, but are taking immense casualties and material losses. For many Americans, the war is now just another conflict in the news. Do we need to worry about the nuclear threat of Putin’s war in Europe the way we worried about such things three decades ago?

    Our staff writer Tom Nichols, an expert on nuclear weapons and the Cold War, counsels Americans not to be obsessed with nuclear escalation, but to be aware of the possibilities for accidents and miscalculations. You can hear his thoughts here:

    The following is a transcript of the episode:

    Tom Nichols: It’s been a year since the Russians invaded Ukraine and launched the biggest conventional war in Europe since the Nazis. One of the things that I think we’ve all worried about in that time is the underlying problem of nuclear weapons.

    This is a nuclear-armed power at war with hundreds of thousands of people in the middle of Europe. This is the nightmare that American foreign policy has dreaded since the beginning of the nuclear age.

    And I think people have kind of put it out of their mind, how potentially dangerous this conflict is, which is understandable, but also, I think, takes us away from thinking about something that is really the most important foreign problem in the world today.

    During the Cold War, we would’ve thought about that every day, but these days, people just don’t think about it, and I think they should.

    My name is Tom Nichols. I’m a staff writer at The Atlantic. And I’ve spent a lot of years thinking about nuclear weapons and nuclear war. For 25 years, I was a professor of national-security affairs at Naval War College.

    For this episode of Radio Atlantic, I want to talk about nuclear weapons and what I think we should have learned from the history of the Cold War about how to think about this conflict today.

    I was aware of nuclear weapons at a pretty young age because my hometown, Chicopee, Massachusetts, was home to a giant nuclear-bomber base, Strategic Air Command’s East Coast headquarters, which had the big B-52s that would fly missions with nuclear weapons directly to the Soviet Union.

    I had a classic childhood of air-raid sirens, and hiding in the basement, and going under the desks, and doing all of that stuff. My high-school biology teacher had a grim sense of humor and told us, you know, because of the Air Force base, we were slated for instant destruction. He said, Yeah, if anything ever happens, we’re gone. We’re gone in seven or eight minutes. So I guess the idea of nuclear war and nuclear weapons was a little more present in my life at an earlier age than for a lot of other kids.

    It’s been a long time since anyone’s really had to worry about global nuclear war. It’s been over 30 years since the fall of the Berlin Wall. I think people who lived through the Cold War were more than happy to forget about it. I know I am glad to have it far in the past. And I think younger people who didn’t experience it have a hard time understanding what it was all aboutand what that fear was about—because it’s part of ancient history now.

    But I think people really need to understand that Cold War history to understand what’s going on today, and how decision makers in Washington and in Europe, and even in Moscow, are playing out this war—because many of these weapons are still right where we left them.

    We have fewer of them, but we still have thousands of these weapons, many of them on a very short trigger. We could go from the beginning of this podcast to the end of the world, that short of [a] time. And it’s easy to forget that. During the Cold War, we were constantly aware of it, because it was the central influence on our foreign policy. But it’s important for us to look back at the history of the Cold War because we survived a long and very tense struggle with a nuclear-armed opponent. Now, some of that was through good and sensible policy. And some of it was just through dumb luck.

    Of course, the first big crisis that Americans really faced where they had to think about the existential threat of nuclear weapons was the Cuban missile crisis, in October of 1962.

    I was barely 2 years old. But living next to this big, plump nuclear target in Massachusetts, we actually knew people in my hometown who built fallout shelters. But we got through the Cuban missile crisis, in part because President Kennedy and Soviet Premier Nikita Khrushchev realized what was at stake.

    The gamble to put missiles in Cuba had failed, and that we had to—as Khruschev put it in one of his messages—we had to stop pulling on the ends of the rope and tightening the knot of war. But we also got incredibly lucky.

    There was a moment aboard a Soviet submarine where the sub commander thought they were under attack. And he wanted to use nuclear-tipped torpedoes to take out the American fleet, which would’ve triggered a holocaust.

    I mean, it would’ve been an incredible amount of devastation on the world. Tens, hundreds of millions of people dead. And, um, fortunately a senior commander who had to consent to the captain’s idea vetoed the whole thing. He said, I don’t think that’s what’s happening. I don’t think they’re trying to sink us, and I do not consent. And so by this one lucky break with this one Soviet officer, we averted the end of the world. I mean, we averted utter catastrophe.

    After the Cuban missile crisis, people are now even more aware of this existential threat of nuclear weapons and it starts cropping up everywhere, especially in our pop culture. I mean, they were always there in the ’50s; there were movies about the communist threat and attacks on America. But after the Cuban missile crisis, that’s when you start getting movies like Dr. Strangelove and Fail Safe.

    Both were about an accidental nuclear war, which becomes a theme for most of the Cold War. In Dr. Strangelove, an American general goes nuts and orders an attack on Russia. And in Fail Safe, a piece of machinery goes bad and the same thing happens. And I think this reflected this fear that we now had to live with, this constant threat of something that we and the Soviets didn’t even want to do, but could happen anyway.

    Even the James Bond movies, which were supposed to be kind of campy and fun, nuclear weapons were really often the source of danger in them. You know, bad guys were stealing them; people were trying to track our nuclear submarines. Throughout the ’60s, the ’70s, the ’80s nuclear weapons really become just kind of soaked into our popular culture.

    We all know the Cuban missile crisis because it’s just part of our common knowledge about the world, even for people that didn’t live through it. I think we don’t realize how dangerous other times were. I always think of 1983 as the year we almost didn’t make it.

    1983 was an incredibly tense year. President Ronald Reagan began the year calling the Soviet Union an “evil empire.” And announced that the United States would start pouring billions of dollars into an effort to defend against Soviet missiles, including space-based defenses, which the Soviets found incredibly threatening.

    The relationship between the United States and the Soviet Union had just completely broken down. Really, by the fall of 1983, it felt like war was inevitable. It certainly felt like to me war was inevitable. There was kind of that smell of gunpowder in the air. We were all pretty scared. I was pretty scared. I was a graduate student at that point. I was 23 years old, and I was certain that this war, this cataclysmic war, was going to happen not only in my lifetime, but probably before I was 30 years old.

    And then a lot of things happened in 1983 that elevated the level of tension between the United States and the Soviet Union to extraordinary levels. I would say really dangerous levels. The Soviets did their best to prove they were an evil empire by shooting down a fully loaded civilian airliner, killing 269 people. Just weeks after the shoot-down of the Korean airliner, Soviet Air Defenses got an erroneous report of an American missile launch against them. And this is another one of those cases where we were just lucky. We were just fortunate.

    And in this case, it was a Soviet Air Defense officer, a lieutenant colonel, who saw this warning that the Americans had launched five missiles. And he said, You know, nobody starts World War III with five missiles. That seems wrong.

    And he said, I just, I think the system—which still had some bugs—I just don’t think the system’s right. We’re gonna wait that out. We’re gonna ignore that. He was actually later reprimanded.

    It was almost like he was reprimanded and congratulated at the same time, because if he had called Moscow and said, Look, I’m doing my duty. I’m reporting Soviet Air Defenses have seen American birds are in the air. They’re coming at us and over to you, Kremlin. And from there, a lot of bad decisions could have cascaded into World War III, especially after a year where we had been in such amazingly high conflict with each other.

    Once again, just as after the Cuban missile crisis, the increase in tension in the 1980s really comes through in the popular culture. Music, movies, TV puts this sense of threat into the minds of ordinary Americans in a way that we just don’t have now. So people are going to the movies and they’re seeing movies like WarGames, once again about an accidental nuclear war. They’re seeing movies like Red Dawn, about a very intentional war by the Soviet Union against the United States.

    The Soviets thought that Red Dawn was actually part of Reagan’s attempt to use Hollywood to prepare Americans for World War III. In music, Ronald Reagan as a character made appearances in videos by Genesis or by Men at Work. That November, the biggest television event in history was The Day After, which was a cinematic representation of World War III.

    I mean, it was everywhere. By 1983, ’84, we were soaked in this fear of World War III. Nuclear war and Armageddon, no matter where you looked. I remember in the fall of 1983 going to see the new James Bond movie, one of the last Roger Moore movies, called Octopussy. And the whole plot amazed me because, of course, I was studying this stuff at the time, I was studying NATO and nuclear weapons.

    And here’s this opening scene where a mad Soviet general says, If only we can convince the West to give up its nuclear weapons, we can finally invade and take over the world.

    I saw all of these films as either a college student or a young graduate student, and again, it was just kind of woven into my life. Well, of course, this movie is about nuclear war. Of course, this movie is about a Soviet invasion. Of course, this movie is about, you know, the end of the world, because it was always there. It was always in the background. But after the end of the Cold War, that remarkable amount of pop-culture knowledge and just general cultural awareness sort of fades away.

    I think one reason that people today don’t look back at the Cold War with the same sense of threat is that it all ended so quickly. We went from [these] terrifying year[s] of 1983, 1984. And then suddenly Gorbachev comes in; Reagan reaches out to him; Gorbachev reaches back. They jointly agree in 1985—they issue a statement that to this day, is still considered official policy by the Russian Federation and by the United States of America. They jointly declare a nuclear war can never be won and must never be fought.

    And all of a sudden, by the summer of 1985, 1986, it’s just over, and, like, 40 years of tension just came to an end in the space of 20, 24 months. Something I just didn’t think I would see in my lifetime. And I think that’s really created a false sense of security in later generations.

    After the Cold War, in the ’90s we have a Russia that’s basically friendly to the United States but nuclear weapons are still a danger. For example, in 1995, Norway launched a scientific satellite on top of a missile—I think they were gonna study the northern lights—and the scientists gave everybody notice, you know, We’re gonna be launching this satellite. You’re gonna see a missile launch from Norway.

    Somebody in Russia just didn’t get the message, and the Russian defense people came to President Boris Yeltsin and they said, This might be a NATO attack. And they gave him the option to activate and launch Russian nuclear weapons. Yeltsin conferred with his people, and fortunately—because our relations were good, and because Boris Yeltsin and Bill Clinton had a good relationship, and because tensions were low in the world—Yeltsin says, Yeah, okay. I don’t buy that. I’m sure it’s nothing.

    But imagine again, if that had been somebody else.

    And that brings us to today. The first thing to understand is: We are in a better place than we were during the Cold War in many ways. During the Cold War, we had tens of thousands of weapons pointed at each other. Now by treaty, the United States and the Russian Federation each have about 1,500 nuclear weapons deployed and ready to go. Now, that’s a lot of nuclear weapons, but 1,500 is a lot better than 30,000 or 40,000.

    Nonetheless, we are dealing with a much more dangerous Russian regime with this mafia state led by Vladimir Putin.

    Putin is a mafia boss. There is no one to stop him from doing whatever he wants. And he has really convinced himself that he is some kind of great world historical figure who is going to reestablish this Christian Slavic empire throughout the former Soviet Union and remnants of the old Russian empire. And that makes him uniquely dangerous.

    People might wonder why Putin is even bothering with nuclear threats, because we’ve always thought of Russia as this giant conventional power because that’s the legacy of the Cold War. We were outnumbered. NATO at the time was only 16 countries. We were totally outnumbered by the Soviets and the Warsaw Pact in everything—men, tanks, artillery—and of course, the only way we could have repulsed an attack by the Soviet Union into Europe would’ve been to use nuclear weapons.

    I know earlier I mentioned the movie Octopussy. We’ve come a long way from the days when that mad Russian general could say, If only we got rid of nuclear weapons and NATO’s nuclear weapons, we could roll our tanks from Czechoslovakia to Poland through Germany and on into France.

    What people need to understand is that Russia is now the weaker conventional power. The Russians are now the ones saying, Listen, if things go really badly for us and we’re losing, we reserve the right to use nuclear weapons. The difference between Russia now and NATO then is: NATO was threatening these nuclear weapons if they were invaded and they were being just rolled over by Soviet tanks on their way to the English channel. The Russians today are saying, We started this war, and if it goes badly for us, we reserve the right to use nuclear weapons to get ourselves out of a jam.

    This conventional weakness is actually what makes them more dangerous, because they’re now continually being humiliated in the field. And a country that had gotten by by convincing people that they were a great conventional power, that they had a lot of conventional capability, they’re being revealed now as a hollow power. They can’t even defeat a country a third of their own size.

    And so when they’re running out of options, you can understand at that point where Putin says, Well, the only way to scramble the deck and to get a do-over here is to use some small nuclear weapon in that area to kind of sober everybody up and shock them into coming to the table or giving me what I want.

    Now, I think that would be incredibly stupid. And I think a lot of people around the world, including China and other countries, have told Putin that would be a really bad idea. But I think one thing we’ve learned from this war is that Putin is a really lousy strategist who takes dumb chances because he’s just not very competent.

    And that comes back to the Cold War lesson—that you don’t worry about someone starting World War III as much as you worry about bumbling into World War III because of a bunch of really dumb decisions by people who thought they were doing something smart and didn’t understand that they were actually doing something really dangerous.

    So where does this leave us? This major war is raging through the middle of Europe, the scenario that we always dreaded during the Cold War; thousands and thousands of Moscow’s troops flooding across borders. What’s the right way to think about this? Perhaps the most important thing to understand is that this really is a war to defend democracy against an aggressive, authoritarian imperial state.

    The front line of the fight for civilization, really, is in Ukraine now. If Ukraine loses this war, the world will be a very different place. That’s what makes it imperative that Americans think about this problem. I think it’s imperative to support Ukraine in this fight, but we should do that with a prudent understanding of real risks that haven’t gone away.

    And so I think the Cold War provides some really good guidance here, which is to be engaged, to be aware, but not to be panicked. Not to become consumed by this fear every day, because that becomes paralyzing, that becomes debilitating. It’s bad for you as a person. And it’s bad for democracies’ ability to make decisions—because then you simply don’t make any decisions at all, out of fear.

    I think it’s important not to fall victim to Cold War amnesia and forget everything we learned. But I also don’t think we should become consumed by a new Cold War paranoia where we live every day thinking that we’re on the edge of Armageddon.

    Tom Nichols

    Source link

  • Why Can’t I Stop Rooting for a God-Awful Basketball Team?

    Why Can’t I Stop Rooting for a God-Awful Basketball Team?

    When I attended a Washington Wizards open practice at D.C.’s Capital One Arena earlier this month, the focus was more on spectator entertainment than Rocky-style workouts. The season opener was a week away, and the players ran drills at half speed and engaged in silly skills competitions for fans, including a basketball version of Connect Four. But as a lifelong Wiz devotee, I was having an awestruck, love-you-man moment. Here I was posing for a photo with Phil freakin Chenier. Franchise royalty. My childhood idol. Back in the 1970s, when Chenier was draining jumpers and sporting a Richard Pryor mustache, the team routinely chased titles. These days? Not so much.

    Being an NBA fan who loves the Wizards is a little like being a foodie who adores turnips: It just doesn’t make sense. Since the 2000–01 season, only the Knicks and Timberwolves have lost more games. The franchise last advanced beyond the second round of the playoffs in 1979 (back when they were called the Bullets), and they’ve missed the playoffs 16 of the past 25 years. We fans have endured 40-plus years of frustration and disappointment, mainly from the typical issues—bad defense, bad draft picks, bad trades—but sometimes from … weirder ones: One All-Star player was charged with a gun felony involving a teammate, and another was once suspended without pay for being overweight. It’s all #SoWizards, to use a Twitter hashtag.

    And yet, I made it out to the open practice with a few hundred fans on a Tuesday night, wearing a Wizards T-shirt and feeling the faint, irrational warmth of preseason hope. Anyone can root for a winner. That’s easy. Last season, the NFL teams with the top-selling merchandise were the Cowboys, 49ers, Patriots, Steelers, and Chiefs. Each team finished with a winning record. In Philadelphia, the currently undefeated Eagles and the World Series–bound Phillies have generated a 20 percent or more increase in business for local restaurants, sports bars, and memorabilia stores.

    But rooting for the middling Wizards takes guts at best and is downright masochism at worst. Still, even though the team is more likely to bring me agony than elation, I can’t fathom supporting any other franchise. The same is surely true of my fellow Wizards fans—and many fans of other perennial losers (hey, the Detroit Lions somehow still have fans). So why do we stay hooked?

    My Wizards fandom began in the D.C. suburbs in the ’70s, when I was a Bullets-crazed kid devouring box scores on the sports page, shooting jumpers on a backyard dirt court, and pretending to be Chenier. I was 12 when the Bullets paraded down Pennsylvania Avenue to celebrate their only title, and the subsequent 44 years have brought lots of bad memories: Last season, the Wizards somehow blew a 35-point lead against the L.A. Clippers. The worst part? I wasn’t surprised.

    Recent pain should feel stronger than childhood joy, I would think—even for fans like me, whose support was passed down geographically. But these deep, die-hard roots can influence our adult behavior. “Early learning is incredibly powerful and hard to erase,” Chris Crandall, a psychology professor at the University of Kansas who has studied fan allegiance, told me. The team’s success 50 years ago may have boosted my childhood loyalty, Crandall explained, and their subsequent failures did not remove it. A new attitude (“Wow, these guys stink”) essentially “lays over the old one, but the old one is still there,” Crandall said. “And it’s very difficult to get rid of it.”

    I’m at least old enough to remember the team’s lone championship. The top memory for Wizards fans in their 30s is probably John Wall’s dramatic game-winning three-pointer in Game 6 of the Eastern Conference semifinals. The Wizards, of course, then lost Game 7. But one reason fans stick around is the perverse pride they have in their fandom, Edward Hirt, a professor at the University of Indiana who has studied sports-fan psychology, told me. Rooting for the Lakers or the Dallas Cowboys is like wearing khakis: You hardly stand out in a crowd. Loving the Wizards gives me a defiant sense of individuality. “Do you want to be like everybody else, or do you want to be different?” Hirt said. “The answer is neither. We want to be a little bit of both. We like feeling like we belong, but we don’t want to be seen as a clone of everybody else, either.”

    Supporting a loser satisfies both of those desires. I can commune with fellow fans at a sports bar or game, but when I walk through an airport, even in D.C., I’m often the only guy wearing a Wizards cap. And honestly, I like that. My Wiz fandom, Andrew Billings, a sports-media professor at the University of Alabama, told me, sends a message to the world: “How loyal am I? I root for the Washington Wizards.” (Which, let’s be real, would be a great T-shirt). In a 2015 study of students from seven universities, football fans were 55 percent less likely to wear team apparel following a defeat compared with a win. But those who do are making a statement: I’m not a fair-weather fan; I’m dedicated and trustworthy.

    Those noble qualities explain why fans of lousy teams despise fair-weather fans, Hirt added. Bandwagon fans skip the suffering but embrace the glory. If the Wizards somehow reached the NBA Finals this year, I’d be both thrilled and infuriated by the mobs of rapturous fans at downtown watch parties. Where were these bandwagon yahoos in 2001, when the team finished 19–63?

    But maybe winning matters less than we think—even for die-hard fans who react to each loss with a primal scream. In one 2019 study, fans of a college football team felt a two-day rise in self-esteem after a victory. But self-esteem levels didn’t drop significantly among losing fans. One of the reasons: Even if your team loses, you can raise your self-esteem simply by commiserating with friends, Billings, a co-author, said.

    Yes, suffering sucks, but suffering together has some upsides. It can be a social glue that intensifies bonds with the team and fellow fans. “Going through this hardship with your sports team makes you much more likely to stick with them,” Omri Gillath, a psychology professor at the University of Kansas, told me. Fans don’t just bask in reflected glory, or BIRG, as psychologists call it; they also BIRF—bask in reflected failure. “It’s about having a community of people that understand you and like the same thing that you do,” Gillath said.

    Last season, a friend and I attended the Wizards’ home finale, and they got shellacked by the equally lousy Knicks. But my friend and I enjoyed laughs over pregame beers. We made sarcastic comments as the Wiz turned a 10–0 lead into a 22-point deficit. I bought an end-of-the-season discounted T-shirt at the team store. Listening to Knicks fans hoot about their victory was annoying, but we had fun. And we bonded.

    But rooting for a losing team may be a dying phenomenon. Sports betting and streaming have made sports more solitary and less tied to where you live—undercutting some of the reasons fans endure their god-awful teams. “Geographic loyalty is particularly powerful for older generations, partly because they weren’t nearly as mobile with their jobs or their careers as younger people are,” Billings said. “I live in Alabama. If I wanted to be a Golden State Warriors fan, I could access all 82 of their regular-season games in a way that was not possible for older generations when they built their fandom.” Younger fans may also be more likely to follow a single player than a particular team, Billings believes.

    Let’s be clear: Winning is way better than losing. A 2013 study found that on the Monday after NFL games, fans of losing teams were more likely to consume saturated fats and sugars compared with fans of winning teams. But I truly believe—and maybe this is loser talk—that my decades of Wizards fandom have made me a better human. I have well-developed coping skills. My friends and I are like Statler and Waldorf, the crusty hecklers on The Muppet Show: We manage head-smacking losses with well-timed quips. I don’t get too elated after a victory—although victories mean more when they’re rare—or too down after a defeat. Hell, maybe it’s even made me more empathetic to people’s challenges. After all, most of us in life can relate more like the constantly struggling Wizards than the trophy-hoisting Warriors.

    Even though I know better, I’m optimistic this season won’t be a #SoWizards year. Maybe the team will jell. Maybe the young players will develop. Maybe the veterans will stay healthy. Or, you know, maybe not. A struggling sports franchise, I’ve decided, is like your idiot brother or jackass uncle. Despite all their obvious flaws, you still love them. And so I’ll cherish disco-era Bullets memories, celebrate the unexpected victories, cling to foolish hope, and brace myself for the worst. If they miss the playoffs—again—well, there’s always next year.

    Ken Budd

    Source link