ReportWire

Tag: minority groups

  • A Life Without Nature Is a Lonely One

    A Life Without Nature Is a Lonely One

    [ad_1]

    My Brooklyn apartment is designed for sterility. The windows have screens to keep out bugs; I chose my indoor plants specifically because they don’t attract pests. While commuting to other, similarly aseptic indoor spaces—co-working offices, movie theaters, friends’ apartments—I’ll skirt around pigeons, avert my eyes from a gnarly rat, shudder at the odd scuttling cockroach. But once I’m back inside, the only living beings present (I hope, and at least as far as I know) are the ones I’ve chosen to interact with: namely, my partner and the low-maintenance snake plant on the windowsill.

    My aversion to pigeons, rats, and cockroaches is somewhat justifiable, given their cultural associations with dirtiness and disease. But such disgust is part of a larger estrangement between humanity and the natural world. As nature grows unfamiliar, separate, and strange to us, we are more easily repelled by it. These feelings can lead people to avoid nature further, in what some experts have called “the vicious cycle of biophobia.”

    The feedback loop bears telling resemblance to another vicious cycle of modern life. Psychologists know that lonely individuals tend to think more negatively of others and see them as less trustworthy, which encourages even more isolation. Although our relationship to nature and our relationships with one another may feel like disparate phenomena, they are both parallel and related. A life without nature, it seems, is a lonely life—and vice versa.

    The Western world has been trending toward both biophobia and loneliness for decades. David Orr, an environmental-studies researcher and advocate for climate action, wrote in a 1993 essay that “more than ever we dwell in and among our own creations and are increasingly uncomfortable with the nature that lies beyond our direct control.” This discomfort might manifest as a dislike of camping, or annoyance at the scratchy touch of grass at the park. It might also show up as disgust in the presence of insects, which a 2021 paper from Japanese scholars found is partially driven by urbanization. Ousting nature from our proximity—with concrete, walls, window screens, and lifestyles that allow us to remain at home—also increases the likelihood that the experiences we do have with other lifeforms will be negative, Orr writes. You’re much less likely to love birds if the only ones around are the pigeons you perceive as dirty.

    The rise of loneliness is even better documented. Americans are spending more time inside at home and alone than they did a few decades ago. In his book Bowling Alone, the political scientist Robert Putnam cites data showing that, from the 1970s to the late 1990s, Americans went from entertaining friends at home about 15 times a year to just eight. No wonder, then, that nearly a fifth of U.S. adults reported feeling lonely much of the previous day in an April Gallup poll. Loneliness has become a public-health buzzword; Surgeon General Vivek Murthy calls it an “epidemic” that affects both mental and physical health. At least in the United States, COVID-19 has made things worse by expanding our preferred radius of personal space, and when that space is infringed upon, more of the reactions are now violent.

    That loneliness and biophobia are rising in tandem may be more than a coincidence. Orr wrote in his 1993 essay that appreciation of nature will flourish mostly in “places in which the bonds between people, and those between people and the natural world create a pattern of connectedness, responsibility, and mutual need.” The literature suggests that he’s right. Our sense of community certainly affects how comfortable or desirable we perceive time in nature to be, Viniece Jennings, a senior fellow in the JPB Environmental Health Fellowship Program at Harvard who studies these relationships, told me. In one 2017 study across four European cities, having a greater sense of community trust was linked to more time spent in communal green spaces. A 2022 study showed that, during COVID-related shutdowns, Asians in Australia were more likely to walk outside if they lived in close-knit neighborhoods with high interpersonal trust.

    Relationships between racial and ethnic groups can have an especially strong influence on time spent in nature. In the 2022 study from Australia, Asians were less likely to go walking than white people, which the study authors attributed to anti-Asian racism. Surveys consistently show that minority groups in the U.S., especially Black and Hispanic Americans, are less likely to participate in outdoor recreation, commonly citing racism, fear of racist encounters, or lack of easy access as key factors. Inclusive messaging in places like urban parks, by contrast, may motivate diverse populations to spend time outdoors.

    On the flip side, being in nature or even just remembering times you spent there can increase feelings of belonging, says Katherine White, a behavioral scientist at the University of British Columbia who co-wrote a 2021 paper on the subject. The authors of one 2022 paper found that “people who strongly identify with nature, who enjoy being in nature, and who had more frequent garden visits were more likely to have a stronger sense of social cohesion.” In a 2018 study from Hong Kong, preschool children who were more engaged with nature had better relationships with their peers and demonstrated more kindness and helpfulness. A 2014 experiment in France showed that people who had just spent time walking in a park were more likely to pick up and return a glove dropped by a stranger than people who were just about to enter the park. The results are consistent, White told me: “Being in nature makes you more likely to help other people,” even at personal cost.

    Time spent in natural spaces might contribute to a greater sense of belonging in part because it usually requires you to be in public space. Unlike homes and offices, natural spaces provide a setting for unpredictable social interactions—such as running into a new neighbor at the dog park or starting a spontaneous conversation with a stranger on your walking path—which “can be a great space for forming connections and building social networks,” Jennings said. In a study in Montreal, Canada, researchers found that time in public parks and natural spaces allowed immigrant families to converse with neighbors, make new friends, and feel better integrated in their new communities, all for free. Similarly, there’s some reason to suspect that strong human relationships can help extinguish any disgust we feel toward the natural world. We learn fear through one another, Daniel Blumstein, an evolutionary biologist at UCLA, told me. The more safe and enjoyable experiences we accumulate in groups, the better our tolerance for new and unfamiliar things.

    It would be a stretch to say that just getting people to touch more grass will solve all societal ills, or that better social cohesion will guarantee that humankind unites to save the planet. Our relationships with the Earth and one another fluctuate throughout our lives, and are influenced by a number of variables difficult to capture in any one study. But this two-way phenomenon is a sign that, if you’ve been meaning to go outside more or connect with your neighbors, you might as well work on both. “Natural ecosystems rely on different people” and vice versa, Jennings said. “You don’t have to go on long hikes every day to understand that.”


    ​When you buy a book using a link on this page, we receive a commission. Thank you for supporting The Atlantic.

    [ad_2]

    Hannah Seo

    Source link

  • The Pandemic’s Legacy Is Already Clear

    The Pandemic’s Legacy Is Already Clear

    [ad_1]

    Recently, after a week in which 2,789 Americans died of COVID-19, President Joe Biden proclaimed that “the pandemic is over.” Anthony Fauci described the controversy around the proclamation as a matter of “semantics,” but the facts we are living with can speak for themselves. COVID still kills roughly as many Americans every week as died on 9/11. It is on track to kill at least 100,000 a year—triple the typical toll of the flu. Despite gross undercounting, more than 50,000 infections are being recorded every day. The CDC estimates that 19 million adults have long COVID. Things have undoubtedly improved since the peak of the crisis, but calling the pandemic “over” is like calling a fight “finished” because your opponent is punching you in the ribs instead of the face.

    American leaders and pundits have been trying to call an end to the pandemic since its beginning, only to be faced with new surges or variants. This mindset not only compromises the nation’s ability to manage COVID, but also leaves it vulnerable to other outbreaks. Future pandemics aren’t hypothetical; they’re inevitable and imminent. New infectious diseases have regularly emerged throughout recent decades, and climate change is quickening the pace of such events. As rising temperatures force animals to relocate, species that have never coexisted will meet, allowing the viruses within them to find new hosts—humans included. Dealing with all of this again is a matter of when, not if.

    In 2018, I wrote an article in The Atlantic warning that the U.S. was not prepared for a pandemic. That diagnosis remains unchanged; if anything, I was too optimistic. America was ranked as the world’s most prepared country in 2019—and, bafflingly, again in 2021—but accounts for 16 percent of global COVID deaths despite having just 4 percent of the global population. It spends more on medical care than any other wealthy country, but its hospitals were nonetheless overwhelmed. It helped create vaccines in record time, but is 67th in the world in full vaccinations. (This trend cannot solely be attributed to political division; even the most heavily vaccinated blue state—Rhode Island—still lags behind 21 nations.) America experienced the largest life-expectancy decline of any wealthy country in 2020 and, unlike its peers, continued declining in 2021. If it had fared as well as just the average peer nation, 1.1 million people who died last year—a third of all American deaths—would still be alive.

    America’s superlatively poor performance cannot solely be blamed on either the Trump or Biden administrations, although both have made egregious errors. Rather, the new coronavirus exploited the country’s many failing systems: its overstuffed prisons and understaffed nursing homes; its chronically underfunded public-health system; its reliance on convoluted supply chains and a just-in-time economy; its for-profit health-care system, whose workers were already burned out; its decades-long project of unweaving social safety nets; and its legacy of racism and segregation that had already left Black and Indigenous communities and other communities of color disproportionately burdened with health problems. Even in the pre-COVID years, the U.S. was still losing about 626,000 people more than expected for a nation of its size and resources. COVID simply toppled an edifice whose foundations were already rotten.

    In furiously racing to rebuild on this same foundation, America sets itself up to collapse once more. Experience is reputedly the best teacher, and yet the U.S. repeated mistakes from the early pandemic when faced with the Delta and Omicron variants. It got early global access to vaccines, and nonetheless lost almost half a million people after all adults became eligible for the shots. It has struggled to control monkeypox—a slower-spreading virus for which there is already a vaccine. Its right-wing legislators have passed laws and rulings that curtail the possibility of important public-health measures like quarantines and vaccine mandates. It has made none of the broad changes that would protect its population against future pathogens, such as better ventilation or universal paid sick leave. Its choices virtually guarantee that everything that’s happened in the past three years will happen again.


    The U.S. will continue to struggle against infectious diseases in part because some of its most deeply held values are antithetical to the task of besting a virus. Since its founding, the country has prized a strain of rugged individualism that prioritizes individual freedom and valorizes self-reliance. According to this ethos, people are responsible for their own well-being, physical and moral strength are equated, social vulnerability results from personal weakness rather than policy failure, and handouts or advice from the government are unwelcome. Such ideals are disastrous when handling a pandemic, for two major reasons.

    First, diseases spread. Each person’s choices inextricably affect their community, and the threat to the collective always exceeds that to the individual. The original Omicron variant, for example, posed slightly less risk to each infected person than the variants that preceded it, but spread so quickly that it inundated hospitals, greatly magnifying COVID’s societal costs. To handle such threats, collective action is necessary. Governments need policies, such as vaccine requirements or, yes, mask mandates, that protect the health of entire populations, while individuals have to consider their contribution to everyone else’s risk alongside their own personal stakes. And yet, since the spring of 2021, pundits have mocked people who continue to think this way for being irrational and overcautious, and government officials have consistently framed COVID as a matter of personal responsibility.

    Second, a person’s circumstances always constrain their choices. Low-income and minority groups find it harder to avoid infections or isolate when sick because they’re more likely to live in crowded homes and hold hourly-wage jobs without paid leave or the option to work remotely. Places such as prisons and nursing homes, whose residents have little autonomy, became hot spots for the worst outbreaks. Treating a pandemic as an individualist free-for-all ignores how difficult it is for many Americans to protect themselves. It also leaves people with vulnerabilities that last across successive pathogens: The groups that suffered most during the H1N1 influenza pandemic of 2009 were the same ones that took the brunt of COVID, a decade later.

    America’s individualist bent has also shaped its entire health-care system, which ties health to wealth and employment. That system is organized around treating sick people at great and wasteful expense, instead of preventing communities from falling sick in the first place. The latter is the remit of public health rather than medicine, and has long been underfunded and undervalued. Even the CDC—the nation’s top public-health agency—changed its guidelines in February to prioritize hospitalizations over cases, implicitly tolerating infections as long as hospitals are stable. But such a strategy practically ensures that emergency rooms will be overwhelmed by a fast-spreading virus; that, consequently, health-care workers will quit; and that waves of chronically ill long-haulers who are disabled by their infections will seek care and receive nothing. All of that has happened and will happen again. America’s pandemic individualism means that it’s your job to protect yourself from infection; if you get sick, your treatment may be unaffordable, and if you don’t get better, you will struggle to find help, or even anyone who believes you.


    In the late 19th century, many scholars realized that epidemics were social problems, whose spread and toll are influenced by poverty, inequality, overcrowding, hazardous working conditions, poor sanitation, and political negligence. But after the advent of germ theory, this social model was displaced by a biomedical and militaristic one, in which diseases were simple battles between hosts and pathogens, playing out within individual bodies. This paradigm conveniently allowed people to ignore the social context of disease. Instead of tackling intractable social problems, scientists focused on fighting microscopic enemies with drugs, vaccines, and other products of scientific research—an approach that sat easily with America’s abiding fixation on technology as a panacea.

    The allure of biomedical panaceas is still strong. For more than a year, the Biden administration and its advisers have reassured Americans that, with vaccines and antivirals, “we have the tools” to control the pandemic. These tools are indeed effective, but their efficacy is limited if people can’t access them or don’t want to, and if the government doesn’t create policies that shift that dynamic. A profoundly unequal society was always going to struggle with access: People with low incomes, food insecurity, eviction risk, and no health insurance struggled to make or attend vaccine appointments, even after shots were widely available. A profoundly mistrustful society was always going to struggle with hesitancy, made worse by political polarization and rampantly spreading misinformation. The result is that just 72 percent of Americans have completed their initial course of shots and just half have gotten the first of the boosters necessary to protect against current variants. At the same time, almost all other protections have been stripped away, and COVID funding is evaporating. And yet the White House’s recent pandemic-preparedness strategy still focuses heavily on biomedical magic bullets, paying scant attention to the social conditions that could turn those bullets into duds.

    Technological solutions also tend to rise into society’s penthouses, while epidemics seep into its cracks. Cures, vaccines, and diagnostics first go to people with power, wealth, and education, who then move on, leaving the communities most affected by diseases to continue shouldering their burden. This dynamic explains why the same health inequities linger across the decades even as pathogens come and go, and why the U.S. has now normalized an appalling level of COVID death and disability. Such suffering is concentrated among elderly, immunocompromised, working-class, and minority communities—groups that are underrepresented among political decision makers and the media, who get to declare the pandemic over. Even when inequities are highlighted, knowledge seems to suppress action: In one study, white Americans felt less empathy for vulnerable communities and were less supportive of safety precautions after learning about COVID’s racial disparities. This attitude is self-destructive and limits the advantage that even the most privileged Americans enjoy. Measures that would flatten social inequities, such as universal health care and better ventilation, would benefit everyone—and their absence harms everyone, too. In 2021, young white Americans died at lower rates than Black and Indigenous Americans, but still at three times the rate of their counterparts in other wealthy countries.

    By failing to address its social weaknesses, the U.S. accumulates more of them. An estimated 9 million Americans have lost close loved ones to COVID; about 10 percent will likely experience prolonged grief, which the country’s meager mental-health services will struggle to address. Because of brain fog, fatigue, and other debilitating symptoms, long COVID is keeping the equivalent of 2 million to 4 million Americans out of work; between lost earnings and increased medical costs, it could cost the economy $2.6 trillion a year. The exodus of health-care workers, especially experienced veterans, has left hospitals with a shortfall of staff and know-how. Levels of trust—one of the most important predictors of a country’s success at controlling COVID—have fallen, making pandemic interventions harder to deploy, while creating fertile ground in which misinformation can germinate. This is the cost of accepting the unacceptable: an even weaker foundation that the next disease will assail.


    In the spring of 2020, I wrote that the pandemic would last for years, and that the U.S. would need long-term strategies to control it. But America’s leaders consistently acted as if they were fighting a skirmish rather than a siege, lifting protective measures too early, and then reenacting them too slowly. They have skirted the responsibility of articulating what it would actually look like for the pandemic to be over, which has meant that whenever citizens managed to flatten the curve, the time they bought was wasted. Endemicity was equated with inaction rather than active management. This attitude removed any incentive or will to make the sort of long-term changes that would curtail the current disaster and prevent future ones. And so America has little chance of effectively countering the inevitable pandemics of the future; it cannot even focus on the one that’s ongoing.

    If change happens, it will likely occur slowly and from the ground up. In the vein of ACT UP—the extraordinarily successful activist group that changed the world’s approach to AIDS—grassroots organizations of longhaulers, grievers, immunocompromised people, and others disproportionately harmed by the pandemic have formed, creating the kind of vocal constituency that public health has long lacked.

    More pandemics will happen, and the U.S. has spectacularly failed to contain the current one. But it cannot afford the luxury of nihilism. It still has time to address its bedrocks of individualism and inequality, to create a health system that effectively prevents sickness instead of merely struggling to treat it, and to enact policies that rightfully prioritize the needs of disabled and vulnerable communities. Such changes seem unrealistic given the relentless disappointments of the past three years, but substantial social progress always seems unfeasible until it is actually achieved. Normal led to this. It is not too late to fashion a better normal.

    [ad_2]

    Ed Yong

    Source link