ReportWire

Tag: Indigenous communities

  • Vatican will return dozens of artifacts to Indigenous groups in Canada as part of reconciliation

    [ad_1]

    VATICAN CITY (AP) — The Vatican is expected to soon announce that it will return a few dozen artifacts sought by Indigenous communities in Canada as part of its reckoning with the Catholic Church’s troubled role in helping suppress Indigenous culture in the Americas, officials said Wednesday.

    The items, including an Inuit kayak, are part of the Vatican Museum’s ethnographic collection, known as the Anima Mundi museum. The collection has been a source of controversy for the Vatican amid the broader museum debate over the restitution of cultural artifacts taken from Indigenous peoples during colonial periods.

    Negotiations on returning the Vatican items accelerated after Pope Francis in 2022 met with Indigenous leaders who had traveled to the Vatican to receive his apology for the church’s role in Canada’s disastrous residential schools. During their visit, they were shown some objects in the collection, including wampum belts, war clubs and masks, and asked for them to be returned.

    Francis later said he was in favor of returning the items and others in the Vatican collection on a case-by-case basis, saying: “In the case where you can return things, where it’s necessary to make a gesture, better to do it.”

    The Canadian Catholic Conference of Bishops said Wednesday it has been working with Indigenous groups on returning the items to their “originating communities.” It said it expected the Holy See to announce the return. Vatican and Canadian officials said they expected an announcement in the coming weeks, and that the items could arrive on Canadian soil before the end of the year.

    The Globe and Mail newspaper first reported on the progress in the restitution negotiations.

    Most of the items in the Vatican collection were sent to Rome by Catholic missionaries for a 1925 exhibition in the Vatican gardens that was a highlight of that year’s Holy Year.

    Doubt cast on whether the items were freely given

    The Vatican insists the items were “gifts” to Pope Pius XI, who wanted to celebrate the church’s global reach, its missionaries and the lives of the Indigenous peoples they evangelized.

    But historians, Indigenous groups and experts have long questioned whether the items could really have been offered freely, given the power imbalances at play in Catholic missions at the time. In those years, Catholic religious orders were helping to enforce the Canadian government’s policy of eliminating Indigenous traditions, which Canada’s Truth and Reconciliation Commission has called “cultural genocide.”

    The return of the items will follow the “church-to-church” model the Vatican used in 2023, when it gave its Parthenon Marbles to the Orthodox Christian Church in Greece. The three fragments were described by the Vatican as a “donation” to the Orthodox church, not a state-to-state repatriation to the Greek government.

    In this case, the Vatican is expected to hand over the items to the Canadian bishops conference, with the explicit understanding that the ultimate keepers will be the Indigenous communities, a Canadian official said Wednesday, speaking on condition of anonymity because the negotiations are not concluded.

    What happens after the items are returned

    The items, accompanied by whatever provenance information the Vatican has, will be taken first to the Canadian Museum of History in Gatineau, Quebec. There, experts and Indigenous groups will try to identify where the items originated, down to the specific community, and what should be done with them, the official said.

    The official declined to say how many items were under negotiation or who decided what would be returned, but said the total numbered “a few dozen.”

    The aim is to get the items back this year, the official said, noting the 2025 Jubilee celebrating hope, and the centenary of the 1925 Holy Year that was the reason for the items to be brought to Rome in the first place.

    The 1925 exhibit is now so controversial that its 100th anniversary has been virtually ignored by the Vatican, which celebrates a lot of anniversaries.

    The Assembly of First Nations said some logistical issues need to be finalized before the objects can be returned, including establishing protocols.

    “For First Nations, these items are not artifacts. They are living, sacred pieces of our cultures and ceremonies and must be treated as the invaluable objects that they are,” National Chief Cindy Woodhouse Nepinak told Canadian Press.

    ___

    Associated Press religion coverage receives support through the AP’s collaboration with The Conversation US, with funding from Lilly Endowment Inc. The AP is solely responsible for this content.

    [ad_2]

    Source link

  • Deer Are Beta-Testing a Nightmare Disease

    Deer Are Beta-Testing a Nightmare Disease

    [ad_1]

    Scott Napper, a biochemist and vaccinologist at the University of Saskatchewan, can easily envision humanity’s ultimate doomsday disease. The scourge would spread fast, but the progression of illness would be slow and subtle. With no immunity, treatments, or vaccines to halt its progress, the disease would eventually find just about every single one of us, spreading via all manner of body fluids. In time, it would kill everyone it infected. Even our food and drink would not be safe, because the infectious agent would be hardy enough to survive common disinfectants and the heat of cooking; it would be pervasive enough to infest our livestock and our crops. “Imagine if consuming a plant could cause a fatal, untreatable neurodegenerative disorder,” Napper told me. “Any food grown within North America would be potentially deadly to humans.”

    This nightmare illness doesn’t yet exist. But for inspiration, Napper needs to look only at the very real contagion in his own lab: chronic wasting disease (CWD), a highly lethal, highly contagious neurodegenerative disease that is devastating North America’s deer, elk, and other cervids.

    In the half century since it was discovered in a captive deer colony in Colorado, CWD has worked its way into more than 30 U.S. states and four Canadian provinces, as well as South Korea and several countries in Europe. In some captive herds, the disease has been detected in more than 90 percent of individuals; in the wild, Debbie McKenzie, a biologist at the University of Alberta, told me, “we have areas now where more than 50 percent of the bucks are infected.” And CWD kills indiscriminately, gnawing away at deer’s brains until the tissue is riddled with holes. “The disease is out of control,” Dalia Abdelaziz, a biochemist at the University of Calgary, told me.

    What makes CWD so formidable is its cause: infectious misfolded proteins called prions. Prion diseases, which include mad cow disease, have long been known as terrifying and poorly understood threats. And CWD is, in many ways, “the most difficult” among them to contend with—more transmissible and widespread than any other known, Marcelo Jorge, a wildlife biologist at the University of Georgia, told me. Scientists are quite certain that CWD will be impossible to eradicate; even limiting its damage will be a challenge, especially if it spills into other species, which could include us. CWD is already a perfect example of how dangerous a prion disease can be. And it has not yet hit the ceiling of its destructive potential.


    Among the world’s known infectious agents, prions are an anomaly, more like zombies than living entities. Unlike standard-issue microbes—viruses, bacteria, parasites, fungi—prions are just improperly folded proteins, devoid of genetic material, unable to build more of themselves from scratch, or cleave themselves in two. To reproduce, they simply find properly formed proteins that share their base composition and convert those to their aberrant shape, through mostly mysterious means. And because prions are slightly malformed versions of molecules that our bodies naturally make, they’re difficult to defend against. The immune system codes them as benign and ignores them, even as disease rapidly unfolds. “This is an entirely new paradigm of infectious disease,” Napper told me. “It’s a part of your own body that’s turning against you.”

    And yet, we’ve managed to keep many prion diseases in check. Kuru, once common in the highlands of Papua New Guinea, was transmitted through local rituals of funerary cannibalism; the disease fizzled out after people stopped those practices. Mad cow disease (more formally known as bovine spongiform encephalopathy) was contained by culling infected animals and eliminating the suspected source, cow feed made with infected tissues. Even scrapie, a highly contagious prion disease of sheep and goats, is limited to livestock, making it feasible to pare down infected populations, or breed them toward genetic resistance.

    CWD, meanwhile, is a fixture of wild animals, many of them migratory. And whereas most other prion diseases primarily keep quarters in the central nervous system, CWD “gets in pretty much every part of the body,” Jorge told me. Deer then pass on the molecules, often through direct contact; they’ll shed prions in their saliva, urine, feces, reproductive fluids, and even antler velvet long before they start to show symptoms. Candace Mathiason, a pathobiologist at Colorado State University, and her colleagues have found that as little as 100 nanograms of saliva can seed an infection. Her studies suggest that deer can also pass prions in utero from doe to growing fawn.

    Deer also ingest prions from their environment, where the molecules can linger in soil, on trees, and on hunting bait for years or decades. A team led by Sandra Pritzkow, a biochemist at UTHealth Houston, has found that plants can take up prions from the soil, too. And unlike the multitude of microbes that are easily done in by UV, alcohol, heat, or low humidity, prions are so structurally sound that they can survive nearly any standard environmental assault. In laboratories, scientists must blast their equipment with temperatures of about 275 degrees Fahrenheit for 60 to 90 minutes, under extreme pressure, to rid it of prions—or drench their workspaces with bleach or sodium hydroxide, at concentrations high enough to rapidly corrode flesh.

    Infected deer are also frustratingly difficult to detect. The disease typically takes years to fully manifest, while the prions infiltrate the brain and steadily destroy neural tissue. The molecules kill insidiously: “This isn’t the kind of disease where you might get a group of deer that are all dead around this watering hole,” Jorge told me. Deer drift away from the herd; they forage at odd times. They become braver around us. They drool and urinate more, stumble about, and begin to lose weight. Eventually, a predator picks them off, or a cold snap freezes them, or they simply starve; in all cases, though, the disease is fatal. Because of CWD, deer populations in many parts of North America are declining; “there is definitely some concern that local populations will disappear,” McKenzie told me. Researchers worry the disease will soon overwhelm caribou in Canada, imperiling the Indigenous communities who rely on them for food. Hunters and farmers, too, are losing vital income. Deer are unlikely to go extinct, but the disease is depriving their habitats of key grazers, and their predators of food.

    In laboratory experiments, CWD has proved capable of infecting rodents, sheep, goats, cattle, raccoons, ferrets, and primates. But so far, jumps into non-cervid species don’t seem to be happening in the wild—and although people eat an estimated 10,000 CWD-infected cervids each year, no human cases have been documented. Still, lab experiments indicate that human proteins, at least when expressed by mice, could be susceptible to CWD too, Sabine Gilch, a molecular biologist at the University of Calgary, told me.

    And the more prions transmit, and the more hosts they find themselves in, the more opportunities they may have to infect creatures in new ways. Prions don’t seem to evolve as quickly as many viruses or bacteria, Gilch told me. But “they’re not as static as we would like them to be.” She, McKenzie, and other researchers have detected a multitude of CWD strains bopping around in the wild—each with its own propensity for interspecies spread. With transmission so unchecked, and hosts so numerous, “this is kind of like a ticking time bomb,” Surachai Suppattapone, a biochemist at Dartmouth, told me.


    The world is unlikely to ever be fully rid of CWD; even the options to slow its advance are so far limited. Efforts to survey for infection depend on funding and researchers’ time, or the generosity of local hunters for samples; environmental decontamination is still largely experimental and tricky to do at scale; treatments—which don’t yet exist—would be nearly impossible to administer en masse. And culling campaigns, although sometimes quite effective, especially at the edges of the disease’s reach, often spark public backlash.

    Deer that carry certain genetic variants do seem less susceptible to prions, and progress more slowly to full-blown disease and death. But because none so far seems able to fully block infection, or completely curb shedding, prolonging life may simply prolong transmission. “Once an animal gets infected,” Abdelaziz told me, there’s almost a “hope it dies right away.” Even if sturdier prion resistance is someday found, “it’s probably just a matter of time until prions start to adapt to that as well,” Gilch said.

    Vaccines, in theory, could help, and in recent years, several research groups—including Napper’s and Abdelaziz’s—have made breakthroughs in overcoming the immune system’s inertia in attacking proteins that look like the body’s own. Some strategies try to target the problematic, invasive prions only; others are going after both the prion and the native, properly folded protein, so that the vaccine can do double duty, waylaying the infectious invader and starving it of reproductive fodder. (So far, lab animals seem to do mostly fine even when they’re bred to lack the native prion protein, whose function is still mostly mysterious.) In early trials, both teams’ vaccines have produced promising immune responses in cervids. But neither team yet fully knows how effective their vaccines are at cutting down on shedding, how long that protection might last, or whether these strategies will work across cervid species. One of Napper’s vaccine candidates, for instance, seemed to hasten the progression of disease in elk.

    Vaccines for wildlife are also tough to deliver, especially the multiple doses likely needed in this case. “It’s not like you can just run around injecting every elk and deer,” Napper told me. Instead, he and other researchers plan to compound their formula with a salty apple-cider slurry that he hopes wild cervids might eat with some regularity. “The deer absolutely love it,” he said.

    Should any CWD vaccines come to market, though, they will almost certainly be the first prion vaccines that clear the experimental stage. That could be a boon for more than just deer. Another prion disease may spill over from one species to another; others may arise spontaneously. CWD is not, and may never be, the prion disease that most directly affects us. But it is, for now, the most urgent—and the one from which we have the most to lose, and maybe gain.

    [ad_2]

    Katherine J. Wu

    Source link

  • The Pandemic’s Legacy Is Already Clear

    The Pandemic’s Legacy Is Already Clear

    [ad_1]

    Recently, after a week in which 2,789 Americans died of COVID-19, President Joe Biden proclaimed that “the pandemic is over.” Anthony Fauci described the controversy around the proclamation as a matter of “semantics,” but the facts we are living with can speak for themselves. COVID still kills roughly as many Americans every week as died on 9/11. It is on track to kill at least 100,000 a year—triple the typical toll of the flu. Despite gross undercounting, more than 50,000 infections are being recorded every day. The CDC estimates that 19 million adults have long COVID. Things have undoubtedly improved since the peak of the crisis, but calling the pandemic “over” is like calling a fight “finished” because your opponent is punching you in the ribs instead of the face.

    American leaders and pundits have been trying to call an end to the pandemic since its beginning, only to be faced with new surges or variants. This mindset not only compromises the nation’s ability to manage COVID, but also leaves it vulnerable to other outbreaks. Future pandemics aren’t hypothetical; they’re inevitable and imminent. New infectious diseases have regularly emerged throughout recent decades, and climate change is quickening the pace of such events. As rising temperatures force animals to relocate, species that have never coexisted will meet, allowing the viruses within them to find new hosts—humans included. Dealing with all of this again is a matter of when, not if.

    In 2018, I wrote an article in The Atlantic warning that the U.S. was not prepared for a pandemic. That diagnosis remains unchanged; if anything, I was too optimistic. America was ranked as the world’s most prepared country in 2019—and, bafflingly, again in 2021—but accounts for 16 percent of global COVID deaths despite having just 4 percent of the global population. It spends more on medical care than any other wealthy country, but its hospitals were nonetheless overwhelmed. It helped create vaccines in record time, but is 67th in the world in full vaccinations. (This trend cannot solely be attributed to political division; even the most heavily vaccinated blue state—Rhode Island—still lags behind 21 nations.) America experienced the largest life-expectancy decline of any wealthy country in 2020 and, unlike its peers, continued declining in 2021. If it had fared as well as just the average peer nation, 1.1 million people who died last year—a third of all American deaths—would still be alive.

    America’s superlatively poor performance cannot solely be blamed on either the Trump or Biden administrations, although both have made egregious errors. Rather, the new coronavirus exploited the country’s many failing systems: its overstuffed prisons and understaffed nursing homes; its chronically underfunded public-health system; its reliance on convoluted supply chains and a just-in-time economy; its for-profit health-care system, whose workers were already burned out; its decades-long project of unweaving social safety nets; and its legacy of racism and segregation that had already left Black and Indigenous communities and other communities of color disproportionately burdened with health problems. Even in the pre-COVID years, the U.S. was still losing about 626,000 people more than expected for a nation of its size and resources. COVID simply toppled an edifice whose foundations were already rotten.

    In furiously racing to rebuild on this same foundation, America sets itself up to collapse once more. Experience is reputedly the best teacher, and yet the U.S. repeated mistakes from the early pandemic when faced with the Delta and Omicron variants. It got early global access to vaccines, and nonetheless lost almost half a million people after all adults became eligible for the shots. It has struggled to control monkeypox—a slower-spreading virus for which there is already a vaccine. Its right-wing legislators have passed laws and rulings that curtail the possibility of important public-health measures like quarantines and vaccine mandates. It has made none of the broad changes that would protect its population against future pathogens, such as better ventilation or universal paid sick leave. Its choices virtually guarantee that everything that’s happened in the past three years will happen again.


    The U.S. will continue to struggle against infectious diseases in part because some of its most deeply held values are antithetical to the task of besting a virus. Since its founding, the country has prized a strain of rugged individualism that prioritizes individual freedom and valorizes self-reliance. According to this ethos, people are responsible for their own well-being, physical and moral strength are equated, social vulnerability results from personal weakness rather than policy failure, and handouts or advice from the government are unwelcome. Such ideals are disastrous when handling a pandemic, for two major reasons.

    First, diseases spread. Each person’s choices inextricably affect their community, and the threat to the collective always exceeds that to the individual. The original Omicron variant, for example, posed slightly less risk to each infected person than the variants that preceded it, but spread so quickly that it inundated hospitals, greatly magnifying COVID’s societal costs. To handle such threats, collective action is necessary. Governments need policies, such as vaccine requirements or, yes, mask mandates, that protect the health of entire populations, while individuals have to consider their contribution to everyone else’s risk alongside their own personal stakes. And yet, since the spring of 2021, pundits have mocked people who continue to think this way for being irrational and overcautious, and government officials have consistently framed COVID as a matter of personal responsibility.

    Second, a person’s circumstances always constrain their choices. Low-income and minority groups find it harder to avoid infections or isolate when sick because they’re more likely to live in crowded homes and hold hourly-wage jobs without paid leave or the option to work remotely. Places such as prisons and nursing homes, whose residents have little autonomy, became hot spots for the worst outbreaks. Treating a pandemic as an individualist free-for-all ignores how difficult it is for many Americans to protect themselves. It also leaves people with vulnerabilities that last across successive pathogens: The groups that suffered most during the H1N1 influenza pandemic of 2009 were the same ones that took the brunt of COVID, a decade later.

    America’s individualist bent has also shaped its entire health-care system, which ties health to wealth and employment. That system is organized around treating sick people at great and wasteful expense, instead of preventing communities from falling sick in the first place. The latter is the remit of public health rather than medicine, and has long been underfunded and undervalued. Even the CDC—the nation’s top public-health agency—changed its guidelines in February to prioritize hospitalizations over cases, implicitly tolerating infections as long as hospitals are stable. But such a strategy practically ensures that emergency rooms will be overwhelmed by a fast-spreading virus; that, consequently, health-care workers will quit; and that waves of chronically ill long-haulers who are disabled by their infections will seek care and receive nothing. All of that has happened and will happen again. America’s pandemic individualism means that it’s your job to protect yourself from infection; if you get sick, your treatment may be unaffordable, and if you don’t get better, you will struggle to find help, or even anyone who believes you.


    In the late 19th century, many scholars realized that epidemics were social problems, whose spread and toll are influenced by poverty, inequality, overcrowding, hazardous working conditions, poor sanitation, and political negligence. But after the advent of germ theory, this social model was displaced by a biomedical and militaristic one, in which diseases were simple battles between hosts and pathogens, playing out within individual bodies. This paradigm conveniently allowed people to ignore the social context of disease. Instead of tackling intractable social problems, scientists focused on fighting microscopic enemies with drugs, vaccines, and other products of scientific research—an approach that sat easily with America’s abiding fixation on technology as a panacea.

    The allure of biomedical panaceas is still strong. For more than a year, the Biden administration and its advisers have reassured Americans that, with vaccines and antivirals, “we have the tools” to control the pandemic. These tools are indeed effective, but their efficacy is limited if people can’t access them or don’t want to, and if the government doesn’t create policies that shift that dynamic. A profoundly unequal society was always going to struggle with access: People with low incomes, food insecurity, eviction risk, and no health insurance struggled to make or attend vaccine appointments, even after shots were widely available. A profoundly mistrustful society was always going to struggle with hesitancy, made worse by political polarization and rampantly spreading misinformation. The result is that just 72 percent of Americans have completed their initial course of shots and just half have gotten the first of the boosters necessary to protect against current variants. At the same time, almost all other protections have been stripped away, and COVID funding is evaporating. And yet the White House’s recent pandemic-preparedness strategy still focuses heavily on biomedical magic bullets, paying scant attention to the social conditions that could turn those bullets into duds.

    Technological solutions also tend to rise into society’s penthouses, while epidemics seep into its cracks. Cures, vaccines, and diagnostics first go to people with power, wealth, and education, who then move on, leaving the communities most affected by diseases to continue shouldering their burden. This dynamic explains why the same health inequities linger across the decades even as pathogens come and go, and why the U.S. has now normalized an appalling level of COVID death and disability. Such suffering is concentrated among elderly, immunocompromised, working-class, and minority communities—groups that are underrepresented among political decision makers and the media, who get to declare the pandemic over. Even when inequities are highlighted, knowledge seems to suppress action: In one study, white Americans felt less empathy for vulnerable communities and were less supportive of safety precautions after learning about COVID’s racial disparities. This attitude is self-destructive and limits the advantage that even the most privileged Americans enjoy. Measures that would flatten social inequities, such as universal health care and better ventilation, would benefit everyone—and their absence harms everyone, too. In 2021, young white Americans died at lower rates than Black and Indigenous Americans, but still at three times the rate of their counterparts in other wealthy countries.

    By failing to address its social weaknesses, the U.S. accumulates more of them. An estimated 9 million Americans have lost close loved ones to COVID; about 10 percent will likely experience prolonged grief, which the country’s meager mental-health services will struggle to address. Because of brain fog, fatigue, and other debilitating symptoms, long COVID is keeping the equivalent of 2 million to 4 million Americans out of work; between lost earnings and increased medical costs, it could cost the economy $2.6 trillion a year. The exodus of health-care workers, especially experienced veterans, has left hospitals with a shortfall of staff and know-how. Levels of trust—one of the most important predictors of a country’s success at controlling COVID—have fallen, making pandemic interventions harder to deploy, while creating fertile ground in which misinformation can germinate. This is the cost of accepting the unacceptable: an even weaker foundation that the next disease will assail.


    In the spring of 2020, I wrote that the pandemic would last for years, and that the U.S. would need long-term strategies to control it. But America’s leaders consistently acted as if they were fighting a skirmish rather than a siege, lifting protective measures too early, and then reenacting them too slowly. They have skirted the responsibility of articulating what it would actually look like for the pandemic to be over, which has meant that whenever citizens managed to flatten the curve, the time they bought was wasted. Endemicity was equated with inaction rather than active management. This attitude removed any incentive or will to make the sort of long-term changes that would curtail the current disaster and prevent future ones. And so America has little chance of effectively countering the inevitable pandemics of the future; it cannot even focus on the one that’s ongoing.

    If change happens, it will likely occur slowly and from the ground up. In the vein of ACT UP—the extraordinarily successful activist group that changed the world’s approach to AIDS—grassroots organizations of longhaulers, grievers, immunocompromised people, and others disproportionately harmed by the pandemic have formed, creating the kind of vocal constituency that public health has long lacked.

    More pandemics will happen, and the U.S. has spectacularly failed to contain the current one. But it cannot afford the luxury of nihilism. It still has time to address its bedrocks of individualism and inequality, to create a health system that effectively prevents sickness instead of merely struggling to treat it, and to enact policies that rightfully prioritize the needs of disabled and vulnerable communities. Such changes seem unrealistic given the relentless disappointments of the past three years, but substantial social progress always seems unfeasible until it is actually achieved. Normal led to this. It is not too late to fashion a better normal.

    [ad_2]

    Ed Yong

    Source link