ReportWire

Tag: recent decades

  • The Calendar of Human Fertility Is Changing

    The Calendar of Human Fertility Is Changing

    [ad_1]

    As the chair of the department of obstetrics and gynecology at UT Southwestern Medicine, Catherine Spong is used to seeing a lot of baby bumps. But through her decades of practice, she’s been fascinated by a different kind of bump: Year after year after year, she and her colleagues deliver a deluge of babies from June through September, as much as a 10 percent increase in monthly rates over what they see from February through April. “We call it the summer surge,” Spong told me.

    Her hospital isn’t alone in this trend. For decades, demographers have documented a lift in American births in late summer, and a trough in the spring. I see it myself in my own corner of the world: In the past several weeks, the hospital across the street from me has become a revolving door of new parents and infants. When David Lam, an economist at the University of Michigan who helped pioneer several early U.S. studies on seasonal patterns of fertility, first analyzed his data decades ago, “we were kind of surprised how big it was,” he told me. Compare the peak of some years to their nadir, he said, and it was almost like looking at the Baby Boom squished down into 12 months.

    Birth seasonality has been documented since the 1820s, if not earlier. But despite generations of study, we still don’t fully understand the reasons it exists, or why it differs so drastically among even neighboring countries. Teasing apart the contributions of biology and behavior to seasonality is messy because of the many factors involved, says Micaela Martinez, the director of environmental health at the nonprofit WE ACT for Environmental Justice, who has been studying seasonality for years. And even while researchers try to track it, the calendar of human fertility has been changing. As our species has grown more industrialized, claimed more agency over reproduction, and reshaped the climate we are living in, seasonality, in many places, is shifting or weakening.

    There is no doubt that a big part of human birth seasonality is behavioral. People have more sex when they have more free time; they have less sex when they’re overworked or overheated or stressed. Certain holidays have long been known to carry this effect: In parts of the Western world with a heavy Christian presence, baby boomlets fall roughly nine months after Christmas; the same patterns have been spotted with Spring Festival and Lunar New Year in certain Chinese communities. (Why these holidays strike such a note, and not others, isn’t entirely clear, experts told me.)

    In addition to free time, family-focused celebrations probably help set the mood, Luis Rocha, a systems scientist at Binghamton University, told me. Cold weather might help people get snuggly around Christmastime, too, but it’s not necessary; Rocha’s studies and others have shown the so-called Christmas effect in southern-hemisphere countries as well. No matter whether Christmas falls in the winter or summer, around the end of December, Google searches for sex skyrocket and people report more sexual activity on health-tracking apps. In a few countries, including the U.S., condom sales rise too.

    But cultural norms have never been able to explain everything about the Homo sapiens birth calendar. “It’s pretty common for mammals to have a specific breeding season” dictated by all sorts of environmental cues, Martinez told me. Deer, for instance, mate in the fall, triggered by the shortening length of daylight, effectively scheduling their fawns to be born in the spring; horses, whose gestations are longer, breed as the days lengthen in the spring and into summer, so they can foal the following year.

    Humans, of course, aren’t horses or deer. Our closest relatives among primates “are much more flexible” about when they mate, Élise Huchard, a behavioral ecologist at the University of Montpellier, in France, told me. But those apes are not immune to their surroundings, and neither are we. All sorts of hormones in the human body, including reproductive ones, wax and wane with the seasons. Researchers in the United States and Australia have found that couples hoping to conceive via in vitro fertilization have a higher chance of success if the eggs are retrieved during the summer. At the same time, summer conceptions appear to be less common, or less successfully carried to term, in some countries, a trend that sharpens at lower latitudes and, Lam told me, during hotter years. The subsequent spring lulls may be explained in part by heat waves dissuading people from sex. But Alan Barreca, an economist at UCLA, suspects that ultrahigh temperatures may also physiologically compromise fertility, potentially by affecting factors such as sperm quantity and quality, ovulation success, or the likelihood of early fetal loss.

    No matter its exact drivers, seasonality is clearly weakening in many countries, Martinez told me; in some parts of the world, it may be entirely gone. The change isn’t uniform or entirely understood, but it’s probably to some extent a product of just how much human lifestyles have changed. In many communities that have historically planted and harvested their own food, people may have been more disinclined to, and less physically able to, conceive a child when labor demands were high or when crops were scarce—trends that are still prominent in certain countries today. People in industrial and high-income areas of the modern world, though, are more shielded from those stressors and others, in ways that may even out the annual birth schedule, Kathryn Grace, a geographer at the University of Minnesota, told me. The heat-driven dip in America’s spring births, for instance, has softened substantially in recent decades, likely due in part to increased access to air-conditioning, Lam said. And as certain populations get more relaxed about religion, the cultural drivers of birth times may be easing up, too, several experts told me. Sweden, for example, appears to have lost the “Christmas effect” of December sex boosting September births.

    Advances in contraception and fertility treatments have also put much more of fertility under personal control. People in well-resourced parts of the world can now, to a decent degree, realize their preferences for when they want their babies to be born. In Sweden, parents seem to avoid November and December deliveries because that would make their child among the youngest in their grade (which carries a stereotype of potentially having major impacts on their behavioral health, social skills, academics, and athletic success). In the U.S., people have reported preferring to give birth in the spring; there’s also a tax incentive to deliver early-winter babies before January 1, says Neel Shah, the chief medical officer of Maven Clinic, a women’s health and fertility clinic in New York.

    Humans aren’t yet, and never will be, completely divorced from the influences of our surroundings. We are also constantly altering the environment in which we reproduce—which could, in turn, change the implications of being born during a particular season. Births are not only more common at certain times of the year; they can also be riskier, because of the seasonal perils posed to fetuses and newborns, Mary-Alice Doyle, a social-policy researcher at the London School of Economics, told me. Babies born during summer may be at higher risk of asthma, for instance—a trend that’s likely to get only stronger as heat waves, wildfires, and air pollution become more routine during the year’s hottest months.

    The way we manage infectious disease matters too. Being born shortly after the peak of flu season—typically winter, in temperate parts of the world—can also be dangerous: Infections during pregnancy have been linked to lower birth weight, preterm delivery, even an increased likelihood of the baby developing certain mental-health issues later on. Comparable concerns exist in the tropics, where mosquitoes, carrying birth-defect-causing viruses such as dengue or Zika, can wax and wane with the rainy season. The more humans allow pathogens to spill over from wildlife and spread, the bigger these effects are likely to be.

    Children born in the spring—in many countries, a more sparsely populated group—tend to be healthier on several metrics, Barreca told me. It’s possible that they’re able to “thread the needle,” he said, between the perils of flu in winter and extreme heat in summer. But these infants might also thrive because they are born to families with more socioeconomic privilege, who could afford to beat the heat that might have compromised other conceptions. As heat waves become more intense and frequent, people without access to air-conditioning might have an even harder time getting pregnant in the summer.

    The point of all this isn’t that there is a right or wrong time of year to be born, Grace told me. If seasonality will continue to have any sway over when we conceive and give birth, health-care systems and public-health experts might be able to use that knowledge to improve outcomes, shuttling resources to maternity wards and childhood-vaccination clinics, for instance, during the months they might be in highest demand.

    Humans may never have had as strict a breeding season as horses and deer. But the fact that so many people can now deliver safely throughout the year is a testament to our ingenuity—and to our sometimes-inadvertent power to reshape the world we live in. We have, without always meaning to, altered a fundamental aspect of human reproduction. And we’re still not done changing it.

    [ad_2]

    Katherine J. Wu

    Source link

  • The Pandemic’s Legacy Is Already Clear

    The Pandemic’s Legacy Is Already Clear

    [ad_1]

    Recently, after a week in which 2,789 Americans died of COVID-19, President Joe Biden proclaimed that “the pandemic is over.” Anthony Fauci described the controversy around the proclamation as a matter of “semantics,” but the facts we are living with can speak for themselves. COVID still kills roughly as many Americans every week as died on 9/11. It is on track to kill at least 100,000 a year—triple the typical toll of the flu. Despite gross undercounting, more than 50,000 infections are being recorded every day. The CDC estimates that 19 million adults have long COVID. Things have undoubtedly improved since the peak of the crisis, but calling the pandemic “over” is like calling a fight “finished” because your opponent is punching you in the ribs instead of the face.

    American leaders and pundits have been trying to call an end to the pandemic since its beginning, only to be faced with new surges or variants. This mindset not only compromises the nation’s ability to manage COVID, but also leaves it vulnerable to other outbreaks. Future pandemics aren’t hypothetical; they’re inevitable and imminent. New infectious diseases have regularly emerged throughout recent decades, and climate change is quickening the pace of such events. As rising temperatures force animals to relocate, species that have never coexisted will meet, allowing the viruses within them to find new hosts—humans included. Dealing with all of this again is a matter of when, not if.

    In 2018, I wrote an article in The Atlantic warning that the U.S. was not prepared for a pandemic. That diagnosis remains unchanged; if anything, I was too optimistic. America was ranked as the world’s most prepared country in 2019—and, bafflingly, again in 2021—but accounts for 16 percent of global COVID deaths despite having just 4 percent of the global population. It spends more on medical care than any other wealthy country, but its hospitals were nonetheless overwhelmed. It helped create vaccines in record time, but is 67th in the world in full vaccinations. (This trend cannot solely be attributed to political division; even the most heavily vaccinated blue state—Rhode Island—still lags behind 21 nations.) America experienced the largest life-expectancy decline of any wealthy country in 2020 and, unlike its peers, continued declining in 2021. If it had fared as well as just the average peer nation, 1.1 million people who died last year—a third of all American deaths—would still be alive.

    America’s superlatively poor performance cannot solely be blamed on either the Trump or Biden administrations, although both have made egregious errors. Rather, the new coronavirus exploited the country’s many failing systems: its overstuffed prisons and understaffed nursing homes; its chronically underfunded public-health system; its reliance on convoluted supply chains and a just-in-time economy; its for-profit health-care system, whose workers were already burned out; its decades-long project of unweaving social safety nets; and its legacy of racism and segregation that had already left Black and Indigenous communities and other communities of color disproportionately burdened with health problems. Even in the pre-COVID years, the U.S. was still losing about 626,000 people more than expected for a nation of its size and resources. COVID simply toppled an edifice whose foundations were already rotten.

    In furiously racing to rebuild on this same foundation, America sets itself up to collapse once more. Experience is reputedly the best teacher, and yet the U.S. repeated mistakes from the early pandemic when faced with the Delta and Omicron variants. It got early global access to vaccines, and nonetheless lost almost half a million people after all adults became eligible for the shots. It has struggled to control monkeypox—a slower-spreading virus for which there is already a vaccine. Its right-wing legislators have passed laws and rulings that curtail the possibility of important public-health measures like quarantines and vaccine mandates. It has made none of the broad changes that would protect its population against future pathogens, such as better ventilation or universal paid sick leave. Its choices virtually guarantee that everything that’s happened in the past three years will happen again.


    The U.S. will continue to struggle against infectious diseases in part because some of its most deeply held values are antithetical to the task of besting a virus. Since its founding, the country has prized a strain of rugged individualism that prioritizes individual freedom and valorizes self-reliance. According to this ethos, people are responsible for their own well-being, physical and moral strength are equated, social vulnerability results from personal weakness rather than policy failure, and handouts or advice from the government are unwelcome. Such ideals are disastrous when handling a pandemic, for two major reasons.

    First, diseases spread. Each person’s choices inextricably affect their community, and the threat to the collective always exceeds that to the individual. The original Omicron variant, for example, posed slightly less risk to each infected person than the variants that preceded it, but spread so quickly that it inundated hospitals, greatly magnifying COVID’s societal costs. To handle such threats, collective action is necessary. Governments need policies, such as vaccine requirements or, yes, mask mandates, that protect the health of entire populations, while individuals have to consider their contribution to everyone else’s risk alongside their own personal stakes. And yet, since the spring of 2021, pundits have mocked people who continue to think this way for being irrational and overcautious, and government officials have consistently framed COVID as a matter of personal responsibility.

    Second, a person’s circumstances always constrain their choices. Low-income and minority groups find it harder to avoid infections or isolate when sick because they’re more likely to live in crowded homes and hold hourly-wage jobs without paid leave or the option to work remotely. Places such as prisons and nursing homes, whose residents have little autonomy, became hot spots for the worst outbreaks. Treating a pandemic as an individualist free-for-all ignores how difficult it is for many Americans to protect themselves. It also leaves people with vulnerabilities that last across successive pathogens: The groups that suffered most during the H1N1 influenza pandemic of 2009 were the same ones that took the brunt of COVID, a decade later.

    America’s individualist bent has also shaped its entire health-care system, which ties health to wealth and employment. That system is organized around treating sick people at great and wasteful expense, instead of preventing communities from falling sick in the first place. The latter is the remit of public health rather than medicine, and has long been underfunded and undervalued. Even the CDC—the nation’s top public-health agency—changed its guidelines in February to prioritize hospitalizations over cases, implicitly tolerating infections as long as hospitals are stable. But such a strategy practically ensures that emergency rooms will be overwhelmed by a fast-spreading virus; that, consequently, health-care workers will quit; and that waves of chronically ill long-haulers who are disabled by their infections will seek care and receive nothing. All of that has happened and will happen again. America’s pandemic individualism means that it’s your job to protect yourself from infection; if you get sick, your treatment may be unaffordable, and if you don’t get better, you will struggle to find help, or even anyone who believes you.


    In the late 19th century, many scholars realized that epidemics were social problems, whose spread and toll are influenced by poverty, inequality, overcrowding, hazardous working conditions, poor sanitation, and political negligence. But after the advent of germ theory, this social model was displaced by a biomedical and militaristic one, in which diseases were simple battles between hosts and pathogens, playing out within individual bodies. This paradigm conveniently allowed people to ignore the social context of disease. Instead of tackling intractable social problems, scientists focused on fighting microscopic enemies with drugs, vaccines, and other products of scientific research—an approach that sat easily with America’s abiding fixation on technology as a panacea.

    The allure of biomedical panaceas is still strong. For more than a year, the Biden administration and its advisers have reassured Americans that, with vaccines and antivirals, “we have the tools” to control the pandemic. These tools are indeed effective, but their efficacy is limited if people can’t access them or don’t want to, and if the government doesn’t create policies that shift that dynamic. A profoundly unequal society was always going to struggle with access: People with low incomes, food insecurity, eviction risk, and no health insurance struggled to make or attend vaccine appointments, even after shots were widely available. A profoundly mistrustful society was always going to struggle with hesitancy, made worse by political polarization and rampantly spreading misinformation. The result is that just 72 percent of Americans have completed their initial course of shots and just half have gotten the first of the boosters necessary to protect against current variants. At the same time, almost all other protections have been stripped away, and COVID funding is evaporating. And yet the White House’s recent pandemic-preparedness strategy still focuses heavily on biomedical magic bullets, paying scant attention to the social conditions that could turn those bullets into duds.

    Technological solutions also tend to rise into society’s penthouses, while epidemics seep into its cracks. Cures, vaccines, and diagnostics first go to people with power, wealth, and education, who then move on, leaving the communities most affected by diseases to continue shouldering their burden. This dynamic explains why the same health inequities linger across the decades even as pathogens come and go, and why the U.S. has now normalized an appalling level of COVID death and disability. Such suffering is concentrated among elderly, immunocompromised, working-class, and minority communities—groups that are underrepresented among political decision makers and the media, who get to declare the pandemic over. Even when inequities are highlighted, knowledge seems to suppress action: In one study, white Americans felt less empathy for vulnerable communities and were less supportive of safety precautions after learning about COVID’s racial disparities. This attitude is self-destructive and limits the advantage that even the most privileged Americans enjoy. Measures that would flatten social inequities, such as universal health care and better ventilation, would benefit everyone—and their absence harms everyone, too. In 2021, young white Americans died at lower rates than Black and Indigenous Americans, but still at three times the rate of their counterparts in other wealthy countries.

    By failing to address its social weaknesses, the U.S. accumulates more of them. An estimated 9 million Americans have lost close loved ones to COVID; about 10 percent will likely experience prolonged grief, which the country’s meager mental-health services will struggle to address. Because of brain fog, fatigue, and other debilitating symptoms, long COVID is keeping the equivalent of 2 million to 4 million Americans out of work; between lost earnings and increased medical costs, it could cost the economy $2.6 trillion a year. The exodus of health-care workers, especially experienced veterans, has left hospitals with a shortfall of staff and know-how. Levels of trust—one of the most important predictors of a country’s success at controlling COVID—have fallen, making pandemic interventions harder to deploy, while creating fertile ground in which misinformation can germinate. This is the cost of accepting the unacceptable: an even weaker foundation that the next disease will assail.


    In the spring of 2020, I wrote that the pandemic would last for years, and that the U.S. would need long-term strategies to control it. But America’s leaders consistently acted as if they were fighting a skirmish rather than a siege, lifting protective measures too early, and then reenacting them too slowly. They have skirted the responsibility of articulating what it would actually look like for the pandemic to be over, which has meant that whenever citizens managed to flatten the curve, the time they bought was wasted. Endemicity was equated with inaction rather than active management. This attitude removed any incentive or will to make the sort of long-term changes that would curtail the current disaster and prevent future ones. And so America has little chance of effectively countering the inevitable pandemics of the future; it cannot even focus on the one that’s ongoing.

    If change happens, it will likely occur slowly and from the ground up. In the vein of ACT UP—the extraordinarily successful activist group that changed the world’s approach to AIDS—grassroots organizations of longhaulers, grievers, immunocompromised people, and others disproportionately harmed by the pandemic have formed, creating the kind of vocal constituency that public health has long lacked.

    More pandemics will happen, and the U.S. has spectacularly failed to contain the current one. But it cannot afford the luxury of nihilism. It still has time to address its bedrocks of individualism and inequality, to create a health system that effectively prevents sickness instead of merely struggling to treat it, and to enact policies that rightfully prioritize the needs of disabled and vulnerable communities. Such changes seem unrealistic given the relentless disappointments of the past three years, but substantial social progress always seems unfeasible until it is actually achieved. Normal led to this. It is not too late to fashion a better normal.

    [ad_2]

    Ed Yong

    Source link