ReportWire

Tag: climate desk

  • The Climate Impact of Owning a Dog

    [ad_1]

    This story originally appeared on Grist and is part of the Climate Desk collaboration.

    I’ve been a vegetarian for over a decade. It’s not because of my health, or because I dislike the taste of chicken or beef: It’s a lifestyle choice I made because I wanted to reduce my impact on the planet. And yet, twice a day, every day, I lovingly scoop a cup of meat-based kibble into a bowl and set it down for my 50-pound rescue dog, a husky mix named Loki.

    Until recently, I hadn’t devoted a huge amount of thought to that paradox. Then I read an article in the Associated Press headlined “People often miscalculate climate choices, a study says. One surprise is owning a dog.”

    The study, led by environmental psychology researcher Danielle Goldwert and published in the journal PNAS Nexus, examined how people perceive the climate impact of various behaviors—options like “adopt a vegan diet for at least one year,” or “shift from fossil fuel car to renewable public transport.” The team found that participants generally overestimated a number of low-impact actions like recycling and using efficient appliances, and they vastly underestimated the impact of other personal decisions, including the decision to “not purchase or adopt a dog.”

    The real objective of the study was to see whether certain types of climate information could help people commit to more effective actions. But mere hours after the AP published its article, its aim had been recast as something else entirely: an attack on people’s furry family members. “Climate change is actually your fault because you have a dog,” one Reddit user wrote. Others in the community chimed in with ire, ridiculing the idea that a pet Chihuahua could be driving the climate crisis and calling on researchers and the media to stop pointing fingers at everyday individuals.

    Goldwert and her fellow researchers watched the reactions unfold with dismay. “If I saw a headline that said, ‘Climate scientists want to take your dogs away,’ I would also feel upset,” she said. “They definitely don’t,” she added. “You can quote me on that.”

    Loki grinning on a hike in the Pacific Northwest.

    Photograph: Claire Elise Thompson/Grist

    [ad_2]

    Claire Elise Thompson

    Source link

  • An Invasive Disease-Carrying Mosquito Has Spread to the Rocky Mountains

    [ad_1]

    This story originally appeared on Inside Climate News and is part of the Climate Desk collaboration.

    It can carry life-threatening diseases. It’s difficult to find and hard to kill. And it’s obsessed with human blood.

    The Aedes aegypti is a species of mosquito that people like Tim Moore, district manager of a mosquito control district on the Western Slope of Colorado, really don’t want to see.

    “Boy, they are locked into humans,” Moore said. “That’s their blood meal.”

    This mosquito species is native to tropical and subtropical climates, but as climate change pushes up temperatures and warps precipitation patterns, the Aedes aegypti—which can spread Zika, dengue, chikungunya and other potentially deadly viruses—is on the move.

    It’s popping up all over the Mountain West, where conditions have historically been far too harsh for it to survive. In the last decade, towns in New Mexico and Utah have begun catching Aedes aegypti in their traps year after year, and just this summer, one was found for the first time in Idaho.

    Now, an old residential neighborhood in Grand Junction, Colorado, has emerged as one of the latest frontiers for this troublesome mosquito.

    The city, with a population of about 70,000, is the largest in Colorado west of the Continental Divide. In 2019, the local mosquito control district spotted one wayward Aedes aegypti in a trap. It was odd, but the mosquitoes had already been found in Moab, Utah, about 100 miles to the southwest. Moore, the district manager, figured they’d caught a hitchhiker and that the harsh Colorado climate would quickly eliminate the species.

    “I concluded it was a one-off, and we don’t have to worry too much about this,” Moore said.

    Tim Moore, district manager of Grand River Mosquito Control District, explains that managing a new invasive species of mosquito in Grand Junction has required the district to increase spending on new mosquito traps and staff.Photograph: Isabella Escobedo

    [ad_2]

    Erin Douglas

    Source link

  • Trump’s Hatred of EVs Is Making Gas Cars More Expensive

    [ad_1]

    This story originally appeared on Mother Jones and is part of the Climate Desk collaboration.

    As President Donald Trump sees it, environmental regulations that attempt to improve efficiency and address climate change only make products more expensive and perform worse. He has long blamed efficiency regulations for his frustrations with things like toilets and showerheads. He began his second term in office to “unleash prosperity through deregulation.”

    But there’s at least one big way that American companies and households may end up paying more, not less, for the president’s anti-environment policy moves.

    If you’re in the market for a vehicle, you’ve probably noticed: Cars are getting more expensive. Kelley Blue Book reported that the average sticker price for a new car topped $50,000 for the first time in September.

    And they aren’t just getting more expensive to buy; cars are getting more expensive to own. For most Americans, gasoline is their single-largest energy expenditure, around $2,930 per household each year on average.

    While a more efficient dishwasher, light bulb, or faucet may have a higher sticker price up front—especially as manufacturers adjust to new rules—cars, appliances, solar panels, and electronics can more than pay for themselves with lower operating costs over their lifetimes. And Trump’s agenda of suddenly rolling back efficiency rules has simultaneously made it harder for many industries to do business while raising costs for ordinary Americans.

    No one knows this better than the US auto industry, which has whiplashed between competing environmental regulations for over a decade.

    President Barack Obama tightened vehicle efficiency and pollution standards. In his first term, Trump loosened them. President Joe Biden reinstated and strengthened them. Now Trump is reversing course again—leaving the $1.6 trillion US auto industry unsure what turn to take next.

    Regulation Whiplash

    In July, the Environmental Protection Agency began undoing a foundational legal basis that lets the agency limit climate pollution from cars. Without it, the EPA has far less power to require automakers to manufacture cleaner vehicles, which hampers efforts to reduce one of the single biggest sources of carbon emissions.

    Trump’s transportation secretary, Sean P. Duffy, said in a statement over the summer that these moves “will lower vehicle costs and ensure the American people can purchase the cars they want.”

    But in reality, the shift may have the opposite effect. That’s because when the rules change every few years, automakers struggle to meet existing benchmarks and can’t plan ahead. The Alliance for Automotive Innovation, a trade group representing companies like Ford, Toyota, and Volkswagen, sent a letter to the EPA in September saying that the administration’s moves and the repeal of incentives for electric cars mean that the current car pollution rules established under Biden and stretching out to 2027 “are simply not achievable.” The Trump administration responded by zeroing out any penalties for violations—but the industry is already planning for a post-Trump world where rules could drastically change yet again.

    Because it takes years and billions of dollars to develop new cars that comply with stricter rules, carmakers would prefer if regulations stayed put one way or the other. Every rule change adds time and expense to the development lifecycle, which ultimately gets baked into a car’s price tag.

    Changing rules are also vexing for electric car makers, whose models are gaining traction both in the US and around the world, even as the Trump administration has ended tax incentives for EVs. Trump is making things even more difficult by pulling support for domestic battery production that would help US car companies build electric cars.

    It all adds up to a huge headache for the industry. “Particularly in the last six months, I think ‘chaos’ is a good word because they’re getting hit from every angle,” said David Cooke, senior associate director at the Center for Automotive Research at Ohio State University.

    And all that uncertainty is making cars more expensive to buy and run, with even more expensive long-term consequences for people’s health and the environment.

    How Trump’s Policies Are Costing Drivers More

    As the government relaxes efficiency targets, progress will stall and car buyers will get stuck with cars that cost more to operate.

    Energy Innovation, a think tank, found that repealing tailpipe standards could cost households an extra $310 billion by 2050, mainly through more spending on gasoline. Undoing the standards would also increase air pollution and shrink the job market for US electric vehicle manufacturing due to lower demand.

    The EPA’s fuel mileage rating of a large SUV.

    Photograph: D. Lentz/Getty Images/iStockphoto

    Even the Trump administration’s own analysis of the effects of undoing the EPA’s greenhouse-gas emission regulations found that his moves would drive up gasoline prices due to more fuel consumption from less efficient vehicles.

    “Repealing these standards in particular would set America back decades,” said Sara Baldwin, senior director for electrification at Energy Innovation.

    [ad_2]

    Umair Irfan

    Source link

  • Climate Change Made Hurricane Melissa 4 Times More Likely, Study Suggests

    [ad_1]

    This story originally appeared on Inside Climate News and is part of the Climate Desk collaboration.

    Fueled by unusually warm waters, Hurricane Melissa this week turned into one of the strongest Atlantic storms ever recorded. Now a new rapid attribution study suggests human-induced climate change made the deadly tropical cyclone four times more likely.

    Hurricane Melissa collided with Jamaica on Tuesday, wreaking havoc across the island before tearing through nearby Haiti and Cuba. The storm, which reached Category 5, reserved for the hurricanes with the most powerful winds, has killed at least 40 people across the Caribbean so far. Now weakened to a Category 2, it continues its path toward Bermuda, where landfall is likely on Thursday night, according to the National Hurricane Center.

    Early reports of the damage are cataclysmic, particularly in hardest-hit western Jamaica. Winds reaching speeds of 185 miles per hour and torrential rain flattened entire neighborhoods, decimated large swaths of agricultural lands and forced more than 25,000 people—locals and tourists alike—to seek cover in shelters or hotel ballrooms. According to the new attribution study from Imperial College London, climate change ramped up Melissa’s wind speeds by 7 percent, which increased damages by 12 percent.

    Losses could add up to tens of billions of dollars, experts say.

    The findings echo similar reports released earlier this week on how global warming contributed to the likelihood and severity of Hurricane Melissa. Each of the analyses add to a growing body of research showing how ocean warming from climate change is fueling the conditions necessary for stronger tropical storms.

    Hurricane Melissa is “kind of a textbook example of what we expect in terms of how hurricanes respond to a warming climate,” said Brian Soden, a professor of atmospheric sciences at the University of Miami, who was not involved in the recent analyses. “We know that the warming ocean temperatures [are] being driven almost exclusively by increasing greenhouse gases.”

    The storm has disrupted every aspect of life in this part of the Caribbean.

    “There’s been massive dislocation of services. We have people living in shelters across the country,” Dennis Zulu, United Nations resident coordinator in Jamaica, said in a press conference on Wednesday. “What we are seeing in preliminary assessments is a country that’s been devastated to levels never seen before.”

    The Climate Connection

    For the rapid attribution study, researchers at Imperial College used the peer-reviewed Imperial College Storm Model, known as IRIS, which has created a database of millions of synthetic tropical cyclone tracks that can help fill in gaps on how storms operate in the real world.

    The model essentially runs simulations on the likelihood of a given storm’s wind speed—often the most damaging factor—in a pre-industrial climate versus the current climate. Applying IRIS to Hurricane Melissa is how the researchers determined that human-induced warming supercharged the cyclone’s wind speed by 7 percent.

    [ad_2]

    Kiley Price

    Source link

  • Australia’s March Toward 100 Percent Clean Energy

    [ad_1]

    “[The clutch] is like 1950s technology—it’s really boring,” Westerman said (“boring,” for grid operators, is the highest form of praise). ​“The marginal cost of putting this in is like nothing compared to the cost of the plant.”

    A company called SSS has built these clutches for decades. One is nearly operational in the state of Queensland at the Townsville gas-fired plant, which Siemens Energy is converting into what it calls a ​“hybrid rotating grid stabilizer.” Siemens says this project is the world’s first such conversion of a gas turbine of this size.

    That particular retrofit took about 18 months and involved some relocating of auxiliary components at Townsville to make room for the new clutch. So it’s not instantaneous, but far easier than building a new synchronous condenser from scratch, and about half the cost, per Siemens.

    Some novel long-duration storage techniques also provide their own spinning mass. Canadian startup Hydrostor expects to break ground early next year on a fully permitted and contracted project in Broken Hill, a city deep in the Outback of New South Wales.

    Broken Hill lent its name to BHP, which started there as a silver mine in 1885 and has grown to one of the largest global mining companies. More recently, the desert landscape played host to the postapocalyptic car chases of Mad Max 2. Now, roughly 18,000 people live there, at the end of one long line connecting to the broader grid.

    Hydrostor will shore up local power by excavating an underground cavity and compressing air into it; releasing the compressed air turns a turbine to regenerate up to 200 megawatts for up to eight hours, serving the community if the grid connection goes down and otherwise shipping clean power to the broader grid.

    But unlike batteries, Hydrostor’s technology uses old-school generators, and its compressors contribute additional spinning metal.

    “We have a clutch spec’d in for New South Wales, because they need the inertia,” Hydrostor CEO Jon Norman said. ​“It’s so simple; it’s like the same clutches on your standard car.”

    Transmission grid operator Transgrid ran a competitive process to determine the best way to provide system security to Broken Hill in the event it had to operate apart from the grid, Norman said. That analysis chose Hydrostor’s bid to simply insert a clutch when it installs its machinery.

    The project still needs to get built, but if up-and-coming clean storage technologies could step in to provide that grid security, it wouldn’t all have to come from ghostly gas plants lingering on the system.

    “It’s a different feeling [in Australia]—there’s a can do, go get ​’em, ​‘put me in coach’ attitude,” said Audrey Zibelman, the American grid expert who ran AEMO before Westerman. ​“When you’re determined to say how best to go about this, as opposed to why it’s hard or why it doesn’t work, the solutions appear.”

    [ad_2]

    Julian Spector

    Source link

  • Why US Power Bills Are Surging

    [ad_1]

    Now, electricity prices are surging in addition to all of the uncorked demand from the Covid-19 pandemic, when the global economic slowdown and pressure from policymakers kept a lid on utility bills.

    “I think if we were to repeat this analysis for next year, there would probably be a little bit of an uptick this year, but the data that I’m looking at doesn’t suggest a really significant increase in the historical context,” said Geoffrey Blanford, the lead author of EPRI’s report.

    But there isn’t just one story unfolding across the country.

    The US has a particularly chaotic energy system. How much people pay to light their homes, stay warm, and get around varies a lot from state to state and even among neighbors. For example, Texas households tend to spend a larger share of their budgets on keeping their pickup trucks running, while families in Massachusetts spend a greater portion on staying warm.

    So, no—we’re not in an energy crisis, but it’s unlikely that your power bills will come down anytime soon. There is some good news though: In the years ahead, Americans are actually poised to spend a smaller share of their incomes on energy overall as technology makes it more cost-effective to shift away from fossil fuels.

    “In our forward-looking scenarios, one of the key drivers for change is electrification, particularly light-duty vehicles,” Blanford said. “This tends to actually reduce the energy wallet in real terms per household over time even as you’re spending more on electricity.” Though electric car sales have slowed down in the US, they are still rolling into more driveways. And as homes and appliances become more efficient, that will help reduce energy bills as well. Based on current trends, the average US household energy wallet will shrink by 36 percent by 2050, with state-level declines anywhere from 10 to 50 percent, according to the report.

    [ad_2]

    Umair Irfan

    Source link

  • The LA Fires Spewed Out Toxic Nanoparticles. He Made It His Mission to Trace Them

    [ad_1]

    Spada’s is one of the hardest projects to tune the beam for. “The beam is way over-powered to run my samples, at baseline,” Spada said, comparing the amount of power he needs to a couple drops of water, “but the beam, it’s like Niagara Falls.”

    The technique Spada relies on, particle-induced x-ray emission (PIXE), is a focused stream of protons to knock electrons out of atoms embedded in the sample. As those atoms stabilize, they emit x-rays—and each element gives off a signature energy. “It’s like a fingerprint,” Spada said. “Every metal shows up in a different color of x-ray.”

    Because PIXE is nondestructive, Spada can scan the same filter multiple times, looking for metals like lead, arsenic, cadmium, and antimony—elements he frequently finds in urban wildfire debris. The beam line at Crocker is one of only a handful in the country equipped for this kind of environmental work.

    “It’s not fast,” Spada said. “Sometimes it takes a couple of minutes just to scan a pinhead-sized area. But it’s precise, and it tells us what’s really in the air people are breathing.”

    Spada is still in the process of running each of the filters from his monitoring areas through thermal-optical analysis for organic carbon, and spectroscopy that could detect molecular structures, in addition to the PIXE process.

    Just the thermal-optical carbon analysis alone takes an hour per sample and gives just two numbers—how much elemental carbon and how much organic carbon.

    Spada had droves of samples to get through.

    “We turn everything into methane. We use a methanator, which sounds like something out of Phineas and Ferb, but it’s how we detect the organic carbon fractions,” said Spada. Each type of carbon burns off at a different temperature, revealing its origin—wildfire, diesel, gasoline, building materials. Because the signatures from the LA fires weren’t consistent with typical wildland burns, he noticed a strange pattern in one of the samples early on—high sulfur, high chlorine.

    “We think it was from PVC pipes,” he said. “That’s one of the only materials that would give you both those elements. And it was from the Altadena set, so in a residential area.”

    He flagged the findings for Baalousha. They have been reviewing each other’s results as an expedited substitute for formal peer review, and drafting community updates together.

    “It was really important to him that we not just publish something academic,” Knack said. “He wanted it readable—like, for families, not scientists.”

    Spada has been releasing reports on the ash samples on a rolling basis since he and Baalousha got the first results back in March. Each report went out with links to cleanup guidance, recommendations on protective gear, and a glossary.

    He hopes to be able to release a preliminary report on the air conditions during the fires shortly. In mid-August, over seven months after they tore through LA, Spada was finally able to review his preliminary PIXE data while on leave from work, recovering from a routine outpatient surgery.

    So far he’s found that the majority of nanoparticles were created and circulated in the air during the active fire phase, and once the fire had been contained and transitioned to the smoldering phase, the number dropped off steeply. “For example, in Pasadena, silicon in the 0.09- to 0.26-micrometer size range was 8 times higher during the active fire period,” Spada said via email.

    [ad_2]

    Nina Dietz

    Source link

  • Big Tech Dreams of Putting Data Centers in Space

    [ad_1]

    For one thing, the systems he imagines process data relatively slowly compared to those on terra firma. They’d be constantly bombarded by radiation, and “obsolescence would be a problem” because making repairs or upgrades would be confoundingly difficult. Hajimiri believes that data centers in space could, someday, be a viable solution but hesitates to say when that day might come. “Definitely it would be doable in a few years,” he said. “The question is how effective they would be, and how cost-effective they would become.”

    The idea of simply putting data centers in orbit is not limited to the offhand musings of techies or the deeper thought of academics. Even some elected officials in cities where companies like Amazon hope to build data centers are raising the point. Tucson, Arizona, councilmember Nikki Lee waxed poetic about their potential during an August hearing, in which the council unanimously voted down a proposed data center in their city.

    “A lot of people are saying data centers don’t belong in the desert,” Lee said. But “if this is truly a national priority,” then the focus must be on “putting federal research and development dollars into looking at data centers that will exist in space. And that may sound wild to you all and a little science fiction, but it’s actually happening.”

    That’s true, but it’s happening on an experimental scale, not an industrial one. A startup called Starcloud hoped to launch a refrigerator-sized satellite housing a few Nvidia chips in August, but the launch date was pushed back. Lonestar Data Systems landed a miniature data center, carrying precious information like an Imagine Dragons song, on the moon a few months ago, though the lander tipped over and died in the attempt. More such launches are planned for the coming months. But it’s “very hard to predict how quickly this idea will become economically feasible,” said Matthew Weinzierl, a Harvard University economist who studies market forces in space. “Space-based data centers may well have some niche uses, such as for processing space-based data and providing national security capabilities,” he said. “To be a meaningful rival to terrestrial centers, however, they will need to compete on cost and service quality like anything else.”

    For now, it’s much more expensive to put a data center in space than it is to put one in, say, Virginia’s Data Center Valley, where power demand could double in the next decade if left unregulated. And as long as staying on Earth remains cheaper, profit-motivated companies will favor terrestrial data-center expansion.

    Still, there is one factor that might encourage OpenAI and others to look toward the heavens: There isn’t much regulation up there. Building data centers on Earth requires obtaining municipal permits, and companies can be stymied by local governments whose residents worry that data center development might siphon their water, raise their electricity bills, or overheat their planet. In space, there aren’t any neighbors to complain, said Michelle Hanlon, a political scientist and lawyer who leads the Center for Air and Space Law at the University of Mississippi. “If you are a US company seeking to put data centers in space, then the sooner the better, before Congress is like, ‘Oh, we need to regulate that.’”

    [ad_2]

    Sophie Hurwitz

    Source link

  • Big Businesses Are Doing Carbon Dioxide Removal All Wrong

    [ad_1]

    Amazon, Google, Microsoft, and H&M are currently investing in durable CDR. A spokesperson for H&M described the fast-fashion company’s purchase of 10,000 metric tons of durable CDR from the Swiss company Climeworks, one of the largest purchases to date, and said H&M plans to use them to neutralize residual emissions. The tech companies affirmed their commitment to reduce emissions first and then use carbon removal to offset residual emissions, though none of them addressed NewClimate Institute’s concerns that they would use large amounts of durable and nondurable CDR to claim progress toward net-zero.

    A statement provided to Grist from TotalEnergies did not address CDR. It instead described the company’s support for carbon capture and storage and “nature-based solutions.” The latter refers to short-lived offsets, such as tree-planting, that the NewClimate Institute does not believe are appropriate for offsetting fossil fuel emissions.

    Apple, Duke Energy, and Shein declined to comment after seeing the report. The remaining 24 companies did not respond to inquiries from Grist.

    Jonathan Overpeck, a climate scientist at the University of Michigan and the dean of its School for Environment and Sustainability, said the NewClimate Institute report is timely. “Right now the whole idea of CDR … is kind of a Wild West scene, with lots of actors promising to do things that may or may not be possible,” he said. He added that companies appear to be using CDR as an alternative to mitigating their climate pollution.

    “The priority has to be on reducing emissions, not on durable CDR at this point,” he told Grist.

    In the near term, durable CDR is doing virtually nothing to offset emissions. As of 2023, only 0.0023 gigatons of CO2 were removed from the atmosphere each year using these methods. That’s about 15,000 times less than the annual amount of climate pollution from fossil fuels and cement manufacturing.

    According to the NewClimate Institute, voluntary initiatives are no substitute for government-mandated emissions reduction targets and investments in durable CDR. To the extent that these initiatives exist, however, the organization says they should provide a clearer definition of what constitutes “durable” carbon removal; determine companies’ responsibility for scaling up durable CDR based on their ongoing and historical emissions, or—perhaps more realistically—on their ability to pay; and require companies to set separate targets for emissions reductions and support for durable CDR. The last recommendation is intended to reinforce a climate action hierarchy that puts mitigation before offsetting. Companies should not “hide inaction on decarbonization behind investments in removals,” as the report puts it.

    Mooldijk said voluntary initiatives can incentivize investments in durable CDR by recognizing “climate contributions.” These might manifest as simple statements about companies’ monetary contributions to durable CDR, instead of claims about the amount of CO2 that they have theoretically neutralized.

    Some of these recommendations were submitted earlier this year to the Science-Based Targets initiative, the world’s most respected verifier of private sector climate targets. The organization is getting ready to update its corporate net-zero standard with new guidance on the use of CDR. Another standard-setter, the International Organization for Standardization, is similarly preparing to release new standards on net-zero, which could curtail some of the most questionable corporate climate claims while also drumming up support for durable CDR.

    John Reilly, a senior lecturer emeritus at the MIT Sloan School of Management, said that ultimately, proper regulation of corporate climate commitments—including of durable CDR—will fall on governments. Companies “are happy to throw a little money into these things,” he said, “but I don’t think voluntary guidelines are ever going to get you there.”

    [ad_2]

    Joseph Winters

    Source link

  • Real Estate Speculators Are Swooping In to Buy Disaster-Hit Homes

    [ad_1]

    “Hi there Gina, hope you’re having a great day,” said another exactly two weeks later. “My name is Christine, I am a land buyer. I’m reaching out to see if you have any plans to sell the lot.” The text was signed by “Twin Acres.” Twin Acres is not a registered real estate broker. Grist’s attempt to text the number back went unanswered.

    Sometimes, Miceli said, she answers the texts. “It depends on my mood. I think there’s been a time or two I’ve said, ‘Go to hell.’” She has no plans to leave. She’s raising her family in the home her husband’s grandparents bought, and she owns a local brewery.

    Some theorists call this phenomenon “disaster gentrification,” when real estate investors flood a disaster zone to buy up damaged properties for cheap.

    Samantha Montano, a professor of emergency management and author of the book Disasterology, spent years living and working in New Orleans after Hurricane Katrina and saw it happen with her own eyes. In areas like the Lower Ninth Ward, some people displaced by the storm didn’t have the resources to return. Speculators rushed in. Some landowners became instant millionaires, selling their properties to out-of-state developers hoping to rebuild and flip their property.

    “The issue of gentrification in New Orleans was there from the beginning,” Montano said. “There were many groups who were warning about that, advocating for housing policy and other recovery policies to account for gentrification. [They] tried to prevent it.” Twenty years later, the demographics of New Orleans have shifted: Lower-income and Black residents have been displaced, and whiter, wealthier new residents took their place. “Certainly that is all very much intertwined in the recovery and in who had access to the resources to return and rebuild—and who didn’t,” she said.

    In the wake of the Eaton Fire in Altadena, California, earlier this year, half of home purchases were by limited liability corporations, according to Dwell, the home design news site. That’s nearly double what they typically represent compared to individuals buying homes. Just six companies—among them Ocean Development Inc. and Black Lion Properties LLC—dominated those transactions in Altadena, spending millions of dollars to purchase destroyed properties in historically Black neighborhoods. It’s difficult to find out who these companies are: Often, they contact potential sellers through fake phone numbers or under names that aren’t necessarily attached to real corporations.

    The value of disaster-struck land consistently bounces back fast, meaning that buyers can flip the land or homes—sometimes even without making repairs. As climate change fuels more frequent severe natural disasters across the United States, “disaster investors” seem set to make greater profits than ever—and communities like North St. Louis stand to bear the burden.

    A for-sale sign in Altadena, California, in March, three months after wildfires swept through the area.Photograph: Juliana Yamada/Getty Images

    [ad_2]

    Sophie Hurwitz

    Source link

  • Antarctica Is Changing Rapidly. The Consequences Could Be Dire

    [ad_1]

    This story originally appeared on Grist and is part of the Climate Desk collaboration.

    Seen from space, Antarctica looks so much simpler than the other continents—a great sheet of ice set in contrast to the dark waters of the encircling Southern Ocean. Get closer, though, and you’ll find not a simple cap of frozen water, but an extraordinarily complex interplay between the ocean, sea ice, and ice sheets and shelves.

    That relationship is in serious peril. A new paper in the journal Nature catalogs how several “abrupt changes,” like the precipitous loss of sea ice over the last decade, are unfolding in Antarctica and its surrounding waters, reinforcing one another and threatening to send the continent past the point of no return—and flood coastal cities everywhere as the sea rises several feet.

    “We’re seeing a whole range of abrupt and surprising changes developing across Antarctica, but these aren’t happening in isolation,” said climate scientist Nerilie Abram, lead author of the paper. (She conducted the research while at Australian National University but is now chief scientist at the Australian Antarctic Division.) “When we change one part of the system, that has knock-on effects that worsen the changes in other parts of the system. And we’re talking about changes that also have global consequences.”

    Scientists define abrupt change as a bit of the environment changing much faster than expected. In Antarctica these can occur on a range of times scales, from days or weeks for an ice shelf collapse, and centuries and beyond for the ice sheets. Unfortunately, these abrupt changes can self-perpetuate and become unstoppable as humans continue to warm the planet. “It’s the choices that we’re making right now, and this decade and the next, for greenhouse gas emissions that will set in place those commitments to long-term change,” Abram said.

    A major driver of Antarctica’s cascading crises is the loss of floating sea ice, which forms during winter. In 2014, it hit a peak extent (at least since satellite observations began in 1978) around Antarctica of 20.11 million square kilometers, or 7.76 million square miles. But since then, the coverage of sea ice has fallen not just precipitously, but almost unbelievably, contracting by 75 miles closer to the coast. During winters, when sea ice reaches its maximum coverage, it has declined 4.4 times faster around Antarctica than it has in the Arctic in the last decade.

    Put another way: The loss of winter sea ice in Antarctica over just the past decade is similar to what the Arctic has lost over the last 46 years. “People always thought the Antarctic was not changing compared to the Arctic, and I think now we’re seeing signs that that’s no longer the case,” said climatologist Ryan Fogt, who studies Antarctica at Ohio University but wasn’t involved in the new paper. “We’re seeing just as rapid—and in many cases, more rapid—change in the Antarctic than the Arctic lately.”

    While scientists need to collect more data to determine if this is the beginning of a fundamental shift in Antarctica, the signals so far are ominous. “We’re starting to see the pieces of the picture begin to emerge that we very well might be in this new state of dramatic loss of Antarctic sea ice,” said Zachary M. Labe, a climate scientist who studies the region at the research group Climate Central, which wasn’t involved in the new paper.

    [ad_2]

    Matt Simon

    Source link

  • Climate Change Is Bringing Legionnaire’s Disease to a Town Near You

    [ad_1]

    This story originally appeared on Vox and is part of the Climate Desk collaboration.

    Air conditioners have been working overtime this hot summer, from those tiny window units to the massive AC towers that serve the tightly packed apartment buildings in major cities. And while they bring the relief of cool air, these contraptions also create the conditions for dangerous bacteria to multiply and spread.

    One particularly nasty bacteria-borne illness is currently spreading in New York City using those enormous cooling units as its vector: Legionnaire’s disease. The bacterial pneumonia, which usually recurs each summer in the US’s largest city, has sickened more than 100 people and killed five in a growing outbreak.

    If you don’t live in New York City or the Northeast, you may never have heard of Legionnaire’s, but this niche public health threat may not be niche for much longer.

    Climate change is helping to make Legionnaire’s disease both more plentiful in the places where it already exists and creating the potential for it to move to new places where the population may not be accustomed to it. Cities in the Northeast and Midwest, where hotter weather meets older infrastructure, have reported more cases in recent years. Recently, Legionella bacteria was discovered in a nursing home’s water system in Dearborn, Michigan—one of the states, along with Ohio, Pennsylvania, Illinois, and Wisconsin, that have seen more activity in the past few years.

    Anyone can contract Legionnaire’s disease by inhaling tiny drops containing the bacteria, and the symptoms—fever, headache, shortness of breath—appear within days. It can cause a severe lung infection, with a death rate of around 10 percent.

    While healthier people often experience few symptoms, the more vulnerable—young children, the elderly, pregnant people, and those with compromised immune systems—face serious danger from the illness. Around 5,000 people die every year in the United States from Legionnaire’s disease, many of them living in low-income housing with outdated cooling equipment where the bacteria can more readily grow and spread.

    Legionnaire’s disease is a microcosm of climate change’s impact on low-income communities. As warmer temperatures facilitate the spread of disease, the most socially vulnerable populations are going to pay the steepest price.

    The Collision of Legionnaire’s Disease, Climate Change, and Economic Disparities

    Legionnaire’s disease was first documented after an unusually aggressive pneumonia outbreak during an American Legion conference in Philadelphia in 1976. Soon, Centers for Disease Control and Prevention scientists confirmed the cause of the mysterious illness: a previously unknown bacteria that was accordingly named Legionella. Legionella, unfortunately, is everywhere—in streams, lakes, and water pipes across the country.

    But usually, it occurs in such low concentrations and is so remote that it doesn’t pose a threat to humans. Usually.

    Now, city health officials have found the bacteria in the large cooling tanks that serve massive apartment buildings across New York City, particularly in Harlem. Cooling tanks are ideal places for Legionnaire’s to grow and spread. They’re filled with stagnant, warm water that is more hospitable to bacterial growth. Like an evaporative cooler, the systems convert warm stagnant water into cool air for apartment dwellers. They can spray mists laden with the bacteria into the open air, dispersing it across the surrounding air, where it can enter a person’s lungs when they inhale. According to the Environmental Protection Agency, 80 percent of Legionnaire’s cases are linked to potable water systems.

    [ad_2]

    Dylan Scott

    Source link

  • California Is Flooding School Cafeterias With Vegan Meals—and Kids Like It

    California Is Flooding School Cafeterias With Vegan Meals—and Kids Like It

    [ad_1]

    Student nutrition directors like Primer say the foundation that allows schools to experiment with new recipes is California’s universal free lunch program. She notes that, when school lunch is free, students are more likely to actually try and enjoy it: “Free food plus good food equals a participation meal increase every time.”

    Nora Stewart, the author of the Friends of the Earth report, says the recent increase in vegan school lunch options has also been in response to a growing demand for less meat and dairy in cafeterias from climate-conscious students. “We’re seeing a lot of interest from students and parents to have more plant-based [meals] as a way to really help curb greenhouse gas emissions,” she said. A majority of Gen Zers—79 percent—say they would eat meatless at least once or twice a week, according to research conducted by Aramark, a company provides food services to school districts and universities, among other clients. And the food-service company that recently introduced an all-vegetarian menu in the San Francisco Unified School District credits students with having “led the way” in asking for less meat in their cafeterias. The menu includes four vegan options: an edamame teriyaki bowl, a bean burrito bowl, a taco bowl with a pea-based meat alternative, and marinara pasta.

    Stewart theorizes that school nutrition directors are also increasingly aware of other benefits to serving vegan meals. “A lot of school districts are recognizing that they can integrate more culturally diverse options with more plant-based meals,” said Stewart. In the past five years, the nonprofit found, California school districts have added 41 new vegan dishes to their menus, including chana masala bowls, vegan tamales, and falafel wraps. Dairy-free meals also benefit lactose-intolerant students, who are more likely to be students of color.

    Still, vegan meals are hardly the default in California cafeterias, and in many places, they’re unheard of. Out of the 25 largest school districts in the state, only three elementary districts offer daily vegan options, the same number as did in 2019. According to Friends of the Earth, a fourth of the California school districts they reviewed offer no plant-based meal options; in another fourth, the only vegan option for students is a peanut butter and jelly sandwich. “I was surprised to see that,” said Stewart.

    [ad_2]

    Frida Garza

    Source link

  • The Origins of the Climate Haven Myth

    The Origins of the Climate Haven Myth

    [ad_1]

    The real estate industry has taken notice. Quite coincidentally, as Hurricane Helene was bearing down on the Southeast last week, Zillow announced a new feature that displays climate risk scores on listing pages alongside interactive maps and insurance requirements. Now, you can look up an address and see, on a scale of 1 to 10, the risk of flooding, extreme temperatures, and wildfires for that property, based on data provided by the climate risk modeling firm First Street. Redfin, a Zillow competitor, launched its own climate risk index using First Street data earlier this year.

    The new climate risk scores on Zillow and Redfin can’t tell you with any certainty whether you’ll be affected by a natural disaster if you move into any given house. But this is a tool that can help guide decisions about how you might want to insure your property and think about its long-term value.

    It’s almost fitting that Zillow and Redfin, platforms designed to help people find the perfect home, are doing the work to show that climate risk is not binary. There are no homes completely free of risk for the same reasons that there’s no such thing as a perfect climate haven.

    Climate risk is a complicated equation that complicates the already difficult and complex calculus of buying a home. Better access to data about risk can help, and a bit more transparency about the insurance aspect of homeownership is especially useful, as the industry struggles to adapt to our warming world and the disasters that come with it.

    “As we start to see insurance costs increase, all that starts to impact that affordability question,” Skylar Olsen, Zillow’s chief economist, told me. “It’ll help the housing market move towards a much healthier place, where buyers and sellers understand these risks and then have options to meet them.”

    That said, knowledge of risk isn’t keeping people from moving to disaster-prone parts of the country right now. People move to new parts of the country for countless different reasons, including the area’s natural beauty, job prospects, and affordable housing. Those are a few of the reasons why high-risk counties across the country are growing faster than low-risk counties, even in the face of future climate catastrophes, which are both unpredictable and inevitable. It’s almost unfathomable to know how to prepare ourselves properly for the worst-case scenario.

    “The scale of these events that we’re seeing are so beyond what humans have ever seen,” said Vivek Shandas, an urban planning professor at Portland State University. “No matter what we think might be a manageable level of preparedness and infrastructure, we’re still going to see cracks, and we’re still going to see breakages.”

    That doesn’t mean we shouldn’t build sea walls or find new ways to fight wildfires. In a sense, we have the opportunity to create our own climate havens by making cities more resilient to the risks they face. We can be optimistic about that future.

    [ad_2]

    Adam Clark Estes

    Source link

  • Taiwan Makes the Majority of the World’s Computer Chips. Now It’s Running Out of Electricity

    Taiwan Makes the Majority of the World’s Computer Chips. Now It’s Running Out of Electricity

    [ad_1]

    It is not just a case of building more capacity. Taiwan’s energy dilemma is a combination of national security, climate, and political challenges. The island depends on imported fossil fuel for around 90 percent of its energy and lives under the growing threat of blockade, quarantine, or invasion from China. In addition, for political reasons, the government has pledged to close its nuclear sector by 2025.

    Taiwan regularly attends UN climate meetings, though never as a participant. Excluded at China’s insistence from membership in the United Nations, Taiwan asserts its presence on the margins, convening side events and adopting the Paris Agreement targets of peak emissions before 2030 and achieving net zero by 2050. Its major companies, TSMC included, have signed up to RE100, a corporate renewable-energy initiative, and pledged to achieve net-zero production. But right now, there is a wide gap between aspiration and performance.

    Angelica Oung, a journalist and founder of the Clean Energy Transition Alliance, a nonprofit that advocates for a rapid energy transition, has studied Taiwan’s energy sector for years. When we met in a restaurant in Taipei, she cheerfully ordered an implausibly large number of dishes that crowded onto the small table as we talked. Oung described two major blackouts—one in 2021 that affected TSMC and 6.2 million households for five hours, and one in 2022 that affected 5.5 million households. It is a sign, she says, of an energy system running perilously close to the edge.

    Nicholas Chen argues that government is failing to keep up even with existing demand. “In the past eight years there have been four major power outages,” he said, and “brownouts are commonplace.”

    The operating margin on the grid—the buffer between supply and demand—ought to be 25 percent in a secure system. In Taiwan, Oung explained, there have been several occasions this year when the margin was down to 5 percent. “It shows that the system is fragile,” she said.

    Taiwan’s current energy mix illustrates the scale of the challenge: Last year, Taiwan’s power sector was 83 percent dependent on fossil fuel: Coal accounted for around 42 percent of generation, natural gas 40 percent, and oil 1 percent. Nuclear supplied 6 percent, and solar, wind, hydro, and biomass together nearly 10 percent, according to the Ministry of Economic Affairs.

    Taiwan’s fossil fuels are imported by sea, which leaves the island at the mercy both of international price fluctuations and potential blockade by China. The government has sought to shield consumers from rising global prices, but that has resulted in growing debt for the Taiwan Electric Power Company (Taipower), the national provider. In the event of a naval blockade by China, Taiwan could count on about six weeks reserves of coal but not much more than a week of liquefied natural gas (LNG). Given that LNG supplies more than a third of electricity generation, the impact would be severe.

    The government has announced ambitious energy targets. The 2050 net-zero road map released by Taiwan’s National Development Council in 2022 promised to shut down its nuclear sector by 2025. By the same year, the share of coal would have to come down to 30 percent, gas would have to rise to 50 percent, and renewables would have to leap to 20 percent. None of those targets is on track.

    [ad_2]

    Isabel Hilton

    Source link

  • A Lawsuit From Backers of a ‘Startup City’ Could Bankrupt Honduras

    A Lawsuit From Backers of a ‘Startup City’ Could Bankrupt Honduras

    [ad_1]

    The flurry of private contracts became part of a “kleptocratic” regime, according to one 2017 report by the Carnegie Endowment for International Peace. Nearly all of the ISDS claims have their roots in contracts, laws or other agreements made during this period.

    For the farmers and villagers being pushed off their land, or having their water resources privatized, the development rush converged with spiraling violence.

    “Nowhere are you more likely to be killed for standing up to companies that grab land and trash the environment,” the international watchdog group Global Witness wrote in 2017, “than in Honduras.”

    An opponent of a project that became the subject of two ISDS claims was murdered the following year.

    At the center of these new laws and contracts was Juan Orlando Hernández, who was president of the congress when the ZEDE law was passed and was elected president of Honduras later in 2013. Hernández would serve two terms as president—a step prohibited by the Constitution. The US Department of Justice would later charge that Hernández used millions of dollars in payments from drug cartels to help buy off local officials to secure his electoral victories.

    Eventually, Hernández, his brother and his chief of the national police would be extradited to the United States and convicted of drug trafficking and weapons charges. Hernández, US Attorney General Merrick B. Garland said, used his time in power to run “one of the largest and most violent drug-trafficking conspiracies in the world.”

    Hernández was convicted in March of this year and sentenced to 45 years in prison, while the former national police chief was sentenced to 19 years. His brother is serving a life sentence. Hernández did not reply to a request for an interview from prison.

    Brimen, Honduras Próspera’s CEO, who immigrated to the United States from Venezuela, has said his goal is to provide a model that would foster prosperity, helping alleviate poverty by streamlining unnecessary bureaucracies that hobble governments, especially in parts of Latin America.

    Rosa Danelia Hendrix.

    Photograph: Nicholas Kusnetz; Inside Climate News

    Honduras Próspera said it “has no connection to any corruption in Honduras whatsoever.” The company has not been publicly accused of being involved in corruption or in passing the ZEDE law. But some residents, activists and members of the current government criticize the company for taking advantage of the law, given how it was passed, and for working with Hernández’s administration.

    “They came and did business with the darkest side of our country,” said Rosa Danelia Hendrix, speaking in Spanish. Hendrix serves as president of the federation of patronatos for Roatán and the other Bay Islands, and helped lead the fight against the ZEDEs.

    Up Against an Economic Superpower

    The Castro administration’s fight against the ZEDEs is being waged from Tegucigalpa’s Government Civic Center, a set of gleaming buildings erected by Hernández’s government. The neat, modern plaza sits next to the presidential palace and houses many government offices, but its pedestrian entrance opens onto a busy street without a turn-off, resulting in a chaotic scene of double-parked taxis and honking, as if its architects failed to imagine that citizens would visit.

    There, Fernando Garcia and a team of half-a-dozen young staffers compile documents and compose fervent social media posts denouncing the ZEDEs—there are two others apart from Próspera, focused on agricultural exports and mixed-use development, neither of which has filed an ISDS claim.

    [ad_2]

    Nicholas Kusnetz, Katie Surma

    Source link

  • California Can Slake the Thirst of Its Farms by Storing Water Underground

    California Can Slake the Thirst of Its Farms by Storing Water Underground

    [ad_1]

    For example, two winters’ worth of snow followed by intense heat created a flood risk in 2023. State officials decided to release water from Lake Oroville and other reservoirs across Southern California and the Central Valley. Although this helped prevent flooding and sent water downstream, many Californians were upset that the fresh water was being wasted. In attempts to reduce overflow releases, water agencies and irrigation districts made recharge basins to capture precipitation. But it wasn’t enough. Constant overpumping and a changing climate leave aquifers depleted to this day.

    Their natural recharge process—precipitation accumulating as surface water that percolates through the soil to recharge groundwater aquifers—can also be disrupted by urbanization or impervious covers like pavement, said Bruk Berhanu, a senior researcher in water efficiency and reuse at the Pacific Institute.

    The study suggests more managed aquifer recharge (MAR) infrastructure is needed to adequately catch large amounts of water in short time periods and avoid similar water-loss situations.

    MAR is an intentional method of recharging aquifers, especially those at low levels. Already commonly implemented in California, MAR infrastructure includes conveyance structures that redistribute water to dehydrated locations, and injection—spraying water on land or, the more costly option, directly infusing water in wells.

    Yet, to ensure an effective recharge of the aquifers, more monitoring and measurement is required. “Through 2014, growers were not required to monitor or report any withdrawals or injections to aquifers,” said Schwabe.

    Regardless, California has more monitoring practices than other states mainly because water availability is not as big a concern elsewhere, said Berhanu. Monitoring standards vary by state and region. Regulations for urban areas differ from agricultural or industrial areas. Based on Berhanu’s work assessing the country’s volumetric potential for water use efficiency at the municipal level, he found that “there is no federal regulatory framework for monitoring or reporting. In a lot of cases, water supplies aren’t even metered.”

    Even in areas that did have regulations, the reports were often infrequent or incomplete; the UC Riverside researchers are working on expanding the few accurate monitoring systems put in place in Southern California by proactive growers.

    Additionally, the study proposes voluntary water markets where farmers with a surplus of water can trade it to another farmer in need. It’s a win-win process: The selling farmer makes extra profit and the other gets much-needed water. “With prices based on scarcity plus delivery costs, such a marketplace would have incentives for storage and efficient use,” Schwabe said in a press release.

    Berhanu added that water-trading markets can work in some areas but not in others. “It needs a very strong governance framework to make sure all of the players are playing according to the rules.” The process will need to have improved monitoring practices, transparent data, and clear external costs, he said. “The more decentralized you get with how these transactions are being made, it becomes very difficult to coordinate the overall watershed-scale system benefits.”

    The study also mentions the value of reusing wastewater. Historically, wastewater has been treated to an environmental safety standard then released into the ocean or groundwater system. Over time, natural processes will clean it. Instead of waiting for the environment to purify it, water treatment facilities can repurpose the wastewater for irrigation, commercial use, or recharging purposes.

    As of 2023, water treatment plants can purify wastewater so well that people can drink it. “At some point, the water that we use will become someone else’s water for drinking or irrigation,” said Berhanu. Whether wastewater is for drinking or recharging aquifers, California plants are expanding their operations to include recycling methods so they can produce a sufficient supply.

    “The overall volume of water in the world doesn’t really change. We need to shift our thinking from looking at how much water is available at one point of time to trying to better integrate our practices with the entire water cycle,” said Berhanu.

    The study goes on to mention numerous efficiency-based and management solutions, like sustainable farming practices, land repurposing, and desalination to help the agriculture industry adjust.

    “Now is the time to think about possibilities and opportunities for collaboration across agriculture, municipalities, and the environment to invest in smart investments that capture more water and put it in the ground,” said Schwabe.

    [ad_2]

    Caroline Marshall Reinhart

    Source link

  • AI Has Helped Shein Become Fast Fashion’s Biggest Polluter

    AI Has Helped Shein Become Fast Fashion’s Biggest Polluter

    [ad_1]

    This story originally appeared in Grist and is part of the Climate Desk collaboration.

    In 2023, the fast-fashion giant Shein was everywhere. Crisscrossing the globe, airplanes ferried small packages of its ultra-cheap clothing from thousands of suppliers to tens of millions of customer mailboxes in 150 countries. Influencers’ “#sheinhaul” videos advertised the company’s trendy styles on social media, garnering billions of views.

    At every step, data was created, collected, and analyzed. To manage all this information, the fast fashion industry has begun embracing emerging AI technologies. Shein uses proprietary machine-learning applications — essentially, pattern-identification algorithms — to measure customer preferences in real time and predict demand, which it then services with an ultra-fast supply chain.

    As AI makes the business of churning out affordable, on-trend clothing faster than ever, Shein is among the brands under increasing pressure to become more sustainable, too. The company has pledged to reduce its carbon dioxide emissions by 25 percent by 2030 and achieve net-zero emissions no later than 2050.

    But climate advocates and researchers say the company’s lightning-fast manufacturing practices and online-only business model are inherently emissions-heavy — and that the use of AI software to catalyze these operations could be cranking up its emissions. Those concerns were amplified by Shein’s third annual sustainability report, released late last month, which showed the company nearly doubled its carbon dioxide emissions between 2022 and 2023.

    “AI enables fast fashion to become the ultra-fast fashion industry, Shein and Temu being the fore-leaders of this,” said Sage Lenier, the executive director of Sustainable and Just Future, a climate nonprofit. “They quite literally could not exist without AI.” (Temu is a rapidly rising ecommerce titan, with a marketplace of goods that rival Shein’s in variety, price, and sales.)

    In the 12 years since Shein was founded, it has become known for its uniquely prolific manufacturing, which reportedly generated over $30 billion of revenue for the company in 2023. Although estimates vary, a new Shein design may take as little as 10 days to become a garment, and up to 10,000 items are added to the site each day. The company reportedly offers as many as 600,000 items for sale at any given time with an average price tag of roughly $10. (Shein declined to confirm or deny these reported numbers.) One market analysis found that 44 percent of Gen Zers in the United States buy at least one item from Shein every month.

    That scale translates into massive environmental impacts. According to the company’s sustainability report, Shein emitted 16.7 million total metric tons of carbon dioxide in 2023 — more than what four coal power plants spew out in a year. The company has also come under fire for textile waste, high levels of microplastic pollution, and exploitative labor practices. According to the report, polyester — a synthetic textile known for shedding microplastics into the environment — makes up 76 percent of its total fabrics, and only 6 percent of that polyester is recycled.

    [ad_2]

    Sachi Mulkey

    Source link

  • The Mosquito-Borne Disease ‘Triple E’ Is Spreading in the US as Temperatures Rise

    The Mosquito-Borne Disease ‘Triple E’ Is Spreading in the US as Temperatures Rise

    [ad_1]

    The disease is spread by two types of mosquito. The first is a species called Culiseta melanura, or the black-tailed mosquito. This mosquito tends to live in hardwood bogs and feeds on birds like robins, herons, and wrens, spreading the virus among them. But the melanura mosquito doesn’t often bite mammals. A different mosquito species, Coquillettidia perturbans, is primarily responsible for most of the human cases of the disease reported in the US. The perturbans mosquito picks up the EEE virus when it feeds on birds and then infects the humans and horses that it bites. Toward the end of the summer, when mosquitoes have reached their peak numbers and start jostling for any available blood meal, human cases start cropping up.

    A pest control employee checks a swamp for mosquitoes in Stratham, New Hampshire.

    Photograph: Darren McCollester/Getty Images

    Andreadis, who published a historical retrospective on the progression of triple E in the northeastern US in 2021, said climate change has emerged as a major driver of the disease.

    “We’ve got milder winters, we’ve got warmer summers, and we’ve got extremes in both precipitation and drought,” he said. “The impact that this has on mosquito populations is probably quite profound.”

    Warmer global average temperatures generally produce more mosquitoes, no matter the species.

    Studies have shown that warmer air temperatures up to a certain threshold, around 90 degrees Fahrenheit, shorten the amount of time it takes for C. melanura eggs to hatch. Higher temperatures in the spring and fall extend the number of days mosquitoes have to breed and feed. And they’ll feed more times in a summer season if it’s warmer—mosquitoes are ectothermic, meaning their metabolism speeds up in higher temperatures.

    Rainfall, too, plays a role in mosquito breeding and activity, since mosquito eggs need water to hatch. A warmer atmosphere holds more moisture, which means that even small rainfall events dump more water today than they would have last century. The more standing water there is in roadside ditches, abandoned car tires, ponds, bogs, and potholes, the more opportunities mosquitoes have to breed. And warmer water decreases the incubation period for C. melanura eggs, leading one study to conclude that warmer-than-average water temperatures “increase the probability for amplification of EEE.”

    [ad_2]

    Zoya Teirstein

    Source link

  • Scientists Plan ‘Doomsday’ Vault on Moon

    Scientists Plan ‘Doomsday’ Vault on Moon

    [ad_1]

    Thanga and his team have sketched a system that would use solar panels and batteries to provide the power to push temperatures inside a lava tube down to the deep freeze needed to create their lunar ark. This is the defining difference between Thanga’s design and Hagedorn’s thought experiment. Where Thanga’s group would aim to actively cool the ark, Hagedorn and the Smithsonian team have envisioned a repository that uses natural features of the moon to keep the samples cryogenic.

    “The idea behind our proposal is that, to the extent we could make it, it would be passive,” Parenti said. She pointed out that people have long speculated about the idea of building something that stores materials on the moon, but all the ideas have required a crew to maintain them.

    To passively maintain a perpetual deep freeze, they’ve proposed building the repository on the south pole of the moon where, inside some craters, coincidences of celestial geometry have aligned to create areas of permanent shadow, and temperatures can be as low as –196 degrees centigrade. Those conditions would mean that the samples could be stored without need for crew, and they could be maintained with rovers and robotics alone.

    While in theory all of this makes these permanent polar shadows ideal for such a project, “we don’t know the basics of what that place is,” Thanga countered. Just last month, NASA canceled a mission that would have been the first rover to explore the pole in part because of the technical challenges posed. “This is one of the ironic things,” Thanga said. “It’s nearby Earth, but it’s perhaps one of the most extreme places in the entire solar system.”

    Fitzpatrick feels confident, however, that NASA’s current lunar roadmap will provide ample opportunity to explore and understand those dark polar realms, including a mission scheduled for later this year that plans to land on a ridge overlooking a polar shadow. But as NASA looks to explore those regions, Thanga pointed out, it’s possible that we might merely learn more about how hard it is to exist and operate in that level of cold.

    “Just operating in cryogenic conditions, that’s not trivial at all,” Thanga said. “Mechanical things do weird things. They may freeze up, latch up, you name it, under spacelike conditions. Even from moderately cold conditions in a vacuum, we have a phenomenon called cold welding,” where two pieces of metal fuse on contact.

    Thanga argues that the more sensible thing to do, then, is to create the ark in a lava tube since his colleagues in planetary science expect those tubes to be quite similar to the ones we have on Earth, albeit much colder, which gives researchers and engineers an understanding of what to expect and how to plan for it.

    Much like Hagedorn’s concept, however, price and schedule have yet to be refined. But Thanga expects that, after the design is finalized (which could yet take years), it could be built and assembled faster and cheaper than the International Space Station.

    [ad_2]

    Ayurella Horn-Muller, Syris Valentine

    Source link