ReportWire

Tag: Environmental Health

  • Untouched Brazilian Amazon Regions Lack Ecological Study.

    Untouched Brazilian Amazon Regions Lack Ecological Study.

    Newswise — Many parts of the Brazilian Amazon are neglected in ecological research, for several reasons, according to an article published in the journal Current Biology. Authored by Joice Ferreira of the Federal University of Pará (UFP) and colleagues from many countries who also belong to the Synergize Consortium, the article identifies the areas missing from ecological research and the factors that have determined these gaps, pinpointing opportunities for the planning of new investments in research in the region.

    The researchers analyzed data from 7,694 ecological research sites to try to understand how logistics and human influence on the forests could explain the probability of research being done in different parts of the Amazon region. The period analyzed was 2010-20, and the survey covered nine groups of organisms: benthic invertebrates (living on the seabed or in the lowest layers of any water body), heteropterans (true bugs), odonates (dragonflies and damselflies), fish, macrophytes (aquatic plants), birds, woody vegetation, ants, and dung beetles.

    “The consortium contacted people who had contributed to databases, standardized inventories and studies involving sampling efforts. Information was thereby compiled on three groups that represent Amazonian biodiversity: vertebrates, invertebrates, and plants in upland forests, flooded forests and aquatic environments – rivers, lakes, etc. This is the first paper published by the group,” said Mario Ribeiro de Moura, a researcher at the State University of Campinas’s Institute of Biology (IB-UNICAMP) in São Paulo, Brazil. He is a co-author of the article and a member of the consortium.

    The findings evidenced high susceptibility to climate change by 2050 in 15%-18% of the most neglected areas in the Brazilian Amazon. The least studied areas are also the most threatened in the vicinity of the “deforestation arc”, a swathe of territory extending along the southern, southeastern and eastern borders of Amazonia, mostly in the states of Acre, Amazonas, Maranhão, Mato Grosso, Pará, Rondônia and Tocantins.

    The main gaps in Amazonian ecological research were in upland areas. “This was expected and probably reflects the role played by navigable waterways in facilitating access to blackwater and whitewater inundation forest, as well as other aquatic environments,” Moura said.

    Not by chance, the least pessimistic scenarios appeared along rivers in northeast Pará and Roraima, southeastern Acre and northern Rondônia. “In these areas, the future impact of climate change will be less severe, and we have more knowledge of the species that live there,” Moura said.

    The study was supported by FAPESP via two postdoctoral fellowships in Brazil. One of the fellowships was awarded to Raquel de Carvalho, and the other to Angélica Faria de Resende. Moura was supported by a Young Investigator Grant and a scholarship in Brazil

    Research biases

    The scientists mapped the most neglected areas of the Amazon region in terms of ecological research and superimposed on this map the areas most likely to be affected by climate change based on a metric they developed to reflect its intensity. Deforestation and degradation data were taken from a recent study published in Science on the drivers of deforestation in the Amazon. The correlations between datasets showed that ecological research in the Amazon is more frequent in deforested areas than areas where deforestation is predicted in the next three decades.

    “Environmental change is happening at a very fast pace, including climate change and landscape transformation. To understand how these changes affect biodiversity, we need to know what was in a given area before they happened. The Amazon is one of the last significantly conserved refuges of tropical biodiversity and essential to an understanding of the isolated effect of climate change and habitat destruction on biodiversity,” Moura said. “The study highlighted the areas at risk of environmental change in the coming years and not yet explored by scientists. Without sufficient ecological research, we won’t be able to know what’s changing and what’s being lost.”

    With regard to logistics, accessibility and distance to research facilities were key predictors of the probability of research being done. “Access is a mixed blessing, as evidenced by the deforestation arc. Easy access enables researchers to reach more areas, so part of this immense arc has been thoroughly studied, but it also enables those responsible for deforestation and other malefactors to reach these areas. Little information is available on the threatened areas at the edges of the deforestation arc,” Moura said.

    Access, and hence research probability, increased with proximity to transportation and research facilities for all upland organisms and most representatives of wetlands and aquatic habitats. “The length of the dry season determines ease of access by water. In flooded forest areas, the shorter the dry season, the easier it is to gain access by river, and this increases the likelihood of research. In upland areas, more severe dry seasons facilitate overland access, with less mud and inundation,” Moura said.

    Forest degradation and land tenure were also moderately effective predictors, albeit with consistent importance, across all organism groups. Both factors affected ecological research in the same direction, with research probability slightly declining in more degraded areas and Indigenous territories, but increasing in conservation units. 

    In short, less research is done in degraded areas and Indigenous territories, and more in conservation units. “It’s harder to obtain access to Indigenous communities, or there may be a lack of administrative mechanisms that connect researchers with the bodies that regulate such access and with the communities themselves. We need to improve integration between the parties involved, and above all engage local communities in the knowledge creation process. Far more research goes on in conservation units than Indigenous territories, although both are types of protected area,” Moura said.

    In Carvalho’s opinion, this is a distribution problem, since Indigenous territories account for some 23% of the total area of the Brazilian Amazon. “At the same time, several Indigenous territories are the best conserved parts of the Amazon biome. It would be very valuable if we could do research there,” she said.

    Novel strategies

    According to Moura, the Amazon Rainforest is under-represented in global databases used as a source for research on biodiversity. “As noted in the article, we need to integrate the information we have about the Amazon with global databases. The Synergize Consortium has projects that could contribute to global assessments. The information reviewed for this study mostly complies with the requirements of other databases and could be used to improve the representativeness of Amazonian biodiversity in future research on global change. The consortium plans to use this study as a basis for establishing itself as an important collaborative network for other research groups interested in analyzing environmental changes in the Amazon,” he said.

    The Synergize Consortium’s principal investigators are Ferreira, who is affiliated with EMBRAPA Amazônia Oriental, a unit of the Brazilian Agricultural Research Corporation (EMBRAPA); and Filipe França, a researcher at the University of Bristol in the United Kingdom. Jos Barlow, a professor at the University of Lancaster, also in the UK, is a co-author of the article and a member of the consortium’s steering committee.

    Moura believes the group’s findings can be used to develop novel funding strategies for the Amazon. “Once you’ve identified the gaps, you can target them for investment in conservation and research, or give more weight to research in these areas in future calls for proposals. Public policy and action plans can take these results into consideration, especially as far as biodiversity monitoring and inventorying are concerned,” he said.

    Sao Paulo Research Foundation (FAPESP)

    Source link

  • RUDN Ecologists Describe Strong Desertification in Northern Algeria

    RUDN Ecologists Describe Strong Desertification in Northern Algeria

    Newswise — RUDN University ecologists and colleagues from Algeria, Greece, Egypt, and Russia have determined the scale and causes of desertification in northern Algeria. The analysis was carried out using satellite images in different ranges. Over six years, the area of usable land has decreased by 1.5-9 times. The results were published in The Egyptian Journal of Remote Sensing and Space Science.

    The loss of the biological function of land is called desertification. The composition of the soil changes, the sand content increases, and the vegetation becomes poorer. Such lands can no longer be cultivated; livestock cannot graze on them. There are several regions on Earth with a high risk of desertification. One of them is North Africa. Remote monitoring using satellite images helps track desertification. However, different soil types may be difficult to distinguish by satellite data if they have high sand content. It is important to interpret the images correctly. RUDN University ecologists and colleagues from Algeria, Greece, Egypt, and Russia determined which satellite data is best suited for determining soil composition.

    “There is a problem with the similarity of reflectivity between different soils with high sand content. These are, for example, sand, loamy sand and clay. Therefore, it is necessary to develop more accurate spectral indicators to distinguish soil structures easily,” said Dmitry Kucher, Ph.D., head of the Scientific Center for Research, Integrated Design and Development of Urban and Agricultural Development of the RUDN University.

    Ecologists conducted the study in the Nemamcha region in northern Algeria. This region has undergone rapid desertification. To trace spatiotemporal changes in the topsoil, RUDN University ecologists used satellite images from 2013 and 2019 and soil samples. Then they calculated the correlation between these data and analyzed the possible causes of desertification.

    It turned out that blue and near-infrared images are best suited for determining the proportion of sand and clay. Using them, RUDN University ecologists built a regression model determing the composition of the soil with sufficient accuracy—the coefficient of determination (an indicator of model quality) reached 89%.

    Changes in soil composition between 2013 and 2019 indicate noticable desertification: the share of land suitable for agriculture in the region fell from 31% in 2013 to 4% in 2019, and the grazing area fell from 21% to 13%. Ecologists also named the main cause of desertification in this area – aeolian processes, that is, wind erosion and the application of sand by the wind. They turn out to be strong, among other things, because of human activity – too intensive cattle breeding and agriculture.

    “We found a dominant role for aeolian processes, which are exacerbated by low topography, overgrazing, climate change, and over-intensive agriculture. We recommend investigating the protective role of dry grasslands and desert shrublands against erosion and restoring degraded lands. We urge legislators to implement remote monitoring strategies and restore vegetation to combat desertification,” said Dmitry Kucher, Ph.D., Head of the Scientific Center for Research, Integrated Design and Development of Urban and Agriculture at RUDN University.

    Russian Foundation for Basic Research

    Source link

  • Organic nitrogen aerosol plays a vital role in global nitrogen deposition.

    Organic nitrogen aerosol plays a vital role in global nitrogen deposition.

    Newswise — This study, led by Dr Yumin Li of Southern University of Science and Technology (SUSTech), was a collaboration between Professor Tzung-May Fu’s team at SUSTech and Professor Jian Zhen Yu’s team at Hong Kong University of Science and Technology (HKUST). The research emphasized the previously underestimated significance of atmospheric ON aerosol depositions on ecosystems. Additionally, the ecological effects of ON aerosol depositions are anticipated to increase due to global warming and the decrease in nitrogen oxide emissions from human activities.

    Atmospheric deposition of organic nitrogen (ON) plays a crucial role in the global nitrogen cycle. Surface measurements showed that 2% to 70% of the local atmospheric deposition flux of total nitrogen was organic. However, previous models have largely neglected the spatial and chemical variations of atmospheric ON, leading to inadequate assessment of its global impacts.

    The scientists from SUSTech and HKUST developed a comprehensive global model of atmospheric gaseous and particulate ON, incorporating the latest knowledge on emissions and secondary formations. Their simulated surface concentrations of atmospheric particulate ON (ONp) were highly consistent with global observations, a feat that had not been achieved previously. Additionally, their simulated atmospheric deposition flux aligned with global observations within an order of magnitude. The scientists estimated that the global atmospheric ON deposition was 26 Tg N yr-1. This majority of this deposition (23 Tg N yr-1) occurred in the form of ON aerosol and accounted for 19% of the global atmospheric total N deposition (124 Tg N yr-1). The main sources of ON aerosols were wildfires, ocean emissions, and secondary formation.

    “Our simulation showed that the deposition of ON aerosol from the atmosphere is a crucial external source of nitrogen to nitrogen-limited ecosystems worldwide, such as the boreal forests, tundras, and the Arctic Ocean,” Fu says. In a future warming climate, wildfires will likely become more frequent and intense. Climate warming will also lead to surface ocean stratification, making atmospheric ON deposition an increasingly important source of nitrogen to these ecosystems. “We need to further examine the environmental impacts of atmospheric ON aerosol and how those impacts respond to climate change.”

    Science China Press

    Source link

  • A cheaper, safer alternative to lithium-ion batteries: aqueous rechargeable batteries

    A cheaper, safer alternative to lithium-ion batteries: aqueous rechargeable batteries

    Newswise — This summer, the planet is suffering from unprecedented heat waves and heavy rainfalls. Developing renewable energy and expanding associated infrastructure has become an essential survival strategy to ensure the sustainability of the planet in crisis, but it has obvious limitations due to the volatility of electricity production, which relies on uncertain variables like labile weather conditions. For this reason, the demand for energy storage systems (ESS) that can store and supply electricity as needed is ever-increasing, but lithium-ion batteries (LIBs) currently employed in ESS are not only highly expensive, but also prone to potential fire, so there is an urgent need to develop cheaper and safer alternatives.

    A research team led by Dr. Oh, Si Hyoung of the Energy Storage Research Center at the Korea Institute of Science and Technology (KIST) has developed a highly safe aqueous rechargeable battery that can offer a timely substitute that meets the cost and safety needs. Despite of lower energy density achievable, aqueous rechargeable batteries have a significant economic advantage as the cost of raw materials is much lower than LIBs. However, inveterate hydrogen gas generated from parasitic water decomposition causes a gradual rise in internal pressure and eventual depletion of the electrolyte, which poses a sizeable threat on the battery safety, making commercialization difficult.

    Until now, researchers have often tried to evade this issue by installing a surface protection layer that minimizes the contact area between the metal anode and the electrolyte. However, the corrosion of the metal anode and accompanying decomposition of water in the electrolyte is inevitable in most cases, and incessant accumulation of hydrogen gas can cause a potential detonation in long-term operation.

    To cope with this critical issue, the research team has developed a composite catalyst consisting of manganese dioxide and palladium, which is capable of automatically converting hydrogen gas generated inside the cell into water, ensuring both the performance and safety of the cell. Manganese dioxide does not react with hydrogen gas under normal circumstances, but when a small amount of palladium is added, hydrogen is readily absorbed by the catalysts, being regenerated into water. In the prototype cell loaded with the newly developed catalysts, the internal pressure of the cell was maintained well below the safety limit, and no electrolyte depletion was observed.

    The results of this research effectively solves one of the most concerning safety issues in the aqueous batteries, making a major stride towards commercial application to ESS in the future. Replacing LIBs by cheaper and safer aqueous batteries can even trigger a rapid growth of global market for ESS.

    “This technology pertains to a customized safety strategy for aqueous rechargeable batteries, based on the built-in active safety mechanism, through which risk factors are automatically controlled.” said Dr. Oh, Si Hyoung of KIST. “Moreover, it can be applied to various industrial facilities where hydrogen gas leakage is one of major safety concerns (for instance, hydrogen gas station, nuclear power plant etc) to protect public safety.”

     

    ###

    KIST was established in 1966 as the first government-funded research institute in Korea. KIST now strives to solve national and social challenges and secure growth engines through leading and innovative research. For more information, please visit KIST’s website at https://eng.kist.re.kr/

    This research was supported by the Ministry of Science and ICT (Minister Lee Jong-ho) through the Nano Future Material Source Technology Development Project and the Mid-Career Researcher Support Project, and the results were published on August 1 in the international journal Energy Storage Materials (IF 20.4).

    National Research Council of Science and Technology

    Source link

  • Commonly Used Herbicide is Harmful to Adolescent Brain Function

    Commonly Used Herbicide is Harmful to Adolescent Brain Function

    Newswise — Herbicides are the most used class of pesticides worldwide, with uses in agriculture, homes and industry. Exposures to two of the most popular herbicides were associated with worse brain function among adolescents, according to a study led by researchers at the Herbert Wertheim School of Public Health and Human Longevity Science at University of California San Diego.

    In the Oct. 11, 2023 online issue of Environmental Health Perspectives, the researchers reported measuring metabolite concentrations of two commonly used herbicides — glyphosate and 2,4-dichlorophenoxyacetic acid (2,4-D) — and the insect repellent DEET in urine samples collected in 2016 from 519 adolescents, aged 11 to 17, living in the agricultural county of Pedro Moncayo, Ecuador. Researchers also assessed neurobehavioral performance in five areas: attention and inhibitory control, memory and learning, language, visuospatial processing, and social perception.

    “Many chronic diseases and mental health disorders in adolescents and young adults have increased over the last two decades worldwide, and exposure to neurotoxic contaminants in the environment could explain a part of this increase,” said senior author Jose Ricardo Suarez, M.D., Ph.D., M.P.H., associate professor in the Herbert Wertheim School of Public Health.

    Among the findings:

    • Glyphosate, a nonselective herbicide used in many crops, including corn and soy, and for vegetation control in residential settings, was detected in 98 percent of participants.
    • 2,4-D, a broadleaf herbicide used on lawns, aquatic sites, and agricultural crops, was detected in 66 percent of participants.
    • Higher amounts of 2,4-D in urine were associated with lower neurobehavioral performance in the domains of attention and inhibitory control, memory and learning, and language.
    • Glyphosate concentration in urine was associated with lower scores in social perception only, while DEET metabolites were not associated with neurobehavioral performance.

    Following the introduction of genetically modified, glyphosate-resistant “Roundup-ready” crops in 1996 and 2,4-D resistant crops in 2014, there have been substantial increases in glyphosate and 2,4-D use, making them the most widely used herbicides in the world, wrote the authors.

    “There is considerable use of herbicides and insecticides in agricultural industries in both developed and developing nations around the world, raising exposure potential for children and adults, especially if they live in agricultural areas, but we don’t know how it impacts each stage of life,” said first author Briana Chronister, doctoral candidate in the UC San Diego – San Diego State University Joint Doctoral Program in Public Health.

    Previous studies have linked exposure to some of the most used insecticides to altered neurocognitive performance while other insecticides may also affect mood and brain development. Today, 20 percent of adolescents and 26 percent of young adults have diagnosable mental health conditions such as anxiety, depression, impulsivity, aggression or learning disorders.

    The authors reported that 2,4-D was negatively associated with performance in all five neurobehavioral areas, but statistically significant associations were observed with attention and inhibitory control, memory and learning, and language. Glyphosate had a significant negative association only with social perception, a test that measures the ability to recognize emotions, while DEET metabolites were not associated with neurobehavioral alterations.

    “Hundreds of new chemicals are released into the market each year, and more than 80,000 chemicals are registered for use today,” said Suarez. “Sadly, very little is known about the safety and long-term effects on humans for most of these chemicals. Additional research is needed to truly understand the impact.”

    This research is a study within ESPINA: The Study of Secondary Exposures to Pesticides Among Children and Adolescents, a prospective cohort study funded by the National Institute of Environmental Health Sciences, part of the National Institutes of Health, the National Institute of Occupational Safety and Health, and other private funding sources. ESPINA aims to understand the effect of pesticide exposures on the development of humans from childhood thru adulthood.

    In 2022, Suarez and his team completed year 14 of follow-up of study participants with plans to evaluate whether the observed associations persist into early adulthood.

    Co-authors include: Kun Yang, Audrey R. Yang, Tuo Lin, Xin Tu, Harvey Checkoway, Jose Suarez-Torres, Sheila Gahagan, and Raeanne C. Moore, UC San Diego; Dolores Lopez-Paredes and Danilo Martinez, Fundación Cimas del Ecuador; and Dana Barr, Emory University.

    This research was funded, in part, by the National Institutes of Health (R01ES025792, R01ES030378, R21ES026084, U2CES026560, P30ES019776, 5T32MH122376).

    Disclosures: The authors do not have any conflicts of interest to report.

    DOI: 10.1289/EHP11383

    University of California San Diego

    Source link

  • Climate-driven heat may render parts of Earth uninhabitable

    Climate-driven heat may render parts of Earth uninhabitable

    Newswise — If global temperatures increase by 1 degree Celsius (C) or more than current levels, each year billions of people will be exposed to heat and humidity so extreme they will be unable to naturally cool themselves, according to interdisciplinary research from the Penn State College of Health and Human Development, Purdue University College of Sciences and Purdue Institute for a Sustainable Future. 

    Results from a new article published today (Oct. 9) in Proceedings of the National Academy of Sciences indicated that warming of the planet beyond 1.5 C above preindustrial levels will be increasingly devastating for human health across the planet.  

    Humans can only withstand certain combinations of heat and humidity before their bodies begin to experience heat-related health problems, such as heat stroke or heart attack. As climate change pushes temperatures higher around the world, billions of people could be pushed beyond these limits.  

    Since the start of the industrial revolution, when humans began to burn fossil fuels in machines and factories, temperatures around the world have increased by about 1 C, or 1.8 degrees Fahrenheit (F). In 2015, 196 nations signed the Paris Agreement which aims to limit worldwide temperature increases to 1.5 C above pre-industrial levels.  

    The researcher team modeled global temperature increases ranging between 1.5 C and 4 C — considered the worst-case scenario where warming would begin to accelerate — to identify areas of the planet where warming would lead to heat and humidity levels that exceed human limits. 

    “To understand how complex, real-world problems like climate change will affect human health, you need expertise both about the planet and the human body,” said co-author W. Larry Kenney, professor of physiology and kinesiology, the Marie Underhill Noll Chair in Human Performance at Penn State and co-author of the new study. “I am not a climate scientist, and my collaborators are not physiologists. Collaboration is the only way to understand the complex ways that the environment will affect people’s lives and begin to develop solutions to the problems that we all must face together.” 

    A threat to billions 

    The ambient wet-bulb temperature limit for young, healthy people is about 31 C, which is equal to 87.8 F at 100% humidity, according to work published last year by Penn State researchers. However, in addition to temperature and humidity, the specific threshold for any individual at a specific moment also depends on their exertion level and other environmental factors, including wind speed and solar radiation. In human history, temperatures and humidity that exceed human limits have been recorded only a limited number of times — and only for a few hours at a time — in the Middle East and Southeast Asia, according to the researchers.  

    Results of the study indicate that if global temperatures increase by 2 C above pre-industrial levels, the 2.2 billion residents of Pakistan and India’s Indus River Valley, the one billion people living in eastern China and the 800 million residents of sub-Saharan Africa will annually experience many hours of heat that surpass human tolerance. 

    These regions would primarily experience high-humidity heatwaves. Heatwaves with higher humidity can be more dangerous because the air cannot absorb excess moisture, which limits sweat evaporates from human bodies and moisture from some infrastructure, like evaporative coolers. Troublingly, researchers said, these regions are also in lower-to-middle income nations, so many of the affected people may not have access to air conditioning or any effective way to mitigate the negative health effects of the heat. 

    If warming of the planet continues to 3 C above pre-industrial levels, the researchers concluded, heat and humidity levels that surpass human tolerance would begin to affect the Eastern Seaboard and the middle of the United States — from Florida to New York and from Houston to Chicago. South America and Australia would also experience extreme heat at that level of warming.  

    At current levels of heating, the researchers said, the United States will experience more heatwaves, but these heatwaves are not predicted to surpass human limits as often as in other regions of the world. Still, the researchers cautioned that these types of models often do not account for the worst, most unusual weather events.  

    “Models like these are good at predicting trends, but they do not predict specific events like the 2021 heatwave in Oregon that killed more than 700 people or London reaching 40 C last summer,” said lead author Daniel Vecellio, a bioclimatologist who completed a postdoctoral fellowship at Penn State with Kenney. “And remember, heat levels then were all below the limits of human tolerance that we identified. So, even though the United States will escape some of the worst direct effects of this warming, we will see deadly and unbearable heat more often. And — if temperatures continue to rise — we will live in a world where crops are failing and millions or billions of people are trying to migrate because their native regions are uninhabitable.” 

    Understanding human limits and future warming 

    Over the last several years, Kenney and his collaborators have conducted 462 separate experiments to document the combined levels of heat, humidity and physical exertion that humans can tolerate before their bodies can no longer maintain a stable core temperature.  

    “As people get warmer, they sweat, and more blood is pumped to their skin so that they can maintain their core temperatures by losing heat to the environment,” Kenney said. “At certain levels of heat and humidity, these adjustments are no longer sufficient, and body core temperature begins to rise. This is not an immediate threat, but it does require some form of relief. If people do not find a way to cool down within hours, it can lead to heat exhaustion, heat stroke and strain on the cardiovascular system that can lead to heart attacks in vulnerable people.” 

    In 2022, Kenney, Vecellio and their collaborators demonstrated that the limits of heat and humidity people can withstand are lower than were previously theorized.  

    “The data collected by Kenney’s team at Penn State provided much needed empirical evidence about the human body’s ability to tolerate heat. Those studies were the foundation of these new predictions about where climate change will create conditions that humans cannot tolerate for long,” said co-author Matthew Huber, professor of earth, atmospheric and planetary sciences at Purdue University. 

    When this work was published, Huber, who had already begun work on mapping the impacts of climate change, contacted Vecellio about a potential collaboration. Huber had previously published widely cited work proposing a theoretical limit of humans’ heat and humidity limits. 

    The researchers, along with Huber’s graduate student, Qinqin Kong, decided to explore how people would be affected in different regions of the world if the planet warmed by between 1.5 C and 4 C. The researchers said that 3 C is the best estimate of how much the planet will warm by 2100 if no action is taken. 

    “Around the world, official strategies for adapting to the weather focus on temperature only,” Kong said. “But this research shows that humid heat is going to be a much bigger threat than dry heat. Governments and policymakers need to re-evaluate the effectiveness of heat-mitigation strategies to invest in programs that will address the greatest dangers people will face.” 

    Staying safe in the heat 

    Regardless of how much the planet warms, the researchers said that people should always be concerned about extreme heat and humidity — even when they remain below the identified human limits. In preliminary studies of older populations, Kenney found that older adults experience heat stress and the associated health consequences at lower heat and humidity levels than young people. 

    “Heat is already the weather phenomenon that kills the most people in the United States,” Vecellio, now a postdoctoral researcher at George Mason University’s Virginia Climate Center, said. “People should care for themselves and their neighbors — especially the elderly and sick — when heatwaves hit.” 

    The data used in this study examined the body’s core temperatures, but the researchers said that during heatwaves, people experience health problems from other causes as well. For example, Kenney said that most of the 739 people who died during Chicago’s 1995 heatwave were over 65 and experienced a combination of high body temperature and cardiovascular problems, leading to heart attacks and other cardiovascular causes of death. 

    Looking to the future 

    To stop temperatures from increasing, the researchers cite decades of research indicating that humans must reduce the emission of greenhouse gases, especially the carbon dioxide emitted by burning fossil fuels. If changes are not made, middle-income and low-income countries will suffer the most, Vecellio said.  

    As one example, the researchers pointed to Al Hudaydah, Yemen, a port city of more than 700,000 people on the Red Sea. Results of the study indicated that if the planet warms by 4 C, this city can expect more than 300 days when temperatures exceed the limits of human tolerance every year, making it almost uninhabitable.  

    “The worst heat stress will occur in regions that are not wealthy and that are expected to experience rapid population growth in the coming decades,” Huber said. “This is true despite the fact that these nations generate far fewer greenhouse gas emissions than wealthy nations. As a result, billions of poor people will suffer, and many could die. But wealthy nations will suffer from this heat as well, and in this interconnected world, everyone can expect to be negatively affected in some way.” 

    This research was supported by grants from the National Institute on Aging, the National Aeronautics and Space Administration, and the National Science Foundation. 

    Penn State University

    Source link

  • Studying Grand Canyon’s Past for Climate Insights

    Studying Grand Canyon’s Past for Climate Insights

    Newswise — The Grand Canyon’s valleys and millions of years of rock layers spanning Earth’s history have earned it a designation as one of the Seven Natural Wonders of the World. But, according to a new UNLV and University of New Mexico study, its marvels extend to vast cave systems that lie beneath the surface, which just might hold clues to better understand the future of climate change — by studying nature’s past.

    A research team led by UNLV paleoclimatologist and Professor Matthew Lachniet that included the University of New Mexico Department of Earth & Planetary Sciences Distinguished Professor Yemane Asmerom and Research Scientist Victor Polyak and other collaborators, studied an ancient stalagmite from the floor of an undisturbed Grand Canyon cave. By studying the mineral deposits’ geochemistry, they were able to analyze precipitation patterns during the rapidly warming period following the last Ice Age to improve understanding of the potential impact of future climate change on summer monsoon rains in the U.S. Southwest and northwestern Mexico.

    Their findings, “Elevated Grand Canyon groundwater recharge during the warm Early Holocene,” published Oct. 2 in Nature Geoscience, revealed that increasing levels of water seeped into the cave between 8,500 and 14,000 years ago, during a period known as the early Holocene when temperatures rose throughout the region. Using a paleoclimate model, the researchers determined that this was likely caused by intensified and expanded summer rainfall stemming from atmospheric impacts on air circulation patterns that more quickly melted the winter snowpacks and sped up the evaporation process that fuels monsoon rains. 

    This is significant, authors say, because most of the water currently infiltrating through the bedrock and into caves and aquifers — and contributing to groundwater recharge — comes from winter snowmelt. During the early Holocene, however, when peak temperatures were only slightly warmer than today, both summer and winter moisture contributed to groundwater recharge in the region.

    The authors suggest that future warming, which could cause temperatures to rise above those of the early Holocene, may also lead to greater rates of summer rainfall on the high-elevation Colorado Plateau and an intensifying North American monsoon, the pattern of pronounced and increased thunderstorms and precipitation that typically occur between June and mid-September.

    “What was surprising about our results is that during this past warm period, both the summer monsoon and infiltration into the cave increased, which suggests that summer was important for Grand Canyon groundwater recharge, even though today it is not an important season for recharge,” said Lachniet, who personally retrieved the stalagmite from a cave in the Redwall Formation on the South Rim of eastern Grand Canyon in 2017. “While we still expect the region to dry in the future, more intense summer rainfall may actually infiltrate into the subsurface more than it does today.”

    Stalagmites are common cave formations that act as ancient rain gauges that record historic climate change. They grow as mineral-rich waters seep through the ground above and drop from the tips of stalactites on cave ceilings. Calcite minerals from tiny drops of water accumulate over thousands of years and, much like tree rings, accurately record the rainfall history of an area. Three natural forms of oxygen are found in water, and the quantity of one form decreases as rainfall increases. This information is

    locked into the stalagmites over time. Because of the distinct difference in the oxygen isotope composition between summer and winter precipitation, it is possible to estimate the relative contributions from each season. Variation in uranium 234 isotope and changes in the growth thickness of stalagmite give indication of the change in the amount of precipitation. 

    “We were able to validate the oxygen record with the growth data, with the uranium isotope data to confirm that in fact, we see significant increases in summer moisture during this warm period, which we attribute is to the monsoon,” said Asmerom. “Obviously, we know things very precisely in terms of timing because we know how to date things. This is something that we are known for around the world using these methods”, Polyak added.

    The research team used stalagmite samples to reconstruct groundwater recharge rates — or the amount of water that penetrates the aquifers — in the Grand Canyon area during the early years of the Holocene period. High groundwater recharge rates likely occurred on other high-elevation plateaus in the region, too, they said, though it’s unclear how the activity applies to hotter, low-elevation deserts.   

    What is clear is that ongoing human-caused climate change is leading to hotter temperatures throughout southwestern North America, including the Grand Canyon region. Alongside population growth and agricultural pressures, this warming can reduce the infiltration of surface water into groundwater aquifers. Groundwater recharge rates also depend on the frequency and intensity of summer rains associated with monsoon season.

    Though summer infiltration isn’t a significant contributor to groundwater recharge in the region today, these latest findings suggest that could change in the future as the climate warms and monsoonal moisture increases. What’s unknown is how a projected decrease in winter precipitation and snowpack could impact overall groundwater reserves.

    In a previous study led by UNM’s Asmerom and published in the Proceedings of the National Academy of Sciences, they found that the North American monsoon is likely to intensify with increased warming. But there were other, mostly model-based studies that suggested otherwise. The new study is consistent with Asmerom and colleagues’ previous study. 

    “Unfortunately, effective moisture is the balance between precipitation and evaporation. Unlike the more temperate Grand Canyon climate, the dry southern part, is likely to be drier, as a result of the increased temperatures,” said Asmerom.

    University of New Mexico

    Source link

  • Smaller carbon, more comfort

    Smaller carbon, more comfort

    Newswise — Osaka, Japan – As organizations work to reduce their energy consumption and associated carbon emissions, one area that remains to be optimized is indoor heating and cooling. In fact, HVAC – which stands for Heating, Ventilation, and Air Conditioning – represents, on average, about 40% of a building’s total energy use. Methods that conserve electricity while still providing a comfortable indoor environment for workers could make a significant difference in the fight against climate change.

    Now, researchers from Osaka University have demonstrated significant energy savings through the application of a new, AI-driven algorithm for controlling HVAC systems. This method does not require complex physics modelling, or even detailed previous knowledge about the building itself.

    During cold weather, it is sometimes challenging for conventional sensor-based systems to determine when the heating should be shut off. This is due to thermal interference from lighting, equipment, or even the heat produced by the workers themselves. This can lead to the HVAC being activated when it should not be, wasting energy.

    To overcome these obstacles, the researchers employed a control algorithm that worked to predict the thermodynamic response of the building based on data collected. This approach can be more effective than attempting to explicitly calculate the impact of the multitude of complex factors that might affect the temperature, such as insulation and heat generation. Thus, with enough information, ‘data driven’ approaches can often outperform even sophisticated models. Here, the HVAC control system was designed to ‘learn’ the symbolic relationships between the variables, including power consumption, based on a large dataset.

    The algorithm was able to save energy while still allowing the building occupants to work in comfort. “Our autonomous system showed significant energy savings, of 30% or more for office buildings, by leveraging the predictive power of machine learning to optimize the times the HVAC should operate.” says lead author Dafang Zhao. “Importantly, the rooms were comfortably warm despite it being winter.”

    The algorithm worked to minimize the total energy consumed, the difference between the actual and desired room temperature, and change in the rate of power output at peak demand. “Our system can be easily customized to prioritize energy conservation or temperature accuracy, depending on the needs of the situation,” adds senior author Ittetsu Taniguchi.

    To collectively achieve the goal of a carbon-neutral economy, it is highly likely that corporations will need to be at the vanguard of innovation. The researchers note that their approach may enjoy rapid adoption during times of rising energy costs, which makes their findings good for both the environment as well as company viability.

    ###

    Osaka University

    Source link

  • American University and Football for Peace Join Forces to Promote Sports Diplomacy, Launch Peace Center

    American University and Football for Peace Join Forces to Promote Sports Diplomacy, Launch Peace Center

    Newswise — American University’s School of International Service (SIS) and Football for Peace (FfP), an international sports diplomacy non-government organization headquartered in London, UK, with the support of the Maryland Sports Commission, are launching the first Football for Peace Center in the United States. The Peace Center will address pressing social and environmental challenges in the U.S. and around the world, focusing on youth empowerment, water prosperity, and societal advocacy.

    “SIS has a long history of promoting leadership in peace and conflict resolution and addressing issues like poverty; geography; and water justice, including access to clean water, that contribute to conflict,” said SIS Dean Shannon Hader, MD, MPH. “Through this partnership and the growth of the Peace Center, we will host a variety of programs and events, reaffirming our dedication to creating positive change and ‘waging peace’ worldwide.”

    The FfP Peace Center will serve as a platform for community service, global campaigns, advocacy, and youth engagement for marginalized communities in the Washington, D.C., Maryland, and Virginia area, as well as for AU students, alumni, and partners, uniquely leveraging the power and popularity of both soccer and American football. As one of the new Center’s initiatives, SIS faculty and students will share their expertise in water politics and justice to support Football for Peace’s Rehydrate the Earth campaign, which will be formally launched later ahead of World Water Day 2024. The campaign is the world’s first global football-led water campaign.

    “It’s great for Football for Peace U.S.A. to be partnering with such a prestigious university like American University and its School of International Service,” said Josh R. Norman, NFL Cornerback & founding board member of FfP USA. “Our heritage comes from professional sports, and we consider football, both soccer and American Football, to have a unique ability to reach far beyond ethnic, religious, social, or environmental differences. We hope to make a lot of a positive impact in the U.S.A.”

    “I am so proud to come back to the States and work with some amazing partners after spending many years playing college soccer, which taught me positive values on and off the pitch. This partnership aligns perfectly with the upcoming World Cup; soccer touches five billion people and has the power to move masses,” said Kash Siddiqi, FfP co-founder and former professional soccer player. “Through this dynamic partnership, we’re not just coming together; we’re playing a pivotal role in promoting peace through soccer and football. Together, we’re turning our shared commitment into advancing Sports Diplomacy Actions locally and internationally. The announcement of the inaugural Capitol Region Football for Peace Center is a significant step toward making this vision a reality.”

    The partnership will provide American University students the opportunity to become involved in sports diplomacy through FfP’s Most Valuable Peacemakers (MVP) Award, an initiative that honours young leaders, renowned athletes, and dignitaries for their efforts in tackling local and global issues and making a positive impact in their communities. Launched in 2015, the MVP Award allows youth to hear from professional athletes and offers soccer training opportunities and community service through soccer. This transformative experience empowers participants to cultivate their peace-building skills through empathy, compassion, and service to others.

    The new Center will also create internship opportunities for students to participate in the Football for Peace projects with a global focus, including Peace Matches. The partnership will also aim to offer opportunities to AU faculty to lead and assist with initiatives to further AU’s mission of creating positive change around the world.

    “Today’s announcement with American University is the first major step for Football for Peace, in an ongoing effort, to partner strategically with a distinguished academic institution while fostering and advocating the growth and mission of the organization in the United States,” said Terry Hasseltine, Executive Director, Maryland Sports Commission and President of the Sport & Entertainment Corporation of Maryland. “Working with a global initiative like Football for Peace, and now their Peace Center at American University, will elevate our long-term legacy footprint for the next generation here in Maryland, while creating the potential to expand regionally and nationally.”

    The agreement between AU SIS and FfP was celebrated during a special event on the AU campus that focused on the impact of sports diplomacy and featured prominent speakers, including Brenda Abdelall, Assistant Secretary, U.S. Department of Homeland Security; George Atallah, assistant executive director of external affairs for the NFL Players Association; Terry Hasseltine, President of the Maryland Sports Commission; Josh Norman, NFL former Washington Commanders’ top cornerback; Oguchi Onyewu, former US Men’s soccer national team Captain and Vice President of Sporting, United States Soccer Federation; tennis star Francis Tiafoe; and Brenden Varma, Deputy Director, UN Information Center.

    About American University’s School of International Service

    American University’s School of International Service (SIS) is a top-10 school of international affairs located in Washington, D.C. Since the school’s founding in 1957, we have answered President Dwight D. Eisenhower’s call to prepare students of international affairs to “wage peace.” SIS produces transformational research and prepares more than 3,000 graduate and undergraduate students for global careers in government, nonprofits, and business. Our students take advantage of Washington’s wealth of resources and professional opportunities—and an active international network of more than 25,000 alumni. They graduate prepared to combine knowledge and practice and to serve the global community as emerging leaders, waging peace and building understanding in our world.

    About Football for Peace

    Football for Peace (FfP) as an organization was inspired by the work of FIFA and Chilean legend Elias Figueroa. In 2013, Kashif Siddiqi, a former international soccer player and soccer diplomat launched Football for Peace internationally. FfP is a sports diplomacy NGO. Its mission is to advance sports diplomacy initiatives that address pressing social and environmental issues, leveraging the unique combination of football and soccer to serve communities in the United States and around the world.

    American University

    Source link

  • Lawrence Livermore grabs two spots in DOE’s Energy Earthshot program

    Lawrence Livermore grabs two spots in DOE’s Energy Earthshot program

    Newswise — Lawrence Livermore National Laboratory scientists will lead and co-lead projects in support of the Department of Energy’s (DOE) new Energy Earthshot program.

    The Energy Earthshots Initiative calls for innovation and collaboration to tackle the toughest topics in energy-related research. In January, DOE announced Office of Science funding for the Energy Earthshot Research Centers (EERCs)—they will build off a concept the DOE successfully demonstrated in the previous Energy Frontier Research Centers (EFRCs) and the Scientific Discovery Through Advanced Computing (SciDAC) program. The new EERCs will support fundamental research to accelerate breakthroughs in support of the Energy Earthshots Initiative.

    The Energy Earthshots are designed to stimulate integrated program development and execution across the DOE’s basic science and energy technology offices. They are part of an all-hands-on-deck approach to provide science and technology innovations that the nation needs to address tough technological challenges required to achieve our climate goals. The Energy Earthshots will accelerate breakthroughs toward more abundant, affordable and reliable clean energy solutions and the carbon dioxide removal needed to counterbalance hard-to-abate greenhouse gas emissions.

    Six Energy Earthshots have been announced so far: Hydrogen Shot™, Long Duration Storage Shot™, Carbon Negative Shot™, Enhanced Geothermal Shot™, Floating Offshore Wind Shot™ and Industrial Heat Shot™. They are supported by the three primary Office of Science program offices: Advanced Scientific Computing Research, Basic Energy Sciences and Biological and Environmental Research.

    Jennifer Pett-Ridge, head of LLNL’s Carbon Initiative, will lead a $19 million center called Terraforming Soil,” which will support the Carbon Negative Shot. Of the total award, LLNL will receive ~$17 million.

    LLNL scientist Jiaqi Li will serve as the deputy director for the “Center for Coupled Chemo-Mechanics of Cementitious Composites,” which will support the Enhanced Geothermal Shot. Brookhaven National Laboratory (BNL) leads this center, and LLNL will receive $1.9 million over four years for its role of the project.

    Terraforming Soil

    To reduce the United States’ net carbon dioxide (CO2) emissions to zero and limit the impacts of global warming, it is essential to actively remove CO2 from the atmosphere. Soils store a vast amount of carbon in both organic and inorganic forms — on the order of 3,000 billion tons globally — this is more carbon than is found in the atmosphere and land plants combined.

    While the United States’ 166 million hectares of agricultural soils have lost a vast amount of carbon in the past century due to cultivation and erosion, there is clear potential to reverse this trend and actively manage agricultural lands with strategies that capture CO2 from the atmosphere. The Terraforming Soil Energy Earthshot Research Center (EERC) will research new bio- and geo- engineered techniques to understand, predict and accelerate scalable and affordable CO2 drawdown in soils, via both organic and inorganic carbon cycle pathways.

    “Our goal is to advance the fundamental understanding of CO2 drawdown in soils through both organic and inorganic pathways, measuring soil C storage capacity, durability and regional variations that affect needed land-management practices,” Pett-Ridge said.

    The Terraforming Soil EERC team includes 50 world-class experts in soil carbon cycling, photosynthesis biochemistry, plant/microbial gene engineering and genomics, mineral geochemistry, machine learning, exascale modeling and computing, additive manufacturing and in situ isotope-based characterization.

    The center will bridge cutting-edge analytical and computational studies with a commitment to engage with community stakeholders, exploring the technical, social and economic implications of engineered soil CO2 drawdown. In addition, the team will emphasize diverse training opportunities for students and early career scientists and amplify equity and inclusion throughout the research pipeline.

    Collaborators include the University of California Berkeley, University of California Davis, Rice University, Princeton University, Yale University, Carleton College, Massachusetts Institute of Technology, Northern Arizona University, Colorado State University, Lawrence Berkeley National Laboratory, Pacific Northwest National Laboratory, Andes Ag, Inc. and the Woodwell Climate Research Center.

    Center for Coupled Chemo-Mechanics of Cementitious Composites

    LLNL will conduct fundamental research to understand and predict chemo-mechanics of sustainable materials within Enhanced Geothermal System (EGS) environments and develop new materials to overcome major challenges in deploying cost effective EGS.

    Geothermal well environments are arguably the most challenging for cement to survive, and multiple problems of wells durability and performance are associated with cementing materials and well-cementing methods. These include, but are not limited to, poor cement acid resistance, poor thermal and mechanical stress resistance under cyclic thermo-mechanical loads, poor bonding with metal casing and, as a result, poor casing corrosion protection. Well integrity issues linked to cement degradation and failure are more severe for the high temperature conditions that EGS wells undergo during hydraulic stimulation operations and thermal shocks. Furthermore, cementing operations during geothermal well constructions suffer from cement slurries losses into formations, long waiting times for cement to solidify or rapid uncontrolled cement solidification followed by drill out operations or abandoning the well. 

    “To address the durability and sustainable issues of enhanced geothermal wells, a fundamental understanding of chemo-mechanics of alternative cementitious materials that could provide cost-effective, and sustainable solutions for EGS is required,” Li said.

    The proposed work will focus on gaining fundamental understanding of reaction mechanisms, equilibrium and phase compositions, mechanical properties for cementitious composites under EGS conditions designed industrial wastes. The knowledge generated by the project will form a comprehensive framework for informed development and commercialization of 1) environmentally sustainable, durable, cost effective well materials, including cementitious composites and inorganic coatings and 2) new well designs forgoing the use of cementitious materials in EGS wells. To achieve this goal advanced high-energy analytical and computational techniques will be used in design, monitoring and characterization of model systems.

    Besides BNL and LLNL, collaborators include Sandia National Laboratories, Lawrence Berkeley National Laboratory, Los Alamos National Laboratory, Cornell University, Princeton University, The University of Texas at Austin and the University of Illinois Urbana-Champaign.

    Lawrence Livermore National Laboratory

    Source link

  • Naming and Shaming Can be Effective to Get Countries to Act on Climate

    Naming and Shaming Can be Effective to Get Countries to Act on Climate

    Newswise — Enforcement is one of the biggest challenges to international cooperation on mitigating climate change in the Paris Agreement. The agreement has no formal enforcement mechanism; instead, it is designed to be transparent so countries that fail to meet their obligations will be named and thus shamed into changing behavior. A new study from the University of California San Diego’s School of Global Policy and Strategy shows that this naming-and-shaming mechanism can be an effective incentive for many countries to uphold their pledges to reduce emissions.

    The study, appearing in the Proceedings of the National Academy of Sciences (PNAS), assesses the naming and shaming built into the 2015 Paris Agreement through its Enhanced Transparency Framework (ETF). The ETF requires nations to publicly report their goals and progress toward meeting those goals. The study suggests that the ETF is most effective at motivating countries with the strongest commitments to slowing climate change.

    “The architects of the Paris Agreement knew that powerful enforcement mechanisms, like trade sanctions, wouldn’t be feasible,” said study coauthor David Victor, professor of industrial innovation at UC San Diego’s School of Global Policy and Strategy and co-director of the Deep Decarbonization Initiative. “Most analysts assumed the agreement would fail to be effective without strong enforcement and are skeptical of naming and shaming. Our research suggests that pessimism is wrong. Naming and shaming is built into the system and our study shows that the policy experts who are most knowledgeable about Paris see this mechanism working well—at least for some countries.”

    Naming and shaming doesn’t work everywhere, the study shows; however, it is particularly important for countries that are already highly motivated to act. Even those countries need a spotlight on their behavior, lest they slip and fail to comply with the obligations they set for themselves under the Paris Agreement. 

    In Europe—where countries have the most ambitious and credible climate pledges—the surge in energy prices and interruptions in Russian gas supply created incentives to retain higher-emission energy technologies, such as coal. International visibility and political pressures within those countries plausibly help explain why European policymakers have kept emissions in alignment with their previously committed climate goals.

    In the U.S., naming and shaming is likely to be effective as well, but not to the same degree as in Europe, the study shows.

    “This raises some concern about the ability to maintain the momentum generated by the Inflation Reduction Act under less favorable conditions, such as rising interest rates,” said Emily Carlton, study coauthor and UC San Diego School of Global Policy and Strategy alum.

    Study taps expert opinions of top climate negotiators from around the world

    The findings in the new PNAS study are derived from responses from a sample of registrants of the Conference of Parties (COP), consisting of more than 800 diplomatic and scientific experts who, for decades, have participated in climate policy debates. This expert group is critical to understanding how political institutions shape climate policy because they are the people “in the room” when key policy decisions are made. They are in a unique position to evaluate what is most likely to motivate their countries to act on climate.

    They were asked questions such as: is the ETF in the agreement effective? Do they support the use of the ETF, and is it a legitimate way to enforce the Paris Agreement?

    Overall, 77% of the sample agreed with using naming and shaming—that is, using the ETF for comparing countries’ mitigation efforts. The results further indicate that 57% of all respondents expect naming and shaming to substantially affect the climate policy performance of their home country—where they know the policy environment best.

    While survey respondents’ country of origin was kept anonymous to elicit the most candid responses possible, the respondents that think naming and shaming is most effective are more likely to be from democracies with high-quality political institutions. In addition, these individuals come from countries with strong internal concern about climate change and ambitious and credible international climate commitments, such as countries in Europe.

    The study finds naming and shaming is likely least effective for countries that lack strong democratic institutions, such as some large emitters like China.

    While the inability for naming and shaming to work effectively within the countries least motivated for climate action creates tension, the study does provide a hopeful narrative for enforcing cooperation on climate, according to the authors.

    “It is a really good thing that naming and shaming can keep the most climate-motivated countries on track because decarbonizing is hard and changes in circumstances and energy markets can make it even harder,” said Carlton. “Countries in Europe are some of the biggest emitters and as we saw recently, policymakers could have easily switched back to coal after the Russia’s invasion of Ukraine, but they did not.”

    Who should be the “namers and shamers” and who is most effective at it?

    The survey respondents were also asked which institutions should be responsible for naming and shaming. The results overwhelmingly indicated the preference for namers and shamers to be scientists, as well as neutral international organizations such as the United Nations (U.N.) and Intergovernmental Panel on Climate Change (IPCC). However, past studies have found that both diplomatic and science organizations like the U.N. and IPCC are actually ineffective at naming and shaming.

    “It is not something that these organizations do,” Carlton said. “They are positioned to try to get countries to cooperate and it’s just not a function of theirs to put countries on blast in a judgmental way. That is something you see done more effectively from non-governmental organizations (NGOs) and the media.”

    While naming and shaming is a mechanism that makes cooperation work, the authors believe that other strategies such as trade sanctions may be useful as well. They explored this topic in a recent study.  

    Coauthors of the PNAS paper, “Naming and Shaming as a Strategy for Enforcing the Paris Agreement: The Role of Political Institutions and Public Concern,” include Astrid Dannenberg of University of Kassel and the University of Gothenburg and Marcel Lumkowsky of the University of Kassel.

    University of California San Diego

    Source link

  • Enhancing Chemical Identification Challenges

    Enhancing Chemical Identification Challenges

    Newswise — What chemicals are we exposed to on a daily basis? That is the central question of ‘non-targeted analysis’ or NTA, an emerging field of analytical science that aims to identify all chemicals around us. A daunting task, because how can you be sure to detect everything if you don’t know exactly what you’re looking for? In a paper in Environmental Science and Technology, researchers at the Universities of Amsterdam (UvA, the Netherlands) and Queensland (UQ, Australia) assess this problem. In a meta-analysis of NTA results published over the past six years, they estimate that less than 2% of all chemicals have been identified.

    According to Viktoriia Turkina who performed the research as a PhD student with Dr Saer Samanipour  at the UvA’s Van ‘t Hoff Institute for Molecular Sciences, this limitation underscores the urgent need for a more proactive approach to chemical monitoring and management. “We need to incorporate more data-driven strategies into our studies to be able to effectively protect the human and environmental health”, she says.

    Samanipour explains that current monitoring of chemicals is rather limited since it’s expensive, time consuming, and requires specialized experts. “As an example, in the Netherlands we have one of the most sophisticated monitoring programs for chemicals known to be of concern to human health. Yet we target less than 1000 chemicals. There are far more chemicals out there that we don’t know about.”

    A vast chemical space

    To deal with those chemicals, some 15 to 20 years ago the concept of non-targeted analysis was introduced to look at possible exposure in an unbiased manner. The idea is to take a sample from the environment (air, water, soil, sewer sludge) or the human body (hair, blood, etc ) and analyse it using well-established analytical techniques such as chromatography coupled with high resolution mass spectroscopy. The challenge then is to trace the obtained signal back to the structures of chemicals that may be present in the sample. This will include already known  chemicals, but also chemicals of which the potential presence in the environment is yet unknown.

    In theory, this ‘chemical space’ includes as many as 1060 compounds, an incomprehensible number that exceeds the number of stars in the universe by far. On the other hand, the number of organic and inorganic substances published in the scientific literature and public databases is estimated at around 180 million. To make their research even more manageable, Turkina, Samanipour and co-workers focused on a subset of 60.000 well-described compounds from the NORMAN database. Turkina: “This served as the reference to establish what is covered in NTA studies, and more importantly, to develop an idea about what is being overlooked.”

    The vast ‘exposome’ of chemicals that humans are exposed to on a daily basis is a sign of our times, according to Samanipour. “These days we are soaking in a giant ocean of chemicals. The chemical industry is part of that, but also nature is running all a whole bunch of reactions that result in exposure. And we expose ourselves to chemicals by the stuff we use – think for instance of the problem of microplastics. To solve all this we have to be able to go beyond pointing fingers. With our research, we hope to contribute to finding a solution together. Because we all are in the same boat.”

    Much room for improvement

    The meta analysis, which included 57 NTA papers, revealed that only around 2% of the estimated chemical space was covered. This can indicate that the actual exposure to chemicals is indeed quite low, however, it can also point to shortcomings in the applied analyses. According to Turkina and Samanipour, the latter is indeed the case. They focused on NTA studies applying liquid chromatography coupled with high resolution mass spectrometry (LC-HRMS) -one of the most comprehensive methods for the analysis of complex environmental and biological samples.

    It turned out that there was much room for improvement. For instance in sample preparation, they observed a bias towards specific compounds rather than capturing a more diverse set of chemicals. They also observed poor selection and inconsistent reporting of LC-HRMS parameters and data acquisition methods. “In general”, Samanipour says, “the chemical analysis community is to a great extent driven by the available technology that vendors have developed for specific analysis purposes. Thus the instrumental set-up and data processing methods are rather limited when it comes to non-targeted analysis.”

    To Samanipour, the NTA approach is definitely worth pursuing. “But we need to develop it further and push it forward. Together with vendors we can develop new powerful and more versatile analytical technologies, as well as effective data analysis protocols.” He also advocates a data-driven approach were the theoretical chemical space is ‘back calculated’  towards a subset of chemicals that are highly likely to be present in our environment. “Basically we have to better understand what is the true chemical space of exposure. And once those boundaries are defined, then it becomes a lot easier to assess that number of 2% we have determined.”

    Universiteit van Amsterdam

    Source link

  • Training Birds for Climate Adaptation

    Training Birds for Climate Adaptation

    Newswise — One result of climate change is that spring is arriving earlier. However, migratory birds are not keeping up with these developments and arrive too late for the peak in food availability when it is time for breeding. By getting the birds to fly a little further north, researchers in Lund, Sweden, and the Netherlands have observed that these birds can give their chicks a better start in life.

    Global warming is causing problems for birds in Sweden and elsewhere. Warmer springs mean that caterpillars hatch, grow and pupate earlier compared with just a few decades ago. This has consequences for birds that cannot eat caterpillars that have entered the pupal stage. Therefore, when the food supply runs out at an ever earlier time in the spring, more and more chicks starve during the breeding season. This is a big problem for migratory birds that spend winters in Africa, as they do not know how early spring arrives in Sweden. Could the problem be solved if the migratory birds simply came home and started breeding earlier?

    “It seems that our non-migratory birds are doing this to a certain extent. But, of course, they are present and can feel how early spring will come. We thought that perhaps the migratory birds could fly further north until they find a place with suitable well-developed caterpillars,” says Jan-Åke Nilsson, biology researcher at Lund University in Sweden.

    To test this in practice, the researchers decided to help some Pied Flycatchers along the way. The biologists caught Pied Flycatchers that had arrived prior to breeding in the Netherlands. The birds were then driven during the night to Vombs Fure, an area of pine forest outside Lund in Skåne, where they were released. The peak of caterpillar availability in Skåne is about two weeks later than in the Netherlands – a distance of around 600 kilometres that a Pied Flycatcher could cover in just two nights.

    “The birds that were given a lift from the Netherlands to Skåne synchronised very well with the food peak! As they started to breed about 10 days earlier the “Swedish” Pied Flycatchers they had a dramatically better breeding success than the Swedish ones as well as a better success than the Pied Flycatchers that remained in the Netherlands,” says Jan-Åke Nilsson.

    In addition, it was shown that the chicks of the Dutch Pied Flycatchers that had received migration assistance did not stop in the Netherlands when they returned after their first spring migration. Instead, they continued on to the area of pine forest outside Lund where they were born. Furthermore, they arrived earlier than the Swedish Pied Flycatchers and thereby had more well-fed chicks at Vombs Fure the year after the researchers gave the Pied Flycatchers a helping hand to find Skåne.

    “The number of small birds, particularly migratory birds, has decreased drastically throughout Europe. By flying a little further north, these birds, at least in principle, could synchronise with their food resources and there is hope that robust populations of small birds can be maintained, even though springs are arriving ever earlier,” concludes Jan-Åke Nilsson.

    Lund University

    Source link

  • AI boosts plant observation precision

    AI boosts plant observation precision

    Newswise — Artificial intelligence (AI) can help plant scientists collect and analyze unprecedented volumes of data, which would not be possible using conventional methods. Researchers at the University of Zurich (UZH) have now used big data, machine learning and field observations in the university’s experimental garden to show how plants respond to changes in the environment.

    Climate change is making it increasingly important to know how plants can survive and thrive in a changing environment. Conventional experiments in the lab have shown that plants accumulate pigments in response to environmental factors. To date, such measurements were made by taking samples, which required a part of the plant to be removed and thus damaged. “This labor-intensive method isn’t viable when thousands or millions of samples are needed. Moreover, taking repeated samples damages the plants, which in turn affects observations of how plants respond to environmental factors. There hasn’t been a suitable method for the long-term observation of individual plants within an ecosystem,” says Reiko Akiyama, first author of the study.

    With the support of UZH’s University Research Priority Program (URPP) “Evolution in Action”, a team of researchers has now developed a method that enables scientists to observe plants in nature with great precision. PlantServation is a method that incorporates robust image-acquisition hardware and deep learning-based software to analyze field images, and it works in any kind of weather.

    Millions of images support evolutionary hypothesis of robustness

    Using PlantServation, the researchers collected (top-view) images of Arabidopsis plants on the experimental plots of UZH’s Irchel Campus across three field seasons (lasting five months from fall to spring) and then analyzed the more than four million images using machine learning. The data recorded the species-specific accumulation of a plant pigment called “anthocyanin” as a response to seasonal and annual fluctuations in temperature, light intensity and precipitation.

    PlantServation also enabled the scientists to experimentally replicate what happens after the natural speciation of a hybrid polyploid species. These species develop from a duplication of the entire genome of their ancestors, a common type of species diversification in plants. Many wild and cultivated plants such as wheat and coffee originated in this way.

    In the current study, the anthocyanin content of the hybrid polyploid species A. kamchatica resembled that of its two ancestors: from fall to winter its anthocyanin content was similar to that of the ancestor species originating from a warm region, and from winter to spring it resembled the other species from a colder region. “The results of the study thus confirm that these hybrid polyploids combine the environmental responses of their progenitors, which supports a long-standing hypothesis about the evolution of polyploids,” says Rie Shimizu-Inatsugi, one of the study’s two corresponding authors.

    From Irchel Campus to far-flung regions

    PlantServation was developed in the experimental garden at UZH’s Irchel Campus. “It was crucial for us to be able to use the garden on Irchel Campus to develop PlantServation’s hardware and software, but its application goes even further: when combined with solar power, its hardware can be used even in remote sites. With its economical and robust hardware and open-source software, PlantServation paves the way for many more future biodiversity studies that use AI to investigate plants other than Arabidopsis – from crops such as wheat to wild plants that play a key role for the environment,” says Kentaro Shimizu, corresponding author and co-director of the URPP Evolution in Action.

    The project is an interdisciplinary collaboration with LPIXEL, a company that specializes in AI image analysis, and Japanese research institutes at Kyoto University and the University of Tokyo, among others, under the Global Strategy and Partnerships Funding Scheme of UZH Global Affairs and the International Leading Research grant program of the Japan Society for the Promotion of Science (JSPS). The project also received funding from the Swiss National Science Foundation (SNSF).

    Strategic Partnership with Kyoto University

    Kyoto University is one of UZH’s strategic partner universities. The strategic partnership ensures that high-potential research collaborations will receive the necessary support to thrive, for instance through the UZH Global Strategy and Partnership Funding Scheme. Over the last years, several joint research projects between Kyoto University and UZH have already received funding, among them “PlantServation”.

    University of Zurich

    Source link

  • Cheaper, Abundant Recycled Plastics Can Be Sound Ingredients for Plastic Bottles, Food Packaging

    Cheaper, Abundant Recycled Plastics Can Be Sound Ingredients for Plastic Bottles, Food Packaging

    Newswise — Washington D.C. – New research on the growing uses of recycled polypropylene in plastic packaging finds it performs well and has the potential to meet environmental goals and reduce raw material costs.

    Conclusions suggest both cost optimization of additives can be improved and sustainability goals can be reached by increasing the use of post-consumer recycled (PCR) materials in food packaging.

    These findings come as many packagers are relying more on polypropylene and similar plastics than widely-used polyethylene terephthalate (PET) in bottled water and similar beverages.

    Polypropylene (PP) has a resin identification code of #5 and a high melting point and is often used in containers for hot liquids. It can also be found in yogurt containers, syrup, medicine bottles, caps and straws.

    The new study was conducted by Iowa State University scientists who gathered a small collection of recycled plastics and after testing, found they performed well mechanically in terms of strength, flexibility, integrity and other indicators like heat resistance.

    If future performance studies support these findings, and outside chemicals remain below regulatory limits, it could be a win-win for those seeking more sustainable packaging and efficiencies in their packaging recycling programs.

    According to the authors, “This study demonstrates the viability of a significant source of polypropylene and its notable long-term impacts, increasing profits by using PCR materials.” But the potential upside doesn’t end there.

    “This approach will produce environmentally responsible food plastic packaging in compliance with legislation in the circular economy,” the paper concludes.

    According to Iowa State’s Drs. Keith Vorst and Greg Curtzwiler, the findings are important because “they demonstrate PCR plastics can have higher value than just sustainability alone. PCR materials can also be used as a source of critical additives that would not need to be added to virgin plastics when blended together.”

    According to lead author Dr. Ma. Cristine Concepcion D. Ignacio, the research is unique in that it focuses on “determining the compliance and physical performance of extrusion blow molded material recovery facility (MRF)-recovered post-consumer PP bottle for direct food-contact applications.”

    The article appeared in a recent issue of the peer-reviewed journal Polymers and was supported by IAFNS’ Food Packaging Safety and Sustainability Committee.

    The study is available here.

    The Institute for the Advancement of Food and Nutrition Sciences (IAFNS) is committed to leading positive change across the food and beverage ecosystem. This paper was supported in part by IAFNS’ Food Packaging Safety and Sustainability Committee. IAFNS is a 501(c)(3) science-focused nonprofit uniquely positioned to mobilize government, industry and academia to drive, fund and lead actionable research. iafns.org

    Institute for the Advancement of Food and Nutrition Sciences

    Source link

  • Lithium Sustainability for Decades

    Lithium Sustainability for Decades

    Newswise — On the way towards climate neutrality, Europe will need large amounts of lithium for battery storage systems. So far, however, its share in the worldwide lithium extraction volume has been one percent only. For this reason, researchers of KIT study ways to extract lithium from geothermal sources. “In theory, geothermal power plants in the Upper Rhine Valley and Northern German Basin might cover between 2 and 12 percent of Germany’s annual lithium demand,” says Valentin Goldberg from KIT’s Institute of Applied Geosciences (AGW). With his team, he calculated this potential based on an extensive data analysis. However, it has not been clear for how long extraction will be possible. Another study of the researchers now offers an optimistic perspective. “According to our findings, lithium extraction will be possible for many years at low environmental cost,” Goldberg says. “The model developed for our study describes lithium extraction in the Upper Rhine Valley. But parameters are chosen such that they can also be transferred to other joint systems.“

    Modeling of Geothermal Lithium Production

    Extraction of lithium from thermal waters is no conventional type of mining. That is why no conventional methods could be applied for analysis. “The lithium dissolved in water exists in a widely branched network of joints and cavities in the rock. However, it can only be accessed at certain points via individual wells,” says Dr. Fabian Nitschke, AGW, who was also involved in this study. “The reservoir dimension, hence, depends on the amount of water that can be accessed hydraulically via wells.” To calculate the lithium production potential, researchers had to consider the potential water extraction volume, its lithium concentration, and lithium extraction per unit time. “We use a dynamic transport model adapted to underground conditions in the Upper Rhine Valley. It couples thermal, hydraulic, and chemical processes. Similar models are known from petroleum and gas industry, but have not yet been applied to lithium,” Nitschke points out.

    When using geothermal energy, the extracted water is pumped back into the ground via a second borehole. Researchers wanted to find out whether lithium concentration of the deep water decreases with time. The results show that lithium concentration in the extraction borehole decreases by 30 to 50 percent in the first third of the investigation period of 30 years, as the deep water is diluted by the returned water. Then, lithium concentration remains constant. “This can be attributed to the open joint system that continuously supplies fresh deep water from other directions,” Nitschke says. Modeling suggests that continuous lithium extraction will be possible for decades: “Actually, extraction of this unconventional resource shows the classical cyclic behavior. Yields of hydrocarbon extraction or ore mining are also highest in the beginning and then start to decrease gradually.”

    Sensible Investment in a Sustainable Future

    Thomas Kohl from AGW, who directs the corresponding research activities as Professor for Geothermal Energy and Reservoir Technology, considers the research results another argument in favor of a wide use of geothermal energy. “We already knew that geothermal sources can supply baseload-capable, renewable energy for decades. Our study now reveals that a single power plant in the Upper Rhine Valley could additionally cover up to 3 percent of the annual German lithium consumption.” Kohl’s group is now working on solutions for practical implementation.. Recently, it published a study in Desalination on the preliminary treatment of thermal water for resource extraction. “The next step now is to transfer this technology to the industrial scale,” Kohl says.

    Karlsruhe Institute of Technology (KIT)

    Source link

  • LLNL scientists among finalists for new Gordon Bell climate modeling award

    LLNL scientists among finalists for new Gordon Bell climate modeling award

    Newswise — A team from Lawrence Livermore and seven other Department of Energy (DOE) national laboratories is a finalist for the new Association for Computing Machinery (ACM) Gordon Bell Prize for Climate Modeling for running an unprecedented high-resolution global atmosphere model on the world’s first exascale supercomputer.

    The Gordon Bell submission, led by Energy Exascale Earth System Model (E3SM) chief computational scientist Mark Taylor, details the team’s record-setting demonstration of the Simple Cloud Resolving E3SM Atmosphere Model (SCREAM) on Oak Ridge National Laboratory’s 1.2 exaFLOP (1.2 quintillion computing operations per second) Frontier machine.

    Incorporating state-of-the-art parameterizations for fluid dynamics, microphysics, moist turbulence and radiation, SCREAM is a full-featured atmospheric general-circulation model developed for very fine-resolution simulations on exascale machines. The effort is led by LLNL staff scientist Peter Caldwell, who also heads the Lab’s Climate Modeling group.

    A cornerstone of SCREAM development is computationally-efficient performance-portable design. This feature allows SCREAM to become — as far as the team is aware — the first nonhydrostatic global atmospheric model with resolution finer than 5 kilometers to run on an exascale supercomputer, the first to run at scale on both NVIDIA and AMD Graphics Processing Unit (GPU) systems, and the first to exceed 1 simulated-year-per-day of throughput. SCREAM earned its Gordon Bell finalist position from a record-setting run performed earlier this year, and a revised submission boasts results 54% faster than the original entry, obtaining a performance of 1.26 simulated years per day on 8,192 Frontier nodes.          

    “The Gordon Bell Prize is the highest honor in high performance computing,” said LLNL’s Caldwell. “E3SM is very proud and excited to be finalists for the inaugural year of the Gordon Bell Climate Award. We worked extremely hard for five years to develop a model which makes efficient use of exascale computers, providing more trustworthy and higher-fidelity predictions of future climate than were previously possible. The aim of this new prize matches our goals exactly, so we were hopeful about our chances.”

    What separates SCREAM from other climate models is that it was written in C++ and uses the Kokkos library, enabling it to perform efficiently across the spectrum of computer architectures, Caldwell explained. The design choice allowed the SCREAM team to run on Frontier faster than any other climate model, he said, adding that “most climate and weather models are struggling to take advantage of the GPUs that power most of today’s top powerful supercomputers. SCREAM is of huge interest to other modeling centers as a successful example of how to make this transition.”

    The SCREAM effort grew out of the E3SM team, a multi-lab DOE partnership led by LLNL scientist Dave Bader, that is tasked with developing a state-of-the-art climate modeling, simulation and prediction project for exascale supercomputers. Sponsored by the U.S. Department of Energy’s (DOE’s) Office of Biological and Environmental Research (BER), the E3SM team includes researchers and computational scientists at LLNL and the Sandia, Argonne, Brookhaven, Los Alamos, Lawrence Berkeley, Oak Ridge and Pacific Northwest national laboratories. Other LLNL staff named in the Gordon Bell entry include scientists Aaron Donahue, Chris Terai and Renata McCoy.

    Team members said the achievement represents a breakthrough in climate modeling and a significant milestone for the E3SM project, which aims to bring DOE’s cutting-edge computer science to bear on the climate simulation challenge by simulating the climate system at very high resolution. Such high resolution permits explicit resolution of large convective circulations and other important atmospheric phenomena, thereby avoiding critical sources of uncertainty in traditional climate models, according to researchers. Fine resolution is also necessary to capture critical aspects of the climate that might impact conditions in the United States in the coming decades, such as extreme temperatures, storms and sea-level rise.

    The Gordon Bell Prize for Climate Modeling “aims to recognize innovative parallel computing contributions toward solving the global climate crisis,” according to ACM. It will be awarded for the first time this year at the International Conference for High Performance Computing, Networking, Storage, and Analysis (SC23) in Denver, and accompanied by a $10,000 award provided by Gordon Bell. Winners will be selected based on their potential to impact climate modeling and related fields.

    For more on E3SM, visit https://e3sm.org/.

    Lawrence Livermore National Laboratory

    Source link

  • Polar experiments reveal seasonal cycle in Antarctic sea ice algae

    Polar experiments reveal seasonal cycle in Antarctic sea ice algae

    Newswise — In the frigid waters surrounding Antarctica, an unusual seasonal cycle occurs. During winter, from March to October, the sun barely rises. As seawater freezes it rejects salts, creating pockets of extra-salty brine where microbes live in winter. In summer, the sea ice melts under constant daylight, producing warmer, fresher water at the surface. 

    This remote ecosystem is home to much of the Southern Ocean’s photosynthetic life. A new University of Washington study provides the first measurements of how sea-ice algae and other single-celled life adjust to these seasonal rhythms, offering clues to what might happen as this environment shifts under climate change. 

    The study, published Sept. 15 in the International Society for Microbial Ecology’s ISME Journal, contains some of the first measurements of how sea-ice microbes respond to changing conditions. 

    “We know very little about how sea-ice microbes respond to changes in salinity and temperature,” said lead author Hannah Dawson, a UW postdoctoral researcher who did the work while pursuing her doctorate in oceanography at the UW. “And until now we knew almost nothing about the molecules they produce and use in chemical reactions to stay alive, which are important for supporting higher organisms in the ecosystem as well as for climate impacts, like carbon storage and cloud formation.” 

    The polar oceans play an important role in global ocean currents and in supporting marine ecosystems. Microbes form the base of the food web, supporting larger life forms. 

    “Polar oceans make up a significant portion of the world’s oceans, and these are very productive waters,” said senior author Jodi Young, a UW assistant professor of oceanography. “These waters support big swarms of krill, the whales that come to feed on those krill, and either polar bears or penguins. And the start of that whole ecosystem are these single-celled microscopic algae. We just know so little about them.” 

    The tiny organisms are also important for the climate, since they quietly perform photosynthesis and soak up carbon from the atmosphere. Polar algae are especially good at producing sulfur-containing molecules that give beaches their distinctive smell and, when lofted into the air in sea spray, promote formation of clouds that can reduce penetration of solar rays. 

    Antarctic sea ice, though long stable, is at an all-time record low this year. 

    In other oceans, satellite instruments can capture dramatic seasonal phytoplankton blooms from space — but that isn’t possible for microbes hidden under sea ice. And Antarctic waters are particularly challenging to visit, leaving researchers with almost no measurements in winter. 

    In late 2018, Dawson and co-author Susan Rundell traveled to Palmer Station, a U.S. research station on the West Antarctic Peninsula. They used a small boat to sample seawater and sea ice at the same nearby sites every three days. 

    Back on shore, the two graduate students performed 10-day experiments in tanks to see which microbes grew as temperature and salinity were adjusted to mimic sea-ice formation and melt. They also shipped samples back to Seattle for more complex measurements of the samples’ genetics and metabolites, the small organic molecules produced by the cell. 

    Results revealed how single-celled algae deal with their fluctuating environments. As temperatures drop, the cells produce cryoprotectants, similar to antifreeze, to prevent their cellular fluid from crystallizing. Many of the most common cryoprotectant molecules were the same across different microbial lifeforms. 

    As salinity changes, to avoid either bursting in freshening waters or becoming desiccated like raisins in salty conditions, the cells change the concentration of salt-like organic molecules. Many such molecules serve a dual role as cryoprotectants, to balance conditions inside and outside the cell to maintain water balance. 

    The results show that under short-term temperature and salinity changes, community structure in each sample remained stable while adjusting the production of protective molecules. Different microbe species showed consistent responses to changing conditions. This should simplify modeling future responses to climate change, Young said. 

    Results also hint that the production of omega-3 fatty acids may decline in lower-salinity environments. This would be bad news for consumers of krill oil supplements, and for the marine ecosystem that relies on those algae-derived nutrients. Future research now underway by the UW group aims to confirm that result — especially with the prospect of increasing freshwater input from melting sea ice and glaciers. 

    “We’re interested in how these sea-ice algae contend with changes in temperature, salinity and light under normal conditions,” Dawson said. “But then we also have climate change, which is completely remodeling the landscape in terms of when sea ice is forming, how much sea ice forms, how long it stays before it melts, as well as the quantity of freshwater input from glaciers. So we’re both trying to capture what’s happening now, and also asking how that can inform what might happen in the future.”

    The study was funded by the National Science Foundation, the Simons Foundation, and the Alfred P. Sloan Foundation. Other co-authors are Anitra Ingalls, Jody Deming, Joshua Sacks and Laura Carlson at the UW; Natalia Erazo, Elizabeth Connors and Jeff Bowman at Scripps Institution of Oceanography; and Veronica Mierzejewski at Arizona State University.

     

    ###

     

    University of Washington

    Source link

  • Ohio’s droughts are worse than often recognized, study finds

    Ohio’s droughts are worse than often recognized, study finds

    Newswise — COLUMBUS, Ohio – A new type of analysis suggests that droughts in Ohio were more severe from 2000 to 2019 than standard measurements have suggested.

    Researchers at The Ohio State University developed impacts-based thresholds for drought in Ohio, looking specifically at how corn yield and streamflow were affected by various drought indicators, such as notable changes in soil moisture, crops, and even livestock losses in the state.

    The results suggest this impacts-based approach could give Ohio farmers earlier and more accurate notice when drought conditions are approaching, said Steven Quiring, co-author of the study and a professor of geography at Ohio State.

    “We want to better understand what steps should be taken so that Ohio can better prepare for and also monitor the onset of drought conditions because a lot of the best ways to respond to drought is taking action early,” said Quiring. Moreover, with a more precise early warning system, agriculture producers might be able to save time and money by implementing water restrictions, or by switching to different or more drought-resistant crops. 

    The study was published in the Journal of Hydrometeorology. 

    The Ohio State researchers compared how their method performed at predicting droughts with data from the U.S Drought Monitor (USDM)

    The problem with the USDM is that it uses fixed drought thresholds, or guidelines that use the same parameters to measure changes in all seasons and climate regions of the country. Unfortunately, this one-size-fits-all approach can cause monitoring plans to inaccurately gauge local weather conditions and how they impact those in certain communities, Quiring said.

    By analyzing data from four drought indices commonly used in previous studies to monitor drought intensity across the United States, researchers were able to show that fixed thresholds tend to indicate milder drought conditions in Ohio than are indicated by the impacts-based thresholds identified in their study. 

    It’s why Quiring and his team want to use the impacts-based method to revamp those thresholds to better reflect drought conditions in Ohio, a move that starts by updating The Ohio Emergency Management Agency’s state drought plan. 

    To accomplish their goal, the researchers investigated how data from the four indices impacted streamflow, or how much water discharges over a designated point in a fixed period of time, and Ohio’s total corn yield, mainly because the crop covers an extensive area within the state, and nearly every county grows it. 

    Identifying agricultural drought thresholds that are specific to Ohio is important, said Quiring. Because the impacts of drought can vary from region to region, using the same drought thresholds in California as in Ohio is absurd, he said. Additionally, the types of drought that occur can differ. Ohio, for example, in particular is prone to “flash droughts” — shortages caused by warm weather that can happen quickly over a few days or weeks. 

    “These rapid-onset droughts can be particularly challenging for the agricultural community because they arrive quickly and conditions can rapidly go from normal to drier than normal,” said Quiring. “All of a sudden soil moisture is depleted, the crops are stressed and yield losses and impacts on the ecosystem occur.”  

    The last time severe drought caused major losses in the United States was in 2012 when a record-breaking heat wave resulted in $34.2 billion in economic losses, 123 direct deaths and a 26% decrease in total corn crop yield across the country. 

    As large areas of the country dried out, Ohio’s corn yield dropped from about 160 bushels per acre to 120 bushels per acre within a year. While such considerable losses have not happened since, according to the State Climate Office of Ohio, some areas of the state have experienced abnormally dry drought conditions this year.  

    What’s more, the researchers’ impacts-based method of drought monitoring also takes into account how climate change can worsen flash drought events.

    “One of the impacts that we found to be counterintuitive in Ohio is that with climate change, we do expect more rainfall overall, but we also expect to see more droughts because there are longer periods of time where no rain occurs,” said Quiring. 

    The results of this study suggest that following guidelines that aren’t specific to a region’s issues can end up either systematically underestimating the impacts of severe drought conditions in some locations or overestimating them in others, Quiring said. 

    While it’ll be some time before Quiring’s team can get their research incorporated into the next edition of the state drought plan, the study emphasizes that its methods could easily be applied to other regions beyond Ohio where long-term streamflow and crop yield data are readily available. Optimistically, it could help to improve drought monitoring worldwide and provide useful information to future agriculture producers and decision-makers, said Quiring. 

    “This work is actually timely because it will provide a basis for decision-making in Ohio, rather than using research that’s been done in other parts of the country,” said Quiring. “Hopefully we can give better guidance to those who are making decisions on the ground.”

    This study was supported by the National Integrated Drought Information System (NIDIS). Co-authors were Ning Zhang and Zhiying Li, who were both at Ohio State when the study was conducted. Zhang is now at the University of California, Davis and Li is at Indiana University. 

    #

    Ohio State University

    Source link

  • Peak hurricane season is September, October: MSU experts can comment

    Peak hurricane season is September, October: MSU experts can comment

    Newswise — EAST LANSING, Mich. – Hurricanes Idalia and Lee have already packed a punch, but climatologists are now predicting more hurricanes this season, which doesn’t end until Nov. 30. Though previous projections suggested a milder hurricane season, we’re now on track for the eighth consecutive year of above-average activity. Michigan State University experts provide comments on the scientific, economic and government issues surrounding hurricanes.

    Lifeng Luo is the director of MSU’s Environmental Science and Policy Program, as well as a professor in the Department of Geography, Environment and Spatial Sciences in the College of Social Science. Luo is an expert in the variability and predictability of climate, hydro climatology, and resource management, among other areas.

    Contact: [email protected]

    “A number of factors are at play in the formation and intensification of tropical storms, and the most important one is the warm ocean. More specifically, the sea surface temperature needs to be at least 80 F or 26.5 C for storms to develop. As the ocean has absorbed a large amount of heat due to global warming, the sea surface temperature has been going up gradually over the last century. Trends can be stronger locally in some regions, such as the North Atlantic and Gulf of Mexico. Other factors include circulation patterns and modes of climate variability like El Nino. Additionally, La Nina tends to increase the number of tropical storms in the Atlantic basin due to reduced vertical wind shear. With three La Ninas in a row in the last three years, climate variability may also contribute to the fact that you see consecutive above-normal hurricane seasons.

    “In terms of natural disasters, Michigan is among the safest states in the US. The impact of Atlantic hurricanes here has been limited given how far we are from the east coast at this latitude and the typical storm tracks. We can still see rainfall (sometimes heavy) associated with a hurricane after it makes landfall and if it moves northward, but it can hardly produce torrential rainfall as typically seen in the rain bands of the hurricane.”

    Mark Skidmore is the Morris Chair in State and Local Government Finance and Policy as well as the resident fellow at MSU Extension’s Center for Local Government Finance and Policy. Additionally, Skidmore is an economics professor in both the colleges of Social Science and Agriculture and Natural Resources. He is an expert in the relationship between government activities and economic development, including incentives, as well as the economics of natural disasters.

    Contact: [email protected]

    “According to the National Oceanic and Atmospheric Administration, or NOAA, the United States experienced 363 weather-related disasters over the 1980-2023 period. Estimates indicate that these disasters resulted in $2.59 trillion in damages of which roughly half are attributable to hurricanes and tropical storms. Though there is significant variability in damages from storm to storm, on average each storm results in about $1 billion in damages.

    “There are several tiers of support that help communities rebuild. As an immediate response, the priority is to provide access to basic needs such as food, water, shelter, fuel and the restoration of electricity and communications. As core needs are met, authorities may focus on rebuilding damaged public infrastructure. Finally, resources flowing in from insurance, private savings and governments help households and business regain a foothold and reestablish operations. Longer-term, it is often helpful to review weaknesses in infrastructure and preparations to reduce vulnerability in the future. 

    “Federal government assistance sometimes weakens incentives for households, businesses and subnational governments to take disaster risk-reduction measures. Why engage in otherwise appropriate risk-reduction measures when federal assistance is available? For example, a property owner may be more inclined to build a vacation home on an exposed beach if it is known that the government will help pay for repairs. Thus, there is tension between providing a safety net for those exposed to disasters and increasing exposure to disasters.”

    Seven Mattes is an assistant professor at the Center for Integrative Studies in the College of Social Science. Mattes is an expert in disaster preparedness and multispecies resiliency, as well as animal studies.

    Contact: [email protected]

    “While hurricanes are a part of life for coastal residents, both the storms and the local populations have increased in number and intensity. As anthropogenic climate change increases the number of storms and human population grows in coastal regions, how we approach preparedness is an ongoing adaptive effort to the new conditions. Thus, while improvements in preparedness have been implemented in coastal states across the U.S., numerous vulnerabilities remain. There are innumerable recommendations for improving hurricane preparedness in the U.S.

    • Strengthening those natural structures that have historically shielded the habitats of humans and nonhumans alike — wetlands, salt marshes, reefs, dunes, mangrove forests, etc. — is an effective means to improve resilience to hurricane impacts. Preserving and valuing natural structures protect against storm surges, flooding and other damaging forces while also supporting the wildlife that reside within.  
    • Improving existing infrastructure to withstand intensified impacts — especially in low-income communities — is urgently needed.  
    • Funding programs and incentives to educate and organize on the local level are essential — learning from, building on and sharing local knowledge ensures community preparedness.  
    • Addressing the preventable vulnerability that results from developing hurricane-prone zones, like building homes and structures in low-lying coastal areas, drains resources at all stages of disaster preparedness.  
    • Including companion species in planning and policy insofar as they impact human safety and decision-making like the PETS Act following Hurricane Katrina. Agricultural animals are especially vulnerable to hurricane impacts, as we saw with Hurricane Florence — millions of chickens and thousands of hogs were killed in the resulting floods. Approaching disaster preparedness with an awareness of the broader multispecies communities in which they live can aid in building resiliency for all within.”

    Simone Theresa Peinkofer is an associate professor in the Department of Supply Chain Management in the Broad College of Business, and she also serves as the director of the college’s Logistics Doctoral Program. Peinkofer is an expert in retail supply chain management, consumer-based strategy in supply chain management and omnichannel fulfillment operations.

    Contact: [email protected]

    “Depending on the path of the hurricane, it can delay freight movement. For example, ports and airports might shut down for an extended period, and the high winds and rainfall can make the movement of freight via train and trucks impossible and unsafe. Hurricanes can also damage goods that are in transit or stored in a warehouse if the warehouse is in the path of the storm. Hence, hurricanes can lead to loss in revenue and potentially higher prices for businesses. Depending on the region in the world, hurricanes or typhoons or cyclones can impact global supply chains. For example, Vietnam’s typhoon season is year-round and so typhoons can also shut down key manufacturing plants and delay or damage international freight. 

    “Companies should have a risk management plan in place that helps guide them through the disaster and especially through the recovery efforts. Additionally, companies in the path of a hurricane would want to closely monitor the situation and prepare accordingly by, for example, rerouting freight to a different port or airport. It’s important to act early on.”

    ###

    Michigan State University has been advancing the common good with uncommon will for more than 165 years. One of the world’s leading research universities, MSU pushes the boundaries of discovery to make a better, safer, healthier world for all while providing life-changing opportunities to a diverse and inclusive academic community through more than 400 programs of study in 17 degree-granting colleges.

    For MSU news on the Web, go to MSUToday. Follow MSU News on Twitter at twitter.com/MSUnews.

    Michigan State University

    Source link