ReportWire

Tag: Environmental Science

  • Lawrence Livermore grabs two spots in DOE’s Energy Earthshot program

    Lawrence Livermore grabs two spots in DOE’s Energy Earthshot program

    [ad_1]

    Newswise — Lawrence Livermore National Laboratory scientists will lead and co-lead projects in support of the Department of Energy’s (DOE) new Energy Earthshot program.

    The Energy Earthshots Initiative calls for innovation and collaboration to tackle the toughest topics in energy-related research. In January, DOE announced Office of Science funding for the Energy Earthshot Research Centers (EERCs)—they will build off a concept the DOE successfully demonstrated in the previous Energy Frontier Research Centers (EFRCs) and the Scientific Discovery Through Advanced Computing (SciDAC) program. The new EERCs will support fundamental research to accelerate breakthroughs in support of the Energy Earthshots Initiative.

    The Energy Earthshots are designed to stimulate integrated program development and execution across the DOE’s basic science and energy technology offices. They are part of an all-hands-on-deck approach to provide science and technology innovations that the nation needs to address tough technological challenges required to achieve our climate goals. The Energy Earthshots will accelerate breakthroughs toward more abundant, affordable and reliable clean energy solutions and the carbon dioxide removal needed to counterbalance hard-to-abate greenhouse gas emissions.

    Six Energy Earthshots have been announced so far: Hydrogen Shot™, Long Duration Storage Shot™, Carbon Negative Shot™, Enhanced Geothermal Shot™, Floating Offshore Wind Shot™ and Industrial Heat Shot™. They are supported by the three primary Office of Science program offices: Advanced Scientific Computing Research, Basic Energy Sciences and Biological and Environmental Research.

    Jennifer Pett-Ridge, head of LLNL’s Carbon Initiative, will lead a $19 million center called Terraforming Soil,” which will support the Carbon Negative Shot. Of the total award, LLNL will receive ~$17 million.

    LLNL scientist Jiaqi Li will serve as the deputy director for the “Center for Coupled Chemo-Mechanics of Cementitious Composites,” which will support the Enhanced Geothermal Shot. Brookhaven National Laboratory (BNL) leads this center, and LLNL will receive $1.9 million over four years for its role of the project.

    Terraforming Soil

    To reduce the United States’ net carbon dioxide (CO2) emissions to zero and limit the impacts of global warming, it is essential to actively remove CO2 from the atmosphere. Soils store a vast amount of carbon in both organic and inorganic forms — on the order of 3,000 billion tons globally — this is more carbon than is found in the atmosphere and land plants combined.

    While the United States’ 166 million hectares of agricultural soils have lost a vast amount of carbon in the past century due to cultivation and erosion, there is clear potential to reverse this trend and actively manage agricultural lands with strategies that capture CO2 from the atmosphere. The Terraforming Soil Energy Earthshot Research Center (EERC) will research new bio- and geo- engineered techniques to understand, predict and accelerate scalable and affordable CO2 drawdown in soils, via both organic and inorganic carbon cycle pathways.

    “Our goal is to advance the fundamental understanding of CO2 drawdown in soils through both organic and inorganic pathways, measuring soil C storage capacity, durability and regional variations that affect needed land-management practices,” Pett-Ridge said.

    The Terraforming Soil EERC team includes 50 world-class experts in soil carbon cycling, photosynthesis biochemistry, plant/microbial gene engineering and genomics, mineral geochemistry, machine learning, exascale modeling and computing, additive manufacturing and in situ isotope-based characterization.

    The center will bridge cutting-edge analytical and computational studies with a commitment to engage with community stakeholders, exploring the technical, social and economic implications of engineered soil CO2 drawdown. In addition, the team will emphasize diverse training opportunities for students and early career scientists and amplify equity and inclusion throughout the research pipeline.

    Collaborators include the University of California Berkeley, University of California Davis, Rice University, Princeton University, Yale University, Carleton College, Massachusetts Institute of Technology, Northern Arizona University, Colorado State University, Lawrence Berkeley National Laboratory, Pacific Northwest National Laboratory, Andes Ag, Inc. and the Woodwell Climate Research Center.

    Center for Coupled Chemo-Mechanics of Cementitious Composites

    LLNL will conduct fundamental research to understand and predict chemo-mechanics of sustainable materials within Enhanced Geothermal System (EGS) environments and develop new materials to overcome major challenges in deploying cost effective EGS.

    Geothermal well environments are arguably the most challenging for cement to survive, and multiple problems of wells durability and performance are associated with cementing materials and well-cementing methods. These include, but are not limited to, poor cement acid resistance, poor thermal and mechanical stress resistance under cyclic thermo-mechanical loads, poor bonding with metal casing and, as a result, poor casing corrosion protection. Well integrity issues linked to cement degradation and failure are more severe for the high temperature conditions that EGS wells undergo during hydraulic stimulation operations and thermal shocks. Furthermore, cementing operations during geothermal well constructions suffer from cement slurries losses into formations, long waiting times for cement to solidify or rapid uncontrolled cement solidification followed by drill out operations or abandoning the well. 

    “To address the durability and sustainable issues of enhanced geothermal wells, a fundamental understanding of chemo-mechanics of alternative cementitious materials that could provide cost-effective, and sustainable solutions for EGS is required,” Li said.

    The proposed work will focus on gaining fundamental understanding of reaction mechanisms, equilibrium and phase compositions, mechanical properties for cementitious composites under EGS conditions designed industrial wastes. The knowledge generated by the project will form a comprehensive framework for informed development and commercialization of 1) environmentally sustainable, durable, cost effective well materials, including cementitious composites and inorganic coatings and 2) new well designs forgoing the use of cementitious materials in EGS wells. To achieve this goal advanced high-energy analytical and computational techniques will be used in design, monitoring and characterization of model systems.

    Besides BNL and LLNL, collaborators include Sandia National Laboratories, Lawrence Berkeley National Laboratory, Los Alamos National Laboratory, Cornell University, Princeton University, The University of Texas at Austin and the University of Illinois Urbana-Champaign.

    [ad_2]

    Lawrence Livermore National Laboratory

    Source link

  • Bioparticles vital in Arctic cloud ice formation

    Bioparticles vital in Arctic cloud ice formation

    [ad_1]

    Newswise — An international team of scientists from Sweden, Norway, Japan, and Switzerland, has presented research findings that reveal a crucial role of biological particles, including pollen, spores, and bacteria, in the formation of ice within Arctic clouds. These findings, published today in Nature Communications, have far-reaching implications for climate science and our understanding of the rapidly changing Arctic climate.

    The research, whose outcomes have unveiled the connection between biological particles and the formation of ice in Arctic clouds, was conducted over multiple years at the Zeppelin Observatory, situated on the remote Norwegian archipelago of Svalbard, Norway, in the High Arctic. Gabriel Freitas, lead author and PhD student at Stockholm University, detailed their innovative approach:
    “We have individually identified and counted these biological particles using a sensitive optical technique reliant on light scattering and UV-induced fluorescence. This precision is essential as we navigate through the challenge of detecting these particles in minuscule concentrations, akin to finding a needle in a haystack.”

    Sugar alcohols as indicators of fungal spores
    The study delved into the seasonal dynamics of biological particles, establishing correlations with variables such as snow cover, temperature, and meteorological parameters. Furthermore, the presence of biological particles was confirmed through various methodologies, including electron microscopy and the detection of specific substances, such as the sugar alcohol compounds arabitol and mannitol.
    Karl Espen Yttri, senior scientist at the Climate and Environmental Research Institute NILU and a co-author of the study, underscored that: “While arabitol and mannitol are present in various microorganisms, their presence in air are related to fungal spores, and might originate both from local sources or from long range atmospheric transport”.

    Microbes contribute to ice nucleation at Zeppelin Observatory
    The quantification of ice nucleating particles and understanding their properties proved to be a cumbersome challenge. Researchers employed two distinct methods, involving the collection of particles on filters over a week, followed by rigorous laboratory analysis.

    Yutaka Tobo, Associate Professor at the National Institute of Polar Research in Japan and co-author of the study, described their strategy: “Our method can quantify the ice nucleating ability of aerosol particles immersed in water droplets at temperatures ranging from 0°C down to about -30°C, thereby revealing the concentration of ambient ice nucleating particles active in Arctic low-level clouds.”

    Franz Conen, Research Fellow at the University of Basel, Switzerland, added, “By subjecting the filters to additional heating at 95°C, we could identify the proteinaceous component of ice nucleating particles, shedding light on their potential biological origin. Our findings unequivocally establish the prevalence of biological particles contributing to ice nucleation at Zeppelin Observatory.”

    Paul Zieger, Associate Professor at Stockholm University and co-author, emphasized the important implication of these findings for climate science: “This research offers critical insights into the origin and properties of biological and ice nucleating particles in the Arctic that could enable climate model developers to improve the representation of aerosol-cloud interactions in models and reduce uncertainties related to anthropogenic radiative forcing estimates.”

    Increases in open ocean areas and snow-free tundra, both sources of biological particles in the Arctic, are expected in the coming decades. Therefore, gaining a deeper understanding of the relationship between these particles and clouds may provide valuable insights into the ongoing and future transformations occurring in the Arctic.

    Read article in Nature Communications: Regionally sourced bioaerosols drive high-temperature ice nucleating particles in the Arctic

    [ad_2]

    Stockholm University

    Source link

  • Naming and Shaming Can be Effective to Get Countries to Act on Climate

    Naming and Shaming Can be Effective to Get Countries to Act on Climate

    [ad_1]

    Newswise — Enforcement is one of the biggest challenges to international cooperation on mitigating climate change in the Paris Agreement. The agreement has no formal enforcement mechanism; instead, it is designed to be transparent so countries that fail to meet their obligations will be named and thus shamed into changing behavior. A new study from the University of California San Diego’s School of Global Policy and Strategy shows that this naming-and-shaming mechanism can be an effective incentive for many countries to uphold their pledges to reduce emissions.

    The study, appearing in the Proceedings of the National Academy of Sciences (PNAS), assesses the naming and shaming built into the 2015 Paris Agreement through its Enhanced Transparency Framework (ETF). The ETF requires nations to publicly report their goals and progress toward meeting those goals. The study suggests that the ETF is most effective at motivating countries with the strongest commitments to slowing climate change.

    “The architects of the Paris Agreement knew that powerful enforcement mechanisms, like trade sanctions, wouldn’t be feasible,” said study coauthor David Victor, professor of industrial innovation at UC San Diego’s School of Global Policy and Strategy and co-director of the Deep Decarbonization Initiative. “Most analysts assumed the agreement would fail to be effective without strong enforcement and are skeptical of naming and shaming. Our research suggests that pessimism is wrong. Naming and shaming is built into the system and our study shows that the policy experts who are most knowledgeable about Paris see this mechanism working well—at least for some countries.”

    Naming and shaming doesn’t work everywhere, the study shows; however, it is particularly important for countries that are already highly motivated to act. Even those countries need a spotlight on their behavior, lest they slip and fail to comply with the obligations they set for themselves under the Paris Agreement. 

    In Europe—where countries have the most ambitious and credible climate pledges—the surge in energy prices and interruptions in Russian gas supply created incentives to retain higher-emission energy technologies, such as coal. International visibility and political pressures within those countries plausibly help explain why European policymakers have kept emissions in alignment with their previously committed climate goals.

    In the U.S., naming and shaming is likely to be effective as well, but not to the same degree as in Europe, the study shows.

    “This raises some concern about the ability to maintain the momentum generated by the Inflation Reduction Act under less favorable conditions, such as rising interest rates,” said Emily Carlton, study coauthor and UC San Diego School of Global Policy and Strategy alum.

    Study taps expert opinions of top climate negotiators from around the world

    The findings in the new PNAS study are derived from responses from a sample of registrants of the Conference of Parties (COP), consisting of more than 800 diplomatic and scientific experts who, for decades, have participated in climate policy debates. This expert group is critical to understanding how political institutions shape climate policy because they are the people “in the room” when key policy decisions are made. They are in a unique position to evaluate what is most likely to motivate their countries to act on climate.

    They were asked questions such as: is the ETF in the agreement effective? Do they support the use of the ETF, and is it a legitimate way to enforce the Paris Agreement?

    Overall, 77% of the sample agreed with using naming and shaming—that is, using the ETF for comparing countries’ mitigation efforts. The results further indicate that 57% of all respondents expect naming and shaming to substantially affect the climate policy performance of their home country—where they know the policy environment best.

    While survey respondents’ country of origin was kept anonymous to elicit the most candid responses possible, the respondents that think naming and shaming is most effective are more likely to be from democracies with high-quality political institutions. In addition, these individuals come from countries with strong internal concern about climate change and ambitious and credible international climate commitments, such as countries in Europe.

    The study finds naming and shaming is likely least effective for countries that lack strong democratic institutions, such as some large emitters like China.

    While the inability for naming and shaming to work effectively within the countries least motivated for climate action creates tension, the study does provide a hopeful narrative for enforcing cooperation on climate, according to the authors.

    “It is a really good thing that naming and shaming can keep the most climate-motivated countries on track because decarbonizing is hard and changes in circumstances and energy markets can make it even harder,” said Carlton. “Countries in Europe are some of the biggest emitters and as we saw recently, policymakers could have easily switched back to coal after the Russia’s invasion of Ukraine, but they did not.”

    Who should be the “namers and shamers” and who is most effective at it?

    The survey respondents were also asked which institutions should be responsible for naming and shaming. The results overwhelmingly indicated the preference for namers and shamers to be scientists, as well as neutral international organizations such as the United Nations (U.N.) and Intergovernmental Panel on Climate Change (IPCC). However, past studies have found that both diplomatic and science organizations like the U.N. and IPCC are actually ineffective at naming and shaming.

    “It is not something that these organizations do,” Carlton said. “They are positioned to try to get countries to cooperate and it’s just not a function of theirs to put countries on blast in a judgmental way. That is something you see done more effectively from non-governmental organizations (NGOs) and the media.”

    While naming and shaming is a mechanism that makes cooperation work, the authors believe that other strategies such as trade sanctions may be useful as well. They explored this topic in a recent study.  

    Coauthors of the PNAS paper, “Naming and Shaming as a Strategy for Enforcing the Paris Agreement: The Role of Political Institutions and Public Concern,” include Astrid Dannenberg of University of Kassel and the University of Gothenburg and Marcel Lumkowsky of the University of Kassel.

    [ad_2]

    University of California San Diego

    Source link

  • Enhancing Chemical Identification Challenges

    Enhancing Chemical Identification Challenges

    [ad_1]

    Newswise — What chemicals are we exposed to on a daily basis? That is the central question of ‘non-targeted analysis’ or NTA, an emerging field of analytical science that aims to identify all chemicals around us. A daunting task, because how can you be sure to detect everything if you don’t know exactly what you’re looking for? In a paper in Environmental Science and Technology, researchers at the Universities of Amsterdam (UvA, the Netherlands) and Queensland (UQ, Australia) assess this problem. In a meta-analysis of NTA results published over the past six years, they estimate that less than 2% of all chemicals have been identified.

    According to Viktoriia Turkina who performed the research as a PhD student with Dr Saer Samanipour  at the UvA’s Van ‘t Hoff Institute for Molecular Sciences, this limitation underscores the urgent need for a more proactive approach to chemical monitoring and management. “We need to incorporate more data-driven strategies into our studies to be able to effectively protect the human and environmental health”, she says.

    Samanipour explains that current monitoring of chemicals is rather limited since it’s expensive, time consuming, and requires specialized experts. “As an example, in the Netherlands we have one of the most sophisticated monitoring programs for chemicals known to be of concern to human health. Yet we target less than 1000 chemicals. There are far more chemicals out there that we don’t know about.”

    A vast chemical space

    To deal with those chemicals, some 15 to 20 years ago the concept of non-targeted analysis was introduced to look at possible exposure in an unbiased manner. The idea is to take a sample from the environment (air, water, soil, sewer sludge) or the human body (hair, blood, etc ) and analyse it using well-established analytical techniques such as chromatography coupled with high resolution mass spectroscopy. The challenge then is to trace the obtained signal back to the structures of chemicals that may be present in the sample. This will include already known  chemicals, but also chemicals of which the potential presence in the environment is yet unknown.

    In theory, this ‘chemical space’ includes as many as 1060 compounds, an incomprehensible number that exceeds the number of stars in the universe by far. On the other hand, the number of organic and inorganic substances published in the scientific literature and public databases is estimated at around 180 million. To make their research even more manageable, Turkina, Samanipour and co-workers focused on a subset of 60.000 well-described compounds from the NORMAN database. Turkina: “This served as the reference to establish what is covered in NTA studies, and more importantly, to develop an idea about what is being overlooked.”

    The vast ‘exposome’ of chemicals that humans are exposed to on a daily basis is a sign of our times, according to Samanipour. “These days we are soaking in a giant ocean of chemicals. The chemical industry is part of that, but also nature is running all a whole bunch of reactions that result in exposure. And we expose ourselves to chemicals by the stuff we use – think for instance of the problem of microplastics. To solve all this we have to be able to go beyond pointing fingers. With our research, we hope to contribute to finding a solution together. Because we all are in the same boat.”

    Much room for improvement

    The meta analysis, which included 57 NTA papers, revealed that only around 2% of the estimated chemical space was covered. This can indicate that the actual exposure to chemicals is indeed quite low, however, it can also point to shortcomings in the applied analyses. According to Turkina and Samanipour, the latter is indeed the case. They focused on NTA studies applying liquid chromatography coupled with high resolution mass spectrometry (LC-HRMS) -one of the most comprehensive methods for the analysis of complex environmental and biological samples.

    It turned out that there was much room for improvement. For instance in sample preparation, they observed a bias towards specific compounds rather than capturing a more diverse set of chemicals. They also observed poor selection and inconsistent reporting of LC-HRMS parameters and data acquisition methods. “In general”, Samanipour says, “the chemical analysis community is to a great extent driven by the available technology that vendors have developed for specific analysis purposes. Thus the instrumental set-up and data processing methods are rather limited when it comes to non-targeted analysis.”

    To Samanipour, the NTA approach is definitely worth pursuing. “But we need to develop it further and push it forward. Together with vendors we can develop new powerful and more versatile analytical technologies, as well as effective data analysis protocols.” He also advocates a data-driven approach were the theoretical chemical space is ‘back calculated’  towards a subset of chemicals that are highly likely to be present in our environment. “Basically we have to better understand what is the true chemical space of exposure. And once those boundaries are defined, then it becomes a lot easier to assess that number of 2% we have determined.”

    [ad_2]

    Universiteit van Amsterdam

    Source link

  • Training Birds for Climate Adaptation

    Training Birds for Climate Adaptation

    [ad_1]

    Newswise — One result of climate change is that spring is arriving earlier. However, migratory birds are not keeping up with these developments and arrive too late for the peak in food availability when it is time for breeding. By getting the birds to fly a little further north, researchers in Lund, Sweden, and the Netherlands have observed that these birds can give their chicks a better start in life.

    Global warming is causing problems for birds in Sweden and elsewhere. Warmer springs mean that caterpillars hatch, grow and pupate earlier compared with just a few decades ago. This has consequences for birds that cannot eat caterpillars that have entered the pupal stage. Therefore, when the food supply runs out at an ever earlier time in the spring, more and more chicks starve during the breeding season. This is a big problem for migratory birds that spend winters in Africa, as they do not know how early spring arrives in Sweden. Could the problem be solved if the migratory birds simply came home and started breeding earlier?

    “It seems that our non-migratory birds are doing this to a certain extent. But, of course, they are present and can feel how early spring will come. We thought that perhaps the migratory birds could fly further north until they find a place with suitable well-developed caterpillars,” says Jan-Åke Nilsson, biology researcher at Lund University in Sweden.

    To test this in practice, the researchers decided to help some Pied Flycatchers along the way. The biologists caught Pied Flycatchers that had arrived prior to breeding in the Netherlands. The birds were then driven during the night to Vombs Fure, an area of pine forest outside Lund in Skåne, where they were released. The peak of caterpillar availability in Skåne is about two weeks later than in the Netherlands – a distance of around 600 kilometres that a Pied Flycatcher could cover in just two nights.

    “The birds that were given a lift from the Netherlands to Skåne synchronised very well with the food peak! As they started to breed about 10 days earlier the “Swedish” Pied Flycatchers they had a dramatically better breeding success than the Swedish ones as well as a better success than the Pied Flycatchers that remained in the Netherlands,” says Jan-Åke Nilsson.

    In addition, it was shown that the chicks of the Dutch Pied Flycatchers that had received migration assistance did not stop in the Netherlands when they returned after their first spring migration. Instead, they continued on to the area of pine forest outside Lund where they were born. Furthermore, they arrived earlier than the Swedish Pied Flycatchers and thereby had more well-fed chicks at Vombs Fure the year after the researchers gave the Pied Flycatchers a helping hand to find Skåne.

    “The number of small birds, particularly migratory birds, has decreased drastically throughout Europe. By flying a little further north, these birds, at least in principle, could synchronise with their food resources and there is hope that robust populations of small birds can be maintained, even though springs are arriving ever earlier,” concludes Jan-Åke Nilsson.

    [ad_2]

    Lund University

    Source link

  • AI boosts plant observation precision

    AI boosts plant observation precision

    [ad_1]

    Newswise — Artificial intelligence (AI) can help plant scientists collect and analyze unprecedented volumes of data, which would not be possible using conventional methods. Researchers at the University of Zurich (UZH) have now used big data, machine learning and field observations in the university’s experimental garden to show how plants respond to changes in the environment.

    Climate change is making it increasingly important to know how plants can survive and thrive in a changing environment. Conventional experiments in the lab have shown that plants accumulate pigments in response to environmental factors. To date, such measurements were made by taking samples, which required a part of the plant to be removed and thus damaged. “This labor-intensive method isn’t viable when thousands or millions of samples are needed. Moreover, taking repeated samples damages the plants, which in turn affects observations of how plants respond to environmental factors. There hasn’t been a suitable method for the long-term observation of individual plants within an ecosystem,” says Reiko Akiyama, first author of the study.

    With the support of UZH’s University Research Priority Program (URPP) “Evolution in Action”, a team of researchers has now developed a method that enables scientists to observe plants in nature with great precision. PlantServation is a method that incorporates robust image-acquisition hardware and deep learning-based software to analyze field images, and it works in any kind of weather.

    Millions of images support evolutionary hypothesis of robustness

    Using PlantServation, the researchers collected (top-view) images of Arabidopsis plants on the experimental plots of UZH’s Irchel Campus across three field seasons (lasting five months from fall to spring) and then analyzed the more than four million images using machine learning. The data recorded the species-specific accumulation of a plant pigment called “anthocyanin” as a response to seasonal and annual fluctuations in temperature, light intensity and precipitation.

    PlantServation also enabled the scientists to experimentally replicate what happens after the natural speciation of a hybrid polyploid species. These species develop from a duplication of the entire genome of their ancestors, a common type of species diversification in plants. Many wild and cultivated plants such as wheat and coffee originated in this way.

    In the current study, the anthocyanin content of the hybrid polyploid species A. kamchatica resembled that of its two ancestors: from fall to winter its anthocyanin content was similar to that of the ancestor species originating from a warm region, and from winter to spring it resembled the other species from a colder region. “The results of the study thus confirm that these hybrid polyploids combine the environmental responses of their progenitors, which supports a long-standing hypothesis about the evolution of polyploids,” says Rie Shimizu-Inatsugi, one of the study’s two corresponding authors.

    From Irchel Campus to far-flung regions

    PlantServation was developed in the experimental garden at UZH’s Irchel Campus. “It was crucial for us to be able to use the garden on Irchel Campus to develop PlantServation’s hardware and software, but its application goes even further: when combined with solar power, its hardware can be used even in remote sites. With its economical and robust hardware and open-source software, PlantServation paves the way for many more future biodiversity studies that use AI to investigate plants other than Arabidopsis – from crops such as wheat to wild plants that play a key role for the environment,” says Kentaro Shimizu, corresponding author and co-director of the URPP Evolution in Action.

    The project is an interdisciplinary collaboration with LPIXEL, a company that specializes in AI image analysis, and Japanese research institutes at Kyoto University and the University of Tokyo, among others, under the Global Strategy and Partnerships Funding Scheme of UZH Global Affairs and the International Leading Research grant program of the Japan Society for the Promotion of Science (JSPS). The project also received funding from the Swiss National Science Foundation (SNSF).

    Strategic Partnership with Kyoto University

    Kyoto University is one of UZH’s strategic partner universities. The strategic partnership ensures that high-potential research collaborations will receive the necessary support to thrive, for instance through the UZH Global Strategy and Partnership Funding Scheme. Over the last years, several joint research projects between Kyoto University and UZH have already received funding, among them “PlantServation”.

    [ad_2]

    University of Zurich

    Source link

  • Cheaper, Abundant Recycled Plastics Can Be Sound Ingredients for Plastic Bottles, Food Packaging

    Cheaper, Abundant Recycled Plastics Can Be Sound Ingredients for Plastic Bottles, Food Packaging

    [ad_1]

    Newswise — Washington D.C. – New research on the growing uses of recycled polypropylene in plastic packaging finds it performs well and has the potential to meet environmental goals and reduce raw material costs.

    Conclusions suggest both cost optimization of additives can be improved and sustainability goals can be reached by increasing the use of post-consumer recycled (PCR) materials in food packaging.

    These findings come as many packagers are relying more on polypropylene and similar plastics than widely-used polyethylene terephthalate (PET) in bottled water and similar beverages.

    Polypropylene (PP) has a resin identification code of #5 and a high melting point and is often used in containers for hot liquids. It can also be found in yogurt containers, syrup, medicine bottles, caps and straws.

    The new study was conducted by Iowa State University scientists who gathered a small collection of recycled plastics and after testing, found they performed well mechanically in terms of strength, flexibility, integrity and other indicators like heat resistance.

    If future performance studies support these findings, and outside chemicals remain below regulatory limits, it could be a win-win for those seeking more sustainable packaging and efficiencies in their packaging recycling programs.

    According to the authors, “This study demonstrates the viability of a significant source of polypropylene and its notable long-term impacts, increasing profits by using PCR materials.” But the potential upside doesn’t end there.

    “This approach will produce environmentally responsible food plastic packaging in compliance with legislation in the circular economy,” the paper concludes.

    According to Iowa State’s Drs. Keith Vorst and Greg Curtzwiler, the findings are important because “they demonstrate PCR plastics can have higher value than just sustainability alone. PCR materials can also be used as a source of critical additives that would not need to be added to virgin plastics when blended together.”

    According to lead author Dr. Ma. Cristine Concepcion D. Ignacio, the research is unique in that it focuses on “determining the compliance and physical performance of extrusion blow molded material recovery facility (MRF)-recovered post-consumer PP bottle for direct food-contact applications.”

    The article appeared in a recent issue of the peer-reviewed journal Polymers and was supported by IAFNS’ Food Packaging Safety and Sustainability Committee.

    The study is available here.

    The Institute for the Advancement of Food and Nutrition Sciences (IAFNS) is committed to leading positive change across the food and beverage ecosystem. This paper was supported in part by IAFNS’ Food Packaging Safety and Sustainability Committee. IAFNS is a 501(c)(3) science-focused nonprofit uniquely positioned to mobilize government, industry and academia to drive, fund and lead actionable research. iafns.org

    [ad_2]

    Institute for the Advancement of Food and Nutrition Sciences

    Source link

  • Lithium Sustainability for Decades

    Lithium Sustainability for Decades

    [ad_1]

    Newswise — On the way towards climate neutrality, Europe will need large amounts of lithium for battery storage systems. So far, however, its share in the worldwide lithium extraction volume has been one percent only. For this reason, researchers of KIT study ways to extract lithium from geothermal sources. “In theory, geothermal power plants in the Upper Rhine Valley and Northern German Basin might cover between 2 and 12 percent of Germany’s annual lithium demand,” says Valentin Goldberg from KIT’s Institute of Applied Geosciences (AGW). With his team, he calculated this potential based on an extensive data analysis. However, it has not been clear for how long extraction will be possible. Another study of the researchers now offers an optimistic perspective. “According to our findings, lithium extraction will be possible for many years at low environmental cost,” Goldberg says. “The model developed for our study describes lithium extraction in the Upper Rhine Valley. But parameters are chosen such that they can also be transferred to other joint systems.“

    Modeling of Geothermal Lithium Production

    Extraction of lithium from thermal waters is no conventional type of mining. That is why no conventional methods could be applied for analysis. “The lithium dissolved in water exists in a widely branched network of joints and cavities in the rock. However, it can only be accessed at certain points via individual wells,” says Dr. Fabian Nitschke, AGW, who was also involved in this study. “The reservoir dimension, hence, depends on the amount of water that can be accessed hydraulically via wells.” To calculate the lithium production potential, researchers had to consider the potential water extraction volume, its lithium concentration, and lithium extraction per unit time. “We use a dynamic transport model adapted to underground conditions in the Upper Rhine Valley. It couples thermal, hydraulic, and chemical processes. Similar models are known from petroleum and gas industry, but have not yet been applied to lithium,” Nitschke points out.

    When using geothermal energy, the extracted water is pumped back into the ground via a second borehole. Researchers wanted to find out whether lithium concentration of the deep water decreases with time. The results show that lithium concentration in the extraction borehole decreases by 30 to 50 percent in the first third of the investigation period of 30 years, as the deep water is diluted by the returned water. Then, lithium concentration remains constant. “This can be attributed to the open joint system that continuously supplies fresh deep water from other directions,” Nitschke says. Modeling suggests that continuous lithium extraction will be possible for decades: “Actually, extraction of this unconventional resource shows the classical cyclic behavior. Yields of hydrocarbon extraction or ore mining are also highest in the beginning and then start to decrease gradually.”

    Sensible Investment in a Sustainable Future

    Thomas Kohl from AGW, who directs the corresponding research activities as Professor for Geothermal Energy and Reservoir Technology, considers the research results another argument in favor of a wide use of geothermal energy. “We already knew that geothermal sources can supply baseload-capable, renewable energy for decades. Our study now reveals that a single power plant in the Upper Rhine Valley could additionally cover up to 3 percent of the annual German lithium consumption.” Kohl’s group is now working on solutions for practical implementation.. Recently, it published a study in Desalination on the preliminary treatment of thermal water for resource extraction. “The next step now is to transfer this technology to the industrial scale,” Kohl says.

    [ad_2]

    Karlsruhe Institute of Technology (KIT)

    Source link

  • UK manufacturers need more support to help us reach net zero, new report says

    UK manufacturers need more support to help us reach net zero, new report says

    [ad_1]

    • A world leading group of engineers has called for UK manufacturers to be given more support in making electrical machines to help us reach net zero

    • Future Electrical Machines Manufacturing Hub (FEMM) – a consortium of academics led by the University of Sheffield – has published a new technology roadmap calling for the UK to reflect on the way it makes electrical machines

    • Electrifying systems currently powered by fossil fuels is seen as critical to the UK reaching net zero by 2050, but this drive to electrification is set to place huge demands on manufacturers and scarce materials that are key to electrical technologies

    • New roadmap highlights crucial need to help UK manufacturers develop new processes and secure a sustainable supply of key resources 

    Newswise — The UK needs to make better electrical machines and make its manufacturing processes more efficient if it is to reach net zero by 2050, according to a world leading group of engineers.

    The Future Electrical Machines Manufacturing Hub (FEMM), a consortium of academics led by the University of Sheffield, focused on addressing key manufacturing challenges in the production of high integrity and high value electrical machines, has made the call as part of its new technology roadmap. 

    Led by Professor Michael Ward from the University of Strathclyde, the roadmap has set out the changes the UK needs to make to electrical machines and the way it manufactures them in order to reach net zero.

    Electrifying systems that are currently powered by fossil fuels is seen as one of the major ways the world can decarbonise. In the UK, the government’s plans to reach net zero are heavily reliant on emerging and developing electrification technologies. 

    However, according to the new roadmap, this drive to electrification is set to place huge pressure on manufacturers, who will need to produce more electric machines to support this inexorable growth in demand and the ever-increasing requirements on performance and sustainability. 

    Given this upcoming huge increase in pressure, the roadmap says it is crucial that manufacturers are given more support. 

    Electrification is also set to increase competition for scarce resources that are critical components of electrical machines, so the roadmap is calling for the UK to develop a circular economy, grow its recycling and remanufacturing industries, and reuse materials in order to reduce its reliance on depleting virgin stock. 

    Professor Geraint Jewell, Director of the Future Electrical Machines Manufacturing Hub and Professor of Electrical Engineering at the University of Sheffield, said: “Electrification is universally recognised as one of the cornerstones of the transition towards net-zero. High efficiency electrical machines are a central element in the electrification of many market sectors such as transportation and renewables. 

    “To meet the challenges of cost reduction, reliability and the ever-increasing demands on performance, requires a combination of new materials, designs and manufacturing. This roadmap sets out the FEMM Hub team’s appraisal of the many challenges and opportunities for new technologies and manufacturing R&D as the move to electrification gathers pace.”

    Professor Michael Ward, Director of Industrial Strategy at the University of Strathclyde and lead author of the FEMM Hub’s new technology roadmap, said: “Electrical machines are foundational to decarbonisation, both in replacing combustion engines and in generating renewable energy. However we look at it, we are going to need more of them, and we need to improve many aspects of their performance, including cost. 

    “Assembling this roadmap has underlined these needs, but above all it has demonstrated the challenges that result from finite global availability of materials. In the UK we don’t have the benefit of indigenous sources of globally scarce materials. That means we need to think creatively about products, manufacturing techniques, and approaches that extend product life and re-use material at the end of life. 

    “Research, such as the work of the FEMM Hub, is key to this but even more important is recognition of this need by policy makers, product developers, and supply chain actors such that the UK can both act as a responsible global player, and achieve economic benefits from the transition to net zero.”

    The Future Electrical Machines Manufacturing Hub aims to help put UK manufacturing at the forefront of the electrification revolution. 

    Made up of researchers from the University of Sheffield and its Advanced Manufacturing Research Centre (AMRC), in collaboration with academics from the Universities of Strathclyde, Newcastle and the National Manufacturing Institute Scotland (NMIS), the Hub helps UK manufacturers capture significant value in the electrical machine supply chain and improve productivity.

    The Hub works with manufacturers such as Rolls-Royce, Airbus, Siemens Gamesa, Dyson, McLaren, and the Aerospace Technology Institute to help shape its research to ensure it meets the needs of industry. 

    By focusing on early Technology Readiness Levels (TRL), the Hub drives innovation, develops new and emerging technologies and helps address long-term industrial challenges.

    To work with FEMM Hub, visit: https://www.electricalmachineshub.ac.uk/get-involved 

    To read the FEMM Hub’s technology roadmap, visit: https://www.electricalmachineshub.ac.uk/our-research/roadmap

    Ends

     

     

    [ad_2]

    University of Sheffield

    Source link

  • LLNL scientists among finalists for new Gordon Bell climate modeling award

    LLNL scientists among finalists for new Gordon Bell climate modeling award

    [ad_1]

    Newswise — A team from Lawrence Livermore and seven other Department of Energy (DOE) national laboratories is a finalist for the new Association for Computing Machinery (ACM) Gordon Bell Prize for Climate Modeling for running an unprecedented high-resolution global atmosphere model on the world’s first exascale supercomputer.

    The Gordon Bell submission, led by Energy Exascale Earth System Model (E3SM) chief computational scientist Mark Taylor, details the team’s record-setting demonstration of the Simple Cloud Resolving E3SM Atmosphere Model (SCREAM) on Oak Ridge National Laboratory’s 1.2 exaFLOP (1.2 quintillion computing operations per second) Frontier machine.

    Incorporating state-of-the-art parameterizations for fluid dynamics, microphysics, moist turbulence and radiation, SCREAM is a full-featured atmospheric general-circulation model developed for very fine-resolution simulations on exascale machines. The effort is led by LLNL staff scientist Peter Caldwell, who also heads the Lab’s Climate Modeling group.

    A cornerstone of SCREAM development is computationally-efficient performance-portable design. This feature allows SCREAM to become — as far as the team is aware — the first nonhydrostatic global atmospheric model with resolution finer than 5 kilometers to run on an exascale supercomputer, the first to run at scale on both NVIDIA and AMD Graphics Processing Unit (GPU) systems, and the first to exceed 1 simulated-year-per-day of throughput. SCREAM earned its Gordon Bell finalist position from a record-setting run performed earlier this year, and a revised submission boasts results 54% faster than the original entry, obtaining a performance of 1.26 simulated years per day on 8,192 Frontier nodes.          

    “The Gordon Bell Prize is the highest honor in high performance computing,” said LLNL’s Caldwell. “E3SM is very proud and excited to be finalists for the inaugural year of the Gordon Bell Climate Award. We worked extremely hard for five years to develop a model which makes efficient use of exascale computers, providing more trustworthy and higher-fidelity predictions of future climate than were previously possible. The aim of this new prize matches our goals exactly, so we were hopeful about our chances.”

    What separates SCREAM from other climate models is that it was written in C++ and uses the Kokkos library, enabling it to perform efficiently across the spectrum of computer architectures, Caldwell explained. The design choice allowed the SCREAM team to run on Frontier faster than any other climate model, he said, adding that “most climate and weather models are struggling to take advantage of the GPUs that power most of today’s top powerful supercomputers. SCREAM is of huge interest to other modeling centers as a successful example of how to make this transition.”

    The SCREAM effort grew out of the E3SM team, a multi-lab DOE partnership led by LLNL scientist Dave Bader, that is tasked with developing a state-of-the-art climate modeling, simulation and prediction project for exascale supercomputers. Sponsored by the U.S. Department of Energy’s (DOE’s) Office of Biological and Environmental Research (BER), the E3SM team includes researchers and computational scientists at LLNL and the Sandia, Argonne, Brookhaven, Los Alamos, Lawrence Berkeley, Oak Ridge and Pacific Northwest national laboratories. Other LLNL staff named in the Gordon Bell entry include scientists Aaron Donahue, Chris Terai and Renata McCoy.

    Team members said the achievement represents a breakthrough in climate modeling and a significant milestone for the E3SM project, which aims to bring DOE’s cutting-edge computer science to bear on the climate simulation challenge by simulating the climate system at very high resolution. Such high resolution permits explicit resolution of large convective circulations and other important atmospheric phenomena, thereby avoiding critical sources of uncertainty in traditional climate models, according to researchers. Fine resolution is also necessary to capture critical aspects of the climate that might impact conditions in the United States in the coming decades, such as extreme temperatures, storms and sea-level rise.

    The Gordon Bell Prize for Climate Modeling “aims to recognize innovative parallel computing contributions toward solving the global climate crisis,” according to ACM. It will be awarded for the first time this year at the International Conference for High Performance Computing, Networking, Storage, and Analysis (SC23) in Denver, and accompanied by a $10,000 award provided by Gordon Bell. Winners will be selected based on their potential to impact climate modeling and related fields.

    For more on E3SM, visit https://e3sm.org/.

    [ad_2]

    Lawrence Livermore National Laboratory

    Source link

  • Polar experiments reveal seasonal cycle in Antarctic sea ice algae

    Polar experiments reveal seasonal cycle in Antarctic sea ice algae

    [ad_1]

    Newswise — In the frigid waters surrounding Antarctica, an unusual seasonal cycle occurs. During winter, from March to October, the sun barely rises. As seawater freezes it rejects salts, creating pockets of extra-salty brine where microbes live in winter. In summer, the sea ice melts under constant daylight, producing warmer, fresher water at the surface. 

    This remote ecosystem is home to much of the Southern Ocean’s photosynthetic life. A new University of Washington study provides the first measurements of how sea-ice algae and other single-celled life adjust to these seasonal rhythms, offering clues to what might happen as this environment shifts under climate change. 

    The study, published Sept. 15 in the International Society for Microbial Ecology’s ISME Journal, contains some of the first measurements of how sea-ice microbes respond to changing conditions. 

    “We know very little about how sea-ice microbes respond to changes in salinity and temperature,” said lead author Hannah Dawson, a UW postdoctoral researcher who did the work while pursuing her doctorate in oceanography at the UW. “And until now we knew almost nothing about the molecules they produce and use in chemical reactions to stay alive, which are important for supporting higher organisms in the ecosystem as well as for climate impacts, like carbon storage and cloud formation.” 

    The polar oceans play an important role in global ocean currents and in supporting marine ecosystems. Microbes form the base of the food web, supporting larger life forms. 

    “Polar oceans make up a significant portion of the world’s oceans, and these are very productive waters,” said senior author Jodi Young, a UW assistant professor of oceanography. “These waters support big swarms of krill, the whales that come to feed on those krill, and either polar bears or penguins. And the start of that whole ecosystem are these single-celled microscopic algae. We just know so little about them.” 

    The tiny organisms are also important for the climate, since they quietly perform photosynthesis and soak up carbon from the atmosphere. Polar algae are especially good at producing sulfur-containing molecules that give beaches their distinctive smell and, when lofted into the air in sea spray, promote formation of clouds that can reduce penetration of solar rays. 

    Antarctic sea ice, though long stable, is at an all-time record low this year. 

    In other oceans, satellite instruments can capture dramatic seasonal phytoplankton blooms from space — but that isn’t possible for microbes hidden under sea ice. And Antarctic waters are particularly challenging to visit, leaving researchers with almost no measurements in winter. 

    In late 2018, Dawson and co-author Susan Rundell traveled to Palmer Station, a U.S. research station on the West Antarctic Peninsula. They used a small boat to sample seawater and sea ice at the same nearby sites every three days. 

    Back on shore, the two graduate students performed 10-day experiments in tanks to see which microbes grew as temperature and salinity were adjusted to mimic sea-ice formation and melt. They also shipped samples back to Seattle for more complex measurements of the samples’ genetics and metabolites, the small organic molecules produced by the cell. 

    Results revealed how single-celled algae deal with their fluctuating environments. As temperatures drop, the cells produce cryoprotectants, similar to antifreeze, to prevent their cellular fluid from crystallizing. Many of the most common cryoprotectant molecules were the same across different microbial lifeforms. 

    As salinity changes, to avoid either bursting in freshening waters or becoming desiccated like raisins in salty conditions, the cells change the concentration of salt-like organic molecules. Many such molecules serve a dual role as cryoprotectants, to balance conditions inside and outside the cell to maintain water balance. 

    The results show that under short-term temperature and salinity changes, community structure in each sample remained stable while adjusting the production of protective molecules. Different microbe species showed consistent responses to changing conditions. This should simplify modeling future responses to climate change, Young said. 

    Results also hint that the production of omega-3 fatty acids may decline in lower-salinity environments. This would be bad news for consumers of krill oil supplements, and for the marine ecosystem that relies on those algae-derived nutrients. Future research now underway by the UW group aims to confirm that result — especially with the prospect of increasing freshwater input from melting sea ice and glaciers. 

    “We’re interested in how these sea-ice algae contend with changes in temperature, salinity and light under normal conditions,” Dawson said. “But then we also have climate change, which is completely remodeling the landscape in terms of when sea ice is forming, how much sea ice forms, how long it stays before it melts, as well as the quantity of freshwater input from glaciers. So we’re both trying to capture what’s happening now, and also asking how that can inform what might happen in the future.”

    The study was funded by the National Science Foundation, the Simons Foundation, and the Alfred P. Sloan Foundation. Other co-authors are Anitra Ingalls, Jody Deming, Joshua Sacks and Laura Carlson at the UW; Natalia Erazo, Elizabeth Connors and Jeff Bowman at Scripps Institution of Oceanography; and Veronica Mierzejewski at Arizona State University.

     

    ###

     

    [ad_2]

    University of Washington

    Source link

  • Ohio’s droughts are worse than often recognized, study finds

    Ohio’s droughts are worse than often recognized, study finds

    [ad_1]

    Newswise — COLUMBUS, Ohio – A new type of analysis suggests that droughts in Ohio were more severe from 2000 to 2019 than standard measurements have suggested.

    Researchers at The Ohio State University developed impacts-based thresholds for drought in Ohio, looking specifically at how corn yield and streamflow were affected by various drought indicators, such as notable changes in soil moisture, crops, and even livestock losses in the state.

    The results suggest this impacts-based approach could give Ohio farmers earlier and more accurate notice when drought conditions are approaching, said Steven Quiring, co-author of the study and a professor of geography at Ohio State.

    “We want to better understand what steps should be taken so that Ohio can better prepare for and also monitor the onset of drought conditions because a lot of the best ways to respond to drought is taking action early,” said Quiring. Moreover, with a more precise early warning system, agriculture producers might be able to save time and money by implementing water restrictions, or by switching to different or more drought-resistant crops. 

    The study was published in the Journal of Hydrometeorology. 

    The Ohio State researchers compared how their method performed at predicting droughts with data from the U.S Drought Monitor (USDM)

    The problem with the USDM is that it uses fixed drought thresholds, or guidelines that use the same parameters to measure changes in all seasons and climate regions of the country. Unfortunately, this one-size-fits-all approach can cause monitoring plans to inaccurately gauge local weather conditions and how they impact those in certain communities, Quiring said.

    By analyzing data from four drought indices commonly used in previous studies to monitor drought intensity across the United States, researchers were able to show that fixed thresholds tend to indicate milder drought conditions in Ohio than are indicated by the impacts-based thresholds identified in their study. 

    It’s why Quiring and his team want to use the impacts-based method to revamp those thresholds to better reflect drought conditions in Ohio, a move that starts by updating The Ohio Emergency Management Agency’s state drought plan. 

    To accomplish their goal, the researchers investigated how data from the four indices impacted streamflow, or how much water discharges over a designated point in a fixed period of time, and Ohio’s total corn yield, mainly because the crop covers an extensive area within the state, and nearly every county grows it. 

    Identifying agricultural drought thresholds that are specific to Ohio is important, said Quiring. Because the impacts of drought can vary from region to region, using the same drought thresholds in California as in Ohio is absurd, he said. Additionally, the types of drought that occur can differ. Ohio, for example, in particular is prone to “flash droughts” — shortages caused by warm weather that can happen quickly over a few days or weeks. 

    “These rapid-onset droughts can be particularly challenging for the agricultural community because they arrive quickly and conditions can rapidly go from normal to drier than normal,” said Quiring. “All of a sudden soil moisture is depleted, the crops are stressed and yield losses and impacts on the ecosystem occur.”  

    The last time severe drought caused major losses in the United States was in 2012 when a record-breaking heat wave resulted in $34.2 billion in economic losses, 123 direct deaths and a 26% decrease in total corn crop yield across the country. 

    As large areas of the country dried out, Ohio’s corn yield dropped from about 160 bushels per acre to 120 bushels per acre within a year. While such considerable losses have not happened since, according to the State Climate Office of Ohio, some areas of the state have experienced abnormally dry drought conditions this year.  

    What’s more, the researchers’ impacts-based method of drought monitoring also takes into account how climate change can worsen flash drought events.

    “One of the impacts that we found to be counterintuitive in Ohio is that with climate change, we do expect more rainfall overall, but we also expect to see more droughts because there are longer periods of time where no rain occurs,” said Quiring. 

    The results of this study suggest that following guidelines that aren’t specific to a region’s issues can end up either systematically underestimating the impacts of severe drought conditions in some locations or overestimating them in others, Quiring said. 

    While it’ll be some time before Quiring’s team can get their research incorporated into the next edition of the state drought plan, the study emphasizes that its methods could easily be applied to other regions beyond Ohio where long-term streamflow and crop yield data are readily available. Optimistically, it could help to improve drought monitoring worldwide and provide useful information to future agriculture producers and decision-makers, said Quiring. 

    “This work is actually timely because it will provide a basis for decision-making in Ohio, rather than using research that’s been done in other parts of the country,” said Quiring. “Hopefully we can give better guidance to those who are making decisions on the ground.”

    This study was supported by the National Integrated Drought Information System (NIDIS). Co-authors were Ning Zhang and Zhiying Li, who were both at Ohio State when the study was conducted. Zhang is now at the University of California, Davis and Li is at Indiana University. 

    #

    [ad_2]

    Ohio State University

    Source link

  • Peak hurricane season is September, October: MSU experts can comment

    Peak hurricane season is September, October: MSU experts can comment

    [ad_1]

    Newswise — EAST LANSING, Mich. – Hurricanes Idalia and Lee have already packed a punch, but climatologists are now predicting more hurricanes this season, which doesn’t end until Nov. 30. Though previous projections suggested a milder hurricane season, we’re now on track for the eighth consecutive year of above-average activity. Michigan State University experts provide comments on the scientific, economic and government issues surrounding hurricanes.

    Lifeng Luo is the director of MSU’s Environmental Science and Policy Program, as well as a professor in the Department of Geography, Environment and Spatial Sciences in the College of Social Science. Luo is an expert in the variability and predictability of climate, hydro climatology, and resource management, among other areas.

    Contact: [email protected]

    “A number of factors are at play in the formation and intensification of tropical storms, and the most important one is the warm ocean. More specifically, the sea surface temperature needs to be at least 80 F or 26.5 C for storms to develop. As the ocean has absorbed a large amount of heat due to global warming, the sea surface temperature has been going up gradually over the last century. Trends can be stronger locally in some regions, such as the North Atlantic and Gulf of Mexico. Other factors include circulation patterns and modes of climate variability like El Nino. Additionally, La Nina tends to increase the number of tropical storms in the Atlantic basin due to reduced vertical wind shear. With three La Ninas in a row in the last three years, climate variability may also contribute to the fact that you see consecutive above-normal hurricane seasons.

    “In terms of natural disasters, Michigan is among the safest states in the US. The impact of Atlantic hurricanes here has been limited given how far we are from the east coast at this latitude and the typical storm tracks. We can still see rainfall (sometimes heavy) associated with a hurricane after it makes landfall and if it moves northward, but it can hardly produce torrential rainfall as typically seen in the rain bands of the hurricane.”

    Mark Skidmore is the Morris Chair in State and Local Government Finance and Policy as well as the resident fellow at MSU Extension’s Center for Local Government Finance and Policy. Additionally, Skidmore is an economics professor in both the colleges of Social Science and Agriculture and Natural Resources. He is an expert in the relationship between government activities and economic development, including incentives, as well as the economics of natural disasters.

    Contact: [email protected]

    “According to the National Oceanic and Atmospheric Administration, or NOAA, the United States experienced 363 weather-related disasters over the 1980-2023 period. Estimates indicate that these disasters resulted in $2.59 trillion in damages of which roughly half are attributable to hurricanes and tropical storms. Though there is significant variability in damages from storm to storm, on average each storm results in about $1 billion in damages.

    “There are several tiers of support that help communities rebuild. As an immediate response, the priority is to provide access to basic needs such as food, water, shelter, fuel and the restoration of electricity and communications. As core needs are met, authorities may focus on rebuilding damaged public infrastructure. Finally, resources flowing in from insurance, private savings and governments help households and business regain a foothold and reestablish operations. Longer-term, it is often helpful to review weaknesses in infrastructure and preparations to reduce vulnerability in the future. 

    “Federal government assistance sometimes weakens incentives for households, businesses and subnational governments to take disaster risk-reduction measures. Why engage in otherwise appropriate risk-reduction measures when federal assistance is available? For example, a property owner may be more inclined to build a vacation home on an exposed beach if it is known that the government will help pay for repairs. Thus, there is tension between providing a safety net for those exposed to disasters and increasing exposure to disasters.”

    Seven Mattes is an assistant professor at the Center for Integrative Studies in the College of Social Science. Mattes is an expert in disaster preparedness and multispecies resiliency, as well as animal studies.

    Contact: [email protected]

    “While hurricanes are a part of life for coastal residents, both the storms and the local populations have increased in number and intensity. As anthropogenic climate change increases the number of storms and human population grows in coastal regions, how we approach preparedness is an ongoing adaptive effort to the new conditions. Thus, while improvements in preparedness have been implemented in coastal states across the U.S., numerous vulnerabilities remain. There are innumerable recommendations for improving hurricane preparedness in the U.S.

    • Strengthening those natural structures that have historically shielded the habitats of humans and nonhumans alike — wetlands, salt marshes, reefs, dunes, mangrove forests, etc. — is an effective means to improve resilience to hurricane impacts. Preserving and valuing natural structures protect against storm surges, flooding and other damaging forces while also supporting the wildlife that reside within.  
    • Improving existing infrastructure to withstand intensified impacts — especially in low-income communities — is urgently needed.  
    • Funding programs and incentives to educate and organize on the local level are essential — learning from, building on and sharing local knowledge ensures community preparedness.  
    • Addressing the preventable vulnerability that results from developing hurricane-prone zones, like building homes and structures in low-lying coastal areas, drains resources at all stages of disaster preparedness.  
    • Including companion species in planning and policy insofar as they impact human safety and decision-making like the PETS Act following Hurricane Katrina. Agricultural animals are especially vulnerable to hurricane impacts, as we saw with Hurricane Florence — millions of chickens and thousands of hogs were killed in the resulting floods. Approaching disaster preparedness with an awareness of the broader multispecies communities in which they live can aid in building resiliency for all within.”

    Simone Theresa Peinkofer is an associate professor in the Department of Supply Chain Management in the Broad College of Business, and she also serves as the director of the college’s Logistics Doctoral Program. Peinkofer is an expert in retail supply chain management, consumer-based strategy in supply chain management and omnichannel fulfillment operations.

    Contact: [email protected]

    “Depending on the path of the hurricane, it can delay freight movement. For example, ports and airports might shut down for an extended period, and the high winds and rainfall can make the movement of freight via train and trucks impossible and unsafe. Hurricanes can also damage goods that are in transit or stored in a warehouse if the warehouse is in the path of the storm. Hence, hurricanes can lead to loss in revenue and potentially higher prices for businesses. Depending on the region in the world, hurricanes or typhoons or cyclones can impact global supply chains. For example, Vietnam’s typhoon season is year-round and so typhoons can also shut down key manufacturing plants and delay or damage international freight. 

    “Companies should have a risk management plan in place that helps guide them through the disaster and especially through the recovery efforts. Additionally, companies in the path of a hurricane would want to closely monitor the situation and prepare accordingly by, for example, rerouting freight to a different port or airport. It’s important to act early on.”

    ###

    Michigan State University has been advancing the common good with uncommon will for more than 165 years. One of the world’s leading research universities, MSU pushes the boundaries of discovery to make a better, safer, healthier world for all while providing life-changing opportunities to a diverse and inclusive academic community through more than 400 programs of study in 17 degree-granting colleges.

    For MSU news on the Web, go to MSUToday. Follow MSU News on Twitter at twitter.com/MSUnews.

    [ad_2]

    Michigan State University

    Source link

  • Arctic beavers boost methane emissions

    Arctic beavers boost methane emissions

    [ad_1]

    Newswise — The climate-driven advance of beavers into the Arctic tundra is causing the release of more methane — a greenhouse gas — into the atmosphere.

    Beavers, as everyone knows, like to make dams. Those dams cause flooding, which inundates vegetation and turns Arctic streams and creeks into a series of ponds. Those beaver ponds and surrounding inundated vegetation can be devoid of oxygen and rich with organic sediment, which releases methane as the material decays.

    Methane is also released when organics-rich permafrost thaws as the result of heat carried by the spreading water.

    A study linking Arctic beavers to an increase in the release of methane was published in July in Environmental Research Letters

    The lead author is Jason Clark, a former postdoctoral fellow at the University of Alaska Fairbanks Geophysical Institute. Research Professor Ken Tape, also of the Geophysical Institute, was Clark’s adviser and is a co-author. Other co-authors include Benjamin Jones, a research assistant professor at the UAF Institute of Northern Engineering; and researchers from the National Park Service and NASA’s Jet Propulsion Laboratory.

    Tape has done extensive research about the northward migration of beavers and their resultant impact on the Arctic environment.

    “What we found is that there are lots of methane hotspots right next to ponds and they start to diminish as you go away from the pond,” he said.

    The new study is the first to link large numbers of new beaver ponds to methane emissions at the landscape scale. It suggests that beaver engineering in the Arctic will at least initially increase methane release. 

    “We say ‘initially’ because that’s the data we have,” Tape said. “What the longer-term implications are, we don’t know.” 

    As a greenhouse gas, methane is 25 times more potent than carbon dioxide at trapping heat in Earth’s atmosphere.

    It accounts for about 20 percent of global greenhouse gas emissions, according to the U.S. Environmental Protection Agency. The agency says human activities have more than doubled atmospheric methane concentrations in the past two centuries.

    The new research focused on 166 square miles of the lower Noatak River basin in Northwest Alaska. Data was obtained by airborne hyperspectral imaging through NASA’s Arctic-Boreal Vulnerability Experiment program. That program and the National Science Foundation funded the research.

    Hyperspectral cameras image an area in hundreds of wavelengths across the electromagnetic spectrum, including many not visible to the human eye. That differs from other cameras, which typically only image in the primary colors of red, green and blue.

    The researchers compared the location of methane hot spots to the locations of 118 beaver ponds and to a number of nearby unaffected stream reaches and lakes. They analyzed the area up to approximately 200 feet from the perimeter of each water body and found a “significantly greater” number of methane hot spots around beaver ponds.

    “We have these datasets that largely overlap, in space and mostly in time,” Tape said. “It’s kind of a simple design relying on a new tool.”

    Additional research about the relationship between beaver migration and Arctic methane release will occur next year.

    [ad_2]

    University of Alaska Fairbanks

    Source link

  • Artificial Intelligence: a step change in climate modelling predictions for climate adaptation

    Artificial Intelligence: a step change in climate modelling predictions for climate adaptation

    [ad_1]

    Newswise — As of today, climate models face the challenge of providing the high-resolution predictions –  with quantified uncertainties – needed by a growing number of adaptation planners, from local decision-makers to the private sector, who require detailed assessments of the climate risks they may face locally.

    This calls for a step change in the accuracy and usability of climate predictions that, according to the authors of the paper “Harnessing AI and computing to advance climate modelling and prediction”, can be brought by Artificial Intelligence. The Comment was published in Nature Climate Change by a group of leading international climate scientists, including CMCC Scientific Director Giulio Boccaletti and CMCC President Antonio Navarra.

    One proposed approach for a step change in climate modelling is to focus on global models with 1-km horizontal resolution. ​​However, the authors explain, although kilometre-scale models have been referred to as ‘digital twins’ of Earth, they still have limitations and biases similar to current models. Moreover, given the high computational costs, they impose limitations on the size of simulation ensembles, which are needed both to calibrate the unavoidable empirical models of unresolved processes and to quantify uncertainties. Overall, kilometre-scale models do not offer the step change in accuracy that would justify accepting the limitations that they impose.

    Rather than prioritizing kilometre-scale resolution, authors propose a balanced approach focused on generating large ensembles of simulations at moderately high resolution (10–50 km, from around 100 km, which is standard today) that capitalizes on advances in computing and AI to learn from data. By moderately increasing global resolution while extensively harnessing observational and simulated data, this approach is more likely to achieve the objective of climate modelling for risk assessment, which involves minimizing model errors and quantifying uncertainties and enables wider adoption. 

    1,000 simulations at 10-km resolution cost the same as 1 simulation at 1-km resolution. “Although we should push the resolution frontier as computer performance increases, climate modelling in the next decade needs to focus on resolutions in the 10–50 km range”, write the authors. “Importantly, climate models must be developed so that they can be used and improved on through rapid iteration in a globally inclusive and distributed research programme that does not concentrate resources in the few monolithic centres that would be needed if the focus is on kilometre-scale global modelling.” 

    [ad_2]

    CMCC Foundation – Euro-Mediterranean Center on Climate Change

    Source link

  • Marine plankton and ecosystems affected by climate change

    Marine plankton and ecosystems affected by climate change

    [ad_1]

    BYLINE: Naoki Namba

    Marine plankton plays an important role in the food chain, which is said to be undergoing a transformation due to climate change. Assistant Professor Kohei Matsuno of the Faculty of Fisheries Sciences spoke about how climate change is changing the distribution and ecology of marine plankton and what impact this will have on higher-trophic predators, including humans.

    The major role of microscopic plankton

    My research focuses on the effects of climate change on marine plankton in polar regions such as the Arctic and the Antarctic. Plankton are small, drifting creatures with weak swimming ability, that are basically just swept along by the water. They cannot move against ocean currents like fish, so they are directly affected by changes in ocean currents and the environment caused by climate change. In other words, the effects of climate change on plankton are easily observable.

    Another important aspect is the ability of phytoplankton to synthesize organic matter from inorganic matter through photosynthesis. Phytoplankton is then preyed upon by zooplankton, which is in turn preyed upon by fish, which are in turn eaten by higher predators such as birds. The small plankton support the marine ecosystem as primary producers. Beyond that, we humans also feed on marine products, so it can be said that plankton and we are connected.

    Climate change as reflected by plankton

    Arctic ice is now rapidly decreasing due to global warming. The reduction of sea ice and the inflow of melting water from glaciers should cause major changes in the marine environment, and we are investigating the effects of such changes on plankton. We actually go to the Arctic Ocean to collect plankton and continually study the number and types of plankton. Currently, satellite observations and mathematical modeling are being actively used for research, but field observations are an essential part of understanding the plankton in the ocean, and fieldwork is what I am best at.

    My research focuses on the Chukchi Sea, which is a part of the Arctic Ocean close to the Pacific Ocean. In this region, we know that warm water from the Pacific Ocean flows into the Arctic Ocean through the Bering Strait. This flow has strengthened in recent years, and we have found that there is a large influx of zooplankton from the Pacific Ocean into the Chukchi Sea. It has been shown that the increase in zooplankton due to the influx may increase the productivity of predators such as fish, and may also displace endemic plankton species that were originally present in the Arctic Ocean. Our research also showed that copepods, a type of zooplankton, brought from the Pacific Ocean, spawn and hatch in the Arctic Ocean. If the sea ice continues to decrease in the future and the influx of Pacific species increases and becomes established, this could drastically change the local ecosystem, including the impact on seals and polar bears.

    Untapped plankton research

    In fact, there are few studies on the relationship between climate change and plankton. Plankton research is difficult because you need to go to the field, and if you want to investigate the relationship with climate change, you need to accumulate data over a period of 20 or 30 years, which is time-consuming and very unglamorous research. That is why I find it interesting to work on it, because there are many things that no one has discovered yet.

    The situation has only recently become more data-intensive, but it is not easy to see the link with climate change because of the limited data available from the past. Conversely, if we don’t study the plankton in the Arctic and the Antarctic now, we won’t be able to make comparisons 10 or 20 years from now, so I think it is important that we do our research now.

    The consequences of earlier sea ice melting

    The northern Bering Sea, the gateway to the Arctic Ocean, is rich in plankton and is one of the world’s leading fishing grounds for snow crab, king crab, and cod. This area is usually covered by sea ice from December to April each year; however, in spring 2018, the sea ice melted about a month earlier than usual, affecting a wide range of organisms. In particular, fish and birds died, and worse nutritional status was reported. However, it was a mystery why such biological effects occur when sea ice melts earlier.

    We were able to unravel part of this mystery based on surveys in the northern Bering Sea carried out in 2017 and 2018 on the Oshoro Maru, a training vessel of the School of Fisheries Sciences at Hokkaido University. In 2018, when the sea ice melted earlier, there was a delay in the onset of the phytoplankton bloom—a massive increase in phytoplankton. Even if the sea ice melts earlier, the bloom is unlikely to occur at that time of year due to inadequate sunlight levels and the environment being more prone to mixing of seawater. As a result of the delayed bloom, it was found that zooplankton, which prey on phytoplankton, laid their eggs later, which also delayed their growth, resulting in smaller individuals. This affected the growth of fish and birds. We were able to clarify this because we were continuously collecting data on the Oshoro Maru. However, it is not yet known how these phenomena will affect the following year and beyond.

    Realizing how small your world is

    I have been interested in biology and environmental issues since I was in high school, but it wasn’t until I entered the fisheries department at university that I became interested in marine plankton. For example, I was surprised to see a plankton that is less than a millimeter long but has 20 legs, and I guess you could call it ‘biological beauty’, a very reasonable and functional form. The forms of living organisms all have meaning. The more research I do, the more I realize that all forms of living organisms have meaning, which I really enjoy. At the same time, as a researcher, I want to take as neutral a view of things as possible and be a voice for plankton and nature.

    Looking back on it now, I would like young people to get out and about. If you are interested in something, immerse yourself in it and actively communicate with seniors and researchers who are active in that field. This will make you feel closer to the world of experts, and conversely, you will realize how small your own world is. Through these experiences, I think you can also discover what you really like and what you want to do. Even if this is not connected to work, I think it will enrich your life.

    Column: The joys of fieldwork

    I like fieldwork—where I do research in the field. I enjoy the unexpected discoveries I make when I look at the plankton there, and the fact that these discoveries become the seeds for my next research project. But the preparation is hard work. Once you leave the port by boat, you have to make do with what you already have on board for two months, so I really think preparation is everything. Once we get there, we spend almost every waking moment sampling. That’s part of the fun.

    In addition, researchers from many different fields live and work together on the ship, so many ideas are born from such interactions and joint research can begin. The oceans are connected all over the world, so we cannot have closed discussions. Researchers from different countries can also conduct research together and put data together to get a fuller picture.

    [ad_2]

    Hokkaido University

    Source link

  • Engineering of plant cell wall modifying enzymes opens new horizons

    Engineering of plant cell wall modifying enzymes opens new horizons

    [ad_1]

    Newswise — A newly discovered way of optimising plant enzymes through bioengineering has increased knowledge of how plant material can be converted into biofuels, biochemicals and other high-value products.

    The University of Adelaide-led study presents innovative ideas for how the walls of plant cells can be assembled, structured and remodelled by controlling specific enzymes’ catalytic function.

    Fundamental plant cell properties – such as structure, integrity, cytoskeletal organisation and stability – are now viewed differently, suggesting new alternatives.

    Studying the catalytic function of specific enzymes – a process termed ‘xyloglucan xyloglucosyl transferases’ – allowed researchers to better understand how they link diverse polysaccharides to form structural components of plant cell walls.

    “This work contributes to the essential knowledge of how xyloglucan xyloglucosyl transferases can be understood and their fundamental properties controlled – for example, to improve their catalytic rates and stability,” said project leader Professor Maria Hrmova.

    For plant material to be used in the production of biofuels, plant cell walls need to be deconstructed and the resultant materials chemically processed. The properties of the cell walls can be altered to be less rigid, therefore making biofuel production more efficient and cost-effective.

    The finding also has applications for the pharmaceutical industry, where enzymes are sought as environmentally friendly and cost-effective options in bioremediation, and other applications.

    Bioremediation is the removal of contaminants, pollutants and toxins from the environment through the use of living organisms.

    “Although the definition of the catalytic function of xyloglucan xyloglucosyl transferases has significantly progressed during the past 15 years, there are limitations, and still a lack of information, in how this knowledge can be organically implemented in the functionality of plant cell walls,” she said.

    This teamwork builds upon 60 years of xyloglucan chemical and biochemical research of this and other research groups.

    The research team used sensitive high-performance liquid chromatography with fluorescent reagents to monitor complex biochemical reactions of polysaccharides in an efficient way.

    “We also applied 3D molecular modelling and molecular dynamics simulations to gain insights into the mode of action of these enzymes on fast time scales,” Professor Hrmova said.

    “Our findings are supported by plant and cellular biology approaches we used to offer novel ideas on the function of these enzymes in vivo.”

    The study was published in the prestigious Plant Journal and was conducted with an international, multidisciplinary team of researchers from the Institute of Chemistry of the Slovak Academy of Sciences and the Huaiyin Normal University in China.

    It also received funding support from the VEGA Scientific Grant Agency and the Australian Research Council.

    A visualisation of reactant movements in a plant xyloglucan transferase enzyme can be seen here.

    [ad_2]

    University of Adelaide

    Source link

  • Pioneering research sheds surprising new light on evolution of plant kingdom

    Pioneering research sheds surprising new light on evolution of plant kingdom

    [ad_1]

    Newswise — A new study has uncovered intriguing insights into the evolution of plant biology, effectively rewriting the history of how they evolved over the past billion years.

    The research, published today in Nature Plants, shows plants have gradually developed their range of anatomical designs throughout the passage of time, punctuated by episodic bursts of innovation to overcome and adapt to environmental challenges.

    Such findings overturn the long-held belief that, much like animals, the fundamental range of plant types evolved in a big burst of sudden change early in their evolutionary history.

    Co-lead author Philip Donoghue, Professor of Palaeobiology at the University of Bristol, said: “Although plants are extraordinarily diverse in their design and organisation, they share a common ancestor which originated at sea more than a billion years ago.

    “We wanted to test whether they really evolved with a big bang early on in their history or whether their evolution was a slower and more continual process. Surprisingly, the results revealed plant evolution was a bit of a mix, with long periods of gradual change interrupted by short bursts of large-scale innovation, overcoming the challenges of living on dry land.”

    To test this theory the team of scientists analysed the similarities and differences of 248 groups of plants, ranging from single-celled pond scum and seaweed to land plants including everything from mosses and ferns, to pines, conifers and flowering plants. They also looked at 160 extinct groups known only from the fossil record, including species from the Devonian Rhynie Chert which lived more than 400 million years ago.

    More than a 130,000 observations were generated by breaking down plant designs into their components and recording those present or absent in each of the main groups, living and fossil. Computerised statistical techniques measured the overall similarities and differences between groups and how they have varied over time.

    The scientists also tried to work out what led to these evolutionary innovations, like the introduction of spores, seeds, roots, leaves, pollen and flowers.   

    Co-lead author Dr James Clark, Research Associate in Biological Sciences at the University of Bristol, said: “We found changes in plant anatomical design occur in association with events in which the entire cellular genetic make-up was doubled. This has happened many times in plant evolutionary history, as a result of errors in the genome-copying process, creating duplicate copies of genes that are free to mutate and evolve new functions.”

    But the major pulses of plant anatomical evolution were found to be associated with the challenge of living and reproducing in increasingly dry environments, connected to the progressive emergence of plants from sea on to land.

    Co-lead author Dr Sandy Hetherington’s fascination with the evolution of land plants began as a budding geologist at the University of Bristol and now continues in his work at the University of Edinburgh.

    He said: “Overall the pattern of episodic pulses in the evolution of plant anatomical designs matches that seen in other multi-cellular kingdoms of complex life, like animals and fungi. This suggests it is a general pattern and blueprint for complex multicellular life from its inception.”

    Paper

    ‘Evolution of phenotypic disparity in the plant kingdom’ by James W. Clark et al in Nature Plants

    Notes to editors

    Professor Philip Donoghue, Dr James Clark and Dr Sandy Hetherington are available for interview and advance copies of the embargoed paper can be requested. Please contact Victoria Tagg, Media & PR Manager (Research) at the University of Bristol: [email protected]

    Images

    https://fluff.bris.ac.uk/fluff/u2/oc20541/_mcB3ejZQJjOMTnnuN0oqgELk/

    Caption: The moss, Polytrichum commune, which is one of the closest living relatives of the ancestral land plant

    Credit: Silvia Pressel, The Natural History Museum

    https://fluff.bris.ac.uk/fluff/u3/oc20541/y_2cDGSW92fm1yF6LDdirgELg/

    Caption: The evolution of plant anatomical variety. Each dot represents a living or fossil species and the connecting lines reflect their evolutionary relationships, branching from a universal ancestor (bottom left) to the most recently evolved group, the flowering plants (bottom right).

    Credit: James Clark and colleagues, University of Bristol, UK

    https://fluff.bris.ac.uk/fluff/u3/oc20541/QllcweUjKzmlC4sFggkdVwELV/

    Caption: A diverse community of land plants, ranging from mosses to flowering species, grow together in boggy stream in the Cairngorms National Park, Scotland.

    Credit: Sandy Hetherington, The University of Edinburgh, UK

    [ad_2]

    University of Bristol

    Source link

  • Greening cities cuts carbon

    Greening cities cuts carbon

    [ad_1]

    Newswise — Dozens of European cities could reach net zero carbon emissions over the next 10 years by incorporating nature into their infrastructure, according to a new study.

    Published recently in the journal, Nature Climate Change, the analysis shows the ways cities can orchestrate a wide range of green solutions like parks, streetscaping and roof gardens to not only capture carbon emissions, but help reduce them.

    The study was undertaken by researchers from Sweden, the U.S. and China. It recommends the most effective approaches for natural carbon sequestration in 54 cities in the EU. And it shows how blending these steps with other climate actions can enable cities to reach net-zero carbon and actually reduce emissions by an average of 17.4 percent.

    Zahra Kalantari, an associate professor in Water and Environmental Engineering at KTH Royal Institute of Technology, says the researchers focused on the indirect ways that so-called “nature-based solutions” can contribute to carbon neutrality.

    “Nature-based solutions not only offset a proportion of a city’s emissions, but can contribute to reduction in emissions and resource consumption too,” Kalantari says.

    The results are based on integrating data from previous studies on the effects of nature-based solutions. These include urban farming, permeable pavements which enable rainwater absorption into the ground, narrower roads with more greenery and trees, wildlife habitat preservation, and creating more agreeable environments for walking and bicycling.

    For example, urban parks, greenspace and trees promote more walking, bicycling and other environmentally positive habits that replace automobile driving. Combined with other solutions like green infrastructure, these measures can further improve urban microclimates by absorbing heat and cold, and as a result reduce energy use in buildings.

    It also provides guidance on which measures should be prioritized and where to locate them for the best effect, she says. For example, in Berlin the study recommends prioritizing green buildings and urban green spaces, which could result in an emissions reduction rate of 6 percent for residences, 13 percent in industry and 14 percent in transportation.

    “There are many studies that examine the effects of individual nature-based solutions, but this merges all of them and analyzes the potential systemic effect,” she says. “That’s new.”

    The study was a collaboration by researchers from KTH Royal Institute of Technology in Stockholm, MIT, Stockholm University, University of Gävle, Linköping University, Royal Swedish Academy of Sciences and Shanghai Jiao Tong University.

    [ad_2]

    Kungliga Tekniska Hogskolan (KTH) [Royal Institute of Technology]

    Source link

  • Paleoclimate Lab Researchers Use National Science Foundation Support to Study Climate Change Past

    Paleoclimate Lab Researchers Use National Science Foundation Support to Study Climate Change Past

    [ad_1]

    Newswise — ALBANY, N.Y. (Aug. 31, 2023) — Last summer, the University at Albany’s Paleoclimate Lab opened its doors, offering a new way to analyze samples of natural materials, such as coral and lake sediment, to help reconstruct Earth’s climate history.

    Now, through nearly $800,000 in new support from the National Science Foundation (NSF) this summer, lab researchers are focused on South Asia and the Middle East.

    Aubrey Hillman, an assistant professor in UAlbany’s Department of Atmospheric and Environmental Sciences (DAES), was awarded $417,242 from the NSF for a collaborative research project to create a 50,000-year continuous record of the Indian summer monsoon by analyzing lake sediment collected from Loktak Lake in Northeast India.

    Sujata Murty, a DAES assistant professor, was awarded $339,771 from the NSF to lead another collaborative research project that aims to reconstruct Red Sea surface hydrology since the 1700s by analyzing coral cores along its eastern edge.

    Both projects are now active and will run through the summer of 2026.

    “The NSF Paleoclimate program is highly competitive; therefore, it is notable that both of these projects were funded,” said Ryan Torn, DAES chair and professor. “Aubrey and Sujata’s work will provide greater insight into Earth’s past climate and offer new research opportunities for both undergraduate and graduate students.”

    Changes in the Indian Summer Monsoon

    The Indian summer monsoon typically lasts from June to September, with much of India, along with other parts of South Asia, receiving a significant amount of its total annual precipitation during this period.

    Hillman’s new NSF project proposes to create new paleoclimate records from Loktak Lake that will provide insight into the causes and consequences of abrupt changes in Indian summer monsoon rainfall over the last 500 centuries.

    To do so, Hillman and her research team, which includes collaborators at the University of Pittsburgh, Manipur University in India and Washington University in St. Louis are using the Paleoclimate lab to analyze lake sediments collected through the project.

    In 2018, the research team traveled to Loktak Lake to start the collection process, using a UWITEC coring device that lowers a long tube to the bottom of the lake and fills it with sediment cores. That tube is then brought home, preserved and analyzed.

    The team plans to return within the next year, collecting a total of 30 meters of lake sediment.

    “The lake sediments will offer us new data to analyze changes in the Indian summer monsoon season over tens of thousands of years,” said Hillman. “There are few records that currently exist at this long of a scale.

    “We believe our findings will offer new insight into the timing, direction, magnitude, and rate of changes in the Indian monsoon season through history, all of which are important to the more than one billion people who rely on it to deliver water and support agriculture,” she added.

    Following the sample collections, the research team plans to hold a series of public engagement workshops with colleagues in India regarding topics such as lake water balance, paleoclimate and monsoons. The project is also supporting graduate student researchers from partnering institutions.

    Climate of the Red Sea

    Our oceans play a critical role in influencing regional and global climate by absorbing much of the solar energy that reaches Earth and releasing heat back into the atmosphere.

    While there’s significant research around the climate history of the Atlantic and Pacific Oceans, the Indian Ocean, the third largest of the world’s five modern oceans, is much less understood.

    Murty’s NSF research project, which includes collaborators from the Woods Hole Oceanographic Institution and Union College, will focus on analyzing coral samples to determine how climate variability over the last 300 years has impacted ocean circulation in the Red Sea, a marginal sea of the Indian Ocean.

    “The Indian Ocean is one of the most under-observed tropical ocean regions in the world,” Murty said. “We do not have a strong understanding of past Indian Ocean climate or ocean circulation patterns, so I’ve been slowly moving my research over to this area, beginning with the marginal seas, such as the Red Sea.”

    “Our research findings will lead to improved understanding of Red Sea hydrographic variability and interactions with regional climate, aiding in climate and ocean circulation prediction efforts in the region,” she added.

    Corals have annual growth layers, similar to tree rings, that can offer valuable information on how environmental conditions have changed over time and provide insight for future climate modeling.

    Oceanographers like Murty scuba dive in the ocean and drill cores from massive boulder corals, taking care not to harm them. The samples for the new research were collected prior to this project and are now in the Paleoclimate Lab. 

    Along with analyzing the corals, project researchers also plan to participate in art-science outreach initiatives such as Synergy II, a collaborative project between Art League RI and the Woods Hole Oceanographic Institution that offers a unique opportunity to share ocean science research through artistic expression.

    The NSF funding also supports graduate and undergraduate students assisting with the coral analysis.

    [ad_2]

    University at Albany, State University of New York

    Source link