ReportWire

Tag: Climate Science

  • Direct air capture technology licensed to Knoxville-based Holocene

    Direct air capture technology licensed to Knoxville-based Holocene

    [ad_1]

    Newswise — An innovative and sustainable chemistry developed at the Department of Energy’s Oak Ridge National Laboratory for capturing carbon dioxide from air has been licensed to Holocene, a Knoxville-based startup focused on designing and building plants that remove carbon dioxide from atmospheric air.

    “ORNL is tackling climate change by developing numerous technologies that reduce or eliminate emissions,” said Susan Hubbard, ORNL deputy for science and technology. “But with billions of tons of carbon dioxide already in the air, we must capture carbon dioxide from the atmosphere to slow and reverse the effects of climate change.”

    “Direct air capture allows us to collect legacy emissions,” said Radu Custelcean, a scientist in ORNL’s Chemical Sciences Division and inventor of the licensed technology. “Our technology is one of the few approaches that can do that. It offers a new, energy-efficient approach to removing CO2 directly from air.”

    In direct air capture, a large fan pulls air through a contacting chamber where the air interacts with chemical compounds that filter and capture carbon dioxide. The CO2 can then be released from the capture material and stored deep underground.

    Holocene’s founder and chief executive officer Anca Timofte said there are several chemical approaches to direct air capture, or DAC, each with benefits and drawbacks.

    “ORNL’s chemistry combines the best features of existing approaches to DAC to create a water-based, low-temperature process,” she said.

    Custelcean’s process uses an aqueous solution containing ORNL-discovered receptors called Bis-iminoguanidine, or BIGs, to absorb carbon dioxide. As this happens, BIGs turn into an insoluble crystalline salt, which can easily be separated from the liquid solution. Custelcean and his research team discovered this new chemistry by chance while conducting fundamental crystallization experiments. The resulting Bis-Iminoguanidine Negative Emission Technology, or BIG-NET, received an R&D 100 Award in 2021.

    The BIGs discovery propelled Custelcean’s research in a new direction.

    “Doing basic research under DOE’s Basic Energy Sciences program, I have the flexibility to change direction if I find something interesting,” Custelcean said. “The basic research allows us to better understand all the elementary reactions and processes involved. But through licensing, we get to see a progression with our partners in the development of the technology. We’re involved in the full spectrum of research.”

    Timofte, originally from Romania, has a background in chemical engineering and worked at one of the world’s first direct air capture companies, Switzerland-based Climeworks. She contributed to the design of the company’s largest plant, which is in Iceland. With a growing interest in the market and finance aspects of carbon capture, she left Climeworks to enroll in the Master of Business Administration program at Stanford University to focus on climate technology and entrepreneurship.

    Timofte avidly followed the published literature around carbon capture. Custelcean’s publications caught her eye — she recognized the name as being Romanian — and she saw how his chemistry could address the major hurdles of the two established direct air capture processes.

    “The more I learned about his research, the more I saw the potential and the more I wanted to start my own company to pursue it,” she said. “With the encouragement of my professors, I founded Holocene and licensed the technology so I could work on it in a lab and think more about commercialization.”

    With Holocene established and the ORNL technology licensed, Timofte is further developing her business plans through Innovation Crossroads, a DOE Lab-Embedded Entrepreneurship Program funded by DOE’s Advanced Materials and Manufacturing Technologies Office, Building Technologies Office and the Tennessee Valley Authority.

    “When you’re in the position of starting a new company, having a group of mentors like the ones at Innovation Crossroads and the ability to work with ORNL is very appealing,” Timofte said. “I was happy to get into the program. It helps with the normal challenges that all startups have, but also very importantly, it connects us with the local ecosystem in Knoxville and gives us access to the scientists who developed the chemistry. We can work together and transfer knowledge — we can learn more about how the licensed technology works, work on features, troubleshoot issues, de-risk and optimize the chemistry. It’s a nice continuation of the collaboration.”

    Innovation Crossroads provides Holocene with a two-year cooperative research and development agreement to continue working with Custelcean and ORNL. Through this partnership, Holocene staff learn more about the science behind the technology, troubleshoot issues in testing and scale-up and connect with mentors at the lab and in the community.

    “Holocene is a great example of how the interconnected climate tech ecosystem can support a new company through the stages of development,” said Dan Miller, Innovation Crossroads program lead.

    Timofte is a Breakthrough Energy Fellow, a program launched by Breakthrough Energy — which was founded by Bill Gates — focused on accelerating innovation in sustainable energy and other technologies to reach net-zero emissions by 2050. Holocene is also part of the Spark Incubator Program, an entrepreneurial support program at the University of Tennessee Research Park’s Spark Innovation Center.

    Next up, Holocene and ORNL will conduct bench-scale testing funded by DOE’s Office of Fossil Energy and Carbon Management with the aim of using ORNL’s chemistry to further develop and deploy direct air capture at a commercial scale.

    ORNL senior commercialization manager Alex DeTrana negotiated the terms of the license. To connect with Holocene, complete this online contact form.

    The invention development team includes ORNL’s Costas Tsouris, Gyoung Gug Jang and Diana Stamberga. Charles Seipp and Neil Williams, formerly of ORNL, also participated. Read more about Custelcean’s carbon-removal research work.

    UT-Battelle manages ORNL for the Department of Energy’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. The Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

    [ad_2]

    Oak Ridge National Laboratory

    Source link

  • China experiences scorching heatwaves and droughts in 2022

    China experiences scorching heatwaves and droughts in 2022

    [ad_1]

    Newswise — Weather and climate are important factors affecting economic and social development. In China, the country’s National Climate Center releases an annual climate report that comprehensively covers China’s achievements and progress that year in climate system monitoring, climate impact assessment, and other aspects. This series of reports has been published in Atmospheric and Oceanic Science Letters for five consecutive years since 2019, and the “State of China’s climate in 2022” is now available.

    This year’s report provides a comprehensive summary of the main climate characteristics and high-impact weather and climate events in China in 2022. As introduced by Director Li Wei of the Climate Service Office of the National Climate Center, in 2022, the overall climate condition in China was worse than normal, presenting a warm–dry climate with the second highest annual mean temperature in history. The annual precipitation was the lowest recorded since 2012. The number of hot days and extreme high temperature events were both the highest in history, while the national average number of rainy days was the lowest. The precipitation in summer and autumn was less than normal. The average precipitation in summer was the second lowest since 1961. In summer, Northeast and North China had more rainfall during the flood season, while the Yangtze River Basin had less, resulting in extreme heatwaves and severe droughts.

    In 2022, there was an apparent stepwise feature of drought regionally, with southern China heavily affected by long droughts in summer and autumn. Rainstorm processes occurred frequently, especially over the Pearl River Basin and the Songliao River Basin, causing severe flooding disasters in South China and Northeast China. In summer, the strongest heatwave since 1961 occurred in central and eastern China. Persistent cold, rainy, snowy, and sunless weather was observed in southern China in February, and a strong cold wave from late November to early December caused severe cooling over a large area. Sandstorm weather appeared less frequently and later than normal, and landfalling typhoons were extremely less frequent.

    [ad_2]

    Institute of Atmospheric Physics, Chinese Academy of Sciences

    Source link

  • Expert available to discuss new report that puts globe on course for breaching benchmark high temperature

    Expert available to discuss new report that puts globe on course for breaching benchmark high temperature

    [ad_1]

    Newswise — A new report from the World Meteorological Organization (WMO) shows that the world’s average temperature could breach a record 1.5 Celsius of warming compared to pre-Industrial Revolution levels.

    News reports call the WMO announcement a critical warning of an average world temperature limit in the face of climate. Researchers indicate the threshold could be broken as early as 2027. A caveat: The breach will likely be only temporary. Nonetheless, as temperatures rise, ice in Antarctica and other places melts, setting up all but certain rises in sea levels. The problem will be further complicated by sinking coastal lands such as Chesapeake Bay in Virginia.

    Virginia Tech geophysicist and environmental security expert Manoochehr Shirzaei studies climate change and uses publicly available satellite imagery to build maps of millions of instances of rising sea levels and coastal land subsidence.

    “Sea level rise and land subsidence increase the hazards associated with hurricanes, storm surges, shoreline erosion, and inundation of low-lying coastal areas where the high density of population and assets amplifies the regions exposure to hazards.” He explains that land subsidence can also affect coastal structures’ integrity and increase the likelihood of failure.

    Shirzaei says the solution varies from place to place based on the individual situation. It may involve upgrading protection facilities (i.e. dams), raising lands, maintaining and restoring nature-based protection (i.e. wetlands), controlling subsidence, improving flood resiliency, selective relocation of important infrastructure, or installing flood warning systems. 

    About Shirzaei

    Manoochehr Shirzaei is an associate professor and geophysicist in the Department of Geosciences, part of the Virginia Tech College of Science. Director of the Earth Observation Lab at Virginia Tech, Shirzaei’s research recently has focused on promoting environmental security through quantifying the impact of the human system and climate change on the availability of water and energy resources in the U.S. He is an affiliated member of the Virginia Tech Global Change Center. Shirzaei has been quoted in WIREDWHRO NPR Norfolk, Coastal News Today, Smart Water Magazine and others.

    [ad_2]

    Virginia Tech

    Source link

  • Western Drought Forecasts: High-Resolution Coming Soon

    Western Drought Forecasts: High-Resolution Coming Soon

    [ad_1]

    Newswise — A new computer modeling technique developed by scientists at the National Center for Atmospheric Research (NCAR) offers the potential to generate months-ahead summertime drought forecasts across the Western United States with the capability of differentiating between dry conditions at locations just a couple of miles apart.

    The technique uses statistical methods and machine learning to analyze key drought indicators during the winter and spring and correlate them with the likelihood of dryness throughout the landscape the following summer. The scientists say this approach, if adapted for use by forecasters, could provide important information for such priorities as management of water resources, wildland fire and fuels, and agriculture.

    “This approach forecasts drought conditions before they have the largest impact,” said NCAR scientist Ronnie Abolafia-Rosenzweig, the lead author of a new paper describing the technique. “It gives managers an additional tool that they can use to prepare and guide the decisions they are making.”

    Abolafia-Rosenzweig and his co-authors found that predictions issued one to three months in advance could correctly identify the occurrence of summer drought in about 81-94% of cases at a resolution of 4 kilometers (2.5 miles) across the rugged and often parched western third of the United States. The predictions proved most accurate in regions of persistent drought, showing how upcoming dry conditions may vary from a cultivated field to a nearby mountainside or forested area. In regions where dry spells were punctuated by periods of heavy summer precipitation, however, the predictions proved less accurate.

    The scientists detailed their findings in a recent article in Water Resources Research, a journal published by the American Geophysical Union. The research was funded by NOAA, the U.S. Geological Survey, and the U.S. National Science Foundation, which sponsors NCAR.

    Strengthening societal resilience

    Droughts can have devastating health and economic impacts, costing the United States at least $249 billion since 1980 and setting the stage for widespread fires. In the West, the period from 2000-2021 was the driest 22-year stretch since at least the year 800, according to tree ring data. In 2021 alone, the drought and associated heat waves led to hundreds of deaths in the region.

    To strengthen societal resilience, scientists are working to improve computer modeling techniques that produce months-ahead predictions of drought. Current drought forecasts, however, have a relatively coarse resolution of, at best, about 10 kilometers, which does not adequately capture the varying degrees of drying across different landscape features in the West.

    But a new dataset that NCAR scientists recently produced in collaboration with the U.S. Geological Survey helped open the way for more detailed drought forecasts. The dataset is named CONUS404 because it contains simulations of hydrological and climate conditions at 4-kilometer resolution across the continental United States (or CONUS) over the past 40-plus years. Abolafia-Rosenzweig and his co-authors also drew on an equally high-resolution U.S. Department of Agriculture dataset, known as PRISM (Parameter elevation Regression on Independent Slopes Model) for meteorological observations.

    These datasets enabled the scientists to identify complex relationships, at a 4-kilometer resolution, between climate and drought conditions in late fall and winter and the extent of drying during the following summer. To identify these relationships, they used machine learning techniques that trained specialized statistical models.

    The scientists focused on pre-summer climate variables such as temperature, precipitation, and humidity, as well as distant ocean-atmosphere patterns such as the Pacific Multidecadal Oscillation that have far-reaching effects on climate. They found that commonly used drought measures, the Palmer Drought Severity Index and Soil Moisture Percentiles, have strong persistence from winter and spring into the summer, making pre-summer drought severity an especially important predictor of summer drought conditions.

    Abolafia-Rosenzweig said the drought forecasting method can augment a fire prediction technique that he and his co-authors had developed last year. Combining the drought and fire models offers the potential for a very detailed look at fire hazard across the West.

    “The West is in a very unique period in terms of both drought and fire with records being broken that go back thousands of years,” he said. “The climate projections are showing drought conditions will continue to intensify in the future. Having tools that can better inform management is becoming increasingly important.”

    This material is based upon work supported by the National Center for Atmospheric Research, a major facility sponsored by the National Science Foundation and managed by the University Corporation for Atmospheric Research. Any opinions, findings and conclusions or recommendations expressed in this material do not necessarily reflect the views of the National Science Foundation.

    [ad_2]

    National Center for Atmospheric Research (NCAR)

    Source link

  • North-West Europe’s scorching days heat up 2x faster

    North-West Europe’s scorching days heat up 2x faster

    [ad_1]

    Newswise — New research led by the University of Oxford has found that climate change is causing the hottest days in North-West Europe to warm at double the rate of average summer days. The difference in trends is most pronounced for England, Wales, and Northern France. Worryingly, while current climate models accurately predict the rate of warming for average days, they underestimate the rate at which the hottest days are warming compared to observations.

    According to lead researcher Dr Matthew Patterson, from the University of Oxford’s Department of Physics, the results indicate that extreme heat events – such as the UK’s record-breaking heatwave last summer – are likely to become more regular. Dr Patterson said: ‘These findings underline the fact that the UK and neighbouring countries are already experiencing the effects of climate change, and that last year’s heatwave was not a fluke. Policy makers urgently need to adapt their infrastructure and health systems to cope with the impacts of higher temperatures.’

    For the study, published today in Geographical Research Letters, Dr Patterson analysed data from the past 60 years (1960-2021) recording the maximum daily temperature, provided by the European Centre for Medium-Range Weather Forecasts.

    Although the maximum recorded temperature varied between years, the overall trend clearly showed that the hottest days for North-West Europe had warmed at twice the rate of average summer days. For England and Wales, the average summer day increased by approximately 0.26°C per decade, whilst the hottest day increased by around 0.58°C per decade. However, this faster warming of the hottest days was not observed to this extent elsewhere in the Northern Hemisphere.

    The reason causing this faster warming of the hottest days relative to average summer days is not yet understood. According to Dr Patterson, this may be due to the hottest summer days in North-West Europe often being linked to hot air transported north from over Spain. Because Spain is warming faster than North-West Europe, this means that air carried in from this region is ever more extreme relative to the ambient air in North-West Europe. The hottest days of 2022, for instance, were driven by a plume of hot air carried north from Spain and the Sahara. However, further research is needed to verify this.

    Dr Patterson added: ‘Understanding the warming rate of the hottest days will be important if we are to improve climate model simulation of extreme events and make accurate predictions about the future intensity of such events. If our models underestimate the rise in extreme temperatures over the coming decades, we will underestimate the impacts this will have.’

    Extreme heat has significant negative impacts on many different aspects of society, including energy and transport infrastructure, and agriculture. It also exacerbates conditions including respiratory and cardiovascular diseases, putting a strain on health services.

    The current UK Government has been criticised by the Climate Change Committee (CCC) for failing to act quickly enough to adapt for the impacts of global heating. These new findings add even more urgency for policy makers to adapt infrastructure and systems vulnerable to extreme heat.

    Notes to editors:

    For media requests and interviews, contact Dr Matthew Patterson, Department of Physics, University of Oxford: [email protected]

    The study ‘North-West Europe hottest days are warming twice as fast as mean summer days’ will be published in Geographical Research Letters, DOI 10.1029/2023GL102757. This link will go live when the embargo lifts. To view a pre-embargo copy of the study, contact Dr Matthew Patterson, Department of Physics, University of Oxford: [email protected]

    About the University of Oxford

    Oxford University has been placed number 1 in the Times Higher Education World University Rankings for the seventh year running, and ​number 2 in the QS World Rankings 2022. At the heart of this success are the twin-pillars of our ground-breaking research and innovation and our distinctive educational offer.

    Oxford is world-famous for research and teaching excellence and home to some of the most talented people from across the globe. Our work helps the lives of millions, solving real-world problems through a huge network of partnerships and collaborations. The breadth and interdisciplinary nature of our research alongside our personalised approach to teaching sparks imaginative and inventive insights and solutions.

    Through its research commercialisation arm, Oxford University Innovation, Oxford is the highest university patent filer in the UK and is ranked first in the UK for university spinouts, having created more than 200 new companies since 1988. Over a third of these companies have been created in the past three years. The university is a catalyst for prosperity in Oxfordshire and the United Kingdom, contributing £15.7 billion to the UK economy in 2018/19, and supports more than 28,000 full time jobs.

    [ad_2]

    University of Oxford

    Source link

  • Immigration experts on Title 42, analysis of immigration policies, and other migrant news in the Immigration Channel

    Immigration experts on Title 42, analysis of immigration policies, and other migrant news in the Immigration Channel

    [ad_1]

    Title 42, the United States pandemic rule that had been used to immediately deport hundreds of thousands of migrants who crossed the border illegally over the last three years, has expired. Those migrants will have the opportunity to apply for asylum. President Biden’s new rules to replace Title 42 are facing legal challenges. The US Homeland Security Department announced a rule to make it extremely difficult for anyone who travels through another country, like Mexico, to qualify for asylum. Border crossings have already risen sharply, as many migrants attempted to cross before the measure expired on Thursday night. Some have said they worry about tighter controls and uncertainty ahead. Immigration is once again a major focus of the media as we examine the humanitarian, political, and public health issues migrants must face. 

    Below are some of the latest headlines in the Immigration channel on Newswise.

    Expert Commentary

    Experts Available on Ending of Title 42

    George Washington University Experts on End of Title 42

    ‘No one wins when immigrants cannot readily access healthcare’

    URI professor discusses worsening child labor in the United States

    Biden ‘between a rock and a hard place’ on immigration

    University of Notre Dame Expert Available to Comment on House Bill Regarding Immigration Legislation, Border Safety and Security Act

    American University Experts Available to Discuss President Biden’s Visit to U.S.-Mexico Border

    Title 42 termination ‘overdue’, not ‘effective’ to manage migration

    Research and Features

    Study: Survey Methodology Should Be Calibrated to Account for Negative Attitudes About Immigrants and Asylum-Seekers

    A study analyses racial discrimination in job recruitment in Europe

    DACA has not had a negative impact on the U.S. job market

    ASBMB cautions against drastic immigration fee increases

    Study compares NGO communication around migration

    Collaboration, support structures needed to address ‘polycrisis’ in the Americas

    TTUHSC El Paso Faculty Teach Students While Caring for Migrants

    Immigrants Report Declining Alcohol Use during First Two Years after Arriving in U.S.

    How asylum seeker credibility is assessed by authorities

    Speeding up and simplifying immigration claims urgently needed to help with dire situation for migrants experiencing homelessness

    Training Individuals to Work in their Communities to Reduce Health Disparities

    ‘Regulation by reputation’: Rating program can help combat migrant abuse in the Gulf

    Migration of academics: Economic development does not necessarily lead to brain drain

    How has the COVID-19 pandemic affected immigration?

    Immigrants with Darker Skin Tones Perceive More Discrimination

     

    [ad_2]

    Newswise

    Source link

  • Accurate measurements of black carbon in the atmosphere

    Accurate measurements of black carbon in the atmosphere

    [ad_1]

    Newswise — Our industrialized society releases many and various pollutants into the world. Combustion in particular produces aerosol mass including black carbon. Although this only accounts for a few percent of aerosol particles, black carbon is especially problematic due to its ability to absorb heat and impede the heat reflection capabilities of surfaces such as snow. So, it’s essential to know how black carbon interacts with sunlight. Researchers have quantified the refractive index of black carbon to the most accurate degree yet which might impact climate models.

    There are many factors driving climate change; some are very familiar, such as carbon dioxide emissions from burning fossil fuels, sulfur dioxide from cement manufacture or methane emissions from animal agriculture. Black carbon aerosol particles, also from combustion, are less covered in the news but are particularly important. Essentially soot, black carbon is very good at absorbing heat from sunlight and storing it, adding to atmospheric heat. At the same time, given dark colors are less effective at reflecting light and therefore heat, as black carbon covers lighter surfaces including snow, it reduces the potential of those surfaces to reflect heat back into space.

    “Understanding the interaction between black carbon and sunlight is of fundamental importance in climate research,” said Assistant Professor Nobuhiro Moteki from the Department of Earth and Planetary Science at the University of Tokyo. “The most critical property of black carbon in this regard is its refractive index, basically how it redirects and disperses incoming light rays. However, existing measurements of black carbon’s refractive index were inaccurate. My team and I undertook detailed experiments to improve this. With our improved measurements, we now estimate that current climate models may be underestimating the absorption of solar radiation due to black carbon by a significant 16%.”

    Previous measurements of the optical properties of black carbon were often confounded by factors such as lack of pure samples, or difficulties in measuring light interactions with particles of differing complex shapes. Moteki and his team improved this situation by capturing the black carbon particles in water, then isolating them with sulfates or other water-soluble chemicals. By isolating the particles, the team was better able to shine light on them and analyze the way they scatter, which gave researchers the data to calculate the value of refractive index.

    “We measured the amplitude, or strength, and phase, or step, of the light scattered from black carbon samples isolated in water,” said Moteki. “This allowed us to calculate what is known as the complex refractive index of black carbon. Complex because rather than being a single number, it’s a value that contains two parts, one of which is ‘imaginary’ (concerned with absorption), though its impact is very, very real. Such complex numbers with imaginary components are actually very common in the field of optical science and beyond.”

    As the new optical measurements of black carbon imply that current climate models are underestimating its contribution to atmospheric warming, the team hopes that other climate researchers and policymakers can make use of their findings. The method developed by the team to ascertain the complex refractive index of particles can be applied to materials other than black carbon. This allows for the optical identification of unknown particles in the atmosphere, ocean or ice cores, and the evaluation of optical properties of powdered materials, not just those related to the ongoing problem of climate change.

    ###

    Journal article: Nobuhiro Moteki, Sho Ohata, Atsushi Yoshida & Kouji Adachi. “Constraining the complex refractive index of black carbon particles using the complex forward-scattering amplitude”, Aerosol Science and Technology. DOI: 10.1080/02786826.2023.2202243

    Funding:
    Funds were provided by the Environment Research and Technology Development Fund (JPMEERF20202003) of the Environmental Restoration and Conservation Agency, the Japan Society for the Promotion of Science (JSPS) KAKENHI program (JP19H04236, JP19KK0289, Accepted Manuscript JP19H04259, JP19H05699, 22H03722, and 22H01294), and the Arctic Challenge for Sustainability ArCS II project (JPMXD1420318865) of the Ministry of Education, Culture, Sports, Science, and Technology (MEXT) of Japan.

    Useful links:
    Graduate School of Science – https://www.s.u-tokyo.ac.jp/en/
    Department of Earth and Planetary Science – https://www.eps.s.u-tokyo.ac.jp/en/

    About The University of Tokyo
    The University of Tokyo is Japan’s leading university and one of the world’s top research universities. The vast research output of some 6,000 researchers is published in the world’s top journals across the arts and sciences. Our vibrant student body of around 15,000 undergraduate and 15,000 graduate students includes over 4,000 international students. Find out more at www.u-tokyo.ac.jp/en/ or follow us on Twitter at @UTokyo_News_en.

    [ad_2]

    University of Tokyo

    Source link

  • Will clean energy incentives, EV tax credits survive debt ceiling showdown?

    Will clean energy incentives, EV tax credits survive debt ceiling showdown?

    [ad_1]

    As the nation prepares for a showdown between President Biden and House Republican leadership over the impending default date of the federal debt ceiling, a House Republicans’ proposal to avoid the country’s first default could raise the federal debt limit but would undermine President Biden’s climate law— the Inflation Reduction Act.

    Joshua Basseches, a climate change policy and politics expert at Tulane’s School of Liberal Arts, believes a big part of the Republican’s proposed solution is to speed up the permitting process for fossil fuel projects and control the energy supply. 

    “When you step back and look at the big picture, this is an effort to undermine the goals of the Inflation Reduction Act. This is a way to keep the fossil fuel industry afloat. But even if the Republicans were to get this through, which I don’t think they will in its current form, it would not undo all the positives from the Inflation Reduction Act.”

    “The bill also contains the full text of the energy package the GOP passed in March, which would expand domestic energy production by allowing more oil, gas and mineral exploration on public lands and make dramatic changes to the National Environmental Policy Act by speeding up permitting for energy projects.”

    Basseches can speak on the following:

    -The potential effects of the GOPs debt-limit plan on the clean energy transition

    -Permitting reform, its opportunities and pitfalls

    -The Inflation Reduction Act’s impact on electric vehicles, clean electricity and ongoing state-level climate policy efforts

    [ad_2]

    Tulane University

    Source link

  • Exploring the underground connections between trees

    Exploring the underground connections between trees

    [ad_1]

    Fungal networks interconnecting trees in a forest is a key factor that determines the nature of forests and their response to climate change. These networks have also been viewed as a means for trees to help their offspring and other tree-friends, according to the increasingly popular “mother-tree hypothesis”. An international group of researchers re-examined the evidence for and against this hypothesis in a new study.

    Trees in a forest are interconnected through thread-like structures of symbiotic fungi, called hyphae, which together form an underground network called a mycorrhizal network. While it is well known that the mycorrhizal fungi deliver nutrients to trees in exchange for carbon supplied by the trees, the so-called mother-tree hypothesis implies a whole new purpose of these networks. Through the network, the biggest and oldest trees, also known as mother trees, share carbon and nutrients with the saplings growing in particularly shady areas where there is not enough sunlight for adequate photosynthesis. The network structure should also enable mother trees to detect the ill health of their neighbors through distress signals, alerting them to send these trees the nutrients they need to heal. In this way, mother trees are believed to act as central hubs, communicating with both young seedlings and other large trees around them to increase their chances of survival.

    This is a very appealing concept attracting the attention of not only scientists, but also the media, where this hypothesis is often presented as fact. According to the authors of the study just published in New Phytologist, the hypothesis is however hard to reconcile with theory, prompting the researchers to re-examine data and conclusions from publications for and against the mother tree hypothesis.

    The study, led by Nils Henriksson at the Swedish University of Agricultural Sciences, found that the empirical evidence for the mother tree hypothesis is actually very limited and theoretical explanations for the mechanisms are largely lacking. While big trees and their interconnections with their neighbors are still essential for the forest ecosystem, the fungal network does not work as a simple pipeline for resource sharing among trees. This means that apparent resource sharing among trees is more likely to be a result of trading between fungi and trees rather than directed transfer from one tree to another. Very often, this even results in aggravated competition between trees rather than support of seedlings. 

    “We found that mycorrhizal networks are indeed essential for the stability of many forest ecosystems, but rarely through sharing and caring among trees. Rather, it works like a trading ground for individual trees and fungi, each trying to make the best deal to survive,” explains Oskar Franklin, a study author and a researcher in the Agriculture, Forestry, and Ecosystem Services Research Group of the IIASA Biodiversity and Natural Resources Program. “The forest is not a super organism or a family of trees helping each other. It is a complex ecosystem with trees, fungi, and other organisms, which are all interdependent but not guided by a common purpose.” 

    “Although the narrative of the mother tree hypothesis is scarcely supported by scientific evidence and is controversial in the scientific community, it has inspired both research and public interest in the complexity of forests. It is vital that the future management and study of forests take the real complexity of these important ecosystems into account,” Franklin concludes.

    Reference

    Henriksson, N., Marshall, J., Högberg, M.N., Högberg, P., Polle, A., Franklin, O., Näsholm, T. (2023). Re-examining the evidence for the mother tree hypothesis – resource sharing among trees via ectomycorrhizal networks. New Phytologist DOI: 10.1111/nph.18935

     

    New Phytologist is a leading international journal focusing on high quality, original research across the broad spectrum of plant sciences, from intracellular processes through to global environmental change. The journal is owned by the New Phytologist Foundation, a not-for-profit organisation dedicated to the promotion of plant science. https://www.newphytologist.org/

     

     

    About IIASA:
    The International Institute for Applied Systems Analysis (IIASA) is an international scientific institute that conducts research into the critical issues of global environmental, economic, technological, and social change that we face in the twenty-first century. Our findings provide valuable options to policymakers to shape the future of our changing world. IIASA is independent and funded by prestigious research funding agencies in Africa, the Americas, Asia, and Europe. www.iiasa.ac.at

    [ad_2]

    International Institute for Applied Systems Analysis (IIASA)

    Source link

  • Archaea Diversity Drops in Warming Climate

    Archaea Diversity Drops in Warming Climate

    [ad_1]

    Newswise — Led by Jizhong Zhou, Ph.D., the director of the Institute for Environmental Genomics at the University of Oklahoma, an international research team conducted a long term experiment that found that climate warming reduced the diversity of and significantly altered the community structure of soil archaea. Their findings are published in the journal Nature Climate Change.

    At the microbiological level, life can be described as belonging to one of three kingdoms – how species are described in relation to one another. Eukarya contains complex organisms like animals and plants and microorganisms such as fungi. The other two categories, bacteria and archaea, are comprised only of microorganisms. Archaea are prevalent in a range of environments, from some of the most hostile like volcanoes and permafrost. However, archaea are also common in the human microbiome and as an important part of soil ecology.

    “As temperature is a major driver of biological processes, climate warming will impact various ecological communities,” Zhou said. “Based on long-term time-series data, our previous studies revealed that experimental warming leads to the divergent succession of soil bacterial and fungal communities, accelerates microbial temporal scaling, reduces the biodiversity of soil bacteria, fungi and protists, but increases bacterial network complexity and stability. However, how climate warming affects the temporal succession of the archaeal community remains elusive. Archaea are ubiquitously present in soil and are vital to soil functions, e.g., nitrification and methanogenesis.”

    Using a long-term multifactor experimental field site at OU’s Kessler Atmospheric and Ecological Field Station, the researchers showed that experimental warming of a tallgrass prairie ecosystem significantly altered the community structure of soil archaea and reduced their taxonomic and phylogenetic diversity. In contrast to the researchers’ previous observations in bacteria and fungi, their finds show that climate warming leads to convergent succession of the soil archaeal community, suggesting archaeal community structures would become more predictable in a warmer world.

    ###

    About the Project

    The article, “Experimental Warming Leads to Convergent Succession of Grassland Archaeal Community” published May 3, 2023 in Nature Climate Change. DOI no. 10.1038/s41558-023-01664-x. Zhou, who is also a George Lynn Cross Research Professor of Microbiology in the Dodge Family College of Arts and Sciences, is the corresponding author. The first author is Ya Zhang, Institute for Environmental Genomics and Department of Microbiology and Plant Biology at OU. 

    About the University of Oklahoma Office of the Vice President for Research and Partnerships 

    The University of Oklahoma is a leading research university classified by the Carnegie Foundation in the highest tier of research universities in the nation. Faculty, staff and students at OU are tackling global challenges and accelerating the delivery of practical solutions that impact society in direct and tangible ways through research and creative activities. OU researchers expand foundational knowledge while moving beyond traditional academic boundaries, collaborating across disciplines and globally with other research institutions as well as decision makers and practitioners from industry, government and civil society to create and apply solutions for a better world. Find out more at ou.edu/research.

    [ad_2]

    University of Oklahoma

    Source link

  • Sustaining U.S. Nuclear Power Plants Could be Key to Decarbonization

    Sustaining U.S. Nuclear Power Plants Could be Key to Decarbonization

    [ad_1]

    Newswise — Nuclear power is the single largest source of carbon-free energy in the United States and currently provides nearly 20 percent of the nation’s electrical demand. Many analyses have investigated the potential of future nuclear energy contributions in addressing climate change. However, few assess the value of existing nuclear power reactors.

    Research led by Pacific Northwest National Laboratory (PNNL) Earth scientist Son H. Kim with the Joint Global Change Research Institute (JGCRI), a partnership between PNNL and the University of Maryland, has added insight to the scarce literature and is the first to evaluate nuclear energy for meeting deep decarbonization goals. Kim sought to answer the question: Just how much do our existing nuclear reactors contribute to the mission of meeting the country’s climate goals, both now and if their operating licenses were extended?

    As the world races to discover solutions for reaching net zero, Kim’s report quantifies the economic value of bringing the existing nuclear fleet into the year 2100 and outlines its significant contributions in limiting global warming.

    Plants slated to close by 2050 could be among the most important players in a challenge that requires all carbon-free technology solutions that are available—emerging and existing—the report finds. New nuclear technology also has a part to play, and its contributions could be boosted by driving down construction costs.  

    “Even modest reductions in capital costs could bring big climate benefits,” said Kim. “Significant effort has been incorporated into the design of advanced reactors to reduce the use of all materials in general, such as concrete and steel, because that directly translates into reduced costs and carbon emissions.”

    Nuclear power reactors face an uncertain future

    The nuclear power fleet in the United States consists of 93 operating reactors across 28 states. Most of these plants were constructed and deployed between 1970-1990. This means half of the fleet has outlived its original operating license lifetime of 40 years. While most reactors have had their licenses renewed for an additional 20 years, and some for yet another 20, the total number of reactors that will receive a lifetime extension to operate a full 80 years from deployment is uncertain.

    Other countries also rely on nuclear energy. In France, for example, nuclear energy provides 70 percent of the country’s power supply. They and other countries will also have to consider whether to extend the lifetime, retire, or build new, modern reactors. However, the U.S. faces the potential retirement of a bulk of reactors in a short period of time—this could have a far stronger impact than the staggered closures other countries may experience.

    “Our existing nuclear power plants are aging and with their current 60-year lifetimes, nearly all of them will be gone by 2050. It’s ironic. We have a net zero goal to reach by 2050, yet our single largest source of carbon-free electricity is at risk of closure,“ said Kim.

    Exploring scenarios of lifetime extensions for nuclear power reactors

    Kim has built computational models that explore the interplay between economic processes, energy demand, and Earth’s climate since joining PNNL and JGCRI in 1995, when he was a doctoral intern with a fresh PhD in nuclear engineering. At JGCRI, researchers explore interactions between human, energy, and environmental systems to provide data for managing risks and analyzing options. His research is inspired by a drive to solve the energy and environmental crisis using modeling capabilities and tools like the Global Change Analysis Model (GCAM), developed at PNNL.

    Kim used GCAM to model multiple scenarios of extending the lifetime of the existing nuclear fleet into 2100. The article, published in Nuclear Technology, put a value on lifetime license extensions from 40 to 100 years at $330 billion to $500 billion in mitigation cost savings under a scenario that limits global temperature to 2°C. Mitigation costs savings, or the carbon value, are amounts of dollars saved in reducing greenhouse gas emissions. Legacy nuclear reactors alone have a carbon value of $500 billion if operational for 100 years. Every gigawatt of energy, or one nuclear power reactor, translates to $5 billion later saved. Because that gigawatt was produced without any carbon emitted into Earth’s atmosphere, no money would need to be spent to mitigate its effects.

    Maintaining existing nuclear power plants avoids replacing reactors with electricity sources that produce carbon emissions. In states where nuclear reactors have been shut down, carbon emissions have increased from replacing the carbon-free electricity with natural gas-generated electricity.

    Kim determined that lifetime extensions of existing nuclear power reactors from 60 to 80 years, without adding new nuclear capacity, contributed to a reduction of approximately 0.4 gigatons of carbon (GtCO2) emissions per year by 2050. The total cumulative difference in CO2 emissions between 2020 and 2100, in a scenario with lifetime extensions and future deployment of nuclear power plants (as compared to a scenario with a moratorium on new nuclear power plants), amounts to as much as 57 GtCO2.

    How much is 57 GtCO2? According to the International Energy Agency, U.S. carbon emissions in 2022 were 4.7 Gt, which means nuclear energy could save approximately 12 years’ worth of carbon emissions.

    An Intergovernmental Panel on Climate Change report on nuclear energy stated, “Nuclear power is therefore an effective greenhouse gas (GHG) mitigation option, especially through license extensions of existing plants enabling investments in retro-fitting and upgrading.”

    However, in a follow-on report to his research, Kim addresses the additional savings potential of driving down capital costs of building new nuclear power plants.

    Removing the uncertainty in nuclear power costs can increase emissions savings

    Building new nuclear power plants is expensive and construction takes a long period of time. The largest costs are often capital costs: the one-time price paid to build new structures and equipment.

    Advanced reactors—including small modular reactors and microreactors—are being developed with new technologies, enhanced security features, smaller physical footprints, and more flexible deployment options. They are expected to play an important role in the future U.S. electricity system and carbon mitigation efforts.

    “One of most important attributes of small modular reactors and microreactors is the reduced construction time,” Kim said. “SMRs and microreactors will be factory fabricated and delivered to site on trucks, and the uncertainty associated with financing cost should be reduced or eliminated.”

    Kim used GCAM to investigate a range of nuclear plant capital costs with scenarios of alternative carbon mitigation policies, and U.S. economy-wide net-zero emission goals by 2050, 2060, and 2070.

    Among the multiple findings in the report for DOE’s Office of Nuclear Energy, Kim found that an aggressive reduction of nuclear construction costs has a clear and pronounced impact on the expanded deployment of nuclear power under all scenarios, even without an explicit carbon mitigation policy.

    Continuing to generate electricity while removing all emissions of greenhouse gases by mid-century is a difficult challenge. “We must utilize all carbon-free technologies that are available to us,” said Kim, “and one of the great values of nuclear energy is that it doesn’t emit carbon while it’s generating power.”

    ###

    About PNNL

    Pacific Northwest National Laboratory draws on its distinguishing strengths in chemistry, Earth sciences, biology and data science to advance scientific knowledge and address challenges in sustainable energy and national security. Founded in 1965, PNNL is operated by Battelle for the Department of Energy’s Office of Science, which is the single largest supporter of basic research in the physical sciences in the United States. DOE’s Office of Science is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science. For more information on PNNL, visit PNNL’s News Center. Follow us on Twitter, Facebook, LinkedIn and Instagram.

    [ad_2]

    Pacific Northwest National Laboratory

    Source link

  • Suffering from allergies already? Blame climate change.

    Suffering from allergies already? Blame climate change.

    [ad_1]

    Reports indicate that pollen patterns, magnitude and flowering timing are changing with the earth’s temperature rise

    Human-caused climate change is exacerbating pollen seasons, asthma and even wildfires in certain areas around the nation. In the past three decades across the U.S., pollen seasons have not only started sooner and lasted longer but also increased in pollen concentrations. This trajectory showcases that its more than just a seasonal nuisance now. Allergies to airborne pollen are tied to respiratory health and will impact a very similar vulnerable population that suffered during the COVID-19 pandemic.

    For expert commentary on allergies and asthma that have been categorized as a health outcome linked to climate change, Andrea De Vizcaya Ruiz, PhD, associate professor and Shahir Masri, ScD, associate specialist, both with the environmental and occupational health department at UC Irvine Program in Public Health, are available for interviews.

    More pollen circulating in our air longer is contributing to the onset and aggravation of allergies (rhinitis, eye irritation, headaches, cough, post-nasal drip). Coupled with indoor air pollution and climate change, our communities are experiencing unprecedented exposure to harmful air pollutants. The evidence is alarming and is imperative we take action to adopt effective and evidence-based regulations, spread awareness on lifestyle changes, and work together to clean our air.

    [ad_2]

    University of California, Irvine

    Source link

  • All time high temperatures are causing more injury deaths

    All time high temperatures are causing more injury deaths

    [ad_1]

    Newswise — UCI Public Health’s Tim Bruckner, PhD, a professor of health, society, and behavior joined a research team to analyze death certificate data during the Pacific Northwest heat wave and discovered the association of higher injury death rates. Injury deaths are categorized as drownings, traffic accidents, assaults, and suicides exceeded expectations as a result of the unprecedented heat wave. 

    They found that in June of 2021, injury deaths exceeded predictions by 21 deaths and by July of that time, death exceeded predictions by 93. These results coincide with additional evidence-based research that injury death rates vary by season in the U.S. confirming that temperature can notably influence injury death rates. 

    Findings are published in the American Journal of Public Health.

    Even though the Pacific Northwest Heat Wave was a 1-in-1000-years event, it can happen again as climate change effects worsen. We need to create better public health interventions and awareness in order to manage the impact that rising temperatures can have on alcohol consumption, driving behaviors, levels of anger and despair, and increased swimming activities. Older adults, agricultural workers, and others undertaking strenuous physical activity in uncooled spaces are disproportionately at risk to the effects of heat.

    [ad_2]

    University of California, Irvine

    Source link

  • How Argonne makes the power grid more reliable and resilient

    How Argonne makes the power grid more reliable and resilient

    [ad_1]

    Newswise — Through innovative methods of deeply understanding the complexities of the grid, the lab helps secure the nation’s energy future.

    The U.S. power grid is almost incomprehensibly large. Comprising nearly 12,000 power plants, 200,000 miles of high-voltage transmission lines, 60,000 substations and 3 million miles of power lines, it may well be the most massive and complex machine ever assembled. Households, businesses, governments and essential infrastructure — including water, telecommunications, food supply, health care and wastewater treatment — rely on the grid around the clock. The power it generates fuels the U.S. economy.

    All this complexity makes it critical to understand the vulnerabilities of the nation’s electric transmission and distribution systems and to protect the grid from an evolving set of human-caused and natural hazards. Those can include cyberattacks from foreign governments and terrorists as well as extreme weather events driven by climate change. Record-setting heat waves, unprecedented storms and flooding, historic droughts and wildfires all pose hazards to the grid.

    “What sets Argonne apart is that we are very good at looking at all these problems from a multidisciplinary perspective. There are no research silos here.” — Mark Petri, head of Argonne’s Electric Power Grid Program

    The U.S. Department of Energy’s (DOE) Argonne National Laboratory plays a vital role in maintaining and developing a stable and secure grid. At the nation’s first national lab, located in southwest suburban Chicago, scientists and engineers bring to bear collective expertise in economics, threat assessment and mitigation, system vulnerability analysis, critical infrastructure interdependency modeling, proactive cybersecurity defense and emergency readiness and response support. The lab also leverages cutting-edge high performance computing hardware, mathematical software technologies, and artificial intelligence and machine learning resources.

    “What sets Argonne apart is that we are very good at looking at all these problems from a multidisciplinary perspective,” says Mark Petri, head of the lab’s Electric Power Grid Program, who leads security and resilience activities. Petri also serves as technical team lead for the Markets, Policies & Regulations pillar of DOE’s Grid Modernization Initiative. ​“We bring together engineers, infrastructure analysts, computer scientists and modelers, artificial intelligence experts, economists, battery researchers and others in a focused effort to tackle these critical national challenges. There are no research silos here.”

    Argonne also collaborates with local, state, regional, tribal and territorial stakeholders, as well as academia, utilities and other national laboratories. This helps Argonne develop and deploy innovative solutions and advanced technologies that enhance the grid’s ability to withstand and recover from threats. Argonne is a key contributor to the Grid Modernization Laboratory Consortium, a strategic partnership between DOE and the national labs to bring together leading experts, technologies and resources to collaborate on the goal of modernizing the nation’s grid.

    Specialized models and training help design and defend an evolving grid

    For more than two decades, Argonne has pioneered the analysis of grid infrastructure. That includes identifying natural and man-made external threats to the system — everything from hail to hackers — and honing in precisely on system vulnerabilities. ​“If I have flooding, high winds, ice — what are the things that are likely to break on the system?” Petri asks. ​“Are transmission towers going to go out? Are substations going to be under water? Am I going to lose power generation? Knowing the weak links in the chain is key.”

    Researchers are also interested in deeply examining the complex interdependencies that exist between electricity infrastructure and other energy systems such as natural gas. Understanding the interconnections, the ways the systems operate in concert and how disruption in one sector has the potential to cause cascading failures across the entire complex, allows researchers to anticipate potential disruptions, manage impacts and develop adaptation measures for the future.

    Argonne scientists have developed specialized computer modeling tools to enable decision makers to make informed, data-backed choices when proactively hardening the grid or responding to threats in real time. For instance, they developed one of the highest resolution climate models covering North America, which projects the impacts of climate change 50 years into the future. While most climate modeling is done at the scale of 100-kilometer grid blocks on a map, Argonne’s model behind its Climate Risk and Resilience Portal, driven by some of the nation’s most powerful supercomputers, zooms in to the level of 12 kilometers. (Argonne’s next climate models will have a resolution closer to four kilometers, which approaches the size of a large urban neighborhood or small rural town.)

    “Developing the hazard and climate risk models that leverage the latest in the science and the leadership class computational resources at Argonne and DOE has enabled us to work with a multitude of private and public sector utilities” said Rao Kotamarthi, science director of the Center for Climate Resilience and Decision Science and a senior scientist at Argonne’s Environmental Science division.

    Kotamarthi explained that the breakthrough offers more actionable hyperlocal information for leaders thinking through climate resiliency planning. Companies including AT&T and ComEd, as well as government agencies like the New York Power Authority, already see the model’s value. Looking to improve the resilience of their grid-level infrastructure and keep critical services up and running, they can see which pieces of valuable equipment sit in likely future climate-related danger zones. This helps them to identify locations that may need to be stabilized or relocated altogether.

    Argonne has also developed several other leading modeling tools, including the Hurricane Electric Assessment Damage Outage, which forecasts likely power outages after a storm. The EPfast tool examines power outage impacts on large electric grid systems. The Restore tool provides insights into repair times for outages at critical infrastructure facilities. And the Electric Grid Resilience Improvement Program models power system restoration after a major blackout.

    Moreover, to help system operators respond more quickly to grid failures, limit impacts on customers and speed recovery, Argonne supports system operator training so they can effectively respond to major grid disruptions. Stakeholders responsible for resilience are put through readiness exercises that replicate real-world threat, response and recovery scenarios — hurricanes, blizzards, earthquakes, cyberattacks — and hone their in-the-moment decision-making skills.

    New tools predict outcomes from emergent grid resources

    Adding yet another layer of complexity to the grid, distributed energy resources (DERs) like rooftop solar panels and generators have emerged as significant power generation sources. DERs contribute to a power system’s overall capacity, but operators must assess their impact and forecast their potential, especially during extreme weather events. That’s why Argonne created TDcoSim, a cutting-edge transmission and distribution co-simulation software tool that enables high-fidelity modeling of DERs. It’s the first model capable of simulating both transmission (the high-voltage network used to transfer power long distances) and distribution (the localized low-voltage network used by the utilities to deliver power to consumers).

    “This is a totally new paradigm in grid modeling. Nobody has done this before,” says Vladimir Koritarov, director of the lab’s Center for Energy, Environmental and Economic Systems Analysis. ​“At Argonne, we specialize in developing these kinds of new, advanced grid models, algorithms, optimization methods and approaches that are more efficient, faster and more accurate than previously available ones.”

    Among those models is the Argonne Low-Carbon Electricity Analysis Framework, known as A-LEAF, an integrated national-scale simulation framework for power system operations and planning. It allows operators to evaluate different pathways to decarbonization of electric grids. A related Argonne-developed interactive tool called the Geospatial Energy Mapper helps users identify sites across the country best suited for renewable energy infrastructure projects.

    As the U.S. aims to meet a goal of net-zero carbon emissions by 2050, the grid’s energy mix will likely include far more renewables than today. But sources such as solar and wind are variable in their production and output may be reduced in extreme weather. Adapting to this variability interests Argonne energy systems engineer Neal Mann. At a time when long-term planning decisions are being made about which energy infrastructure technologies are invested in and built, and which will be retired, Mann focuses on the role nuclear power might play in the future grid. ​“If we rely too much on weather-driven generation, do we end up compromising reliability under stressed climate-related conditions?” he asks. ​“In those cases, having nuclear and other so-called dispatchable technologies available could be the difference between widespread outages or not.”

    Grid-level energy storage is focus of materials and manufacturing R&D

    To compensate for the uncertainty of variable renewables and to capture excess generation, researchers across Argonne are focused on low-cost, high-efficiency energy storage. Those efforts include research into various novel battery technologies such as advanced sodium-ion cathodes and new flow cell chemistries; chemical and thermal storage; and pumped storage hydropower, a common type of hydroelectric energy storage that can provide power even during extended lulls in solar and wind generation.

    One project involves the development of a model based on the R&D 100 winning EverBatt model, called ​“EverGrid.” The free to use model will help determine the impacts of stationary energy storage technologies such as flow batteries and advanced lead acid batteries at end-of-life, including recycling. The model will help researchers make better decisions during the technology development process as well as help find hot spots in processing that can lead to optimization and scale up.

    “In order to reduce greenhouse gas emissions and hit U.S. climate goals, we’re going to be increasingly relying on renewable energy, which is not a constant source of energy,” says Chris Heckle, director of the Materials Manufacturing Innovation Center at Argonne. ​“We need to develop grid-level energy storage solutions, which will need to be large in scale. That will involve manufacturing challenges, transportation challenges and systems challenges, all of which Argonne is well positioned to meet.”

    For Petri, the growing complexity of the grid and the evolving threats against it make Argonne’s interdisciplinary approach more necessary than ever to help secure the nation’s energy future.

    “Our ability to understand how the grid’s complex systems behave, how they might be disrupted, and how operators can improve response is vitally important,” he says. ​“It’s important to people’s lives, it’s important to our economy, it’s important to our national security. And here at Argonne, we are right in the middle of improving these systems from a reliability and resilience perspective.”

    Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science.

    The U.S. Department of Energy’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://​ener​gy​.gov/​s​c​ience.

    [ad_2]

    Argonne National Laboratory

    Source link

  • Arif Efendi applauds global renewable energy efforts

    Arif Efendi applauds global renewable energy efforts

    [ad_1]

    Newswise — The demand for renewable energy is continuously growing worldwide. Arif Efendi previously of Doyen Sports notes that the latest efforts in renewables will be crucial to the future of power sourcing. Current efforts include innovations in solar energy, wind power, nuclear energy, hydrogen fuel, and more.

    According to Euro News, “Renewable energy is to become the world’s top source of electricity by 2025.” The benefits of cleaner and more efficient energy are becoming increasingly attractive to top businesses and corporations globally. Such entities do not only turn to renewables for efficiency. They also rely on them for long-term cost benefits and overall environmental impact. Aspects like these serve as optimistic predictions of higher stability and safety levels.

    Arif Efendi is a passionate businessman and investor. His work spans various industries. Due to this, Efendi makes a loyal effort to stay updated on current events and provide solutions for the future. He applauds renewable energy efforts, as it’s one of the only solutions to a safer and cleaner world.

    Moving forward with renewable energy has been in question for far too long. Over the years, people and companies have asked, “How will we switch to renewable energy?” Fortunately, there are now many tangible solutions to this elongated debate. According to the United Nations (UN), “while about 5 million jobs in fossil fuel production could be lost by 2030, an estimated 14 million new jobs would be created in clean energy.”

    The positive environmental effects

    In February, UN Secretary-General António Guterres briefed the General Assembly meeting on the organization’s top priorities for 2023. In his speech, he noted the importance of renewable energy and how it will change the course of the year ahead.

    He noted, “We must focus on two urgent priorities: cutting emissions and achieving climate justice.” There is no other option. Suppose companies and manufacturers do not implement solid plans to reduce emissions or achieve net zero. In that case, the world will experience further issues that it environmentally cannot afford to bear.

    Another powerful statement from the Secretary-General warned fossil-fuel producers. He dedicated these words to those who manage the field: “I have a special message for fossil-fuel producers and their enablers scrambling to expand production and raking in monster profits: If you cannot set a credible course for net-zero, with 2025 and 2030 targets covering all your operations, you should not be in business. Your core product is our core problem. We need a renewables revolution, not a self-destructive fossil-fuel resurgence.” The UN is taking renewable energy importance to a new level this year.

    We must rely on these sources

    Renewable energy is the only way the world can move forward to sustain communities and global corporate operations. Many are already making the change, demonstrating the significance and relative ease of implementing such measures.
    Instead of simply depleting natural resources, renewable energy provides regenerated power for years. For the health of humans, animals, and nature alike, renewable energy must be used to preserve the environment.
    Solar energy adaptation in the Amazon rainforest is an excellent example of impactful renewable energy implementation. Due to dedicated efforts, many communities in the region now have access to the internet and larger amounts of clean water. The new access to daily items, such as ice and electricity, was especially helpful throughout the COVID-19 pandemic, as isolated communities were at higher risk of dangerous viral spreading. Providing more resources like this is highly encouraged to upkeep the natural state of communities in places like the Amazon rainforest.
    Within the next three years, renewables will be the top energy source globally. Predictions like these provide a sense of promise in limiting toxic emissions. Moreover, Efendi reiterates that renewable energy is vital to the continuation of human activity and health.

    [ad_2]

    Social Media Experts

    Source link

  • Researchers Combat Lake Algae Blooms with Floating Filtration

    Researchers Combat Lake Algae Blooms with Floating Filtration

    [ad_1]

    Newswise — Climate change and human activity have been putting pressure on water bodies worldwide, and Canada’s vast network of lakes is no exception. Over the past decades, increasing nutrient levels have led to a process called eutrophication, in the shallow lakes dotting Quebec’s Laurentian region north of Montreal. These changes have led to a surge in algae blooms, rendering the lakes unusable and possibly disrupting the natural ecosystem.

    Restoring these lakes to a healthier condition is a complicated and expensive undertaking, but a new method being investigated by Concordia researchers in the Department of Building, Civil and Environmental Engineering may cut down on both costs and labour in an environmentally friendly way.

    Writing in the journal Water, the researchers describe a system of floating geotextile filters that efficiently remove suspended solids, algae and the nutrients from a shallow lake.  While the project is still in development, the researchers say they believe it has the potential to scale up. This technology could then benefit the health of larger bodies of water such as ponds, rivers, coastal areas and bays.

    The study is led by PhD student Antônio Cavalcante Pereira and Professor Catherine Mulligan. Research associate Dileep Palakkeel Veetil and Sam Bhat of Titan Environmental Containment are also contributors.

    Non-chemical solutions

    Over the summer and early fall seasons of 2019 and 2020, the researchers placed six geotextile layers in a floating filtration unit at Lac Caron. Lac Caron is a shallow eutrophic lake with a maximum depth of 2.6 metres located in Ste-Anne-des-Lacs, about 75 kilometres north of Montreal. The lake has been under a recreational advisory since 2008 due to excessive algae growth.

    The Plexiglas filtration device was made to float by an inflatable rubber tube placed in the centre of an enclosed area. The area was cordoned off by using geotextile turbidity curtains. The specialized curtains hang down from the water surface to reach the lakebed, or near to it, to prevent suspended solid interactions with the rest of the lake.

    Water samples from the lake and the contained areas were collected every two to three days. The specimens were then analyzed for levels of turbidity, suspended solids (TSS), phosphorus, blue-green-algae-phycocyanin (BGA-PC), chlorophyll-a and more.

    The analysis results were encouraging.

    The analysis results were encouraging, according to average removal efficiencies in 2019 and 2020. The researchers compared the filtered lake water to the non-filtered lake water and found the following:

    • Turbidity reduced 53 per cent in 2019/17 per cent in 2020
    • TSS by from 22 per cent/36 per cent
    • Phosphorus by 49 per cent/18 per cent
    • BGA-PC by 57 per cent/34 per cent
    • Chlorophyll-a by 56 per cent/32 per cent.

    Pereira says the year-over-year differences are the result of heterogeneous water quality in lakes due to distinct climate and algae growth patterns. A large, visible algae bloom was visible in 2019, while 2020’s algae was more dispersed throughout the whole water body.

    “Expanding our system for large lake remediation is a long-term goal. But the novelty of this project is the that we just use the in-situ water filtration as a remediating method for eutrophic water bodies,” says Pereira. “We did not add any chemicals to the lake, but we still managed to get good results: algae suppression and turbidity decreases for an entire recreational season.”

    An evolving long-term project

    Mulligan adds that this paper is part of a series that is based on work that first began back in 2008. The project has gone through subsequent iterations over the years and in other lakes in the region.

    The shallow lakes studied in the past were often created by developers excavating existing lakes and incompletely cutting down trees. However, several recent factors are contributing to recurring excessive algae growth. These factors include the continual degradation of fragmentary tree stumps, along with possible nutrient release from runoff and the lack of natural hydrological patterns.

    “It can be a challenge because water quality changes from year to year,” says Mulligan. “When those eutrophic water bodies are subjected to warmer temperatures, they tend to be much more affected by excessive algal blooms.”

    This research was funded by NSERC, Concordia University and Titan Environmental Containment.

    Read the cited paper: “An In-Situ Geotextile Filtration Method for Suspended Solids Attenuation and Algae Suppression in a Canadian Eutrophic Lake

    [ad_2]

    Concordia University

    Source link

  • Rising Temps Impact Streams in Northeast US

    Rising Temps Impact Streams in Northeast US

    [ad_1]

    Newswise — Over the past 25 years, the Northeast has experienced the largest increase in extreme precipitation nationally. Prior research has shown that the amount of extreme precipitation— rain or snow that results in one- to two inches of water in a day— over the past 25 years has been almost 50% greater than from 1901 to 1995.

    A new Dartmouth study provides insight into how changes in precipitation and temperature due to global warming affect streamflow and flooding in the Northeast. The findings are published in the Journal of the American Water Resources Association.

    The researchers examined how precipitation, including snowfall, winter rain on snow events, springtime snowmelt, and soil conditions, impact streamflow. They focused on four watersheds in the Northeast: the Mattawamkeag River in northeastern Maine; the Dead Diamond River in northern New Hampshire; the White River in eastern Vermont; and the Shenandoah River in West Virginia.

    Streamflow in the three northern watersheds is strongly affected by snowmelt, while the Shenandoah River watershed is affected more by rainfall. All four watersheds were selected because they are unregulated rivers, meaning the streamflow is not controlled by a dam, and span a range of latitudes.

    For the first part of the study, the team created a machine learning model from the historical relationships between streamflow and factors that included: temperature; precipitation (rainfall versus snow); the “antecedent precipitation index” or how much moisture is stored in the soil before a storm; the “standardized precipitation index,” which is used to characterize wet and dry spells; and streamflow.

    They drew on more than 95 years of historical climate data spanning from 1915 to 2011, as well as on streamflow data from the U.S. Geological Survey and snow depth observations from the Northeast Regional Climate Center.

    “Both the antecedent precipitation index and the standardized precipitation index are basically measures of how wet the land surface is already, which affects runoff and streamflow,” says first author Charlotte Cockburn, Guarini ’21, who was a master’s student in earth sciences at Dartmouth at the time of the research.

    “If you have a really big rainstorm on a relatively dry surface, a lot of that water can be absorbed by the soil, but if you have multiple rainstorms leading up to the really big rainstorm, there’s no room in the soil for the water, which creates higher streamflow.”

    That was what happened in August 2011, when Hurricane Irene, known as Tropical Storm Irene in much of New England, caused devastating flooding, multiple deaths, and billions of dollars in damage, Cockburn notes.

    To predict streamflow in the cold season months of November to May, the team used average temperature, three-day and 30-day rainfall, and three-day and 30-day snowfall as variables in their model. They created a sub-model to simulate snowmelt. The model would look at a particular date, for example April 1, 2009, and would then predict streamflow based on the model variables.

    “For context, the highest streamflow in Northeast watersheds tends to occur in the spring, actually right around now, when there is snowmelt, larger rainfall events than in the winter, no vegetation to pull water out of the soil, and when the soil is either saturated or frozen,” says senior author Jonathan Winter, an associate professor of geography at Dartmouth.

    As the researchers explain in the study, one of the conundrums with the model is that it is based on historical data and is trained to rely on snowpack as an important driver for projecting streamflow in the cold season.

    So when the model runs into future dates when there will be reduced snowpack due to global warming, it predicts decreases in streamflow. But as Cockburn explains, “The models don’t exactly capture the dynamics of winter changes in streamflow because they are trained on the past and in a world that is warmer due to climate change, we expect rain to be a much more important driver of winter streamflow.”

    For the second part of the study, the team forced the machine learning model with a projection of climate from 2070 to 2099, to see what happens to streamflow in a future climate.

    The key findings are:

    • Across watersheds and seasons, three-day precipitation and initial soil moisture are the most important variables that determine streamflow in the Northeast.
       
    • Thirty-day snowmelt and 30-day rainfall are important to Mattawamkeag River streamflow because the watershed is both the largest and most northern, making it less sensitive to short extreme precipitation events and more sensitive to snow.
       
    • Future cold season streamflow depends on how New England watersheds respond to the change from more snowfall dominated winters to more rainfall dominated winters.
       
    • Future warm season streamflow depends almost exclusively on changes in rainfall.

    “If the Northeast gets wetter soils and more heavy rainfall events, as climate models predict it will, the Northeast will have increased streamflow and higher flood risk,” says Winter.

    This past winter the Northeast had below normal snowpack due to temperatures that were more than 4 degrees Fahrenheit warmer than average.

    “The winter we just had is what we are going to experience more often in the future. It’s a glimpse of what’s to come,” says Winter. “Our analysis however, surprisingly reveals that in the Northeast, snow matters relatively little in comparison to how sensitive streamflow is to precipitation.”

    Winter says, “With climate change, understanding how streamflow may change in a warmer and wetter climate is important as these dynamics have implications for flooding, ecosystems, water resources, and hydropower.”

    Erich Osterberg, an associate professor of in earth sciences and Frank Magilligan, the Frank J. Reagan ’09 Chair of Policy Studies and a professor of geography at Dartmouth, also served as co-authors of the study.

    ###

    [ad_2]

    Dartmouth College

    Source link

  • New SLAC-Stanford Battery Center targets roadblocks to a sustainable energy transition

    New SLAC-Stanford Battery Center targets roadblocks to a sustainable energy transition

    [ad_1]

    Newswise — Menlo Park, Calif. – The Department of Energy’s SLAC National Accelerator Laboratory and Stanford University today announced the launch of a new joint battery center at SLAC. It will bring together the resources and expertise of the national lab, the university and Silicon Valley to accelerate the deployment of batteries and other energy storage solutions as part of the energy transition that’s essential for addressing climate change.

    A key part of this transition will be to decarbonize the world’s transportation systems and electric grids ­– to power them without fossil fuels. To do so, society will need to develop the capacity to store several hundred terawatt-hours of sustainably generated energy. Only about 1% of that capacity is in place today.

    Filling the enormous gap between what we have and what we need is one of the biggest challenges in energy research and development. It will require that experts in chemistry, materials science, engineering and a host of other fields join forces to make batteries safer, more efficient and less costly and manufacture them more sustainably from earth-abundant materials, all on a global scale. 

    The SLAC-Stanford Battery Center will address that challenge. It will serve as the nexus for battery research at the lab and the university, bringing together large numbers of faculty, staff scientists, students and postdoctoral researchers from SLAC and Stanford for research, education and workforce training. 

     “We’re excited to launch this center and to work with our partners on tackling one of today’s most pressing global issues,” said interim SLAC Director Stephen Streiffer. “The center will leverage the combined strengths of Stanford and SLAC, including experts and industry partners from a wide variety of disciplines, and provide access to the lab’s world-class scientific facilities. All of these are important to move novel energy storage technologies out of the lab and into widespread use.”

    Expert research with unique tools

    Research and development at the center will span a vast range of systems – from understanding chemical reactions that store energy in electrodes to designing battery materials at the nanoscale, making and testing devices, improving manufacturing processes and finding ways to scale up those processes so they can become part of everyday life. 

    “It’s not enough to make a game-changing battery material in small amounts,” said Jagjit Nanda, a SLAC distinguished scientist, Stanford adjunct professor and executive director of the new center, whose background includes decades of battery research at DOE’s Oak Ridge National Laboratory. “We have to understand the manufacturing science needed to make it in larger quantities on a massive scale without compromising on performance.”

    Longstanding collaborations between SLAC and Stanford researchers have already produced many important insights into how batteries work and how to make them smaller, lighter, safer and more powerful. These studies have used machine learning to quickly identify the most promising battery materials from hundreds made in the lab, and measured the properties of those materials and the nanoscale details of battery operation at the lab’s synchrotron X-ray facility. SLAC’s X-ray free-electron laser is available, as well, for fundamental studies of energy-related materials and processes. 

    SLAC and Stanford also pioneered the use of cryogenic electron microscopy (cryo-EM), a technique developed to image biology in atomic detail, to get the first clear look at finger-like growths that can degrade lithium-ion batteries and set them on fire. This technique has also been used to probe squishy layers that build up on electrodes and must be carefully managed, in research performed at the Stanford Institute for Materials and Energy Sciences (SIMES).

    Nanda said the center will also focus on making energy storage more sustainable, for instance by choosing materials that are abundant, easy to recycle and can be extracted in a way that’s less costly and produces fewer emissions.

    A unique collaboration in the heart of Silicon Valley 

    Battery Center Director Will Chueh, an associate professor at Stanford and faculty scientist at SLAC, emphasized that the center is located in the middle of Silicon Valley’s entrepreneurial culture, two miles from the Stanford campus and a short walk away from large, world-class scientific facilities that only a national lab can provide. This generates advantages that would be impossible for any single partner to achieve, including outstanding educational and training opportunities for Stanford students and postdocs that will play an outsized role in shaping the next generation of energy researchers. 

    “There’s no other place in the world,” Chueh said, “where all of this comes together.”

    A pilot project for the center began in 2020 with two battery laboratories in SLAC’s Arrillaga Science Center where Stanford students and postdoctoral researchers have been synthesizing battery materials and evaluating devices. 

    The center is operated by SLAC’s Applied Energy Division and Stanford’s Precourt Energy Institute. Major funding for battery research at SLAC comes from the DOE Office of Science and Office of Energy Efficiency and Renewable Energy. SLAC’s Stanford Synchrotron Radiation Lightsource (SSRL) and Linac Coherent Light Source (LCLS) X-ray free-electron laser are DOE Office of Science user facilities.

    [ad_2]

    SLAC National Accelerator Laboratory

    Source link

  • Stripped to the bone

    Stripped to the bone

    [ad_1]

    Natural disasters can devastate a region, abruptly killing the species that form an ecosystem’s structure. But how this transpires can influence recovery. While fires scorch the landscape to the ground, a heatwave leaves an army of wooden staves in its wake. Storm surges and coral bleaching do something similar underwater.

    [ad_2]

    University of California, Santa Barbara

    Source link

  • Hidden ice melt in Himalaya: Study

    Hidden ice melt in Himalaya: Study

    [ad_1]

    Newswise — A new study reveals that the mass loss of lake-terminating glaciers in the greater Himalaya has been significantly underestimated, due to the inability of satellites to see glacier changes occurring underwater, with critical implications for the region’s future projections of glacier disappearance and water resources.

    Published in Nature Geoscience on April 3, the study was conducted by an international team including researchers from the Chinese Academy of Sciences (CAS), Graz University of Technology (Austria), the University of St. Andrews (UK), and Carnegie Mellon University (USA).

    The researchers found that a previous assessment underestimated the total mass loss of lake-terminating glaciers in the greater Himalaya by 6.5%. The most significant underestimation of 10% occurred in the central Himalaya, where glacial lake growth was the most rapid. A particularly interesting case is Galong Co in this region, with a high underestimation of 65%.

    This oversight was largely due to the limitations of satellite imaging in detecting underwater changes, which has led to a knowledge gap in our understanding of the full extent of glacier loss. From 2000 to 2020, proglacial lakes in the region increased by 47% in number, 33% in area, and 42% in volume. This expansion resulted in an estimated glacier mass loss of around 2.7 Gt, equivalent to 570 million elephants, or over 1,000 times the total number of elephants in the world. This loss was not considered by previous studies since the utilized satellite data can only measure the lake water surface but not underwater ice that is replaced by water.

    “These findings have important implications for understanding the impact of regional water resources and glacial lake outburst floods,” said lead author ZHANG Guoqing from the Institute of Tibetan Plateau Research, CAS.

    By accounting for the mass loss from lake-terminating glaciers, the researchers can more accurately assess the annual mass balance of these glaciers compared to land-terminating ones, thus further highlighting the accelerated glacier mass loss across the greater Himalaya.

    The study also highlights the need to understand the mechanisms driving glacier mass loss and the underestimated mass loss of lake-terminating glaciers globally, which is estimated to be around 211.5 Gt, or roughly 12%, between 2000 and 2020.

    “This emphasizes the importance of incorporating subaqueous mass loss from lake-terminating glaciers in future mass-change estimates and glacier evolution models, regardless of the study region,” said co-corresponding author Tobias Bolch from Graz University of Technology.

    David Rounce, a co-author from Carnegie Mellon University, noted that in the long run, the mass loss from lake-terminating glaciers may continue to be a major contributor to total mass loss throughout the 21st century as glaciers with significant mass loss may disappear more rapidly compared to existing projections.

    “By more accurately accounting for glacier mass loss, researchers can better predict future water resource availability in the sensitive mountain region,” said co-author YAO Tandong, who also co-chairs Third Pole Environment (TPE), an international science program for interdisciplinary study of the relationships among water, ice, climate, and humankind in the region and beyond.

    [ad_2]

    Chinese Academy of Sciences

    Source link