ReportWire

Tag: DOE Science News Source

  • Artificially intelligent software provides a detailed look at jets of plasma used to treat cancer

    Artificially intelligent software provides a detailed look at jets of plasma used to treat cancer

    [ad_1]

    Artificially intelligent software has been developed to enhance medical treatments that use jets of electrified gas known as plasma. The computer code predicts the chemicals emitted by plasma devices, which can be used to treat cancer, promote healthy tissue growth and sterilize surfaces.

    The software learned to predict the cocktail of chemicals coming out of the jet based on data gathered during real-world experiments and using the laws of physics for guidance. This type of artificial intelligence (AI) is known as machine learning because the system learns based on the information provided. The researchers involved in the project published a paper about their code in the Journal of Physics D: Applied Physics.

    The plasma studied in the experiments is known as cold atmospheric plasma (CAP). When the CAP jet is turned on, numerous chemical species in the plasma take part in thousands of reactions. These chemicals modify the cells undergoing treatment in different ways, depending on the chemical composition of the jet. While scientists know that CAPs can be used to kill cancer cells, treat wounds and kill bacteria on food, it’s not fully understood why. 

    “This research is a step toward gaining a deeper understanding of how and why CAP jets work and could also one day be used to refine their use,” said Yevgeny Raitses, a managing principal research physicist at the U.S. Department of Energy’s Princeton Plasma Physics Laboratory (PPPL). 

    The project was completed by the Princeton Collaborative Low Temperature Plasma Research Facility (PCRF), a collaboration between researchers at PPPL and the George Washington University (GWU).

    PPPL has a growing body of work that combines its 70 years of pioneering plasma research with its expertise in AI to solve societal problems. The Lab’s mission extends beyond using plasma to generate fusion power to its use in fields such as medicine and manufacturing, among others. 

    The software uses an approach known as a physics-informed neural network (PINN). In a PINN, data is organized into parts called nodes and neurons. The flow of the data mimics the way information is processed in the human brain. Laws of physics are also added to the code.

    “Knowing what comes out of the jet is very important. Knowing what comes out accurately is very difficult,” said Sophia Gershman, a lead PPPL research engineer from the PCRF who worked on this collaborative project. The process would require several different devices to collect different kinds of information about the jet. “In practical studies, it is difficult to go and utilize all of the various technologically advanced diagnostics all at once for each device and for various types of surfaces that we treat,” Gershman explained.

    Calculating the chemical composition one nanosecond at a time

    Li Lin, a research scientist from GWU and the paper’s primary author, said it’s also difficult to calculate the chemicals in a CAP jet because the interactions need to be considered a nanosecond at a time. “When you consider that the device is in operation for several minutes, the number of calculations makes the problem more than simply computationally intensive. It’s practically impossible,” Lin said. “Machine learning allows you to bypass the complicated part.”

    The project began with a small set of real-world data that was gathered using a technique known as Fourier-transform infrared absorption spectroscopy. The researchers used that small dataset to create a broader set of data. That data was then used to train the neural network using an evolutionary algorithm, which is a type of computer code inspired by nature that searches for the best answers using a survival-of-the-fittest approach. Several successive batches of data are generated using slightly different approaches, and only the best datasets from each round are carried through to the next round of training until the desired results are achieved.

    Ultimately, the team was able to accurately calculate the chemical concentrations, gas temperature, electron temperature and electron concentration of the cold atmospheric plasma jet based on data gathered during real-world experiments. In a cold atmospheric plasma, the electrons — small, negatively charged particles — can be very hot, though the other particles are close to room temperature. The electrons can be at a low enough concentration that the plasma doesn’t feel hot or burn the skin while still being able to have a significant effect on the targeted cells. 

    On the path to personalized plasma treatment

    Michael Keidar, the A. James Clark Professor of Engineering at GWU and a frequent collaborator with PPPL who also worked on this project, said the long-term goal is to be able to perform these calculations fast enough that the software can automatically adjust the plasma during a procedure to optimize treatment. Keidar is currently working on a prototype of such a “plasma adaptive” device in his lab. “Ideally, it can be personalized. The way we envision it, you treat the patient, and the response of every patient will be different,” Keidar explained. “So, you can measure the response in real-time and then try to inform, using feedback and machine learning, the right settings in the plasma-producing device.” 

    More research needs to be done to perfect such a device. For example, this study looked at the CAP jet over time but at only one point in space. Further research would need to broaden the work so it considers multiple points along the jet’s output stream. The study also looked at the plasma plume in isolation. Future experiments would need to integrate the surfaces treated by the plasma to see how that impacts the chemical composition at the treatment site. 

    This research was funded by the U.S. Department of Energy, Grant DE-SC0022349 and by the Princeton Collaborative Research Facility, which is supported by the Department of Energy under Contract No. DE-AC02-09CH11466.

    PPPL is mastering the art of using plasma — the fourth state of matter — to solve some of the world’s toughest science and technology challenges. Nestled on Princeton University’s Forrestal Campus in Plainsboro, New Jersey, our research ignites innovation in a range of applications including fusion energy, nanoscale fabrication, quantum materials and devices, and sustainability science. The University manages the Laboratory for the U.S. Department of Energy’s Office of Science, which is the nation’s single largest supporter of basic research in the physical sciences. Feel the heat at https://energy.gov/science and http://www.pppl.gov.



    [ad_2]

    Princeton Plasma Physics Laboratory

    Source link

  • Some mosquitoes like it hot

    Some mosquitoes like it hot

    [ad_1]

    Newswise — Certain populations of mosquitoes are more heat tolerant and better equipped to survive heat waves than others, according to new research from Washington University in St. Louis.

    This is bad news in a world where vector-borne diseases are an increasingly global health concern. Most models that scientists use to estimate vector-borne disease risk currently assume that mosquito heat tolerances do not vary. As a result, these models may underestimate mosquitoes’ ability to spread diseases in a warming world.

    Researchers led by Katie M. Westby, a senior scientist at Tyson Research Center, Washington University’s environmental field station, conducted a new study that measured the critical thermal maximum (CTmax), an organism’s upper thermal tolerance limit, of eight populations of the globally invasive tiger mosquito, Aedes albopictus. The tiger mosquito is a known vector for many viruses including West Nile, chikungunya and dengue.

    “We found significant differences across populations for both adults and larvae, and these differences were more pronounced for adults,” Westby said. The new study is published Jan. 8 in Frontiers in Ecology and Evolution.

    Westby’s team sampled mosquitoes from eight different populations spanning four climate zones across the eastern United States, including mosquitoes from locations in New Orleans; St. Augustine, Fla.; Huntsville, Ala.; Stillwater, Okla.; St. Louis; Urbana, Ill.; College Park, Md.; and Allegheny County, Pa.

    The scientists collected eggs in the wild and raised larvae from the different geographic locations to adult stages in the lab, tending the mosquito populations separately as they continued to breed and grow. The scientists then used adults and larvae from subsequent generations of these captive-raised mosquitoes in trials to determine CTmax values, ramping up air and water temperatures at a rate of 1 degree Celsius per minute using established research protocols.

    The team then tested the relationship between climatic variables measured near each population source and the CTmax of adults and larvae. The scientists found significant differences among the mosquito populations.

    The differences did not appear to follow a simple latitudinal or temperature-dependent pattern, but there were some important trends. Mosquito populations from locations with higher precipitation had higher CTmax values. Overall, the results reveal that mean and maximum seasonal temperatures, relative humidity and annual precipitation may all be important climatic factors in determining CTmax.

    “Larvae had significantly higher thermal limits than adults, and this likely results from different selection pressures for terrestrial adults and aquatic larvae,” said Benjamin Orlinick, first author of the paper and a former undergraduate research fellow at Tyson Research Center. “It appears that adult Ae. albopictus are experiencing temperatures closer to their CTmax than larvae, possibly explaining why there are more differences among adult populations.”

    “The overall trend is for increased heat tolerance with increasing precipitation,” Westby said. “It could be that wetter climates allow mosquitoes to endure hotter temperatures due to decreases in desiccation, as humidity and temperature are known to interact and influence mosquito survival.”

    Little is known about how different vector populations, like those of this kind of mosquito, are adapted to their local climate, nor the potential for vectors to adapt to a rapidly changing climate. This study is one of the few to consider the upper limits of survivability in high temperatures — akin to heat waves — as opposed to the limits imposed by cold winters.

    “Standing genetic variation in heat tolerance is necessary for organisms to adapt to higher temperatures,” Westby said. “That’s why it was important for us to experimentally determine if this mosquito exhibits variation before we can begin to test how, or if, it will adapt to a warmer world.”

    Future research in the lab aims to determine the upper limits that mosquitoes will seek out hosts for blood meals in the field, where they spend the hottest parts of the day when temperatures get above those thresholds, and if they are already adapting to higher temperatures. “Determining this is key to understanding how climate change will impact disease transmission in the real world,” Westby said. “Mosquitoes in the wild experience fluctuating daily temperatures and humidity that we cannot fully replicate in the lab.”

    [ad_2]

    Washington University in St. Louis

    Source link

  • ‘The Human Element’

    ‘The Human Element’

    [ad_1]

    Newswise — Sometimes, the best way to see what you’re made of is facing a challenge. Andrew Broadbent, an accomplished project manager at the at the National Synchrotron Light Source II (NSLS-II), a U.S. Department of Energy (DOE) Office of Science User Facility located at DOE’s Brookhaven National Laboratory, took on such a challenge earlier this year though DOE’s Project Leadership Institute (PLI) and emerged from the yearlong endeavor with his team victorious.

    Cultivating Leadership

    Every year, PLI selects around 25 experienced project leaders endorsed by DOE national laboratories, program offices, and site offices to participate in their intensive, yearlong leadership development program. This program is designed to cultivate the necessary skills to effectively take on and execute high-risk projects. The cohort is split into five teams that work together over the course of the year to conduct a case study analysis of a recent DOE project. Throughout the program, the cohort travels to different national labs across the country to attend events and participates in self-paced learning during the summer. These modules cover important concepts, like leading innovative teams, that often highlight real-life success stories.

    “Everyone on the team has extensive project management experience,” said Broadbent, “so we were all, largely, in the same boat here, and we learned a lot from each other along the way. Each event provided something useful to take away, making it a really valuable program for anyone in the DOE involved with project management.”

    As the program concludes, each team creates a final report and presentation capturing the successes and failures of the project, analyzes the lessons to be learned, and submits them for judging. The judges confer on the analysis they found to be the most impactful and present the winning team with a shared plaque that travels to each teammate’s home institution. Broadbent’s contributions ensured that the plaque would make its final stop at Brookhaven later next year.

    Transforming cUlture Through inclusiOn (TUTO)

    Broadbent’s team, dubbed TUTO, consisted of members from different national laboratories—Jessica Bentley (Sandia National Laboratory), Lisa Ehlers (Lawrence Berkeley National Laboratory), Vincente Guiseppe (Oak Ridge National Laboratory), and Hiro Tanaka (SLAC National Accelerator Laboratory). Each member strengthened the team with their diverse backgrounds, talents, and project experiences. Broadbent drew plenty of inspiration from the projects he has helped manage. For 16 years at NSLS-II, he has been instrumental in the design, installation, and commissioning of several beamlines that are currently serving users who are performing cutting-edge research, as well as future beamlines that will offer the facility new capabilities.

    For their project, the team explored the execution of DOE’s Facility for Rare Isotope Beams (FRIB) project at Michigan State University (MSU). FRIB’s mission is to produce and research rare isotopes for advancing knowledge in nuclear physics, material science, medicine, defense, and industry. The project was completed in June 2022.

    “FRIB is a unique project not only for its one-of-a-kind mission and technological success but also for its leadership. They successfully navigated an unusual funding and regulatory framework to project completion within budget and five months ahead of schedule,” remarked Broadbent.

    While they explored the project through PLI’s core concepts, they also sought out the values employed by the FRIB team that made their project so successful. In their analysis, they narrowed it down to four main concepts: curation, fluidity, character, and engagement.

    “Curation” was reflected in several aspects of project management, from making a team of diverse people with diverse talents to only selecting processes within the project that are predicted to add value.

    “Fluidity” goes hand in hand with curation. As much as one can try to control a project, unexpected changes are bound to happen at any stage. Things that were carefully curated can suddenly take a different shape. Fluidity is about having that expectation and being able to adapt strategically without compromising on core needs, like safety.

    “Character” fueled these concepts, as it described how respectful relationships from effective and empathetic leaders fostered trust, good communication, and conflict solutions that allow work to be performed smoothly and safely.

    Lastly, there was the concept of “engagement,” teams taking pride and ownership in their work, creating a positive safety culture, sparking community and stakeholder involvement, and promoting inclusivity. All of these concepts link together in such a way that each reinforces the others.

     

    While the presentation covered a lot of ground and sparked some productive discussions, the competition was formidable. There was one more Brookhaven employee in 2023’s cohort: Angelika Drees, collider group leader for the Collider-Accelerator Department. While she was working with another team, she enjoyed comparing and contrasting her experiences with Broadbent as the program concluded and brought back a lot of insight to her current role.

    “I have never looked at another DOE project that closely before and I feel like I learned so much just from making comparisons,” recalled Drees. “It made me think about the new Electron-Ion Collider project in a different way. In some sense, there are a lot of similarities; it’s an accelerator and it has complex physics. And though it may not be the same in terms of scale and scope, there were general concepts that translate from one project to the other. Looking at this project so closely taught us a lot.”

    The scoring between teams was reported to be closer than it had ever been in the past. Regardless of the outcome, the exercise was valuable to all involved and provided a lot to think about for future projects.

    “We really enjoyed doing this,” remarked Broadbent. “Even though writing reports like this tends to be a lot of work, we worked together very well as a team and managed to have fun. It was a very different kind of experience and really made us think. The human side is something everyone can understand, and something everyone can improve upon. That thought came to mind very early on in the project and never went away. Each attribute we uncovered was very human-focused.”

    Brookhaven National Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit science.energy.gov.

    Follow @BrookhavenLab on social media. Find us on Instagram, LinkedIn, X, and Facebook.

     

    [ad_2]

    Brookhaven National Laboratory

    Source link

  • Scientists Probe the Emergent Structure of the Carbon Nucleus

    Scientists Probe the Emergent Structure of the Carbon Nucleus

    [ad_1]

    The Science

    Newswise — The element carbon is critical to organic chemistry and life as we know it. The physics of its most common isotope, carbon-12, are extremely complex. Many experimental and theoretical investigations have been devoted to determining the energies and underlying structures of the nuclear states of carbon-12. In this work, researchers computed these states from first principles—the most basic components of physics theory. The approach used supercomputers and nuclear lattice simulations to calculate the three-dimensional shape formed by the protons and neutrons comprising the nucleus. The results show that all of the low-lying energy states of carbon-12 have a substructure where the six protons and six neutrons cluster together into alpha particles. Alpha particles are helium-4 nuclei, which contain two protons and two neutrons.

    The Impact

    One well-known nuclear state of carbon-12 is the Hoyle state. This state has an energy that sits near the energy threshold for three alpha particles or helium nuclei. This energy thereby greatly enhances the production of carbon in helium-burning stars. This helps to explain the presence of carbon in the Universe. The results obtained in this research show that the Hoyle state is composed of a “bent arm” or obtuse triangular arrangement of alpha particles. All the low-lying energy states of carbon-12 have an intrinsic shape composed of three alpha particles forming either an equilateral triangle or an obtuse triangle. The new results give information about the possible geometrical shapes of nuclear states.

    Summary

    The carbon atom provides the backbone for the complex organic chemistry composing the building blocks of life. The physics of the carbon nucleus in its predominant isotope, carbon-12, are also full of complexity. Researchers from the University of Bonn, Forschungszentrum Jülich in Germany, the Gaziantep Islamic Science and Technology University in Turkey, the Graduate School of China Academy of Engineering Physics, Tbilisi State University, and the Facility for Rare Isotope Beams at Michigan State University calculated the structure of the nuclear states of carbon-12 using the ab initio framework of nuclear lattice effective field theory.

    The research found that all the low-lying states of carbon-12 have an intrinsic shape composed of three alpha clusters forming either an equilateral triangle or an obtuse triangle. The states with the equilateral triangle shape also have a dual description in terms of particle-hole excitations in a mean-field picture. The results agree with experimental data and provide the first model-independent density map of the nuclear states of carbon-12. The results help to explain the origins of carbon from the helium and hydrogen that made up the Universe shortly after the Big Bang.

    Funding

    This research was funded by the Deutsche Forschungsgemeinschaft (the German Research Foundation), the National Natural Science Foundation of China , the Chinese Academy of Sciences President’s International Fellowship Initiative, the National Security Academic Fund of China, Volkswagen Stiftung, the European Research Council, the Department of Energy, and the Nuclear Computational Low-Energy Initiative SciDAC-4 project, as well as computational resources provided by the Gauss Centre for Supercomputing e.V. and the Oak Ridge Leadership Computing Facility.


    Journal Link: Nature Communications, May-2023

    [ad_2]

    Department of Energy, Office of Science

    Source link

  • “Energy Droughts” in Wind and Solar Can Last Nearly a Week, Research Shows

    “Energy Droughts” in Wind and Solar Can Last Nearly a Week, Research Shows

    [ad_1]

    Newswise — Solar and wind power may be free, renewable fuels, but they also depend on natural processes that humans cannot control. It’s one thing to acknowledge the risks that come with renewable energy: the sun doesn’t always shine and the wind doesn’t always blow, but what happens when the grid loses both of these energy sources at the same time?

    This phenomenon is known as a compound energy drought. In a new paper, researchers at Pacific Northwest National Laboratory (PNNL) found that in some parts of the country, these energy droughts can last nearly a week.

    “When we have a completely decarbonized grid and depend heavily on solar and wind, energy droughts could have huge amounts of impact on the grid,” said Cameron Bracken, an Earth scientist at PNNL and lead author on the paper. Grid operators need to know when energy droughts will occur so they can prepare to pull energy from different sources. On top of that, understanding where, when, and for how long energy droughts occur will help experts manage grid-level battery systems that can store enough electricity to deploy during times when energy is needed most.

    The team published the findings October 31 in the journal Renewable Energy and will be presenting at this week’s annual meeting of the American Geophysical Union.

    Hunting for cloudy, windless days

    In the past, researchers studied compound energy droughts on a state or regional scale. But not much has been studied on a nationwide scale. To find out more about the risk of energy droughts over the entire continental U.S., the researchers dug into weather data and then used historical energy demand data to understand how often an energy drought occurs when that energy is needed the most.

    The team examined 4 decades of hourly weather data for the continental U.S. and homed in on geographical areas where actual solar and wind energy plants operate today. Weather data included wind speeds at the height of wind turbines as well as the intensity of solar energy falling on solar panels. Times when the weather data showed stagnant air and cloudy skies translated into lower energy generation from the wind and solar plants—a compound energy drought.

    “We essentially took a snapshot of the infrastructure as of 2020 and ran it through the 40 years of weather data, starting in 1980,” Bracken said. “We are basically saying ‘here is how the current infrastructure would have performed under historical weather conditions.’”

    The researchers found that energy droughts can occur in any season across the continental U.S., though they vary widely in frequency and duration. In California, for instance, cloudy and windless conditions might last several days, whereas the same conditions might last for only a few hours in Texas. Utah, Colorado, and Kansas experience frequent energy droughts both over several-hour timescales as well as several-day timescales. The Pacific Northwest and Northeast, meanwhile, seem to experience energy droughts that last several hours more frequently than several days. The different timescales (hourly versus daily) will help inform the energy drought’s impact on the grid—will it last just a few hours, or several days?

    Overall, researchers found that the longest potential compound energy drought on an hourly timescale was 37 hours (in Texas), while the longest energy drought on a daily timescale was six days (in California).

    Energy drought at peak demand

    Simply knowing the where and how of energy droughts is just one piece of the puzzle, Bracken said. He also stressed that a drought of solar and wind power won’t necessarily cause an energy shortage. Grid operators can turn to other sources of energy like hydropower, fossil fuels, or energy transmitted from other regions in the U.S.

    But as the nation aims to move away from fossil fuels and rely more on solar and wind power, grid operators must understand whether energy droughts will occur during times when the demand for electricity might exceed supply. Climate change brings hotter summers and more intense winter storms, and these are times when not only people use more energy to stay safe (for cooling or heating), but access to electricity might mean life or death.

    To understand the possible connection between energy droughts and energy demand, the team mapped their historical, hypothetical generation data onto 40 years of historical energy demand data that also covered real power plants across the continent.

    The data showed that “wind and solar droughts happen during peak demand events more than you would expect due to chance,” Bracken said, meaning that more often than not, windless and cloudless periods occurred during times when demand for power was high. For now, Bracken isn’t certain that the correlation means causation.

    “This could be due to well-understood meteorological phenomenon such as inversions suppressing wind and increasing temperatures, but further study is needed,” Bracken said.

    Energy storage for energy droughts

    Studying patterns in the frequency and duration of energy droughts will also help inform the deployment of long-duration energy storage projects, said Nathalie Voisin, an Earth scientist at PNNL and coauthor on the paper. The paper is the first to provide a uniform standard of what a compound energy drought is and how long it can last in different parts of the country.

    “We’re providing insight on how to adequately design and manage multi-day storage. So when you know an energy drought is going to last for five hours or five days, you can incentivize storage to be managed accordingly,” Voisin said.

    Next, Bracken and the team will extrapolate weather and demand data into the future to see how climate change will affect the frequency and duration of energy droughts. The team plans to model energy droughts all the way to the end of the century combined with evolving infrastructure.

    This research was funded by PNNL through its internal GODEEEP initiative.

    [ad_2]

    Pacific Northwest National Laboratory

    Source link

  • Advisory panel issues field-defining recommendations for investments in particle physics research

    Advisory panel issues field-defining recommendations for investments in particle physics research

    [ad_1]

    Newswise — Yesterday marked the release of a highly anticipated report from the Particle Physics Project Prioritization Panel (P5), unveiling an exciting new roadmap for unlocking the secrets of the cosmos through particle physics.

    The report was released by the High Energy Physics Advisory Panel to the High Energy Physics program of the Office of Science of the U.S. Department of Energy (DOE) and the National Science Foundation’s Division of Physics. It outlines particle physicists’ recommendations for research priorities in a field whose projects — such as building new accelerator facilities — can take years or decades, contributions from thousands of scientists and billions of dollars

    The 2023 P5 report represents the major activity in the field of particle physics that delivers recommendations to U.S. funding agencies. This year’s report builds on the output of the 2021 Snowmass planning exercise — a process organized by the American Physical Society’s (APS) Division of Particles and Fields that convened particle physicists and cosmologists from around the world to outline research priorities. This membership division constitutes the only independent body in the U.S. that represents particle physics as a whole.

    With our state-of-the-art facilities and community of dedicated scientists, Argonne’s contributions are shaping the global trajectory of high-energy physics.” — Rik Yoshida, Argonne High Energy Physics Division Director

    With our state-of-the-art facilities and community of dedicated scientists, Argonne’s contributions are shaping the global trajectory of high-energy physics.” — Rik Yoshida, Argonne High Energy Physics Division Director

    The P5 report will lay the foundation for a very bright future in the field,” said R. Sekhar Chivukula, 2023 chair of the APS Division of Particles and Fields and a distinguished professor of physics at the University of California, San Diego. ​There are extraordinarily important scientific questions remaining in particle physics, which the U.S. particle physics community has both the capability and opportunity to help address, within our own facilities and as a member of the global high energy physics community.”

    The report includes a range of budget-conscious recommendations for federal investments in research programs, the U.S. technical workforce and the technology and infrastructure needed to realize the next generation of transformative discoveries related to fundamental physics and the origin of the universe. For example, the report recommends continued support for the Deep Underground Neutrino Experiment (DUNE), based out of DOE’s Fermilab in Illinois, for CMB-S4, a network of ground-based telescopes designed to observe the cosmic microwave background (CMB), and for the planned expansion of the South Pole’s neutrino observatory, an international collaboration known as IceCube-Gen2, in a facility operated by the University of Wisconsin–Madison.

    Researchers at DOE’s Argonne National Laboratory stand at the forefront of high energy physics and are poised to contribute significantly to the advancement of the field over the next decade. They are exploring the fundamental nature of the universe and pioneering innovative technologies with far-reaching implications. In particular, Argonne’s High Energy Physics (HEP) division leverages the laboratory’s suite of multidisciplinary facilities and equipment — including world-class scientific computing capabilities — to further scientific discovery and advance accelerator technology. For example, Argonne’s contributions to key high energy physics collaborations include the design and fabrication of components for DUNE, the development of cutting-edge detectors for CMB-S4 and more.

    With our state-of-the-art facilities and community of dedicated scientists, Argonne’s contributions are helping to shape the global trajectory of high-energy physics,” said Rik Yoshida, director of Argonne’s HEP division. ​This report reflects the collective wisdom of the high energy physics community, and we look forward to leveraging our expertise and capabilities here at Argonne to help uncover the mysteries of the universe, drive innovation, inspire future generations of scientists and bolster our nation’s vital role in the future of particle physics.”

    In the P5 exercise, it’s really important that we take this broad look at where the field of particle physics is headed, to deliver a report that amounts to a strategic plan for the U.S. community with a 10-year budgetary timeline and a 20-year context. The panel thought about where the next big discoveries might lie and how we could maximize impact within budget, to support future discoveries and the next generation of researchers and technical workers who will be needed to achieve them,” said Karsten Heeger, P5 panel deputy chair and Eugene Higgins Professor and chair of physics at Yale University.

    New knowledge, and new technologies, set the stage for the most recent Snowmass and P5 convenings. ​The Higgs boson had just been discovered before the previous P5 process, and now our continued study of the particle has greatly informed what we think may lie beyond the standard model of particle physics,” said Hitoshi Murayama, P5 panel chair and the MacAdams Professor of physics at the University of California, Berkeley. ​Our thinking about what dark matter might be has also changed, forcing the community to look elsewhere — to the cosmos. And in 2015, the discovery of gravitational waves was reported. Accelerator technology is changing too, which has shifted the discussion to the technology R&D needed to build the next-generation particle collider.”

    The U.S. participates in several major international scientific collaborations in high energy physics and cosmology, including the European Council for Nuclear Research (CERN), which operates the Large Hadron Collider, where the Higgs boson was discovered in 2012. The P5 report recommends that the U.S. support a significant in-kind contribution to a new international facility, the ​Higgs factory,” to further our understanding of the Higgs boson.

    It also recommends that the U.S. study the possibility of hosting the next most-advanced particle collider facility to reinforce the country’s leading role in international high energy physics for decades to come.

    Activities of the P5 are supported in part by the APS’s Division of Particles and Fields.

    The American Physical Society is a nonprofit membership organization working to advance and diffuse the knowledge of physics through its outstanding research journals, scientific meetings, and education, outreach, advocacy, and international activities. APS represents more than 50,000 members, including physicists in academia, national laboratories, and industry in the United States and throughout the world.

    [ad_2]

    Argonne National Laboratory

    Source link

  • Nature Inspires a New Wave of Biotechnology

    Nature Inspires a New Wave of Biotechnology

    [ad_1]

    The Science

    Newswise — Biological molecules called peptides play a key role in many biological activities, including the transport of oxygen and electrons. Peptides consist of short chains of amino acids, the building blocks of proteins. They are also the inspiration for new kinds of biotechnology. Researchers are developing a synthetic form of a peptide that self-assembles into nanoscale fibers that conduct electricity when combined with heme. Heme is a substance that helps proteins in nature move electrons from one place to another. The researchers determined how electrical conductivity of their peptide nanofibers was affected by the length of the sequence of amino acids in the peptide and their identity

    The Impact

    Structural parameters of  peptides in nature determine their function and their promise for biotechnology. These parameters include sequence length—the length of the peptide segments that make up complete peptide chains. They also include how some amino acids are arranged in a peptide. This study’s results help researchers design peptide assemblies that form nanoscale fibers and transport electrons over long distances, which could make these fibers useful in medical devices, biosensors for a wide range of applications, and robotics. They also have promise in the development of new enzymes, which companies use to make and improve things such as medical-grade and household cleaning products.

    Summary

    Fields in materials and biochemistry research explore protein and peptide nanostructures found in nature. These nanostructures show great promise as bioelectronic materials. The development of a synthetic analog capable of forming one-dimensional (1D) nanostructures would greatly improve scientists’ understanding of the natural system and provide a platform for developing new materials. Researchers in the Center for Nanoscale Materials at Argonne National Laboratory investigated a series of peptides that self-assemble into 1D layered nanostructures. The peptides PA-(Kx)n are denoted simply as PA-Kxn, where PA is c16-AH with c16-A being modified alanine (A) and H is histidine, K is lysine, n is the sequence repeat length (1-4), and x is the amino acid leucine (L), isoleucine (I), or phenylalanine (F).

    The team determined how the length of the peptide sequence (n) and the identity of the hydrophobic amino acid affect key factors: the binding affinity of heme to pre-assembled peptides, the heme density, and the electronic properties. With a sequence length of 2, the peptide assembly yielded the greatest binding affinity. The resulting nanoscale assemblies produced ordered arrays of the electroactive molecule heme. All the peptides, with the exception of PA-KL1, had nanofibers with a long aspect ratio regardless of repeat unit length and sequence. Such structures have potential utility as supramolecular bioelectronic materials useful in biomedical sensing and the development of enzymatic materials.

    Funding

    Research at the Center for Nanoscale Materials, a Department of Energy (DOE) Office of Science user facility, was supported by DOE Office of Science, Office of Basic Energy Sciences.


    Journal Link: Nanoscale, Jun-2022

    [ad_2]

    Department of Energy, Office of Science

    Source link

  • Want Better AI? Get Input From a Real (Human) Expert

    Want Better AI? Get Input From a Real (Human) Expert

    [ad_1]

    Newswise — Can AI be trusted? The question pops up wherever AI is used or discussed—which, these days, is everywhere.

    It’s a question that even some AI systems ask themselves. 

    Many machine-learning systems create what experts call a “confidence score,” a value that reflects how confident the system is in its decisions. A low score tells the human user that there is some uncertainty about the recommendation; a high score indicates to the human user that the system, at least, is quite sure of its decisions. Savvy humans know to check the confidence score when deciding whether to trust the recommendation of a machine-learning system.

    Scientists at the Department of Energy’s Pacific Northwest National Laboratory have put forth a new way to evaluate an AI system’s recommendations. They bring human experts into the loop to view how the ML performed on a set of data.  The expert learns which types of data the machine-learning system typically classifies correctly, and which data types lead to confusion and system errors. Armed with this knowledge, the experts then offer their own confidence score on future system recommendations.

    The result of having a human look over the shoulder of the AI system? Humans predicted the AI system’s performance more accurately.

    Minimal human effort—just a few hours—evaluating some of the decisions made by the AI program allowed researchers to vastly improve on the AI program’s ability to assess its decisions. In some analyses by the team, the accuracy of the confidence score doubled when a human provided the score.

    The PNNL team presented its results at a recent meeting of the Human Factors and Ergonomics Society in Washington, D.C., part of a session on human-AI robot teaming.

    “If you didn’t develop the machine-learning algorithm in the first place, then it can seem like a black box,” said Corey Fallon, the lead author of the study and an expert in human-machine interaction. “In some cases, the decisions seem fine. In other cases, you might get a recommendation that is a real head-scratcher. You may not understand why it’s making the decisions it is.”

    The grid and AI

     It’s a dilemma that power engineers working with the electric grid face. Their decisions based on reams of data that change every instant keep the lights on and the nation running. But power engineers may be reluctant to turn over decision-making authority to machine-learning systems.

    “There are hundreds of research papers about the use of machine learning in power systems, but almost none of them are applied in the real world. Many operators simply don’t trust ML. They have domain experience—something that ML can’t learn,” said coauthor Tianzhixi “Tim” Yin.

    The researchers at PNNL, which has a world-class team modernizing the grid, took a closer look at one machine-learning algorithm applied to power systems. They trained the SVM (support-vector machine) algorithm on real data from the grid’s Eastern Interconnection in the U.S. The program looked at 124 events, deciding whether a generator was malfunctioning, or whether the data was showing other types of events that are less noteworthy.

    The algorithm was 85% reliable in its decisions. Many of its errors occurred when there were complex power bumps or frequency shifts. Confidence scores created with a human in the loop were a marked improvement over the system’s assessment of its own decisions. The human expert’s input predicted the algorithm’s decisions with much greater accuracy.

     

    More human, better machine learning

    Fallon and Yin call the new score an “Expert-Derived Confidence” score, or EDC score.

    They found that, on average, when humans weighed in on the data, their EDC scores predicted model behavior that the algorithm’s confidence scores couldn’t predict.

    “The human expert fills in gaps in the ML’s knowledge,” said Yin. “The human provides information that the ML did not have, and we show that that information is significant. The bottom line is that we’ve shown that if you add human expertise to the ML results, you get much better confidence.”

    The work by Fallon and Yin was funded by PNNL through an initiative known as MARS—Mathematics for Artificial Reasoning in Science. The effort is part of a broader effort in artificial intelligence at PNNL. The initiative brought together Fallon, an expert on human-machine teaming and human factors research, and Yin, a data scientist and an expert on machine learning.

    “This is the type of research needed to prepare and equip an AI-ready workforce,” said Fallon. “If people don’t trust the tool, then you’ve wasted your time and money. You’ve got to know what will happen when you take a machine learning model out of the laboratory and put it to work in the real world.

    “I’m a big fan of human expertise and of human-machine teaming. Our EDC scores allow the human to better assess the situation and make the ultimate decision.”

    # # #

    [ad_2]

    Pacific Northwest National Laboratory

    Source link

  • Topology, Algebra, and Geometry Give Math Respect in Data Science

    Topology, Algebra, and Geometry Give Math Respect in Data Science

    [ad_1]

    By John Roach

    Newswise — In the computer vision field of object detection, deep learning models are trained to identify objects of interest within an image of a scene. For example, such models can be trained to detect viruses in microscopy images or pick out airplanes parked on tarmacs in overhead aerial imagery.

    “In many cases, like microscopy or overhead images, a user would want to ensure that the objects are found regardless of their orientation,” said Tegan Emerson, a senior data scientist and leader of the mathematics, statistics, and data science group at Pacific Northwest National Laboratory (PNNL). “However, this property is not inherent in all deep learning models.”

    In some cases, the deep learning model can pick out the airplanes with noses pointed north but fail to detect the airplanes pointed south, for instance.

    Emerson and her colleagues explored solutions to address this problem by applying the algebraic concept of group action to a deep learning model for object detection. Group action describes how things are changed under a collection of operations such as rotation. With these algebra-based architecture changes applied to the model, objects are more reliably detected in imagery no matter their orientation.

    “If you constrain the model to have this type of mathematical invariance to it, you’re able to maintain your ability to detect and appropriately identify the objects within your scene, which makes this a much more trustworthy tool for people to use,” Emerson said. “That matters in operational environments where a lot of our algorithms are going to be deployed.”

    Giving math respect in data science

    In recent years, mathematicians were pushed to the sidelines in data science disciplines as computer power and datasets used to train machine learning (ML) models grew exponentially and led to a step-change in capabilities such as artificial intelligence (AI) systems that can generate fluid prose in natural language, noted Timothy Doster, a senior data scientist at PNNL.

    “The mathematics community felt a little behind the time as massive amounts of funding went into these computer science fields,” he said. “But now they’re seeing research around explainability or dependability of these algorithms and that’s where math can really come in and address these areas.”

    In 2022, Doster, Emerson, and PNNL data scientist colleague Henry Kvinge co-founded the Topology, Algebra, and Geometry in Data Science (TAG-DS) community to help spur interest in the application of math to address specific topics in data science and ML.

    The community hosts workshops and conferences as well as provides publishing opportunities to drive awareness of mathematically principled solutions to data science problems. Most recently, the team hosted the second annual TAG in ML workshop at the International Conference on Machine Learning (ICML) on July 28, 2023, in Honolulu, Hawaii, and attracted more than 200 participants.

    Part of the interest in the TAG-DS community stems from the growing complexity of ML systems, which operate on high-dimensional, complex datasets using models that have thousands to billions of learnable parameters, noted Kvinge.

    “Such settings transcend human intuition which begins to quickly degrade beyond three dimensions,” he said. “Modern topology, algebra, and geometry were designed to allow mathematicians to understand exotic spaces, making them natural toolboxes to investigate when studying state-of-the-art machine learning.”

    Proof of math in data science

    In some cases, the application of math to data science can improve the rigor of AI models trained with massive datasets and computer power. For example, the mathematical study of symmetry, or representation theory, is used in some of the models capable of predicting how proteins fold and twist into their three-dimensional shapes, according to Kvinge.

    Protein folding models help scientists understand the structure of proteins, which are the building blocks of life—they are molecular machines that play a fundamental role in the structure, function, and regulation of nearly every biological process.

    “We know that how a protein folds should not depend on its location in space nor its orientation, and consequently a deep learning model should ignore these factors of variation when processing representations of proteins,” he explained. “Building model architectures can be done far more accurately when you understand how to capture the symmetries intrinsic to three-dimensional space.”

    In other cases, mathematics techniques can improve data used in more niche data science tasks such as using topological data analysis to extract shape-based features for ML models used to understand the structure and properties of materials such as the metal rods, tubes, and cubes that provide cars and trucks their shape, strength, and fuel economy.

    “Topology is the study of shape and there is a widely used quote from a leader in the field that states, ‘Data has shape, shape has meaning’ and what shape means for different formats of data can be nuanced,” noted Emerson.

    In one study, researchers applied topology to scanning electron microscopy images that were used to support research and development in advanced manufacturing. In this case, white precipitates, or solid materials, that formed during a metal manufacturing process were visible throughout the image. By looking at the topology of the precipitates at multiple threshold values, the team was able to capture physically meaningful features, summarize the information, and use it as input to the ML model.

    “Part of the difference in the paradigm for TAG-DS both at PNNL and in the scientific community is that you’re not just trying to train a model. What you’re trying to do is build a solution,” said Emerson. “You want something that actually addresses a need or a way to support a human who is involved in the processing pipeline.”

    Growing the TAG-DS community

    Engagement with the TAG-DS community has more than doubled in its first year of existence, according to Doster. For example, the TAG-ML workshop at ICML in 2022 had about 40 published submissions. This year’s workshop received more than 90 submissions and included four keynotes by world leaders in geometric and topological deep learning, two poster sessions, six spotlight talks, and other activities.

    Looking forward, the group is planning to host more workshops at computer science and mathematics conferences and is aiming to host a standalone TAG-DS conference in 2025.

    According to Emerson, the ability of TAG-DS to increase the rigor, trustworthiness, and explainability of AI systems will only grow in importance as technologies such as generative AI become widespread.

    “From a national laboratory’s perspective with our interest for the nation, but also for the average person in daily life, the mathematical rigor that the TAG-DS community can bring to understanding the ways these tools can support you, when they will work, how they will fail, and when they are not an appropriate technique to be using is critical,” she said.

    ###

    About PNNL

    Pacific Northwest National Laboratory draws on its distinguishing strengths in chemistry, Earth sciences, biology and data science to advance scientific knowledge and address challenges in sustainable energy and national security. Founded in 1965, PNNL is operated by Battelle for the Department of Energy’s Office of Science, which is the single largest supporter of basic research in the physical sciences in the United States. DOE’s Office of Science is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science. For more information on PNNL, visit PNNL’s News Center. Follow us on Twitter, Facebook, LinkedIn and Instagram.

    [ad_2]

    Pacific Northwest National Laboratory

    Source link

  • Probing the Intricate Structures of 2D Materials at the Nanoscale

    Probing the Intricate Structures of 2D Materials at the Nanoscale

    [ad_1]

    The Science

    Two-dimensional (2D) materials are just a single or a few layers of atoms thick. These materials often have exotic properties that may be useful for next-generation technologies. When layers of these materials are stacked, the electronic properties that emerge can be manipulated by, for example, twisting the layers with respect to one another. To fully understand these properties and correlate them with the twist angle, scientists need advanced microscopy techniques. Researchers developed a novel operating mode for the interferometric four-dimensional scanning transmission electron microscopy (4D-STEM) technique. This special technique allows researchers to measure the atomic-scale structural distortions, twist angle, and interlayer spacings that influence the unique electronic properties of layered 2D materials.

    The Impact

    Layered 2D materials have special properties that can advance technology beyond existing capabilities. For example, they could lead to faster and more energy efficient computers or more reliable electricity storage. The individual layers that make up these materials may each be oriented differently. This creates challenges in fully understanding their 3D atomic structures with existing microscopy techniques. Interferometric 4D-STEM can reveal the relative positions of atoms within separate layers of stacked and twisted 2D materials. The technique opens avenues to the design and development of materials with useful properties.

    Summary

    Layered 2D materials have attracted considerable attention due to their interesting electronic properties, which can be modified by changing the twist angle of bilayer materials, the stacking sequence of trilayer materials, or other factors. To fully understand and control the properties of these materials, researchers need to study their atomic structures. However, visualizing the atomic structure of few-layered materials is often challenging using conventional microscopy techniques, such as when working with materials composed of light elements or when 3D information is needed. Researchers need new techniques to improve precision and locally measure distortions and interlayer spacings in twisted materials composed of two or three layers, especially when they contain light elements or high twist angles.

    Researchers developed a new interferometric 4D-STEM modality that can provide information about local structural deformations within layers, twist direction and magnitude between layers, and interlayer distances for few-layered 2D materials. This new operating mode of 4D-STEM is still based on Bragg interferometry but uses a defocused electron probe to directly provide information about the relative positions of atoms within separate layers, as demonstrated in this study in bilayer and trilayer graphene. The technique sheds new light on the interplay between electronic properties and the precise structural arrangements of few-layer 2D materials.

     

    Funding

    The research was supported by the Center for Nanophase Materials Sciences, a DOE Office of Science user facility, and by the DOE Office of Science Early Career Award Program. Additional support was provided by the European Research Council and resources at the Vienna Scientific Cluster.


    Journal Link: Small, Jun-2021

    [ad_2]

    Department of Energy, Office of Science

    Source link

  • Zeroing in on EV batteries with more storage and faster charging

    Zeroing in on EV batteries with more storage and faster charging

    [ad_1]

    Newswise — Currently, the biggest hurdle for electric vehicles, or EVs, is the development of advanced battery technology to extend driving range, safety and reliability.

    New research has shown how a novel lithium-based electrolyte material (Li9N2Cl3) can be used to develop solid-state batteries that charge faster and store more energy than conventional designs. Experiments revealed the solid-electrolyte was not only stable in normal air environments, but it also inhibited the growth of dendrites — dangerous, branchlike formations that cause batteries to catch fire.

    Oak Ridge National Laboratory scientist Jue Liu conducted neutron experiments to observe how lithium moved through the material.

    “The material’s dry air stability, efficient lithium-ion transport, and high compatibility toward metallic lithium are crucial advances. It’s the best of both worlds,” he said. “It offers all the performance benefits of liquid-electrolyte batteries that we use every day, but it’s safer and more reliable.”


    Journal Link: Science Advances

    [ad_2]

    Oak Ridge National Laboratory

    Source link

  • Scientists Amplify Superconducting Sensor Arrays Signals Near the Quantum Limit

    Scientists Amplify Superconducting Sensor Arrays Signals Near the Quantum Limit

    [ad_1]

    The Science

    Newswise — Understanding how energy moves in materials is fundamental to the study of quantum phenomena, catalytic reactions, and complex proteins. Measuring how energy moves involves shining special X-ray light onto a sample to start a reaction. Detectors then collect the radiation the reaction emits. Conventional sensors usually lack the sensitivity needed for these studies. One solution is to use superconducting sensors. But amplifying the signals from these sensors is a major challenge. Building on advances from quantum computing, researchers added a special type of amplifiers, superconducting traveling-wave parametric amplifiers. While most amplifiers add noise to the measurement, these amplifiers are almost noiseless. In a major advance, researchers recently showed that the amplifiers can operate at 4 Kelvin, which is considered relatively high operating temperatures.

    The Impact

    Reducing the noise that is added during signal processing can improve a sensor’s performance. Amplification allows each sensor to operate faster and be more sensitive. Recent experiments have shown that parametric amplifiers can potentially analyze signals from many superconducting sensors at the same time. Superconducting sensors work at very low temperatures. At these temperatures, parametric amplifiers have very good noise performance, close to the limit of quantum mechanics. The advance paves the way to integrate such amplifiers with a variety of sensor technologies.

    Summary

    A superconducting sensor consists of a superconducting thermometer and an absorber. When X-rays are stopped in the absorber, they change the superconducting state of the sensor. This generates a small current in an electrical circuit. To make the detector more sensitive, many sensors are arranged into an array, like in a digital camera. Superconducting sensors operate at very cold temperatures (approximately 0.09 Kelvin), so they require specialized readout electronics and amplifiers. These amplifiers need to combine the signals from multiple sensors on a single readout line. Combining signals is known as multiplexing. One efficient way to do this is to couple each sensor in an array to a resonator. All of the resonators are coupled to a single output line. The current produced by an absorbed photon shifts the resonant frequency in a unique way for each sensor.

    Because these resonators work in microwave frequencies, the electronic chip that contains all the resonators as well as the output feedline is called the microwave multiplexer. Researchers are preparing to measure the signals from an array of sensors and a microwave multiplexer with a readout chain whose first amplifier is a kinetic-inductance traveling-wave parametric amplifier instead of a conventional semiconductor amplifier. Using the parametric amplifier will reduce readout noise and enable larger arrays of faster sensors.

     

    Funding

    This work was funded by the Department of Energy Office of Science, Basic Energy Sciences Accelerator and Detector Research Program, the National Institute of Standards and Technology’s Innovations in Measurement Science Program, and NASA.


    Journal Link: Physical Review Applied, Apr-2022

    [ad_2]

    Department of Energy, Office of Science

    Source link

  • ORNL scientists close the cycle on recycling mixed plastics

    ORNL scientists close the cycle on recycling mixed plastics

    [ad_1]

    Newswise — Little of the mixed consumer plastics thrown away or placed in recycle bins actually ends up being recycled. Nearly 90% is buried in landfills or incinerated at commercial facilities that generate greenhouse gases and airborne toxins. Neither outcome is ideal for the environment.

    Why aren’t more mixed plastics recycled? It’s usually easier and less expensive to make new plastic products than reclaim, sort and recycle used ones. Conventional recycling of mixed plastics has previously meant manually or mechanically separating the plastics according to their constituent polymers.

    Addressing the issue, scientists at the Department of Energy’s Oak Ridge National Laboratory used carefully planned chemical design, neutron scattering and high-performance computing to help develop a new catalytic recycling process. The catalyst selectively and sequentially deconstructs multiple polymers in mixed plastics into pristine monomers — molecules that react with other monomer molecules to form a polymer. The process offers a promising strategy for combating global plastic waste, such as bottles, packaging, foams and carpets.

    The researchers’ analysis, published in Materials Horizons, compared using the new multipurpose catalyst to using individual catalysts for each type of plastic. The new catalyst would generate up to 95% fewer greenhouse gases, require up to 94% less energy input, and result in up to a 96% reduction in fossil fuel consumption.

    “Our approach involves a tailored synthetic organocatalyst — a compound comprised of small organic molecules that facilitate organic chemical transformations. The organocatalyst can convert batches of mixed plastic waste into valuable monomers for reuse in producing commercial-grade plastics and other valuable materials,” said Tomonori Saito, an ORNL synthetic polymer chemist and corresponding author. “This exceptionally efficient chemical process can help close the loop for recycling mixed plastics by replacing first-use monomers with recycled monomers.

    “Today, nearly all plastics are made from fossil fuels using first-use monomers made by energy-intensive processes. Establishing this kind of closed-loop recycling, if used globally, could reduce annual energy consumption by about 3.5 billion barrels of oil,” Saito added.

    A recycling solution for over 30% of all plastics

    The new organocatalyst has proven to efficiently and quickly deconstruct multiple polymers — in around two hours. Such polymers include those used in materials such as safety goggles (polycarbonates), foams (polyurethanes), water bottles (polyethylene terephthalates) and ropes or fishing nets (polyamides), which together comprise more than 30% of global plastic production. Until now, no single catalyst has been shown to be effective on all four of these polymers.

    The process provides many environmental advantages by replacing harsh chemicals for deconstructing polymers, as well as offering good selectivity, thermal stability, nonvolatility and low flammability. Its effectiveness against multiple polymers also makes it useful for deconstructing the increasing amounts of multicomponent plastics, such as composites and multilayer packaging.

    Small-angle neutron scattering at ORNL’s Spallation Neutron Source was used to help confirm the formation of deconstructed monomers from the waste plastics. The method scatters neutrons at small angles to characterize the structure at different levels of detail, from nanometers to fractions of a micrometer.

    Converting mixed plastics polymers to true recycled plastics

    The organocatalyst deconstructs the plastics at different temperatures, which facilitates sequentially recovering the individual monomers separately, in reusable form. Polycarbonates deconstruct at 266 F (130 C), polyurethanes at 320 F (160 C), polyethylene terephthalates at 356 F (180 C) and polyamides at 410 F (210 C). Other plastics, additives and associated materials such as cotton and plastic bags are left intact because of the differences in their reactivity and can subsequently be recovered.

    “The deconstructed monomers and the organocatalyst are water soluble, so we can transfer them into water, where any impurities such as pigments can be removed by filtration,” said Md Arifuzzaman, the study’s lead author and former postdoctoral synthetic organic chemist at ORNL. He is now an Innovation Crossroads Fellow and CEO and Founder of the Re-Du Company. “The nearly pure monomers are then extracted, leaving the catalyst, which is almost entirely recovered by evaporating the water and can be directly reused for multiple deconstruction cycles.”

    The study included researchers from ORNL’s Chemical Sciences Division and Center for Nanophase Materials Sciences within the Physical Sciences Directorate, the Neutron Sciences Directorate and the Department of Chemical Engineering at the University of Virginia, Charlottesville.

    CNMS and SNS are Department of Energy Office of Science user facilities.
    UT-Battelle manages ORNL for DOE’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. The Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science

     

    [ad_2]

    Oak Ridge National Laboratory

    Source link

  • Scientists illuminate the mechanics of solid-state batteries

    Scientists illuminate the mechanics of solid-state batteries

    [ad_1]

    Newswise — As current courses through a battery, its materials erode over time. Mechanical influences such as stress and strain affect this trajectory, although their impacts on battery efficacy and longevity are not fully understood.

    A team led by researchers at the Department of Energy’s Oak Ridge National Laboratory developed a framework for designing solid-state batteries, or SSBs, with mechanics in mind. Their paper, published in Science, reviewed how these factors change SSBs during their cycling.

    “Our goal is to highlight the importance of mechanics in battery performance,” said Sergiy Kalnaus, a scientist in ORNL’s Multiphysics Modeling and Flows group. “A lot of studies have focused on chemical or electric properties but have neglected to show the underlying mechanics.”

    The team spans several ORNL research areas including computation, chemistry and materials science. Together, their review painted a more cohesive picture of the conditions that affect SSBs by using perspectives from across the scientific spectrum. “We’re trying to bridge the divide between disciplines,” said Kalnaus.

    In batteries, charged particles flow through materials known as electrolytes. Most are liquids, like in the lithium-ion batteries found in electric cars — but solid electrolytes also are being developed. These conductors are typically made from glass or ceramic and could offer advantages such as enhanced safety and strength.

    “True solid-state batteries don’t have flammable liquids inside,” said Kalnaus. “This means that they would be less hazardous than the batteries commonly used today.”

    However, solid electrolytes are still in the early stages of development due to the challenges associated with these novel materials. SSB components swell and shrink during charge and mass transport, which alters the system. “Electrodes constantly deform during the battery operation, creating delamination and voids at the interfaces with the solid electrolyte,” said Kalnaus. “In today’s systems, the best solution is applying a large amount of pressure to keep everything together.”

    These dimensional changes damage solid electrolytes, which are made from brittle materials. They often break in response to strain and pressure. Making these materials more ductile would allow them to withstand stress by flowing instead of cracking. This behavior can be achieved with some techniques that introduce small crystal defects into ceramic electrolytes.

    Electrons leave a system through anodes. In SSBs, this component can be made from pure lithium, which is the most energy dense metal. Although this material offers advantages for a battery’s power, it also creates pressure that can damage electrolytes.

    “During charging, nonuniform plating and an absence of stress-relief mechanisms can create stress concentrations. These can support large amounts of pressure, enabling the flow of lithium metal,” said Erik Herbert, the leader of ORNL’s Mechanical Properties and Mechanics group. “In order to optimize the performance and longevity of SSBs, we need to engineer the next generation of anodes and solid electrolytes that can maintain mechanically stable interfaces without fracturing the solid electrolyte separator.”

    The team’s work is part of ORNL’s long history of researching materials for SSBs. In the early 1990s, a glassy electrolyte known as lithium phosphorous oxynitride, or LiPON, was developed at the lab. LiPON has become widely used as an electrolyte in thin-film batteries that have a metallic lithium anode. This component can withstand many charge-discharge cycles without failure, largely due to the ductility of LiPON. When met with mechanical stressors, it flows instead of cracking.

    “In recent years we have learned that LiPON has robust mechanical properties to complement its chemical and electrochemical durability,” said Nancy Dudney, an ORNL scientist who led the team that developed the material.

    The team’s effort highlights an under-studied aspect of SSBs — understanding the factors that shape their lifespan and efficacy. “The research community needed a road map,” said Kalnaus. “In our paper, we outlined the mechanics of materials for solid-state electrolytes, encouraging scientists to consider these when designing new batteries.”

    UT-Battelle manages Oak Ridge National Laboratory for the Department of Energy’s Office of Science. The single largest supporter of basic research in the physical sciences in the United States, the Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science

    [ad_2]

    Oak Ridge National Laboratory

    Source link

  • ORNL is poised to have a major role in the future of nuclear physics

    ORNL is poised to have a major role in the future of nuclear physics

    [ad_1]

    Newswise — The Department of Energy’s Oak Ridge National Laboratory, a bastion of nuclear physics research for the past 80 years, is poised to strengthen its programs and service to the United States over the next decade if national recommendations of the Nuclear Science Advisory Committee, or NSAC, are enacted.

    “The 2023 Long Range Plan lays out a compelling vision for nuclear science in the United States under multiple budget scenarios,” said Gail Dodge, physicist at Old Dominion University and chair of the NSAC. “Implementation of the Long Range Plan’s recommendations will maintain the nation’s leadership and workforce in nuclear science.”

    On Wednesday the NSAC, which advises DOE and the National Science Foundation on nuclear physics, approved a 10-year roadmap, or Long Range Plan. It includes four key priorities that would advance the nation’s nuclear science research program and set the direction of research for another generation of scientists.

    The recommendations would give ORNL a continuing critical role in helping maintain the nation’s leadership in nuclear physics for at least the next decade — solving mysteries of how the smallest particles in the universe behave and using that understanding to advance medicine, quantum science, energy, national security and other areas that improve the lives of people everywhere.

    Research in nuclear physics — the science of atomic nuclei and their constituents — helps us understand how virtually all ordinary matter in the universe originated and evolved. The cutting-edge research on particles is also used in isotope production, medical diagnosis, national security, energy, nuclear treaty verification, the environment and nuclear applications.

    The highest priority, according to the plan, is increasing the budget for nuclear physics in theoretical, experimental and computational research “to capitalize on the extraordinary opportunities for scientific discovery made possible by the substantial and sustained investment of the United States.” This would expand “discovery potential, technological innovation, and workforce development to the benefit of society.” This recommendation, if adopted, would ensure user facilities throughout the country would continue to operate at the highest level and reap the most scientific benefit.

    “Each one of the four recommendations has a large impact for ORNL,” said David Radford, ORNL physicist and head of the lab’s Fundamental Nuclear and Particle Physics Section. For example, another recommendation is for funding of multiple large experiments to search for neutrinoless double beta decay; one of these experiments has leadership and significant participation from ORNL scientists. The advisory committee recommends that construction of ton-scale detectors addressing fundamental physics should be a top budgetary priority.

    That research, which aims to solve the problem of how matter came to dominate over antimatter, will provide insight into the origin and mass of the neutrino, and in so doing could rewrite the Standard Model of particle physics. The research includes experiments known as CUPID, LEGEND and nEXO proposed by international collaborations. ORNL scientists, including Radford, are leading DOE’s contribution to building LEGEND.

    “This could help explain the matter-antimatter imbalance in the universe,” Radford said. “This plan reiterates that the experiment should go forward. That’s very important for this extremely compelling and exciting physics.”

    Radford and Cynthia Jenks, ORNL’s associate laboratory director for the Physical Sciences Directorate, said the ORNL impacts at a rollout of the plan on Friday after the plan was released to the public on Wednesday.

    Another committee recommendation calls for the “expeditious completion” of the Electron-Ion Collider, a massive particle accelerator that would be built at Brookhaven National Laboratory. Already, ORNL physicists are hard at work designing and building a detector for the system, which, like a precision microscope, will illuminate three-dimensional images of nuclear matter, uncovering how particles like quarks and gluons interact and behave. Experiments on the machine could help answer longstanding questions about the fundamental particles of matter.

    An additional recommendation is to advance discovery science for society by investing in scientific projects that offer new strategic opportunities. Such opportunities advance computing, nuclear data for medicine, clean energy, national security, nonproliferation, the environment and space — all areas that are in ORNL’s wheelhouse of research and that would bolster ORNL’s research programs, Radford said.

    “ORNL certainly does work in these areas, using emerging technologies to meet national needs,” Radford said, adding that programs in nuclear data, advanced computing, sensing, quantum information and nuclear data all make use of not only physicists but engineers, data scientists and other experts. An example is ORNL’s Advanced Radiation Detection, Imaging, Data Science and Applications group, which is invested in these research areas. Also, high-performance computing research impacts physics experiments around the world, including at CERN in Switzerland and elsewhere.

    DOE facilities, such as ORNL’s Spallation Neutron Source, or SNS, an Office of Science user facility, are critical to fundamental nuclear physics research by ORNL researchers and other laboratory and university scientists around the world. An important experiment at SNS is the neutron electric dipole moment experiment, which aims to make the world’s best measurement of this property, an accomplishment that would be “paradigm shifting,” the committee says. Similarly, ORNL scientists use DOE’s Facility for Rare Isotope Beams, or FRIB, also a DOE Office of Science user facility, at Michigan State University, which is producing exciting results on decays of never-before-produced isotopes. ORNL helped lead construction of a day-one detector for that facility that has already produced high-impact results.

    Such scientific advances rely on a workforce trained in science, and the plan calls for resources to help build the next generation of STEM researchers. This includes ensuring graduate students are fairly compensated and “expanding policies and resources to ensure an environment that is safe and respectful to everyone,” said Shelly Lesher, a physicist at the University of Wisconsin, La Crosse. One of the architects of the workforce development section of the plan, Lesher added that the plan calls for exposure of the field to broader populations to increase representation. Like all 17 of America’s DOE national laboratories, ORNL stands to benefit from policies that make it possible for people from all walks of life to join the field, Radford said.

    Said Radford, “The training of the future workforce at this lab will help the security and economic prosperity of the country. This is the voice of the community saying what its priorities are and that the nation would benefit tremendously by buying into that and funding nuclear physics at the appropriate level.”

    UT-Battelle manages ORNL for the Department of Energy’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. The Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.— Lawrence Bernard

    [ad_2]

    Oak Ridge National Laboratory

    Source link

  • Scientists Discover a New Phase of High-Density, Ultra-Hot Ice

    Scientists Discover a New Phase of High-Density, Ultra-Hot Ice

    [ad_1]

    The Science

    Newswise — The outer planets of our solar system, like Uranus and Neptune, are water-rich gas giants. These planets have extreme pressures of 2 million times the Earth’s atmosphere. They also have interiors as hot as the surface of the Sun. Under these conditions, water exhibits exotic, high-density ice phases. Researchers recently observed one of these phases, called Ice XIX, for the first time using high-power lasers to reproduce the necessary extreme conditions. They measured the Ice XIX structure using the Matter at Extreme Conditions instrument at the Linac Coherent Light Source, a pioneering X-ray laser facility, to show that oxygen atoms pack in a body-centered cubic structure, while the hydrogen atoms move freely like a fluid, dramatically increasing conductivity.

    The Impact

    Voyager II, a NASA solar system exploration spacecraft launched in 1977, measured highly unusual magnetic fields around Uranus and Neptune. Scientists considered exotic states of so-called superionic ice as a possible explanation due to these states’ increased electrical conductivity. This work demonstrates the existence of the previously undiscovered Ice XIX phase. It shows that this phase could form at the right depths and help explain the Voyager II magnetic data.

    Summary

    Water–a compound that is ubiquitous in our solar system and necessary for life–exhibits an exceptionally complex pressure-temperature phase diagram with 18 crystalline ice phases already identified. Nowhere are dense ice phases more important than in the interiors of gas giants like Uranus and Neptune. Scientists hypothesize that these planets’ complex magnetic fields are produced by exotic high-pressure states of water ice with superionic properties. However, the structure of ice at these extreme conditions is notoriously challenging to measure.

    Using the Matter at Extreme Conditions instrument at the Linac Coherent Light Source, an ultrafast X-ray Free Electron Laser and a Department of Energy (DOE) Office of Science user facility, to probe the ice structure during laser-driven dynamic compression, researchers found the first direct evidence of a new phase of high-density, ultra-hot water ice. At 200 GPa (2 million atmospheres) and 5,000 K (8,500 degrees Fahrenheit) this new high-pressure ice phase, deemed Ice XIX, has a body-centered cubic (BCC) lattice structure. Though other structures have been theorized to be stable at these conditions, Ice XIX’s BCC structure would enable an increase in the electrical conductivity much deeper into the interiors of ice giants than previously thought. The results provide an important and compelling origin of the multi-polar magnetic fields as measured by the Voyager II spacecraft for Uranus and Neptune.

     

    Funding

    Funding for this research included the DOE National Nuclear Security Administration; the DOE Office of Science, Fusion Energy Science; the Laboratory Directed Research & Development program of Los Alamos National Laboratory; and the National Science Foundation. The experimental measurements were conducted at the Matter at Extreme Conditions instrument (operated by the DOE Office of Science, Fusion Energy Science program) of the Linac Coherent Light Source, a DOE Office of Science, Basic Energy Sciences user facility operated by SLAC National Accelerator Laboratory.


    Journal Link: Scientific Reports, Jan-2022

    [ad_2]

    Department of Energy, Office of Science

    Source link

  • Argonne joins Illinois manufacturers for ​“Makers on the Move” tour

    Argonne joins Illinois manufacturers for ​“Makers on the Move” tour

    [ad_1]

    Newswise — Manufacturers throughout Illinois will have the chance to learn about working with the Materials Manufacturing Innovation Center (MMIC) at the U.S. Department of Energy’s (DOE) Argonne National Laboratory, when the MMIC gets on the bus for the second annual Makers on the Move tour. 

    The Illinois Manufacturing Association and Illinois Manufacturing Excellence Center (IMEC) sponsor the eight-day, 1,000-mile tour, designed to showcase high-tech, clean, diverse and sustainable modern manufacturing. The branded Makers on the Move bus will stand out on the state’s roadways as it visits facilities, colleges and high schools in all corners of Illinois, starting Friday, Oct. 6 at Boeing in Mascoutah and ending on Oct. 13 at various Chicago facilities.   

    “We look forward to experiencing the diversity of manufacturing in Illinois and helping deliver on the lab’s mission of accelerating science and technology to drive U.S. prosperity and security.”  — MMIC Director Chris Heckle

    Meeting Illinois manufacturers face to face and learning their stories is a great opportunity for the MMIC, which exists to support industry partners in solving enduring manufacturing R&D challenges, identifying commercialization opportunities, licensing new technologies and introducing transformational discoveries to the marketplace, said MMIC Director Chris Heckle.

    “Celebrating October as Manufacturing Month is important to us at Argonne,” she said. ​“We look forward to experiencing the diversity of manufacturing in Illinois and helping deliver on the lab’s mission of accelerating science and technology to drive U.S. prosperity and security.” 

    In 2021, manufacturing contributed $2.3 trillion to the U.S. gross domestic product (GDP), amounting to 12.0% of total U.S. GDP, according to the National Institute of Standards and Technology (NIST). IMEC is an approved center in NIST’s Manufacturing Extension Partnership national network.

    IMEC CEO and President David Boulay said Illinois manufacturers are keen to innovate and will be interested in how MMIC can connect them to Argonne’s cutting-edge research, capabilities and facilities, including the Materials Engineering Research Facility and Argonne Leadership Computing Facility, a DOE Office of Science user facility.

    “Argonne’s expertise in the materials and chemical processing spaces can help solve complex problems,” he said. ​“Argonne is a national laboratory with the MMIC as a great regional resource. Their commitment to traveling across the state with our team, in the spirit of supporting industry, demonstrates the lab’s commitment to partnership in manufacturing innovation.”  

    Launched last year, MMIC executes on Argonne’s commitment to advancing U.S. manufacturing by de-risking and accelerating the scale-up and commercialization of new, complex materials critically important to U.S. competitiveness. With MMIC as a first point of contact, industry can engage with scientists working on a new frontier of advanced manufacturing techniques and access facilities and equipment essential for inventing processes for transformative materials. Learn more.

    The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines. Supported by the U.S. Department of Energy’s (DOE’s) Office of Science, Advanced Scientific Computing Research (ASCR) program, the ALCF is one of two DOE Leadership Computing Facilities in the nation dedicated to open science.

    Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science.

    The U.S. Department of Energy’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://​ener​gy​.gov/​s​c​ience.

    [ad_2]

    Argonne National Laboratory

    Source link

  • Argonne’s STEM mapping project highlights opportunities on Chicago’s south side

    Argonne’s STEM mapping project highlights opportunities on Chicago’s south side

    [ad_1]

    Newswise — To become the diverse and talented workforce of today and tomorrow, learners of all ages and from every community need access to educational and training resources in science, technology, engineering and mathematics (STEM). There are many schools and organizations working to inspire, motivate and train learners of all ages in historically underserved neighborhoods of Chicago. To better understand these current resources and to grow and sustain a robust STEM ecosystem, the U.S. Department of Energy’s Argonne National Laboratory has undertaken a STEM mapping project, called the STEM Opportunity Landscape Project, in nine south side Chicago neighborhoods.

    STEM asset mapping consists of an information gathering process that involves identifying all STEM programming, community learning spaces, workforce development programs and STEM employment opportunities in a neighborhood. The collected data is then depicted in the form of maps and other visualizations, creating a comprehensive and interactive STEM opportunity landscape.

    STEM mapping provides communities a holistic view of their community assets and collective strengths, enabling them to leverage these resources effectively. The maps and visualizations reflect STEM assets and opportunities that serve students from kindergarten to their careers, and they have just been made fully accessible to the public.

    “Argonne’s STEM Opportunity Landscape Project provides a free website that elevates the STEM learning, workforce and employment opportunities within these nine communities for learners of all ages. This tool provides valuable insight into crafting deliberate STEM learning pathways K-Career, addressing and closing existing gaps, fostering strategic partnerships, and optimizing available resources to enrich STEM opportunities,” said Meridith Bruozas, the institutional partnership director at Argonne.

    As part of the Argonne in Chicago initiative that includes an office space in Hyde Park, the STEM mapping project focuses on the following nine communities: Douglas, Grand Boulevard, Greater Grand Crossing, Hyde Park, Kenwood, Oakland, South Shore, Washington Park and Woodlawn.  The mapping project collected survey data from learning spaces, including schools, within these communities to identify potential linkages between them. “There are places that already exist in these communities, like makerspaces, computer labs and instructional kitchens, that a lot of people are generally not aware of,” said Argonne STEM Education Partnerships and Outreach Manager Jessica Burgess.

    According to Burgess, the STEM inventory being performed as part of the mapping project helps fulfill a need for a unified approach. “There’s been a call for a STEM ecosystem in which we can bring people together,” she said. “Through the Argonne in Chicago office, the laboratory has the ability to be a convener, building bridges within and between communities to maximize the connections that learners can make as they embark on their educational and career pathways.”

    Various organizations have historically offered valuable programming in these communities. However, these programs do not always connect into a larger STEM ecosystem. “The STEM mapping project offers us a really good view of the current state of the landscape, so that the schools, organizations and employers that work in these communities can identify strengths and weaknesses and ultimately drive connected learner pathways that provide skill development for learners that will eventually lead to STEM careers,” Burgess said.

    In addition to STEM education in schools, Burgess also described ways in which the STEM mapping initiative would be helpful for workforce development. “By including employers, particularly those that demand math- or engineering-related skills, we can help develop various routes by which members of these communities can achieve new STEM-related possibilities,” she said.

    “We are excited to introduce this comprehensive STEM resource to the participating communities,” Bruozas said. “With the tool launched, we are excited about the next phase of the project — diving into the data with the community — this will include hosting data-driven community conversations and co-creating a plan for what STEM learning looks like on the south side.

    By highlighting existing resources, facilitating collaboration, and engaging communities in decision-making, the STEM mapping initiative seeks to create a more equitable and inclusive STEM ecosystem. The project’s impact extends beyond the immediate communities on Chicago’s south side, serving as a model for other regions striving to provide equal access to STEM opportunities.

    [ad_2]

    Argonne National Laboratory

    Source link

  • High-performance, Earth-friendly Materials for Geothermal Wells

    High-performance, Earth-friendly Materials for Geothermal Wells

    [ad_1]

    Newswise — UPTON, NY—The U.S. Department of Energy (DOE) has announced $19 million in funding over four years for a new research center focused on exploring the chemical and mechanical properties of cement composites and other materials used in enhanced geothermal systems (EGS). The “Center for Coupled Chemo-Mechanics of Cementitious Composites for EGS” (C4M)—one of 11 Energy Earthshot Research Centers (EERCs) just announced by DOE as part of its Energy Earthshots™ Initiative—will be located in the Interdisciplinary Science Department at DOE’s Brookhaven National Laboratory. Research there and at partner institutions will inform the design of Earth-friendly varieties of cement composites, coatings, and other barriers designed to protect geothermal wells. The ultimate goal is to expand the use of this abundant, sustainable form of energy.

    “Our Energy Earthshots are game-changing endeavors to unleash the technologies of the clean energy transition and make them accessible, affordable, and abundant,” said U.S. Secretary of Energy Jennifer M. Granholm. “The Energy Earthshot Research Centers and the related work happening on college campuses around the country will be instrumental in developing the clean energy and decarbonization solutions we need to establish a 100% clean grid and beat climate change.”

    Brookhaven Lab materials scientist Tatiana Pyatina, who leads the geothermal materials research effort at Brookhaven Lab and will direct the new C4M EERC, said, “Geothermal energy has the potential to transform abundant heat trapped deep underground into gigawatts of electricity for powering millions of American homes. It is renewable, has a small geographical footprint, and, unlike other green energies [e.g., wind and solar], is available around-the-clock.”

    But there are a few sticking points: The materials used to construct the wells—including cement composites that support and insulate the pipelike metal casings that carry Earth-heated fluids from subterranean depths to the surface—must withstand extreme temperatures and corrosive conditions and last for many years. Enhanced geothermal systems, which force more fluid than is naturally present through hot underground rocks to increase the extraction of heat, experience even greater thermo-mechanical stresses. Such stringent materials requirements can drive up construction costs.

    In addition, the cement currently used in well-supporting composites is an extreme carbon dioxide (CO2) emitter. Almost a pound of the heat-trapping gas is released for every pound of cement produced—through cement-making chemical reactions and the use of fossil fuels to power the process.

    “To realize geothermal energy’s potential, it is therefore essential to rationally design cost-effective, sustainable well-construction materials with a net-zero CO2 footprint,” Pyatina said.

    To achieve that goal, the C4M team will conduct extensive studies of the chemical and mechanical properties of new forms of cementitious composite materials. Their goals are to understand the chemical changes that take place in these materials under high temperature and pressure so they can design reliable and durable composites for use in the extremely challenging underground environments. By quantifying the effects of these chemical changes on materials’ performance, they will learn to control the solidification and transformations of these materials so they can be deployed successfully and economically in well construction and operation.

    “This work will build on a long history of award-winning research at Brookhaven Lab on materials for sustainable energy applications, including geothermal energy,” Pyatina said. “Our hope is that this research will achieve our goal of developing net-zero CO2 materials that will cut the cost of enhanced geothermal systems by 90% by 2035.” 

    Amy Marschilok, the energy systems and energy storage division manager of the Interdisciplinary Science Department, noted, “To meet our Nation’s energy goals we need new approaches to harness green energy and release it on demand. The new C4M EERC epitomizes the Interdisciplinary Science Department mission, leveraging Brookhaven Lab’s expertise across the innovation cycle from fundamental materials science to functional energy systems. I look forward to significant advances under Tatiana’s leadership.”

    New material needs

    In the process of cement production, limestone (calcium carbonate) and other materials are heated to very high temperatures in cement kilns. The high heat triggers a chemical reaction that decomposes the limestone, transforming the calcium carbonate and other ingredients into the compounds that ultimately make up cement powder. The limestone decomposition reaction and the heating that drives it (if powered by fossil fuels) release CO2. To avoid these CO2 emissions, the C4M team will be exploring the use of alternate minerals, possibly even the mud used to drill the wells, which would form its own cement in place.

    To ensure well durability, they’ll be seeking to identify materials with geologically stable mineral phases. They will also investigate the use of inorganic coatings that make the pipe-like well casings more resistant to high temperatures and aggressive environments. Some coatings may protect the metal casings so well that cement would no longer be needed.

    The team will use both laboratory experiments and computational modeling to elucidate and predict the performance of these new cements and composite materials from the atomic to the macroscopic scale, and for a time span ranging from seconds to years. They expect to use information identified through these studies and the use of artificial intelligence and high-performance computing to design advanced materials with long durability for geothermal applications.

    “We have assembled a multi-disciplinary team of leading researchers with complementary expertise,” Pyatina said, noting that the team will leverage expertise and DOE Office of Science user facilities at Brookhaven—including the National Synchrotron Light Source II (NSLS-II) and Center for Functional Nanomaterials (CFN)—as well as at partner institutions, including the Advanced Light Source at DOE’s Lawrence Berkley National Laboratory. Additional partners include DOE’s Sandia National Laboratory, DOE’s Lawrence Livermore National Laboratory, DOE’s Los Alamos National Laboratory, and four universities: University of Texas at Austin (a minority-serving institution), Cornell University, University of Illinois Urbana-Champaign, and Princeton University.

    “Through this Center, an incredibly talented team has been assembled to develop the fundamental understanding of the materials needed to push back the pressure and temperature boundaries of geothermal power production,” said Thomas Butcher, a research engineer who leads the energy conversion group in Brookhaven Lab’s Interdisciplinary Science Department. “Each member has been leading research in this area for a long time, but this project will allow them to focus on this important challenge in a truly collaborative way.”

    Another group of Brookhaven Lab scientists will participate as partners in one of the other Energy Earthshot Research Centers. That center—“Degradation Reactions in Electrothermal Energy Storage (DEGREES)”—will be led by DOE’s National Renewable Energy Laboratory (NREL). James Wishart, Simerjeet Gill, and Yu-chen (Karen) Chen-Wiegart, staff scientists at Brookhaven, will be partners in this center. They will explore the interactions of molten salts (used here as heat transfer fluids) with thermal energy storage materials and investigate how contact with molten salt affects the thermal materials’ stability and performance over time. This research will make use of multimodal x-ray synchrotron techniques at NSLS-II and will include studies on samples brought to NSLS-II from other partner institutions.

    Brookhaven National Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit science.energy.gov.

    Follow @BrookhavenLab on social media. Find us on Instagram, LinkedIn, Twitter, and Facebook.

    [ad_2]

    Brookhaven National Laboratory

    Source link

  • World-class neutron source takes a break for major Proton Power Upgrade

    World-class neutron source takes a break for major Proton Power Upgrade

    [ad_1]

    Newswise — The Spallation Neutron Source at Oak Ridge National Laboratory — already the world’s most powerful accelerator-based neutron source — will be on a planned hiatus through June 2024 as crews work to upgrade the facility.

    Much of the work — part of the facility’s Proton Power Upgrade project — will involve building a connector between the accelerator and the planned Second Target Station at SNS. When complete, the PPU project will bring the accelerator up to 2.8 megawatts from its current record-breaking 1.7 megawatts of beam power.

    Workers will add about 3,000 square feet of concrete tunnel, the “stub,” which will integrate with an existing tunnel. Construction tasks include associated structures, roofing, geomembrane liner, tunnel waterproofing, electrical, fire alarm, ventilation systems and controls.

    “The construction crews have performed all of the excavation work and are transitioning to tunnel base and wall construction,” said ORNL’s Mark Champion, PPU project manager.

    The stub is scheduled to be completed within six months, by the end of February, and most of the rest of this outage will involve installing new components and systems to complete the PPU project.

    That work includes:

    • Installing three new cryomodules, adding more radio-frequency stations and upgrading two high-voltage units to support new 3.0 megawatt klystrons.
    • Installing an injection dump imaging system and new magnets and upgrading deionized water systems, power supplies and a beam power limit system.
    • Installing a new liquid hydrogen refill system, mercury overflow tank and target complete with gas injection and recirculation system.
    • Completing controls integration.

    The upgrade will increase the flow of neutrons — known as the neutron flux — to the First Target Station, or FTS, and eventually also power the STS.

    Power to the FTS — which produces thermal neutrons to analyze samples down to the atomic scale — will increase to 2.0 megawatts, enabling new scientific discoveries in such areas as superconductors, energy materials such as those used in batteries, and basic physics. The additional power will be split via the stub, to power the STS, which will have the world’s highest peak brightness of neutrons, tailored for probing soft matter such as polymers and biological materials, and complex engineering materials. This is used in vaccine research, advanced batteries and for decarbonization studies.

    “It’s very gratifying to reach one of the final stages of the project after several years of planning, design and engineering,” said ORNL’s John Galambos, PPU project director. “It’s a huge tribute to the skills and dedication of the entire PPU team and our partner labs that the project has remained on schedule and on budget despite unprecedented challenges, including Covid-19 and subsequent supply chain issues.”

    The Spallation Neutron Source is an Office of Science user facility at ORNL.

    UT-Battelle manages ORNL for the Department of Energy’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. The Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

    [ad_2]

    Oak Ridge National Laboratory

    Source link