ReportWire

Tag: DOE Science News Source

  • Spallation Neutron Source accelerator achieves world-record 1.7-megawatt power level to enable more scientific discoveries

    Spallation Neutron Source accelerator achieves world-record 1.7-megawatt power level to enable more scientific discoveries

    [ad_1]

    Newswise — The Spallation Neutron Source at the Department of Energy’s Oak Ridge National Laboratory set a world record when its particle accelerator beam operating power reached 1.7 megawatts, substantially improving on the facility’s original design capability.

    The accelerator’s higher power provides more neutrons for researchers who use the facility to study and improve a wide range of materials for more efficient solar panels, longer–lasting batteries and stronger, lighter materials for transportation. The achievement marks a new operational milestone for neutron scattering in the United States and opens the door to tackling more difficult questions and problems in materials science research.

    “This increase in beam power represents another milestone in the Proton Power Upgrade project, an essential component in enabling new science at the SNS, including insights into advanced materials for clean energy applications,” said interim ORNL Director Jeff Smith. “I commend our staff for their efforts in accomplishing this new record.”

    Since construction was completed in 2006, the SNS has been a world-leading DOE Office of Science user facility that provides powerful advanced scientific capabilities for thousands of researchers from around the world to study energy phenomena and materials down to the atomic scale.

    The facility produces neutrons by accelerating protons down a 300-meter-long linear accelerator, around an accumulator ring and into a liquid mercury target. Upon impact, a “spall” of neutrons is routed to surrounding research instruments, which enables scientists to study the atomic structure and behavior of various materials. Neutrons scatter off atoms within the material and are captured by high-speed detectors, revealing fundamental information for research teams to analyze.

    A megawatt is a unit of measure of the beam power of a particle accelerator. The SNS’ 1.7-megawatt power level was reached after the recent installation of additional accelerating systems, part of the ongoing Proton Power Upgrade project at the accelerator.

    ORNL’s Proton Power Upgrade will continue to push the particle accelerator’s beam power up to 2.8 megawatts. This will increase the number of neutrons available for experiments at the existing First Target Station to enable new discoveries and power the planned Second Target Station, a complementary third neutron source at ORNL. STS will address emerging science challenges through experiments not currently feasible nor routine, with the ability to study smaller or less-concentrated samples or those under more extreme environmental conditions.

    Besides SNS, ORNL is home to the High Flux Isotope Reactor. Completed in 1965 and operating at 85 megawatts, HFIR’s steady-state neutron beam is the strongest reactor-based neutron source in the United States.

    The SNS and HFIR facilities produce neutron beams that help spur innovations that lead to improvements in daily life, such as more powerful computers, cleaner air, more effective drugs and longer-lasting batteries.

    SNS and HFIR are DOE Office of Science user facilities.

    UT-Battelle manages ORNL for the Department of Energy’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. The Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

    [ad_2]

    Oak Ridge National Laboratory

    Source link

  • Establishing Ethical Nanobiotechnology

    Establishing Ethical Nanobiotechnology

    [ad_1]

    By Rebekah Orton

    Newswise — Prosthetics moved by thoughts. Targeted treatments for aggressive brain cancer. Soldiers with enhanced vision or bionic ears.

    These powerful technologies sound like science fiction, but they’re becoming possible thanks to nanoparticles.

    And, as with any great power, there comes great responsibility.

    “In medicine and other biological settings, nanotechnology is amazing and helpful, but it could be harmful if used improperly,” said Pacific Northwest National Laboratory (PNNL) chemist Ashley Bradley, part of a team of researchers who conducted a comprehensive survey of nanobiotechnology applications and policies.

    Their research, available now in Health Security, works to sum up the very large, active field of nanotechnology in biology applications, draw attention to regulatory gaps, and offer areas for further consideration.

    “In our research, we learned there aren’t many global regulations yet,” said Bradley. “And we need to create a common set of rules to figure out the ethical boundaries.”

    Nanoparticles, big differences

    Nanoparticles are clusters of molecules with different properties than large amounts of the same substances. In medicine and other biology applications, these properties allow nanoparticles to act as the packaging that delivers treatments through cell walls and the difficult to cross blood-brain barrier.

    “You can think of the nanoparticles a little bit like the plastic around shredded cheese,” said PNNL chemist Kristin Omberg. “It makes it possible to get something perishable directly where you want it, but afterwards you’ve got to deal with a whole lot of substance where it wasn’t before.”

    Unfortunately, dealing with nanoparticles in new places isn’t straightforward. Carbon is pencil lead, nano carbon conducts electricity. The same material may have different properties at the nanoscale, but most countries still regulate it the same as bulk material, if the material is regulated at all.

    For example, zinc oxide, a material that was stable and unreactive as a pigment in white paint, is now accumulating in oceans when used as nanoparticles in sunscreen, warranting a call to create alternative reef-safe sunscreens. And although fats and lipids aren’t regulated, the researchers suggest which agencies could weigh in on regulations were fats to become after-treatment byproducts.

    The article also inventories national and international agencies, organizations, and governing bodies with an interest in understanding how nanoparticles break down or react in a living organism and the environmental life cycle of a nanoparticle. Because nanobiotechnology spans materials science, biology, medicine, environmental science, and tech, these disparate research and regulatory disciplines must come together, often for the first time—to fully understand the impact on humans and the environment.

    Dual use: Good for us, bad for us

    Like other quickly growing fields, there’s a time lag between the promise of new advances and the possibilities of unintended uses.

    “There were so many more applications than we thought there were,” said Bradley, who collected exciting nanobio examples such as Alzheimer’s treatment, permanent contact lenses, organ replacement, and enhanced muscle recovery, among others.

    The article also highlights concerns about crossing the blood-brain barrier, thought-initiated control of computers, and nano-enabled DNA editing where the researchers suggest more caution, questioning, and attention could be warranted. This attention spans everything from deep fundamental research and regulations all the way to what Omberg called “the equivalent of tattoo removal” if home-DNA splicing attempts go south.

    The researchers draw parallels to more established fields such as synthetic bio and pharmacology, which offer lessons to be learned from current concerns such as the unintended consequences of fentanyl and opioids. They believe these fields also offer examples of innovative coordination between science and ethics, such as synthetic bio’s IGEM—student competition, to think about not just how to create, but also to shape the use and control of new technologies.

    Omberg said unusually enthusiastic early reviewers of the article contributed even more potential uses and concerns, demonstrating that experts in many fields recognize ethical nanobiotechnology is an issue to get in front of. “This is a train that’s going. It will be sad if 10 years from now, we haven’t figured how to talk about it.”

    Funding for the team’s research was supported by PNNL’s Biorisk Beyond the List National Security Directorate Objective.

    ###

    About PNNL

    Pacific Northwest National Laboratory draws on its distinguishing strengths in chemistry, Earth sciences, biology and data science to advance scientific knowledge and address challenges in sustainable energy and national security. Founded in 1965, PNNL is operated by Battelle for the Department of Energy’s Office of Science, which is the single largest supporter of basic research in the physical sciences in the United States. DOE’s Office of Science is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science. For more information on PNNL, visit PNNL’s News Center. Follow us on Twitter, Facebook, LinkedIn and Instagram.

    [ad_2]

    Pacific Northwest National Laboratory

    Source link

  • Thanks to Trapped Electrons, a Material Expected to be a Conducting Metal Remains an Insulator

    Thanks to Trapped Electrons, a Material Expected to be a Conducting Metal Remains an Insulator

    [ad_1]

    The Science

    New research sheds light on the mechanism behind how a special material changes from an electrically conducting metal to an electric insulator. The researchers studied lanthanum strontium nickel oxide (La1.67Sr0.33NiO4) derived from a quantum material La2NiO4. Quantum materials have unusual properties that result from how their electrons interact. Below a critical temperature, the strontium doped material is an insulator. This is due to the separation of introduced holes from the magnetic regions, forming “stripes.” As the temperature increases, these stripes fluctuate and melt at 240K. At this temperature, researchers expected the material to become a conducting metal. Instead, it remains an insulating material. Neutron scattering sheds light on this intriguing phenomenon. The results indicate that the material stays an insulator because of certain atomic vibrations that trap electrons and thus impede electrical conduction.

    The Impact

    Quantum materials have properties that aren’t predicted by the parts that make up those materials. For example, they can transition from metals to insulators or act as superconductors. They hold tremendous promise for applications in science and technology. This research describes the tunability of electron-phonon interaction on the metal-insulator transition in one quantum material. The results will help validate theoretical models of materials that have strongly interacting electrons. These theories will help scientists design new quantum materials for future technologies.

    Summary

    In metals, electrons can be considered as free particles flying along trajectories enforced by the crystal structure. In recent decades, scientists discovered new materials where electrons strongly repel each other and bounce off atomic vibrations in the host crystal. These materials exhibit unusual and technologically useful properties. These properties can include dramatic electrical resistance drop in magnetic fields, electron conduction only on the surface, and high temperature superconductivity. Understanding these properties in different materials remains a grand challenge for the scientific community.

    This work used high intensity neutron beams at the Spallation Neutron Source, a Department of Energy user facility at Oak Ridge National Laboratory (ORNL), to look deep inside an archetype quantum material La2NiO4 in which one sixth of the lanthanum (La) atoms are replaced with strontium (Sr) atoms (La1.67Sr0.33NiO4). The team included researchers from the University of Colorado Boulder, ORNL, Brookhaven National Laboratory, and the RIKEN Center for Emergent Matter Science in Japan. These materials are insulating at low temperatures due to the so-called “stripe” order that results from the complex interplay between electronic spins and the holes introduced due to strontium doping. The doped material is expected to become metallic above 240K when the stripes melt. However, the material remains insulating. The collaboration uncovered strong friction between the holes and certain vibrations of oxygen ions and found evidence for this interaction in other materials of similar structure. The microscopic mechanism could pave way for the design of new materials with unusual properties useful for various quantum technologies.

     

    Funding

    Work at the University of Colorado Boulder was supported by the Department of Energy Office of Science, Basic Energy Sciences program. One of the researchers was supported by a Japan Science and Technology Agency CREST Grant. Work at Brookhaven National Laboratory was supported by the Department of Energy Office of Science, Basic Energy Sciences program.


    Journal Link: Scientific Reports, Jul-2020

    [ad_2]

    Department of Energy, Office of Science

    Source link

  • Nuclear Charge Distribution Measurements May Solve Outstanding Puzzle In Particle Physics

    Nuclear Charge Distribution Measurements May Solve Outstanding Puzzle In Particle Physics

    [ad_1]

    The Science

    What scientists call the “nuclear weak distribution” describes the distribution of “active” protons in a nucleus. These are protons that are eligible to transition into neutrons through what scientists call the “weak interaction.” Researchers recently reviewed the existing standard procedure to determine this distribution. In the study, they abandoned previous treatments using nuclear shell models. Instead, they related the weak distribution to the distribution of electric charges in a nucleus. Scientists can measure this distribution by scattering electrons off nuclear targets or by studying energy levels through atomic spectroscopy.

    The Impact

    The new data-driven analysis found significant differences with the results of previous model-based determinations of the nuclear weak distribution. This result advances the search for new physics based on nuclear beta decay experiments. It does so by providing a partial explanation for a discrepancy between predictions from particle physics theory and experimental measurement of a fundamental quantity called “Vud.” This term describes how quarks transition from one type to another inside a proton or neutron, thus changing the particles.

    Summary

    The extracted value of “Vud” from nuclear beta decays seems to be substantially smaller than what is required by the Standard Model of Particle Physics, the commonly-acknowledged best theory for elementary particle physics. This observed anomaly stimulates vibrant discussions of possibilities of new physics discoveries. To study its origin, researchers at the Facility for Rare Isotope Beams (FRIB) at Michigan State University investigated the so-called “nuclear weak distribution.” They found that the current understanding of this distribution is based on rather simple models that assume non-interacting nucleons inside a nucleus. Furthermore, they showed that this distribution may be determined independently of models using measurements of nuclear charge distributions, which can be done either through electron-nucleus scattering experiments or through atomic spectroscopy.

    Upon analyzing existing data, the researchers found that the value of Vud moved closer to the Standard Model prediction. Future measurements of nuclear charge distributions, for example at FRIB, may provide further insights towards the resolution of the Vud anomaly.

     

    Funding

    This work is supported in part by the Department of Energy Office of Science, Office of Nuclear Physics.


    Journal Link: Physical Review Letters, Apr-2023

    [ad_2]

    Department of Energy, Office of Science

    Source link

  • U.S. Department of Energy Releases Plan to Ensure Free, Immediate, and Equitable Access to Federally Funded Research

    U.S. Department of Energy Releases Plan to Ensure Free, Immediate, and Equitable Access to Federally Funded Research

    [ad_1]

    Newswise — WASHINGTON, D.C. — The U.S. Department of Energy (DOE) today released a plan to ensure the Department’s Federally funded research is more open and accessible to the public, researchers, and journalists as part of a broader effort by the Biden-Harris Administration to make government data more transparent. With 17 National Laboratories and scores of programs that fund university and private research, DOE directly supports thousands of research papers per year, and, when this plan goes into effect, those findings will be available immediately and at no cost.

    “Science and innovation cannot flourish in the dark—they require openness, scrutiny, and reexamination so that we can build on them to create the knowledge and technologies that will change the world,” said U.S. Secretary of Energy Jennifer M. Granholm. “As one of the Federal Government’s leading sponsors of research, DOE is proud and excited to get our data and research out into the public’s hands faster and more efficiently, and we look forward to expanding and accelerating that access by engaging the American public in DOE’s mission.”

    DOE’s public access plan supports the August 2022 White House Office of Science and Technology Policy (OSTP) memo that called for Federal agencies to “make publications and their supporting data resulting from federally funded research publicly accessible without an embargo on their free and public release.” The new plan describes the steps DOE will take to enable equitable access to the unclassified and unrestricted results of its multi-billion dollar annual investments in climate, energy, environment, and basic and applied research and development.

    Since 2014, when DOE released its first plan to grant the public more access to research, the Department has provided free public access to nearly 200,000 articles and accepted manuscripts and has enabled broader access to scientific data through rigorous data management planning requirements.

    Key elements of the new DOE public access plan, as laid out by OSTP, will include elimination of any “embargo” period before the public gains free access to journal articles or final accepted manuscripts resulting from federal funding; immediate access to scientific data displayed in or underlying publications and expanded access to scientific data not displayed in publications; and broad adoption of persistent identifiers (PIDs) for research outputs, organizations, awards and contracts, and people.

    Most requirements and guidance will be in place by the end of 2024 with implementation by the end of 2025. DOE’s model for implementing access to publications and scientific data will be similar to existing practices—for publications, through submissions of accepted manuscripts or open access articles which will be made available through DOE’s public access repository, and for data, through submission of data management and sharing plans to DOE.

    Key changes include the requirement to submit accepted manuscripts or open access journal articles immediately upon publication and an increased focus on immediate and broader sharing of scientific data.

    DOE has played a leading role in the assignment and use of PIDs among Federal research agencies, and the new plan builds on this record and expands DOE’s support of PIDs for research outputs, such as data and software, research and sponsoring organizations, and for researchers themselves. DOE will work internally, and with other agencies, to develop options for PIDs for research and development awards and contracts and will update its public access plan when those details are finalized. 

    The Department engaged with numerous communities in developing its plan and will continue to encourage participation and input from researcher communities, libraries, professional societies, publishers, Federal agency partners, and the public.

    [ad_2]

    Department of Energy, Office of Science

    Source link

  • New Insights on the Prevalence of Drizzle in Marine Stratocumulus Clouds

    New Insights on the Prevalence of Drizzle in Marine Stratocumulus Clouds

    [ad_1]

    The Science

    Drizzle is light precipitation in droplets smaller than rain drops. Detecting drizzle in its early stages in marine stratocumulus clouds is important for studying how water in these clouds becomes rainfall. These clouds form off the west coasts of large land areas and are important to the Earth’s energy balance.  Drizzle and rain formation can alter their lifetime, structure, and how much sunlight they reflect to space. However, detecting the initial stages of drizzle is challenging for ground-based remote-sensing observations. Researchers developed a machine learning-based approach using unique radar Doppler spectra observations to identify the early stage of drizzle drops.

    The Impact

    The results demonstrate that drizzle is far more frequent than previously recognized by traditional methods. The method also provides essential information on light precipitation. This information challenges the detection limits of satellite-borne observations used in precipitation climate analyses for global climate model (GCM) evaluation.

    Summary

    Researchers commonly use radar reflectivity from millimeter-wavelength radar for drizzle detection, but it is unable to identify weak drizzle signals. Doppler skewness—a measure of Doppler spectral symmetry—has proven to be a more sensitive quantity for the detection of drizzle embryos. In this study, researchers from Brookhaven National Laboratory and Stony Brook University detected small drizzle droplets using a newly developed machine-learning technique from unique drizzle retrievals based on radar reflectivity and skewness from millimeter-wavelength radars operated by the Department of Energy’s Atmospheric Radiation Measurement (ARM) user facility. The researchers evaluated the drizzle detection algorithm on aircraft in situ measurements and then applied them to ARM observational campaigns at three different sites (Eastern North Atlantic [ENA], Measurements of Aerosols, Radiation, and Clouds over the Southern Ocean [MARCUS], and Marine ARM GPCI Investigation of Clouds [MAGIC]) to investigate drizzle occurrence in marine stratocumulus clouds.

    The results show that drizzle is far more ubiquitous than previously recognized, and that the traditional approach significantly underestimates the drizzle occurrence, especially in thin clouds with low liquid water paths. Drizzle occurrence in marine boundary-layer clouds differs among the three ARM campaigns, indicating that drizzle formation and distribution is regime dependent, controlled by microphysical and dynamical processes in the local region. Further, spaceborne radar (i.e., CloudSat) observations used to generate precipitation climatologies have low sensitivity in the light precipitation region. The new method provides essential information in this region to challenge the conventional light precipitation climatology and can be used to improve the warm rain parameterization in GCMs.

     

    Funding

    Zeen Zhu’s contributions have been supported by the Department of Energy (DOE) Office of Science, Atmospheric System Research (ASR) program’s Eastern North Atlantic Site Science award. Pavlos Kollias, Edward Luke, and Fan Yang have been supported by the DOE Office of Science ASR Program (contract no. DE-SC0012704).


    Journal Link: Atmospheric Chemistry and Physics, Jun-2022

    [ad_2]

    Department of Energy, Office of Science

    Source link

  • Tips for future Argonne interns, from past Argonne interns

    Tips for future Argonne interns, from past Argonne interns

    [ad_1]

    Newswise — Student STEM Ambassadors draw upon their experiences as interns to offer advice on how to make the most out of an internship at Argonne National Laboratory.

    Starting your first science, technology, engineering or mathematics (STEM) internship can feel like a daunting task for college students, especially for those interning at national labs such as the U.S. Department of Energy’s Argonne National Laboratory.

    For students embarking on an internship at Argonne, here are some tips from Argonne’s Student STEM Ambassadors (SSA), who themselves completed internships at the lab.

    The first step toward success starts before the internship even begins. During the period between being accepted into a program and starting work at the lab, future interns should take the initiative to reach out to their mentors and brush up on the current state of the science they will study.

    “It can feel intimidating to reach out to your mentor, especially if you don’t know what you’re doing. But the whole point of this internship is to learn, and the first step is realizing that you won’t be penalized for not knowing something; I wish I had known that before I started the internship.” — David Lopez, CCI intern and Student STEM Ambassador

    “If you’re sitting at home and have nothing to do, just send your mentor an email,” said Alice Gao, who participated in the 2022 Science Undergraduate Laboratory Internships (SULI) program.

    “It can feel intimidating to reach out to your mentor, especially if you don’t know what you’re doing,” said David Lopez. He first interned at Argonne for the Community College Internships (CCI) in 2022, and he is returning for a second CCI program this year. ​“But the whole point of this internship is to learn, and the first step is realizing that you won’t be penalized for not knowing something; I wish I had known that before I started the internship.”

    The learning process continues when students begin their internships. SSAs encourage interns to take things nice and slow for their first week at the lab.

    “Be patient, and don’t try to rush into the research,” said Justin Griffith, a 2022 SULI intern. ​“Building a solid, theoretical foundation for your work is what the first couple of weeks are about — developing your understanding of the material, training, and plenty of reading.”

    “For any questions you have about your research project’s goals, it’s best to ask your mentor about them early, because the longer you wait, the harder it becomes to fix the trajectory you’ve already set,” said Gao.

    Though research remains a strong priority for internships, the SSAs emphasized the importance of social interactions throughout the experience. Students should try to attend as many events as they can and talk with others, even if it takes them out of their comfort zone.

    “Go to as many events as you can, and talk to people about the cool science they’re working on,” said Griffith. ​“I got to know quite a few other interns that I still talk with now, a year later. It can be really easy to be caught up in your research, but being able to take an hour or half-hour of your day to attend a seminar, be part of a social event, or just grab lunch with an intern can be really helpful.”

    Interns’ projects culminate in their greatest challenge: giving professional STEM presentations on their research at the lab-wide Learning on the Lawn.

    “The best way to prepare is to practice casually; the more you talk about your research with others, the better you’ll be able to present your poster,” said Griffith. ​“It’s a great way to develop your communication skills while also showing off what you’ve learned. There is a satisfaction of being able to describe what you have done in that last 10 weeks. It feels very official, like you’ve done meaningful work.”

    Finally, the biggest tip that the SSAs could give students is to apply for more internships. An internship can make a positive difference in students’ futures, as it has for these past interns.

    “Argonne has helped me in my school life; I’ve actually used what I’ve learned from Argonne in my courses,” said Lopez. ​“Interning at Argonne is a rare chance at something new, and if you don’t take advantage of the opportunity, you may regret missing out.”

    This work was supported in part by the U.S. Department of Energy, Office of Science, Office of Workforce Development for Teachers and Scientists (WDTS).

    Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science.

    The U.S. Department of Energy’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://​ener​gy​.gov/​s​c​ience.

    [ad_2]

    Argonne National Laboratory

    Source link

  • New nationwide modeling points to widespread racial disparities in urban heat stress

    New nationwide modeling points to widespread racial disparities in urban heat stress

    [ad_1]

    Newswise — RICHLAND, Wash.— From densely built urban cores to sprawling suburbia, cities are complex. This complexity can lead to temperature hot spots within cities, with some neighborhoods (and their residents) facing more heat than others.

    Understanding this environmental disparity forms the spirit of new research led by scientists at the Department of Energy’s Pacific Northwest National Laboratory. In a new paper examining all major cities in the U.S., the authors find that the average Black resident is exposed to air that is warmer by 0.28 degrees Celsius relative to the city average. In contrast, the average white urban resident lives where air temperature is cooler by 0.22 degrees Celsius relative to the same average.

    The new work, published last week in the journal One Earth, involved a two-part effort. The study’s authors aimed to produce a more useful nationwide estimate of urban heat stress—a more accurate account of how our body responds to outdoor heat. By creating and comparing these estimates against demographic data, they also tried to better understand which populations are most exposed to urban heat stress.

    The findings reveal pervasive income- and race-based disparities within U.S. cities. Nearly all the U.S. urban population—94 percent, or roughly 228 million people—live in cities where summertime peak heat stress exposure disproportionately burdens the poor.

    The study’s authors also find that people who now live within historically redlined neighborhoods, where loan applicants were once denied on racially discriminatory grounds, would be exposed to higher outdoor heat stress than their neighbors living in originally non-redlined parts of the city. 

    The work also highlights shortcomings in the typical approach scientists take in estimating urban heat stress at these scales, which frequently relies on satellite data. This conventional satellite-based method can overestimate such disparities, according to the new work. As the world warms, the findings stand to inform urban heat response plans put forward by local governments who seek to help vulnerable groups. 

    What is heat stress? 

    The human body has evolved to operate within a relatively narrow temperature range. Raise your core body temperature beyond just six or seven degrees and drastic physiological consequences soon follow. Cellular processes break down, the heart is taxed, and organs begin to fail.

    Sweating helps. But the cooling power of sweating depends partly on how humid the environment is. When both heat and humidity are omnipresent and difficult to escape, the body struggles to adapt.

    How is heat stress measured? 

    To measure heat stress, scientists use a handful of indicators, many of which depend on air temperature and humidity. Weather stations provide such data. Because most weather stations are outside of cities, though, scientists often rely on other means to get some idea about urban heat stress, including using sensors on satellites.

    Those sensors infer the temperature of the land surface from measurements of thermal radiation. But such measurements fall short of delivering a full picture of heat stress, said lead author and Earth scientist TC Chakraborty. Measuring just the skin of the Earth, like the surface of a sidewalk or a patch of grass, said Chakraborty, offers only an idea of what it’s like to lay flat on that surface. 

    “Unless you’re walking around barefoot or lying naked on the ground, you’re not really feeling that,” said Chakraborty. “Land surface temperature is, at best, a crude proxy of urban heat stress.” 

    Indeed, most of us are upright, moving through a world where air temperature and moisture dictate how heat actually feels. And these satellite data are only available for clear-sky days—another limiting factor. More complete and physiologically relevant estimates of heat stress incorporate a blend of factors, which models can provide, said Chakraborty.

    To better understand differences between satellite-derived land surface temperature and ambient heat exposure within cities, Chakraborty’s team examined 481 urbanized areas across the continental United States using both satellites and model simulations.

    NASA’s Aqua satellite provided the land surface temperature; and through model simulations that account for urban areas, the authors generated nationwide estimates of all variables required to calculate moist heat stress. Two such metrics of heat stress—the National Weather Service’s heat index and the Humidex, often used by Canadian meteorologists—allowed the scientists to capture the combined impacts of air temperature and humidity on the human body.

    They then identified heat stress hotspots across the country for summer days between 2014 and 2018. Overlaying maps of both historically redlined neighborhoods and census tracts, the team identified relationships between heat exposure and communities.

    How is heat distributed within cities?

    Residents in poorer neighborhoods often face greater heat stress. And a greater degree of income inequality in any given city often means greater heat stress exposure for its poorer residents.

    Most U.S. cities, including heavily populated cities like New York, Los Angeles, Chicago, and Philadelphia, show this disparity. But the relationship between heat stress and race-based residential segregation is even more stark. 

    Roughly 87.5 percent of the cities studied show that Black populations live in parts of the city with higher land surface temperatures, warmer air, and greater moist heat stress. Moreover, the association between the degree of heat stress disparity and the degree of segregation between white and non-white populations across cities is particularly striking, said Chakraborty.

    “The majority—83 percent—of non-white U.S. urban residents live in cities where outdoor moist heat stress disproportionately burdens them,” said Chakraborty, “Further, higher percentages of all races other than white are positively correlated with greater heat exposure no matter which variable you use to assess it.”

    In the 1930s, the U.S. federal government’s Home Owners’ Loan Corporation graded neighborhoods in an effort to rank the suitability of real estate investments. This practice is known as “redlining,” where lower grades (and consequently fewer loans) were issued to neighborhoods composed of poorer and minority groups. The authors find that these redlined neighborhoods still show worse environmental conditions.

    Neighborhoods with lower ratings face higher heat exposure than their non-redlined neighbors. Neighborhoods with higher ratings, in contrast, generally get less heat exposure. 

    This is consistent with previous research on originally redlined urban neighborhoods showing lower tree cover and higher land surface temperature. Chakraborty, however, notes that using land surface temperature would generally overestimate these disparities across neighborhood grades compared to using air temperature or heat index.

    “Satellites give us estimates of land surface temperature, which is a different variable from the temperature we feel while outdoors, especially within cities,” said Chakraborty. “Moreover, the physiological response to heat also depends on humidity, which satellites cannot directly provide, and urbanization also modifies.”

    What can be done?

    Planting more trees often comes up as a potential solution to heat stress, said Chakraborty. But densely built urban cores, where poorer and minority populations in the U.S. often live, have limited space for trees. And many previous estimates of vegetation’s potential to cool city surroundings are also based solely on land surface temperature—they are perhaps prone to similar overestimation, the authors suggest.

    More robust measurements of urban heat stress would help, they added. Factors like wind speed and solar insolation contribute to how heat actually affects the human body. But those factors are left out of most scientific assessments of urban heat stress because they are difficult to measure or model at neighborhood scales.

    In addition to Chakraborty, PNNL authors of the new work include Yun Qian. Andrew Newman at the National Center for Atmospheric Research, Angel Hsu at the University of North Carolina-Chapel Hill, and Glenn Sheriff at Arizona State University are also authors. This work was supported by DOE’s Office of Science and the National Institutes of Health.

    [ad_2]

    Pacific Northwest National Laboratory

    Source link

  • IBM’s Jason Orcutt moves the world toward an interconnected quantum future

    IBM’s Jason Orcutt moves the world toward an interconnected quantum future

    [ad_1]

    Newswise — Jason Orcutt of IBM provides an industry perspective on quantum simulation research at the Q-NEXT quantum research center and works to connect quantum information systems around the globe.

    Glance around Jason Orcutt’s office at IBM Quantum, and you’ll see circuit boards, hiking trail maps, qubit probes and his kids’ artwork. Part office, part lab, part gallery: It’s a cross section of a life of rigorous research and vigorous recreation.

    The scene also captures the kind of activity balancing that characterizes his work as a quantum information researcher, switching between hands-on investigation and high-level research strategy. He uses these wide-ranging skills in his role as a co-design engineer for Q-NEXT, the National Quantum Information Science Research Center led by the U.S. Department of Energy’s (DOE) Argonne National Laboratory.

    A principal research scientist at IBM Quantum, Orcutt provides an industry perspective on one of the pillars of Q-NEXT research: developing simulations to better design quantum information systems.

    “IBM brings a future-looking perspective on the problems we need to solve to develop a really useful quantum computer. And Q-NEXT really aligns with our vision on creating new types of quantum interconnects to scale quantum computers into the future.” — Jason Orcutt, IBM

    Q-NEXT collaborators use quantum computers and classical supercomputers to simulate the behaviors of materials used for quantum applications, which are expected to be revolutionary. In the decades ahead, scientists will deploy quantum sensors that can detect an earthquake from space and run powerful quantum computers that can rapidly suss out solutions to intractable problems.

    “We’re using simulations to better design materials and adapting those simulations to an interconnected quantum system,” Orcutt said. ​“IBM brings a future-looking perspective on the problems we need to solve to develop a really useful quantum computer. And Q-NEXT really aligns with our vision on creating new types of quantum interconnects to scale quantum computers into the future.”

    “Quantum interconnect” is a fancy way of referring to the components that link quantum devices. It could be the instruments connecting a sensor to a computer, or it could be a line on a printed circuit board. Without interconnects, quantum devices can’t talk to each other, and quantum information can’t be shared.

    At IBM Quantum, Orcutt coordinates the development of long-range quantum interconnects, which link devices separated by meters to kilometers, such as the nodes in a future quantum data center.

    “How do we extend quantum information or connect quantum systems over physical distance?” he said. ​“Right now, our IBM quantum systems are really restricted to a single chip. I and the people I work with, as well as the academic researchers such as those at Q-NEXT, are looking to develop connections between qubits that will extend beyond more than one chip.”

    Sending quantum information over longer distances is an obstacle course of physics challenges. For starters, quantum information is fragile. Qubits — the fundamental units of quantum information — fall apart at the smallest disturbance. Distance complicates matters. How do you provide qubits with safe, noise-free passage over a kilometer or more? The proposition is like asking a soap bubble not to pop as it travels down a galley of knives.

    “You can’t use the same tools to pattern a centimeter size chip as you would a meter-scale cable,” Orcutt said.

    Qubits must also be continually converted and reconverted to the right frequencies to be read by the devices they encounter on their journey. The most fundamental frequency conversion requirements arise from the different levels of thermal noise at different frequencies. For example: IBM Quantum focuses on a type of qubit that lives in the microwave frequency range. In this range, the quantum information must be cooled to a few hundredths of a degree from absolute zero to be protected from thermal noise. To be transported in room temperature materials — a requirement for long distance communication — the quantum information must be converted to the optical-wave range, a whopping 10,000 times the frequency of microwaves.

    The way that materials respond to the two frequency ranges is massively different. How do you engineer materials to successfully conduct information that starts as a murmur and ends in a trill?

    Such challenges are part of the growing pains of the field of quantum information science, which is working to tap the potential of information that, until recently, was kept cozily inside tiny instruments such as microchips.

    “We’re taking quantum information into places it traditionally doesn’t live,” Orcutt said. Instead of moving through chips built in clean rooms, qubits are having to find their way through ​“the messy world of macroscopic objects,” he said, such as meter-long coaxial cables or fiber optic cables that connect nodes that are miles apart.

    The scientific community is working to build quantum systems that will eventually connect the globe. Simulating them from soup to nuts is key to ensuring that the interconnected systems of the future will be successful. Orcutt draws on his experience at IBM to inform Q-NEXT’s quantum simulations work.

    “We have to reengineer our systems, and to do that, we have to simulate them,” he said. ​“But how do we reengineer our systems around quantum interconnects instead of a monolithic computing device? Systems where there are different levels of connectivity? We have to rethink not just how we build the systems, but also how we adapt our algorithms to best use them.”

    Orcutt began his journey into quantum information science at Columbia University, planning initially to be a patent lawyer, combining interests in debate and technology.

    “What I quickly realized was that there are many other ways to pursue science and have a fulfilling career that was closer to creating new technical ideas,” he said.

    He pivoted to a bachelor’s in electrical engineering, with no intention of attending graduate school. But, again, he changed his mind after a couple of happy lab experiences working on electronics and photonics. For his Ph.D. research at MIT, Orcutt built the first optical interconnects in the commercial manufacturing processes used for microprocessor and memory chips.

    “This was a wonderful project because it wasn’t just about the devices — it was connected to the systems, which is something that has always been a key draw for me throughout my life,” he said.

    In 2013, Orcutt joined IBM. It was a major shift for someone who started his career as ​“the one soldering the circuit, the one simulating the physics or coding the program,” he said. And while he continues to work directly with the technology, 10 years later, he’s also the one asking how quantum computers should be wired, what components are required to connect the qubits and what direction IBM should take to tackle these strategic and technology questions.

    Orcutt’s experience both at the bench and at the center of operations made him a valuable contributor to Q-NEXT’s 2022 quantum technology report ​“A Roadmap for Quantum Interconnects,” which outlines the discoveries needed to build practical quantum information technologies in one or two decades.

    “It was a useful exercise to define the important challenges and potential solutions that are emerging within the community and define it so it could be addressed by the center on a 10-year scale,” he said.

    Producing the roadmap is just one example of IBM’s collaborative effort with Q-NEXT.

    “The next phase of quantum information science will involve creating new materials and refined products that have superior quantum information performance. And to address that, we need a whole bunch of forces coming together, which is another reason why the shared infrastructure at centers like Q-NEXT are critical,” Orcutt said. ​“Trying to tackle these really hard problems is one of the main reasons we like to work with other industrial players, national labs and a broad consortium of academic groups. To us — to me and to IBM in general — that is a paramount reason to get involved in Q-NEXT: to be able to tackle the really hard problems together with the best people in the field.”

    Building the quantum workforce through education and outreach is another goal for IBM Quantum. IBM creates connections to the students, postdocs and other early-career scientists conducting research at centers like Q-NEXT, widening opportunities to grow its own quantum workforce.

    For those thinking of entering the field, Orcutt notes the excitement of quantum research.

    “When I have a new task or project, I initially have absolutely no idea how we’re going to solve it. The wonderful thing is, we’ve been able to make significant progress against our goals,” he said. ​“It’s been a wonderful journey of figuring out ways to contribute to the quantum effort and trying to solve problems along the way.”

    This work was supported by the DOE Office of Science National Quantum Information Science Research Centers as part of the Q-NEXT center.

    About Q-NEXT

    Q-NEXT is a U.S. Department of Energy National Quantum Information Science Research Center led by Argonne National Laboratory. Q-NEXT brings together world-class researchers from national laboratories, universities and U.S. technology companies with the goal of developing the science and technology to control and distribute quantum information. Q-NEXT collaborators and institutions will create two national foundries for quantum materials and devices, develop networks of sensors and secure communications systems, establish simulation and network test beds, and train the next-generation quantum-ready workforce to ensure continued U.S. scientific and economic leadership in this rapidly advancing field. For more information, visit https://​q​-next​.org/.

    Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science.

    The U.S. Department of Energy’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://​ener​gy​.gov/​s​c​ience.

    [ad_2]

    Argonne National Laboratory

    Source link

  • Direct air capture technology licensed to Knoxville-based Holocene

    Direct air capture technology licensed to Knoxville-based Holocene

    [ad_1]

    Newswise — An innovative and sustainable chemistry developed at the Department of Energy’s Oak Ridge National Laboratory for capturing carbon dioxide from air has been licensed to Holocene, a Knoxville-based startup focused on designing and building plants that remove carbon dioxide from atmospheric air.

    “ORNL is tackling climate change by developing numerous technologies that reduce or eliminate emissions,” said Susan Hubbard, ORNL deputy for science and technology. “But with billions of tons of carbon dioxide already in the air, we must capture carbon dioxide from the atmosphere to slow and reverse the effects of climate change.”

    “Direct air capture allows us to collect legacy emissions,” said Radu Custelcean, a scientist in ORNL’s Chemical Sciences Division and inventor of the licensed technology. “Our technology is one of the few approaches that can do that. It offers a new, energy-efficient approach to removing CO2 directly from air.”

    In direct air capture, a large fan pulls air through a contacting chamber where the air interacts with chemical compounds that filter and capture carbon dioxide. The CO2 can then be released from the capture material and stored deep underground.

    Holocene’s founder and chief executive officer Anca Timofte said there are several chemical approaches to direct air capture, or DAC, each with benefits and drawbacks.

    “ORNL’s chemistry combines the best features of existing approaches to DAC to create a water-based, low-temperature process,” she said.

    Custelcean’s process uses an aqueous solution containing ORNL-discovered receptors called Bis-iminoguanidine, or BIGs, to absorb carbon dioxide. As this happens, BIGs turn into an insoluble crystalline salt, which can easily be separated from the liquid solution. Custelcean and his research team discovered this new chemistry by chance while conducting fundamental crystallization experiments. The resulting Bis-Iminoguanidine Negative Emission Technology, or BIG-NET, received an R&D 100 Award in 2021.

    The BIGs discovery propelled Custelcean’s research in a new direction.

    “Doing basic research under DOE’s Basic Energy Sciences program, I have the flexibility to change direction if I find something interesting,” Custelcean said. “The basic research allows us to better understand all the elementary reactions and processes involved. But through licensing, we get to see a progression with our partners in the development of the technology. We’re involved in the full spectrum of research.”

    Timofte, originally from Romania, has a background in chemical engineering and worked at one of the world’s first direct air capture companies, Switzerland-based Climeworks. She contributed to the design of the company’s largest plant, which is in Iceland. With a growing interest in the market and finance aspects of carbon capture, she left Climeworks to enroll in the Master of Business Administration program at Stanford University to focus on climate technology and entrepreneurship.

    Timofte avidly followed the published literature around carbon capture. Custelcean’s publications caught her eye — she recognized the name as being Romanian — and she saw how his chemistry could address the major hurdles of the two established direct air capture processes.

    “The more I learned about his research, the more I saw the potential and the more I wanted to start my own company to pursue it,” she said. “With the encouragement of my professors, I founded Holocene and licensed the technology so I could work on it in a lab and think more about commercialization.”

    With Holocene established and the ORNL technology licensed, Timofte is further developing her business plans through Innovation Crossroads, a DOE Lab-Embedded Entrepreneurship Program funded by DOE’s Advanced Materials and Manufacturing Technologies Office, Building Technologies Office and the Tennessee Valley Authority.

    “When you’re in the position of starting a new company, having a group of mentors like the ones at Innovation Crossroads and the ability to work with ORNL is very appealing,” Timofte said. “I was happy to get into the program. It helps with the normal challenges that all startups have, but also very importantly, it connects us with the local ecosystem in Knoxville and gives us access to the scientists who developed the chemistry. We can work together and transfer knowledge — we can learn more about how the licensed technology works, work on features, troubleshoot issues, de-risk and optimize the chemistry. It’s a nice continuation of the collaboration.”

    Innovation Crossroads provides Holocene with a two-year cooperative research and development agreement to continue working with Custelcean and ORNL. Through this partnership, Holocene staff learn more about the science behind the technology, troubleshoot issues in testing and scale-up and connect with mentors at the lab and in the community.

    “Holocene is a great example of how the interconnected climate tech ecosystem can support a new company through the stages of development,” said Dan Miller, Innovation Crossroads program lead.

    Timofte is a Breakthrough Energy Fellow, a program launched by Breakthrough Energy — which was founded by Bill Gates — focused on accelerating innovation in sustainable energy and other technologies to reach net-zero emissions by 2050. Holocene is also part of the Spark Incubator Program, an entrepreneurial support program at the University of Tennessee Research Park’s Spark Innovation Center.

    Next up, Holocene and ORNL will conduct bench-scale testing funded by DOE’s Office of Fossil Energy and Carbon Management with the aim of using ORNL’s chemistry to further develop and deploy direct air capture at a commercial scale.

    ORNL senior commercialization manager Alex DeTrana negotiated the terms of the license. To connect with Holocene, complete this online contact form.

    The invention development team includes ORNL’s Costas Tsouris, Gyoung Gug Jang and Diana Stamberga. Charles Seipp and Neil Williams, formerly of ORNL, also participated. Read more about Custelcean’s carbon-removal research work.

    UT-Battelle manages ORNL for the Department of Energy’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. The Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

    [ad_2]

    Oak Ridge National Laboratory

    Source link

  • Resolving a Mathematical Puzzle in Quarks and Gluons in Nuclear Matter

    Resolving a Mathematical Puzzle in Quarks and Gluons in Nuclear Matter

    [ad_1]

    The Science

    The building blocks of atomic nuclei are protons and neutrons, which are themselves made of even more fundamental particles: quarks and gluons. These particles interact via the “strong” force, one of the four fundamental forces of nature. They make up the nuclei at the heart of every atom. They also make up forms of hot or dense nuclear matter that exhibit exotic properties. Scientists study the properties of hot and cold nuclear matter in relativistic heavy ion collision experiments and will continue to do so using the future Electron-Ion Collider. The ultimate goal is to understand how complex forms of matter emerge from elementary particles affected by strong forces.

    The Impact

    Theoretical calculations involving the strong force are complex. One aspect of this complexity arises because there are many ways to perform these calculations. Scientists refer to some of these as “gauge choices.” All gauge choices should produce the same result for the calculation of any quantity that can be measured in an experiment. However, one particular choice, called “axial gauge,” has puzzled scientists for years because of difficulties in obtaining consistent results upon making this choice. This recent study resolves this puzzle and paves the way for reliable calculations of hot and cold nuclear matter properties that can be tested in current and future experiments.

    Summary

    The exotic form of nuclear matter that physicists study in relativistic heavy ion collisions is called the quark-gluon plasma (QGP). This form of matter existed in the early universe. Physicists explore its properties in heavy ion collision experiments by recreating the extremely high temperatures last seen microseconds after the Big Bang. By analyzing experimental data from the collisions and comparing them with theoretical calculations, physicists can ascertain various properties of the QGP. Using a calculation method called “axial gauge” had previously seemed to imply that two QGP properties that describe how heavy quarks move through the QGP were the same. 

    Researchers at the Massachusetts Institute of Technology and the University of Washington have now found this implication to be incorrect. The study also carefully analyzed the subtle conditions for when axial gauge can be employed and explained why the two properties are different. Finally, it showed that two distinct methods for measuring how gluons are distributed inside nuclei must yield different results. Gluons are the particles that carry the strong force, This prediction will be tested at the future Electron-Ion Collider.

     

    Funding

    This work is supported by the Department of Energy Office of Science, Office of Nuclear Physics and by the Office of Science, Office of Nuclear Physics, InQubator for Quantum Simulation (IQuS).


    Journal Link: Physical Review Letters, Feb-2023

    [ad_2]

    Department of Energy, Office of Science

    Source link

  • New Insights on the Interplay of Electromagnetism and the Weak Nuclear Force

    New Insights on the Interplay of Electromagnetism and the Weak Nuclear Force

    [ad_1]

    The Science

    Outside atomic nuclei, neutrons are unstable particles, with a lifetime of about fifteen minutes. The neutron disintegrates due to the weak nuclear force, leaving behind a proton, an electron, and an antineutrino. The weak nuclear force is one of the four fundamental forces in the universe, along with the strong force, the electromagnetic force, and the gravitational force. Comparing experimental measurements of neutron decay with theoretical predictions based on the weak nuclear force can reveal so-far undiscovered interactions. To do so, researchers must achieve extremely high levels of precision. A team of nuclear theorists has uncovered a new, relatively large effect in neutron decay that arises from the interplay of the weak and electromagnetic forces. 

    The Impact

    This research identified a shift in the strength with which a spinning neutron experiences the weak nuclear force. This has two major implications. First, scientists have known since 1956 that due to the weak force, a system and one built like its mirror image do not behave in the same way. In other words, mirror reflection symmetry is broken. This research affects the search for new interactions, technically known as “right-handed currents,” that, at very short distances of less than one hundred quadrillionths of a centimeter, restore the universe’s mirror-reflection symmetry. Second, this research points to the need to compute electromagnetic effects with higher precision. Doing so will require the use of future high-performance computers.

    Summary

    A team of researchers computed the impact of electromagnetic interactions on neutron decay due to the emission and absorption of photons, the quanta of light. The team included nuclear theorists from the Institute for Nuclear Theory at the University of Washington, North Carolina State University, the University of Amsterdam, Los Alamos National Laboratory, and Lawrence Berkeley National Laboratory. 

    The calculation was performed with a modern method, known as “effective field theory,” that efficiently organizes the importance of fundamental interactions in phenomena involving strongly interacting particles. The team identified a new percent-level shift to the nucleon axial coupling, gA, which governs the strength of decay of a spinning neutron. The new correction originates from the emission and absorption of electrically charged pions, which are mediators of the strong nuclear force. While effective field theory provides an estimate of the uncertainties, improving on the current precision will require advanced calculations on Department of Energy supercomputers. The researchers also assessed the impact on searches of right-handed current. They found that after including the new correction, experimental data and theory are in good agreement and current uncertainties still allow for new physics at a relatively low mass scale.

     

    Funding

    This research was supported by the Department of Energy Office of Science, Office of Nuclear Physics; the Laboratory Directed Research and Development program at Los Alamos National Laboratory; the National Science Foundation; and the Dutch Research Council.

    [ad_2]

    Department of Energy, Office of Science

    Source link

  • Ultralow temperature terahertz microscope capabilities enable better quantum technology

    Ultralow temperature terahertz microscope capabilities enable better quantum technology

    [ad_1]

    Newswise — A team of scientists from the Department of Energy’s Ames National Laboratory have developed a way to collect terahertz imaging data on materials under extreme magnetic and cryogenic conditions. They accomplished their work with a new scanning probe microscope. This microscope was recently developed at Ames Lab. The team used the ultralow temperature terahertz microscope to take measurements on superconductors and topological semimetals. These materials were were exposed to high magnetic fields and temperatures below liquid helium (below 4.2 Kelvins or -452 degrees Fahrenheit).

    According to Jigang Wang, a scientist at Ames Lab, professor of Physics and Astronomy at Iowa State University, and the team leader, the team has been improving their terahertz microscope since it was first completed in 2019. “We have improved the resolution in terms of the space, time and energy,” said Wang. “We have also simultaneously improved operation to very low temperatures and high magnetic fields.”

    To expand their terahertz microscope’s capabilities to operate at extreme cryogenic and magnetic environments, Wang explained that his team developed a custom microscopy insert for a cryostat. A cryostat is a device used to maintain extremely cold temperatures. This insert was designed specifically for use with the cryogenic terahertz microscope.

    The new microscope capabilities allowed the team to examine superconductors and topological semimetals, both which operate at these low temperatures. These materials can also move electricity with almost zero energy loss and are important for furthering quantum computing technology.

    Based on their research so far, Wang said that the microscope could lead to development of new, improved materials for highly coherent quantum devices and a better understanding of superconducting and topological materials.

    This research is further discussed in the paper, “A sub-2 Kelvin Cryogenic Magneto-Terahertz Scattering-type Scanning Near-Field Optical Microscope (cm-THz-sSNOM),” written by R. H. J. Kim, J.-M. Park, S. J. Haeuser, L. Luo, and J. Wang, and published in Review of Scientific Instruments.

     

     

    Ames National Laboratory is a U.S. Department of Energy Office of Science National Laboratory operated by Iowa State University. Ames Laboratory creates innovative materials, technologies, and energy solutions. We use our expertise, unique capabilities, and interdisciplinary collaborations to solve global problems.

    Ames Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit https://energy.gov/science.

    [ad_2]

    Ames National Laboratory

    Source link

  • Rouven Essig: Then and Now / 2012 Early Career Award Winner

    Rouven Essig: Then and Now / 2012 Early Career Award Winner

    [ad_1]

    WHAT DID THE 2012 EARLY CAREER AWARD ALLOW YOU TO DO?

    Extensive evidence suggests that a staggering 85% of the matter in our universe is dark matter. However, its identity remains unknown. Even its most basic properties – such as how much it weighs and how it interacts with known matter – remain unknown.

    As a theoretical particle physicist, I conceive of new ideas for what constitutes dark matter. I also develop new experimental concepts for how to detect dark matter particles and any unknown forces that allow dark matter to interact with ordinary matter. My theoretical research impacts how other scientists do research at the cosmic, intensity, and energy frontiers.

    For the cosmic frontier, I have helped pioneer novel detection concepts to search for dark matter below the mass of protons. This has led to new exciting experiments, such as SENSEI, as well as new DOE research and development efforts, such as OSCURA. These projects have unprecedented sensitivity to a multitude of dark matter candidates that were previously unexplored.

    For the intensity frontier, I have conceived of new experiments. In these experiments, intense beams of electrons hit a target and potentially create particles that mediate new forces beyond the known electromagnetic, strong, and weak forces. This has led to new fixed-target experiments at DOE’s Jefferson Lab, including APEX and HPS.  

    For the energy frontier, I have shown how the Higgs bosons produced in high-energy proton-proton collisions at the Large Hadron Collider can shed light on dark matter and new forces. This has led to new searches for non-standard decays of the Higgs boson, in which it disintegrates in ways not expected by our standard theory.

    Most importantly, the DOE Early Career Award allowed me to train several graduate students and a postdoctoral researcher in particle physics. It enabled me to establish a strong research program and research group early on in my career, providing a strong foundation from which I continue to benefit.

    ABOUT:

    Rouven Essig is a professor in the C.N. Yang Institute for Theoretical Physics at Stony Brook University.

    SUPPORTING THE DOE SC MISSION:

    The Early Career Research Program provides financial support that is foundational to early career investigators, enabling them to define and direct independent research in areas important to DOE missions. The development of outstanding scientists and research leaders is of paramount importance to the Department of Energy Office of Science. By investing in the next generation of researchers, the Office of Science champions lifelong careers in discovery science.

    For more information, please go to the Early Career Research Program.

    THE 2012 PROJECT ABSTRACT:

    Title: Particle Physics at the Cosmic, Intensity, and Energy Frontiers

    Abstract

    Major efforts at the intensity, cosmic, and energy frontiers of particle physics are rapidly furthering our understanding of the fundamental constituents of Nature and their interactions. The overall objectives of this research project are (1) to interpret and develop the theoretical implications of the data collected at these frontiers and (2) to provide the theoretical motivation, basis, and ideas for new experiments and for new analyses of experimental data.

    Within the Intensity Frontier, an experimental search for a new force mediated by a GeV‐scale gauge boson will be carried out with the A’ Experiment (APEX) and the Heavy Photon Search (HPS), both at Jefferson Laboratory. Within the Cosmic Frontier, contributions are planned to the search for dark matter particles with the Fermi Gamma‐ray Space Telescope and other instruments. A detailed exploration will also be performed of new direct detection strategies for dark matter particles with sub‐GeV masses to facilitate the development of new experiments. In addition, the theoretical implications of existing and future dark‐ matter‐related anomalies will be examined. Within the Energy Frontier, the implications of the data from the Large Hadron Collider will be investigated. Novel search strategies will be developed to aid the search for new phenomena not described by the Standard Model of particle physics. By combining insights from all three particle physics frontiers, this research aims to increase our understanding of fundamental particle physics.

    RESOURCES:

    D Curtin, R Essig, S Gori, P Jaiswal, A Katz, T Liu, Z Liu, D McKeen, J Shelton, M Strassler, Z Surujon, B Tweedie, and YM Zhong, “Exotic decays of the 125 GeV Higgs boson.” Phys Rev D 90, 75004 (2014). [DOI: /10.1103/PhysRevD.90.075004]

    R Essig, M Fernández-Serra, J Mardon, A Soto, T Volansky & TT Yu, “Direct detection of sub-GeV dark matter with semiconductor targets.” Journal of High Energy Physics 2016, 46 (2016). [DOI: /10.1007/JHEP05(2016)046]

    R Essig, T Volansky, and TT Yu, “New constraints and prospects for sub-GeV dark matter scattering off electrons in xenon.” Phys Rev D 96, 043017 (2017). [DOI: 10.1103/PhysRevD.96.043017] 

    [ad_2]

    Department of Energy, Office of Science

    Source link

  • Princeton University awards plasma physics graduate student Suying Jin a highly selective honorific fellowship

    Princeton University awards plasma physics graduate student Suying Jin a highly selective honorific fellowship

    [ad_1]

    Jin expressed deep appreciation on receiving the fellowship. “I feel truly honored, and I’m fortunate to be at an institution that lifts up its students in this way,” she said. “I am also deeply grateful for all the support, academic and otherwise, that has made this possible.”

    The Program in Plasma Physics is based at the Department of Energy’s Princeton Plasma Physics Laboratory (PPPL) and is a graduate program within the Department of Astrophysical Sciences at Princeton University. Graduates of the program have shaped the field of plasma physics in recent decades, working in academia, national laboratories, industry and beyond.

    Spontaneously arising order

    Jin’s dissertation is investigating the challenging question of how plasmas self-organize in the presence of magnetic fields. “You see it happening all the time, everywhere in the universe, where you have order spontaneously arising from turbulence or chaos,” she said. “I like to go after things that defy intuition and much about the mechanism by which this self-organization occurs remains mysterious.  

    When her advisor, principal research physicist Ilya Dodin, offered Jin several thesis topics to choose from, “Suying fearlessly chose the most challenging project over low-hanging fruits,” Dodin said. “She felt that although immediate rewards were not to be expected, the results of that project would be more important in the long run. I have much respect for that attitude,” he said. “Suying is an outstanding researcher and a classic role model who strongly deserves a Princeton honorific fellowship.”

    Jin traces her passionate interest in plasma science to her preparation for a final exam at the University of California, Los Angeles  (UCLA), where she graduated in physics with honors in 2018. “I was working my way through an electrodynamics textbook, and I came across this problem that introduced me to the whole idea of plasma,” she said. “It was my first time thinking about what would happen if you had a bunch of charged particles together and it seemed like anything would be possible in a medium like that.”

    Basic Science

    While her thesis topic “is basic science and not fusion focused, ultimately, I think the fusion effort will benefit greatly from just fundamental plasma research,” she said. “There’s a lot we still need to understand about plasmas, period.”

    Her dedication to learning extends to teaching, which she has pursued as a teaching assistant at the graduate and undergraduate levels. She’s taught in Dodin’s graduate class in plasma waves, where “she was very proactive and did a great job,” he recalls. She also helped teach an undergraduate course in fusion and fission that has expanded her interest in real-world problems.

    Her research has led to frequent peer-reviewed publications, including five papers as a first author and two as a co-author. In addition, she shares a patent disclosure with two PPPL physicists.

    Outside the classroom, Jin has been an active participant in plasma programs. She was a cofounder of Princeton Women in Plasma Physics (PWiPP), whose mission includes promoting “a supportive community for women and gender minorities in plasma physics at Princeton.”  She has lectured at plasma physics workshops and been a panelist and discussion leader at a local conference for undergraduate women in physics.

    Tae Kwon Do

    When not deeply engaged in plasma physics, Jin pursues long-time hobbies including the Korean martial art Tae Kwon Do, in which she holds a black belt and has practiced for 15 years. She also enjoys cooking and playing the piano.

    Looking ahead, Jin says she would prefer a teaching job to a purely research position and sees herself “continuing down the path of academia. “I’ve had such fantastic mentors from day one when I entered this field, and I would really like to work with students to pass that mentorship along.”

    The Program in Plasma Physics has graduated more than 300 students since it began in 1959.
    In an environment that, over the past few decades, has seen enormous changes in the fields of plasma physics and controlled fusion, the program has consistently focused on fundamentals in physics and mathematics and on intense exposure to contemporary experimental and theoretical research in plasma physics. Learn more.

    [ad_2]

    Princeton Plasma Physics Laboratory

    Source link

  • Sustaining U.S. Nuclear Power Plants Could be Key to Decarbonization

    Sustaining U.S. Nuclear Power Plants Could be Key to Decarbonization

    [ad_1]

    Newswise — Nuclear power is the single largest source of carbon-free energy in the United States and currently provides nearly 20 percent of the nation’s electrical demand. Many analyses have investigated the potential of future nuclear energy contributions in addressing climate change. However, few assess the value of existing nuclear power reactors.

    Research led by Pacific Northwest National Laboratory (PNNL) Earth scientist Son H. Kim with the Joint Global Change Research Institute (JGCRI), a partnership between PNNL and the University of Maryland, has added insight to the scarce literature and is the first to evaluate nuclear energy for meeting deep decarbonization goals. Kim sought to answer the question: Just how much do our existing nuclear reactors contribute to the mission of meeting the country’s climate goals, both now and if their operating licenses were extended?

    As the world races to discover solutions for reaching net zero, Kim’s report quantifies the economic value of bringing the existing nuclear fleet into the year 2100 and outlines its significant contributions in limiting global warming.

    Plants slated to close by 2050 could be among the most important players in a challenge that requires all carbon-free technology solutions that are available—emerging and existing—the report finds. New nuclear technology also has a part to play, and its contributions could be boosted by driving down construction costs.  

    “Even modest reductions in capital costs could bring big climate benefits,” said Kim. “Significant effort has been incorporated into the design of advanced reactors to reduce the use of all materials in general, such as concrete and steel, because that directly translates into reduced costs and carbon emissions.”

    Nuclear power reactors face an uncertain future

    The nuclear power fleet in the United States consists of 93 operating reactors across 28 states. Most of these plants were constructed and deployed between 1970-1990. This means half of the fleet has outlived its original operating license lifetime of 40 years. While most reactors have had their licenses renewed for an additional 20 years, and some for yet another 20, the total number of reactors that will receive a lifetime extension to operate a full 80 years from deployment is uncertain.

    Other countries also rely on nuclear energy. In France, for example, nuclear energy provides 70 percent of the country’s power supply. They and other countries will also have to consider whether to extend the lifetime, retire, or build new, modern reactors. However, the U.S. faces the potential retirement of a bulk of reactors in a short period of time—this could have a far stronger impact than the staggered closures other countries may experience.

    “Our existing nuclear power plants are aging and with their current 60-year lifetimes, nearly all of them will be gone by 2050. It’s ironic. We have a net zero goal to reach by 2050, yet our single largest source of carbon-free electricity is at risk of closure,“ said Kim.

    Exploring scenarios of lifetime extensions for nuclear power reactors

    Kim has built computational models that explore the interplay between economic processes, energy demand, and Earth’s climate since joining PNNL and JGCRI in 1995, when he was a doctoral intern with a fresh PhD in nuclear engineering. At JGCRI, researchers explore interactions between human, energy, and environmental systems to provide data for managing risks and analyzing options. His research is inspired by a drive to solve the energy and environmental crisis using modeling capabilities and tools like the Global Change Analysis Model (GCAM), developed at PNNL.

    Kim used GCAM to model multiple scenarios of extending the lifetime of the existing nuclear fleet into 2100. The article, published in Nuclear Technology, put a value on lifetime license extensions from 40 to 100 years at $330 billion to $500 billion in mitigation cost savings under a scenario that limits global temperature to 2°C. Mitigation costs savings, or the carbon value, are amounts of dollars saved in reducing greenhouse gas emissions. Legacy nuclear reactors alone have a carbon value of $500 billion if operational for 100 years. Every gigawatt of energy, or one nuclear power reactor, translates to $5 billion later saved. Because that gigawatt was produced without any carbon emitted into Earth’s atmosphere, no money would need to be spent to mitigate its effects.

    Maintaining existing nuclear power plants avoids replacing reactors with electricity sources that produce carbon emissions. In states where nuclear reactors have been shut down, carbon emissions have increased from replacing the carbon-free electricity with natural gas-generated electricity.

    Kim determined that lifetime extensions of existing nuclear power reactors from 60 to 80 years, without adding new nuclear capacity, contributed to a reduction of approximately 0.4 gigatons of carbon (GtCO2) emissions per year by 2050. The total cumulative difference in CO2 emissions between 2020 and 2100, in a scenario with lifetime extensions and future deployment of nuclear power plants (as compared to a scenario with a moratorium on new nuclear power plants), amounts to as much as 57 GtCO2.

    How much is 57 GtCO2? According to the International Energy Agency, U.S. carbon emissions in 2022 were 4.7 Gt, which means nuclear energy could save approximately 12 years’ worth of carbon emissions.

    An Intergovernmental Panel on Climate Change report on nuclear energy stated, “Nuclear power is therefore an effective greenhouse gas (GHG) mitigation option, especially through license extensions of existing plants enabling investments in retro-fitting and upgrading.”

    However, in a follow-on report to his research, Kim addresses the additional savings potential of driving down capital costs of building new nuclear power plants.

    Removing the uncertainty in nuclear power costs can increase emissions savings

    Building new nuclear power plants is expensive and construction takes a long period of time. The largest costs are often capital costs: the one-time price paid to build new structures and equipment.

    Advanced reactors—including small modular reactors and microreactors—are being developed with new technologies, enhanced security features, smaller physical footprints, and more flexible deployment options. They are expected to play an important role in the future U.S. electricity system and carbon mitigation efforts.

    “One of most important attributes of small modular reactors and microreactors is the reduced construction time,” Kim said. “SMRs and microreactors will be factory fabricated and delivered to site on trucks, and the uncertainty associated with financing cost should be reduced or eliminated.”

    Kim used GCAM to investigate a range of nuclear plant capital costs with scenarios of alternative carbon mitigation policies, and U.S. economy-wide net-zero emission goals by 2050, 2060, and 2070.

    Among the multiple findings in the report for DOE’s Office of Nuclear Energy, Kim found that an aggressive reduction of nuclear construction costs has a clear and pronounced impact on the expanded deployment of nuclear power under all scenarios, even without an explicit carbon mitigation policy.

    Continuing to generate electricity while removing all emissions of greenhouse gases by mid-century is a difficult challenge. “We must utilize all carbon-free technologies that are available to us,” said Kim, “and one of the great values of nuclear energy is that it doesn’t emit carbon while it’s generating power.”

    ###

    About PNNL

    Pacific Northwest National Laboratory draws on its distinguishing strengths in chemistry, Earth sciences, biology and data science to advance scientific knowledge and address challenges in sustainable energy and national security. Founded in 1965, PNNL is operated by Battelle for the Department of Energy’s Office of Science, which is the single largest supporter of basic research in the physical sciences in the United States. DOE’s Office of Science is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science. For more information on PNNL, visit PNNL’s News Center. Follow us on Twitter, Facebook, LinkedIn and Instagram.

    [ad_2]

    Pacific Northwest National Laboratory

    Source link

  • Zeroing in on a Fundamental Property of the Proton’s Internal Dynamics

    Zeroing in on a Fundamental Property of the Proton’s Internal Dynamics

    [ad_1]

    The Science

    Inside the proton are elementary particles called quarks. Quarks and protons have an intrinsic angular momentum called spin. Spin can point in different directions. When it is perpendicular to the proton’s momentum, it is called a transverse spin. Just like the proton carries an electric charge, it also has another fundamental charge called the tensor charge. The tensor charge is the net transverse spin of quarks in a proton with transverse spin. The only way to obtain the tensor charge from experimental data is using the theory of quantum chromodynamics (QCD) to extract the “transversity” function. This universal function encodes the difference between the number of quarks with their spin aligned and anti-aligned to the proton’s spin when it is in a transverse direction. Using state-of-the-art data science techniques, researchers recently made the most precise empirical determination of the tensor charge.

    The Impact

    Due to the phenomenon known as “confinement,” quarks are always bound in the proton or other hadrons (particles with multiple quarks). The challenge is to connect the theory of quark interactions (QCD) to experimental measurements of high-energy collisions involving hadrons. In this study, researchers used a complete collection of transverse-spin data from electron-positron, electron-proton, and proton-proton scattering in the first global analysis of its kind. They employed this data to make the most precise known empirical calculation of the tensor charge. Scientists need a precise and accurate determination of the proton’s tensor charge to understand the proton’s internal structure and the dynamics of QCD strong interactions. This information is also very important in searches for new physics.

    Summary

    Researchers from the Coordinated Theoretical Approach to Transverse Momentum Dependent Hadron Structure in QCD Topical Collaboration (TMD Collaboration) , working in conjunction with the Thomas Jefferson National Accelerator Facility (Jefferson Lab) Angular Momentum Collaboration (JAM Collaboration), analyzed data from a wide range of experiments where protons and/or quarks were transversely polarized. This allowed for the most precise empirical determination of the proton’s tensor charge. The tensor charge is not only a fundamental property of the proton but also needed in searches for new physics. The results were then compared to computations of the proton’s tensor charge by lattice QCD, which simulates the proton’s structure on a supercomputer. After about a decade of results showing disagreement between empirical methods and lattice QCD for the proton’s tensor charge, researchers for the first time found agreement between the two.

    The empirical study was performed using QCD theory and state-of-the-art numerical methods. A crucial part of the analysis was the utilization of data from electron-positron, electron-proton, and proton-proton scattering. This opens a new frontier in QCD global analyses to simultaneously include all possible measurements, like those from the future Electron-Ion Collider and Jefferson Lab 12 GeV, to continue to increase the precision and accuracy of extracting the proton’s tensor charge.

     

    Funding

    This work was supported by the Department of Energy Office of Science, Nuclear Physics program under the Coordinated Theoretical Approach to Transverse Momentum Dependent Hadron Structure in QCD (TMD Topical Collaboration). This work was also supported in part by the Department of Energy and the National Science Foundation and the agencies’ Early Career Programs.


    Journal Link: Physical Review D, Aug-2022

    [ad_2]

    Department of Energy, Office of Science

    Source link

  • Oak Ridge National Laboratory’s Robert Wagner receives 2023 SAE Medal of Honor

    Oak Ridge National Laboratory’s Robert Wagner receives 2023 SAE Medal of Honor

    [ad_1]

    Newswise — SAE International has awarded Oak Ridge National Laboratory Buildings and Transportation Science Division Director Robert Wagner with the SAE Medal of Honor for his dedication and support of the organization’s mission of advancing mobility solutions.

    Wagner was presented with the award at a ceremony in Detroit. This is the most prestigious award that the automotive engineering society annually bestows upon one of its more than 128,000 members for individual achievement.

    Wagner has been a champion of SAE since the late 1990s and has founded, organized or chaired more than 20 SAE International symposiums, panels and conferences. In 2011, during a time of significant and rapid breakthroughs in engine technologies, he co-founded the High Efficiency Internal Combustion Engine Symposium and helped develop it into a premier global transportation event. Building on that success, in 2016 he co-founded a symposium that focused on range extenders and brought together leaders from the U.S. government and across the profession to share insights into the challenges and opportunities of integrating high-efficiency engines into electrified drivetrains. This was one of the first SAE events focused on electric vehicles and hybridization, helping the organization remain at the forefront of new technology innovations on the global stage.

    More recently, as transportation research focus has shifted to net-zero-carbon fuels, Wagner recognized the need to create an opportunity for a diverse group of global stakeholders to have a neutral forum setting in which to convene and exchange ideas. This led to a series of symposiums dedicated to net-zero initiatives and the interface with engines and fuel.

    Wagner has focused on passing down the legacy of planning and organizing to other researchers, inspiring and mentoring a new generation of engineers to understand the importance that symposiums can have on securing SAE International’s reputation as a world authority on automotive engineering.

    “I am pleased to see that SAE has recognized Robert for his contributions to advancing transportation research, both nationally and internationally,” said Xin Sun, associate laboratory directory for ORNL’s Energy Science and Technology Directorate. “His leadership at ORNL has been instrumental to maintaining the laboratory’s reputation as a leader in transportation and mobility research and development.”

    At ORNL, Wagner has led transportation research and initiatives for more than 20 years and is well regarded as a scientific leader, strategic planner, mentor and collaborator, working with the Department of Energy, other national laboratories, academia and industry partners. Within this role, he stewards two DOE user facilities — the National Transportation Research Center and the Building Technologies Research and Integration Center. He originally came to ORNL as an undergraduate student in 1992 and then joined ORNL as a postdoctoral research fellow in 1999, advancing to a distinguished research staff role followed by leadership roles in which he directed a diverse portfolio of transportation research. For 10 years, Wagner served as DOE’s laboratory relationship manager for advanced combustion, emissions and fuels and was a founding member of the DOE initiative on the Co-Optimization of Fuels and Engines.

    He is an SAE Fellow, two-time winner of the SAE International Forest R. McFarland Award, and a recipient of the SAE International Leadership Citation. In 2019, Wagner was named in the Inaugural SAE Top Contributor Class based on his volunteer and engagement contributions. He has co-authored 40 SAE publications, presented 14 invited talks at SAE International events and served on multiple committees and the editorial board of the SAE International Journal of Engines. Wagner is also a Senior Member of the Institute of Electrical and Electronics Engineers, a Fellow of the American Society of Mechanical Engineers and the American Association for the Advancement of Science, and has won numerous awards from other organizations for research, leadership and service.

    A native of Missouri and first-generation college graduate, he earned his doctoral degree in mechanical engineering from the Missouri University of Science & Technology, where he delivered the commencement address to Ph.D. graduates in December 2022.

    UT-Battelle manages ORNL for the Department of Energy’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. The Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

    [ad_2]

    Oak Ridge National Laboratory

    Source link

  • Understanding the Origin of Matter with the CUORE Experiment

    Understanding the Origin of Matter with the CUORE Experiment

    [ad_1]

    The Science

    There is so much that we do not yet know about neutrinos. Neutrinos are very light, chargeless, and elusive particles that are involved in a process named beta decay and that can help us to understand the origin of matter in Universe. Beta decay is a type of radioactive decay that involves a neutron converting into a proton emitting an electron and an antineutrino. Beta decay is very common– it occurs about a dozen of times per second in a banana. There might also be an ultra-rare kind of beta decay that emits two electrons but no neutrinos. Nuclear physicists around the world are searching for this neutrinoless-double beta decays (NLDBD) in different nuclei. The interest in these decays arises from their potential to reveal unsolved mysteries related to the Universe’s creation of matter. They can also provide hints towards our understanding of the currently unknown mass of neutrinos.

    The Impact

    The Cryogenic Underground Observatory for Rare Events (CUORE) can search for these rare NLDBD processes using different nuclei. Scientists rely on complementarity among searches using different nuclei to have a better understanding of the underlying physics in the process. Complementarity in physics involves theories that contrast with each other but that both explain part of the same phenomena. CUORE recently searched for NLDBD using a nucleus that had not previously been studied with CUORE, Tellurim-128. The researchers have so far found no evidence for NLDBD. However, they show that the half-life of Tellurim-128 to decay by NLDBD is longer than 3.6 septillion years (ultra-rare decays have very long half-lives). This lower limit is about 30 times higher than those from prior experiments using the same technique. This new search pushes forward scientists’ knowledge on these rare nuclear decays. This opens another path to our understanding of the origin of matter in our Universe.

    Summary

    CUORE is one of the world-leading experiments searching for extremely rare nuclear processes. Because of the rareness of these processes, CUORE needs a very low-radioactivity environment achieved by using extremely clean materials and by the Gran Sasso Mountain, which shields the experiment from cosmic rays. CUORE consists of almost 1,000 crystals that are kept at a temperature close to absolute zero by a dedicated refrigeration structure. The temperature of the crystals is measured 1,000 times per second, saved to disk, and analyzed to spot the tiny amount of temperature variations caused by the rare decays. Since the beginning of its operation in 2017, CUORE has collected a huge amount of data and it will continue for at least two more years. Researchers expect better results in the search for NLDBD processes on the nucleus Tellurim-128 in the near future. After CUORE, the next-generation experiments have the potential to unravel several nuclear and particle physics mysteries through the exploration of these elusive processes.

     

    Funding

    This work was supported by the Department of Energy Office of Science, Office of Nuclear Physics; the National Science Foundation; the Alfred P. Sloan Foundation; the University of Wisconsin Foundation; Yale University; and the Istituto Nazionale di Fisica Nucleare. 


    Journal Link: Physical Review Letters, Nov-2022

    [ad_2]

    Department of Energy, Office of Science

    Source link

  • How Argonne makes the power grid more reliable and resilient

    How Argonne makes the power grid more reliable and resilient

    [ad_1]

    Newswise — Through innovative methods of deeply understanding the complexities of the grid, the lab helps secure the nation’s energy future.

    The U.S. power grid is almost incomprehensibly large. Comprising nearly 12,000 power plants, 200,000 miles of high-voltage transmission lines, 60,000 substations and 3 million miles of power lines, it may well be the most massive and complex machine ever assembled. Households, businesses, governments and essential infrastructure — including water, telecommunications, food supply, health care and wastewater treatment — rely on the grid around the clock. The power it generates fuels the U.S. economy.

    All this complexity makes it critical to understand the vulnerabilities of the nation’s electric transmission and distribution systems and to protect the grid from an evolving set of human-caused and natural hazards. Those can include cyberattacks from foreign governments and terrorists as well as extreme weather events driven by climate change. Record-setting heat waves, unprecedented storms and flooding, historic droughts and wildfires all pose hazards to the grid.

    “What sets Argonne apart is that we are very good at looking at all these problems from a multidisciplinary perspective. There are no research silos here.” — Mark Petri, head of Argonne’s Electric Power Grid Program

    The U.S. Department of Energy’s (DOE) Argonne National Laboratory plays a vital role in maintaining and developing a stable and secure grid. At the nation’s first national lab, located in southwest suburban Chicago, scientists and engineers bring to bear collective expertise in economics, threat assessment and mitigation, system vulnerability analysis, critical infrastructure interdependency modeling, proactive cybersecurity defense and emergency readiness and response support. The lab also leverages cutting-edge high performance computing hardware, mathematical software technologies, and artificial intelligence and machine learning resources.

    “What sets Argonne apart is that we are very good at looking at all these problems from a multidisciplinary perspective,” says Mark Petri, head of the lab’s Electric Power Grid Program, who leads security and resilience activities. Petri also serves as technical team lead for the Markets, Policies & Regulations pillar of DOE’s Grid Modernization Initiative. ​“We bring together engineers, infrastructure analysts, computer scientists and modelers, artificial intelligence experts, economists, battery researchers and others in a focused effort to tackle these critical national challenges. There are no research silos here.”

    Argonne also collaborates with local, state, regional, tribal and territorial stakeholders, as well as academia, utilities and other national laboratories. This helps Argonne develop and deploy innovative solutions and advanced technologies that enhance the grid’s ability to withstand and recover from threats. Argonne is a key contributor to the Grid Modernization Laboratory Consortium, a strategic partnership between DOE and the national labs to bring together leading experts, technologies and resources to collaborate on the goal of modernizing the nation’s grid.

    Specialized models and training help design and defend an evolving grid

    For more than two decades, Argonne has pioneered the analysis of grid infrastructure. That includes identifying natural and man-made external threats to the system — everything from hail to hackers — and honing in precisely on system vulnerabilities. ​“If I have flooding, high winds, ice — what are the things that are likely to break on the system?” Petri asks. ​“Are transmission towers going to go out? Are substations going to be under water? Am I going to lose power generation? Knowing the weak links in the chain is key.”

    Researchers are also interested in deeply examining the complex interdependencies that exist between electricity infrastructure and other energy systems such as natural gas. Understanding the interconnections, the ways the systems operate in concert and how disruption in one sector has the potential to cause cascading failures across the entire complex, allows researchers to anticipate potential disruptions, manage impacts and develop adaptation measures for the future.

    Argonne scientists have developed specialized computer modeling tools to enable decision makers to make informed, data-backed choices when proactively hardening the grid or responding to threats in real time. For instance, they developed one of the highest resolution climate models covering North America, which projects the impacts of climate change 50 years into the future. While most climate modeling is done at the scale of 100-kilometer grid blocks on a map, Argonne’s model behind its Climate Risk and Resilience Portal, driven by some of the nation’s most powerful supercomputers, zooms in to the level of 12 kilometers. (Argonne’s next climate models will have a resolution closer to four kilometers, which approaches the size of a large urban neighborhood or small rural town.)

    “Developing the hazard and climate risk models that leverage the latest in the science and the leadership class computational resources at Argonne and DOE has enabled us to work with a multitude of private and public sector utilities” said Rao Kotamarthi, science director of the Center for Climate Resilience and Decision Science and a senior scientist at Argonne’s Environmental Science division.

    Kotamarthi explained that the breakthrough offers more actionable hyperlocal information for leaders thinking through climate resiliency planning. Companies including AT&T and ComEd, as well as government agencies like the New York Power Authority, already see the model’s value. Looking to improve the resilience of their grid-level infrastructure and keep critical services up and running, they can see which pieces of valuable equipment sit in likely future climate-related danger zones. This helps them to identify locations that may need to be stabilized or relocated altogether.

    Argonne has also developed several other leading modeling tools, including the Hurricane Electric Assessment Damage Outage, which forecasts likely power outages after a storm. The EPfast tool examines power outage impacts on large electric grid systems. The Restore tool provides insights into repair times for outages at critical infrastructure facilities. And the Electric Grid Resilience Improvement Program models power system restoration after a major blackout.

    Moreover, to help system operators respond more quickly to grid failures, limit impacts on customers and speed recovery, Argonne supports system operator training so they can effectively respond to major grid disruptions. Stakeholders responsible for resilience are put through readiness exercises that replicate real-world threat, response and recovery scenarios — hurricanes, blizzards, earthquakes, cyberattacks — and hone their in-the-moment decision-making skills.

    New tools predict outcomes from emergent grid resources

    Adding yet another layer of complexity to the grid, distributed energy resources (DERs) like rooftop solar panels and generators have emerged as significant power generation sources. DERs contribute to a power system’s overall capacity, but operators must assess their impact and forecast their potential, especially during extreme weather events. That’s why Argonne created TDcoSim, a cutting-edge transmission and distribution co-simulation software tool that enables high-fidelity modeling of DERs. It’s the first model capable of simulating both transmission (the high-voltage network used to transfer power long distances) and distribution (the localized low-voltage network used by the utilities to deliver power to consumers).

    “This is a totally new paradigm in grid modeling. Nobody has done this before,” says Vladimir Koritarov, director of the lab’s Center for Energy, Environmental and Economic Systems Analysis. ​“At Argonne, we specialize in developing these kinds of new, advanced grid models, algorithms, optimization methods and approaches that are more efficient, faster and more accurate than previously available ones.”

    Among those models is the Argonne Low-Carbon Electricity Analysis Framework, known as A-LEAF, an integrated national-scale simulation framework for power system operations and planning. It allows operators to evaluate different pathways to decarbonization of electric grids. A related Argonne-developed interactive tool called the Geospatial Energy Mapper helps users identify sites across the country best suited for renewable energy infrastructure projects.

    As the U.S. aims to meet a goal of net-zero carbon emissions by 2050, the grid’s energy mix will likely include far more renewables than today. But sources such as solar and wind are variable in their production and output may be reduced in extreme weather. Adapting to this variability interests Argonne energy systems engineer Neal Mann. At a time when long-term planning decisions are being made about which energy infrastructure technologies are invested in and built, and which will be retired, Mann focuses on the role nuclear power might play in the future grid. ​“If we rely too much on weather-driven generation, do we end up compromising reliability under stressed climate-related conditions?” he asks. ​“In those cases, having nuclear and other so-called dispatchable technologies available could be the difference between widespread outages or not.”

    Grid-level energy storage is focus of materials and manufacturing R&D

    To compensate for the uncertainty of variable renewables and to capture excess generation, researchers across Argonne are focused on low-cost, high-efficiency energy storage. Those efforts include research into various novel battery technologies such as advanced sodium-ion cathodes and new flow cell chemistries; chemical and thermal storage; and pumped storage hydropower, a common type of hydroelectric energy storage that can provide power even during extended lulls in solar and wind generation.

    One project involves the development of a model based on the R&D 100 winning EverBatt model, called ​“EverGrid.” The free to use model will help determine the impacts of stationary energy storage technologies such as flow batteries and advanced lead acid batteries at end-of-life, including recycling. The model will help researchers make better decisions during the technology development process as well as help find hot spots in processing that can lead to optimization and scale up.

    “In order to reduce greenhouse gas emissions and hit U.S. climate goals, we’re going to be increasingly relying on renewable energy, which is not a constant source of energy,” says Chris Heckle, director of the Materials Manufacturing Innovation Center at Argonne. ​“We need to develop grid-level energy storage solutions, which will need to be large in scale. That will involve manufacturing challenges, transportation challenges and systems challenges, all of which Argonne is well positioned to meet.”

    For Petri, the growing complexity of the grid and the evolving threats against it make Argonne’s interdisciplinary approach more necessary than ever to help secure the nation’s energy future.

    “Our ability to understand how the grid’s complex systems behave, how they might be disrupted, and how operators can improve response is vitally important,” he says. ​“It’s important to people’s lives, it’s important to our economy, it’s important to our national security. And here at Argonne, we are right in the middle of improving these systems from a reliability and resilience perspective.”

    Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science.

    The U.S. Department of Energy’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://​ener​gy​.gov/​s​c​ience.

    [ad_2]

    Argonne National Laboratory

    Source link