ReportWire

Tag: Nuclear Physics

  • Better simulations of neutron scattering

    Better simulations of neutron scattering

    [ad_1]

    Newswise — A new simulation approach named eTLE aims to improve the precision of a primary tool for estimating neutron behaviours in 3D space. This study examines the approach in detail – validating its reliability in predicting the scattering of neutrons in crystalline media.

    Tripoli-4® is a tool used by researchers to simulate the behaviours of interacting neutrons in 3D space. Recently, researchers developed a new ‘next-event estimator’ (NEE) for Tripoli-4®. Named eTLE, this approach aims to increase Tripoli-4®’s precision using Monte Carlo simulations: a class of algorithms which solve problems by repeatedly estimating the characteristics of a whole population of neutrons, by selecting random groups of individuals. Through new research published in EPJ Plus, a team led by Henri Hutinet at the French Alternative Energies and Atomic Energy Commission implement and validate eTLE’s reliability for the first time.

    Since the production of neutrons is a key element of nuclear fission reactions, this enhanced precision could ultimately help to improve the safety of nuclear reactors. The success of eTLE hinges on the principle that the transport and attenuation of neutrons through a medium is mathematically predictable. So far, the use of NEEs to predict this transport has been hindered by their treatment of neutrons as simple gases of interacting particles. In crystalline media, this causes the angles they follow as they scatter from each other to take on discrete values – forbidding certain angles which may be necessary to understanding the neutrons’ overall behaviour.

    In their study, Hutinet’s team examined the outcomes of eTLE’s Monte Carlo-based approach to estimating neutron behaviours. To validate their findings, they used a classical, unbiased NEE as a benchmark for studying several scattering neutrons inside crystalline media – including graphite and beryllium. Their results revealed a strong agreement between these classical estimators and eTLE: representing a huge improvement compared with previous NEE approaches for Tripoli-4®. By removing the need for discrete scattering angles, the team’s work could now pave the way for nuclear reactor operators to predict neutron behaviours far more accurately in the future.

    Reference: Hutinet, H., Le Loirec, C., Mancusi, D. et al. Neutron elastic scattering kernel for Monte Carlo next-event estimators in Tripoli-4®. Eur. Phys. J. Plus 138, 189 (2023). https://doi.org/10.1140/epjp/s13360-023-03787-8

    [ad_2]

    Springer

    Source link

  • Sutharshan named ORNL deputy for operations

    Sutharshan named ORNL deputy for operations

    [ad_1]

    Newswise — Balendra Sutharshan has been named chief operating officer for Oak Ridge National Laboratory. He will begin serving as ORNL’s deputy for operations and as executive vice president, operations, for UT-Battelle effective April 1. He will succeed Alan Icenhour, who is retiring this spring after serving in the role since 2021. UT-Battelle operates ORNL for the Department of Energy.

    Sutharshan joined ORNL in February 2021 as the associate laboratory director for the Isotope Science and Engineering Directorate. Under his leadership, ISED has achieved remarkable growth in isotope research and development, as well as production to meet the increased demand for isotopes used in medicine, research and security.

    “Balendra brings comprehensive experience to the position, including an extensive knowledge of ORNL’s nuclear capabilities, strong relationships across the national lab and Battelle systems, and a history of driving operational performance improvements and organizational strategy,” interim ORNL Director Jeff Smith said. “I am excited for Balendra to serve in this important role for ORNL.”

    During Sutharshan’s tenure as ALD, ISED has deployed new enrichment technology capabilities and stewarded new projects that will help to secure the domestic isotope supply chain, including the Stable Isotope and Production Research Center, the Stable Isotope Production Facility and the Radioisotope Processing Facility. He established the Isotope Processing and Manufacturing Division in 2022 to further improve production performance and introduced predictive maintenance into the lab’s hot cell facilities to reduce downtime.

    He has also been active in developing new partnerships to grow and train the pipeline of future talent needed to conduct isotope science and production, and he has placed a significant emphasis on improving ISED’s culture.

    As the chief operating officer of UT-Battelle, Sutharshan will lead the formulation and implementation of cross-cutting operation plans and integrated facility strategies to enable ORNL’s missions. He also will play a lead role in the lab’s commitment to community engagement.

    “It’s an honor to be part of an organization that empowers leaders and teams to pursue breakthrough science and technology and has roots back to the Manhattan Project,” Sutharshan said. “I look forward to strengthening ORNL’s operations and facilities strategies and continuing to support the lab’s engagement with communities where we work and live.”

    Prior to joining ORNL, Sutharshan served as COO for the Operational Systems Directorate at Pacific Northwest National Laboratory. In this position, he provided leadership of the directorate responsible for all of PNNL’s infrastructure and facilities as well as its environmental, health, safety, security, project management and nuclear operations programs. Before joining PNNL, Sutharshan served as COO for the Energy and Global Security Directorate at Argonne National Laboratory and served on the DOE review team that analyzed the 2018 High Flux Isotope Reactor fuel event. In addition, he spent nearly 20 years in a series of leadership roles with Westinghouse Electric Company.

    Sutharshan holds a doctorate in nuclear engineering from the Massachusetts Institute of Technology; a master’s in chemical and nuclear engineering and a bachelor’s in chemical engineering from the University of Toronto; and an MBA from Rensselaer Polytechnic Institute.

    UT-Battelle manages ORNL for the Department of Energy’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. The Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

    [ad_2]

    Oak Ridge National Laboratory

    Source link

  • Labs Director to make historic visit to Navajo Nation
Building research, recruitment partnership with Navajo Technical University

    Labs Director to make historic visit to Navajo Nation Building research, recruitment partnership with Navajo Technical University

    [ad_1]

    Newswise — ALBUQUERQUE, N.M. — On March 17, Sandia National Laboratories Director Dr. James Peery will make an historic visit to Navajo Technical University in Crownpoint, New Mexico, marking the first time a sitting national lab director has visited a tribal college or university. The event is designed to build on the growing partnership Sandia has started with NTU.

    What: Labs Director visits Navajo Technical University
    When: 
    Friday, March 17, 8:45 a.m. – 1:30 p.m.; visual highlights 10:30 a.m. – noon
    Where: 
    Navajo Technical University, Lowerpoint Rd. State Hwy 371, Crownpoint, NM, 87313
    RSVP: 
    Contact  to confirm attendance.
    NTU media contact: 

    The partnership is part of the National Nuclear Security Administration’s Minority Serving Institution Partnership Plan, which helps national labs partner with tribal colleges and universities that prepare students for technical careers in NNSA’s laboratories and production plants.

    Navajo Technical University is a tribally controlled postsecondary career and technical institution with a main campus in Crownpoint and two smaller campuses in Chinle and Teec Nos Pos, Arizona. NTU offers programs focusing on advanced manufactured metal parts, certification of 3D-printed metal parts, inspection methodologies and techniques, including equipment operation, and optical metrology, including testing and characterization of materials; all skills that can be beneficial to Sandia’s mission.

    In August 2018, with the help of Sandia, NTU obtained accreditation from the Accreditation Board of Engineering and Technology for its industrial engineering and electrical engineering programs. Having an ABET accreditation allows Sandia to hire NTU graduates. Prior to this, area students who wanted to pursue a career at the national labs would have to attend another ABET accredited university, such as the University of New Mexico, first.

    Over the next five years, Sandia will be working with NTU on an initiative to dramatically increase the number of Native American researchers in advanced manufacturing, power and energy engineering and other technology disciplines. The partnership will work to build the first electrical engineering masters and doctoral program at NTU. Sandia will provide internship opportunities to Native American engineering students at NTU, subject matter experts in electrical engineering disciplines in power and energy and technical assistance in power system dynamics and optimization.

    Sandia is also working with NTU to expand programs in other disciplines, including chemical and mechanical engineering.

    During the visit, Peery will meet with NTU leadership and get a firsthand look at its programs and how the partnership has helped NTU grow. He will also tour the advanced manufacturing and energy systems labs.

    Media is invited to attend the visit, see NTU facilities and speak with NTU and Sandia leadership.

      • 9:00 a.m.: Welcome, by NTU President Dr. Elmer Guy
      • 9:15 a.m.: Welcome by NTU provost Dr. Colleen Bowman
      • 9:45 a.m.: Sandia National Laboratories Director address – Dr. James Peery
      • 10:50 a.m.: Advanced Manufacturing tour
      • 11:30 a.m.: Energy Systems Lab tour
      • 12:00 p.m.: Tour ends/Lunch
      • 1:30 p.m.: Visit ends

    The tour of the facilities will be the most visual portion of the visit.


    Sandia National Laboratories is a multimission laboratory operated by National Technology and Engineering Solutions of Sandia LLC, a wholly owned subsidiary of Honeywell International Inc., for the U.S. Department of Energy’s National Nuclear Security Administration. Sandia Labs has major research and development responsibilities in nuclear deterrence, global security, defense, energy technologies and economic competitiveness, with main facilities in Albuquerque, New Mexico, and Livermore, California.

    Sandia news media contact: 

    [ad_2]

    Sandia National Laboratories

    Source link

  • From Atoms to Earthquakes to Mars: High Performance Computing a Swiss Army Knife for Modeling and Simulation

    From Atoms to Earthquakes to Mars: High Performance Computing a Swiss Army Knife for Modeling and Simulation

    [ad_1]

    BYLINE: Idaho National Laboratory (INL)

    Newswise — Researchers solving today’s most important and complex energy challenges can’t always conduct real-world experiments.    

    This is especially true for nuclear energy research. Considerations such as cost, safety and limited resources can often make laboratory tests impractical. In some cases, the facility or capability necessary to conduct a proper experiment doesn’t exist.  

    At Idaho National Laboratory, computational scientists use INL’s supercomputers to perform “virtual experiments” to accomplish research that couldn’t be done by conventional means. While supercomputing can’t replace traditional experiments, supercomputing is an essential component of all modern scientific discoveries and advancements.  

    “Science is like a three-leg stool,” said Eric Whiting, director of Advanced Scientific Computing at INL. “One leg is theory, one is experiment, and the third is modeling and simulation. You cannot have modern scientific achievements without modeling and simulation.” 

    HIGH-DEMAND RESOURCES 

    INL’s High Performance Computing program has been in high demand for years. From INL’s first supercomputer in 1993 to the addition of the Sawtooth supercomputer in 2020, the demand for high-performance computing has only increased.   

    Sawtooth and INL’s other supercomputers are flexible enough to tackle a wide range of modeling and simulation challenges and are especially suitable for dynamic and adaptive applications, like those used in nuclear energy research. INL’s supercomputers are one of the Nuclear Science User Facilities’ 50 partner facilities and its only supercomputers.  

    Whether it’s exploring the effects of radiation on nuclear fuel or designing nuclear-powered rockets for a trip to Mars, INL’s High Performance Computing center is the Swiss Army knife of advanced computing.  

    THE POWER OF 100,000 LAPTOPS 

    On a recent tour of the Collaborative Computing Center, Whiting led the way through the rows of Sawtooth processors. Each row looked like dozens of tall black refrigerators standing side by side. The room hummed with the pumping of thousands of gallons of water needed to keep Sawtooth cool.  

    Sawtooth contains the computing power of about 100,000 processors all dedicated to very large, high-fidelity problems, which means orders of magnitude more processing power and memory when compared to a traditional laptop computer.  

    All that computing power allows researchers from around the world to run dozens of complex simulations at the same time. “If your program is designed right, it runs thousands of times faster than the best-case scenario on your desktop,” Whiting said.  

    Some of these simulations — modeling the performance of fuel inside an advanced reactor core, for instance — require the computer to solve millions or billions of unknowns repeatedly.  

    “If you have a multidimensional problem in space, and then you add time to it, it greatly adds to the size of the problem,” said Cody Permann, a computer scientist who oversees one of the laboratory’s modeling and simulation capabilities. Modeling and simulation started decades ago by solving simplified problems in one or two dimensions. Modern supercomputers, like INL’s Sawtooth, significantly increased the accuracy of these simulations, bringing them closer to reality.  

    To solve these complicated problems, researchers break down each simulation into thousands upon thousands of smaller units, each impacting the units surrounding it. The more units, the more detailed the simulation, and the more powerful the computer needed to run it.     

    THE ATOMIC EFFECTS OF RADIATION ON MATERIALS 

    For Chao Jiang, a distinguished staff scientist at INL, a highly detailed simulation means peering down to the level of individual atoms.  

    Jiang’s simulations, funded by the Department of Energy Nuclear Energy Advanced Modeling and Simulation program and the Basic Energy Sciences program, help nuclear scientists understand the behavior of materials when their atoms are constantly knocked around by neutrons in a reactor core. These displaced atoms will create defects, changing the microstructure of the material, and therefore its physical and mechanical characteristics. These changes in microstructure can damage the materials and reduce the lifetime of the reactor. Understanding these changes helps scientists design better and safer reactors. 

    “The work we are doing is extremely challenging,” Jiang said. “They are computer-hungry projects. We are big users of the high-performance computers.” 

    Understanding the radiation damage in materials is difficult. This change involves physical processes that occur across vastly different time and length scales. “When the high energy neutrons hit the material,” Jiang said, “it will locally melt the material.” 

    Heating and cooling inside an operating reactor takes place in picoseconds, or one trillionth of a second. During this heating and cooling, the material will re-solidify, but will leave defects behind, Jiang said. “These residual defects will migrate and accumulate to form large-scale defects in the long run.” 

    While large defects, such as dislocation loops and voids, can be directly seen using advanced microscopy techniques, there are many small-scale defects that remain invisible under microscope. These small defects can significantly impact the materials, making the use of computer simulations to fill this knowledge gap critical. INL computational scientists combine their simulations with the advanced characterization techniques performed by material scientists at INL’s Materials and Fuels Complex to advance the understanding of material behavior in a nuclear reactor. 

    SIMULATING THE IMPACTS OF EARTHQUAKES ON REACTOR MATERIALS  

    Another INL scientist, Chandu Bolisetti, also simulates the damage to materials, but on a much different scale.  

    Bolisetti, who leads the lab’s Facility Risk Group, uses high-performance computing to simulate the effects of seismic waves — the shaking that results from an earthquake — on energy infrastructure such as nuclear power plants or dams.  

    In early 2021, funded by the DOE Office of Technology Transitions, Bolisetti and his colleagues performed a particularly complex type of simulation — they simulated the impacts of seismic waves on a nuclear power plant building that houses a molten salt reactor.  

    A molten salt reactor is a particularly difficult physics problem because the coolant/fuel circulates through the reactor in liquid form. The team also placed their hypothetical reactor on seismic isolators, giant shock absorbers that help reduce the impacts of earthquakes on buildings. 

    Bolisetti’s team ran the simulation using MOOSE, which stands for Multiphysics Object Oriented Simulation Environment, a software framework that allows researchers to develop modeling and simulation tools for solving multiphysics problems. For these earthquake simulation problems, Bolisetti’s team uses MASTODON, which they developed using MOOSE specifically for seismic analysis.    

    Another project funded by INL’s Laboratory Directed Research and Development program looks at how a molten salt reactor behaves in an earthquake in much more detail. It extends the analysis to include neutronics and thermal hydraulics — in other words, how the shaking impacts nuclear fission and the distribution of heat in the reactor core. 

    “All three of these physics — earthquake response, thermal hydraulics and neutronics — are pretty complicated,” Bolisetti said. “No one has ever combined these into one simulation. How the power in the reactor fluctuates during an earthquake is important for safety protocols. It affects what the operators would do during an earthquake and helps us understand the core physics and design safer reactors.” 

    “Real-world experiments to simulate this are close to impossible, especially when you add neutronics,” Bolisetti said. “That’s where these kinds of multi-physics simulations really shine.”   

    SIMULATING NUCLEAR ROCKETS FOR A TRIP TO MARS 

    Mark DeHart, a senior reactor physicist at INL, uses MOOSE to simulate an entirely different kind of complex machine: a thermonuclear rocket that could someday take humans to Mars.  

    The rocket would use hydrogen as both a propellant and a coolant. When the rocket is in use, hydrogen would run from storage tanks through the reactor core. The reactor would rapidly heat the hydrogen before it exits the rocket nozzles.  

    “The hydrogen that comes out is pure thrust,” DeHart said.  

    Compared with chemical rockets, thermonuclear rockets are faster and twice as efficient. The rockets could cut travel time to Mars in half. 

    One big challenge is rapidly heating the reactor core from about 26 degrees Celsius (80 degrees Fahrenheit) to nearly 2,760 Celsius (5,000 Fahrenheit) without damaging the reactor or the fuel.  

    DeHart and his colleagues are using Griffin, a MOOSE-based advanced reactor physics tool, for multiphysics modeling of two aspects of the NASA mission.  

    The first project tests the fuel’s performance as it experiences rapid heating in the reactor core. The real-world fuel samples are placed in INL’s Transient Test Reactor (TREAT) where they are rapidly brought up to temperature.  

    The data from those experiments are used to create and validate models of the fuel’s neutronics and heat transfer characteristics using Griffin. 

    “If we can show that Griffin can model this real-world sample correctly, we can have confidence that Griffin can calculate correctly something that doesn’t exist yet,” DeHart said.   

    The second project is designing the rocket engines themselves. Automated controllers rotate drums in the reactor core to bring the temperature up and down. “We’ve developed a simulation that will show how you can use the control drums to bring the reactor from cold to nearly 5,000 F within 30 seconds,” DeHart said.  

    Without high-performance computing and MOOSE, developing a thermonuclear rocket would take dozens of small experiments costing hundreds of millions of dollars.  

    AN OPPORTUNITY FOR COLLABORATION 

    In the end, high-performance computing makes INL a gathering place for researchers with a wide range of expertise, from rocket design to artificial intelligence. About half the system’s users are from national labs, with a quarter coming from universities and a quarter from industry. The resulting collaborations are especially important for nuclear energy research.  

    “INL cannot attract all the experts in our field, but by sharing a computer, INL’s team can work with 1,200 experts across the United States,” Whiting said. “INL’s supercomputers are helping build the expertise and develop the tools so they can deploy next-generation reactors.” 

    And the demand for these modeling and simulation resources is only growing. Sawtooth added more than four times the capacity to INL’s high-performance computing capabilities, and already the line of projects waiting in the queue can reach into the thousands.  

    “We need years of research with the High Performance Computing facility,” said Jiang. “We need to understand the high energy state of nuclear materials as accurately as possible, so we need to explore a huge space. Without high-performance computing, basic energy research would suffer. It’s critical.”  

    If you are interested in accessing INL’s supercomputers for your work, visit inl.gov/ncrc or nsuf.inl.gov 

    About Idaho National Laboratory
    Battelle Energy Alliance manages INL for the U.S. Department of Energy’s Office of Nuclear Energy. INL is the nation’s center for nuclear energy research and development, and also performs research in each of DOE’s strategic goal areas: energy, national security, science and the environment. For more information, visit www.inl.gov. Follow us on social media: Twitter, Facebook, Instagram and LinkedIn. 

     

    [ad_2]

    Idaho National Laboratory (INL)

    Source link

  • Hitting Nuclei with Light May Create Fluid Primordial Matter

    Hitting Nuclei with Light May Create Fluid Primordial Matter

    [ad_1]

    The Science

    A new analysis supports the idea that particles of light (photons) colliding with heavy ions create a fluid of “strongly interacting” particles. The calculations are based on the hydrodynamic particle flow seen in collisions of various types of ions at both the Large Hadron Collider (LHC) and the Relativistic Heavy Ion Collider (RHIC). With only modest changes, these calculations also describe flow patterns seen in near-miss collisions at the LHC. In these collisions, photons that form a cloud around the speeding ions collide with the ions in the opposite beam.

    The Impact

    The results indicate that photon-heavy ion collisions can create a strongly interacting fluid that responds to the initial collision geometry, exhibiting hydrodynamic behavior. This further means that these collisions can form a quark-gluon plasma, a system of quarks and gluons liberated from the protons and neutrons that make up the ions. These findings will help guide future experiments at the Electron-Ion Collider (EIC), a facility planned to be built at Brookhaven National Laboratory over the next decade.

    Summary

    It may seem surprising that photon-heavy ion collisions can produce a hot and dense fluid. But it’s possible because a photon can undergo quantum fluctuations to become another particle with the same quantum numbers. A likely example is a rho meson, made of a quark and antiquark held together by gluons. When a rho meson collides with a nucleus, it forms a collision system very similar to a proton-nucleus collision, which also exhibits flow-like signals.

    The current analysis by theorists at Brookhaven National Laboratory and Wayne State University sought to explain data from the ATLAS experiment at the LHC. The theorists found that accounting for the energy difference between the rho meson and the much higher energy of the incoming nucleus was the most important ingredient for their calculations’ ability to reproduce the experimental results. In the most energetic heavy ion collisions, the pattern of particles emerging transverse to the colliding beams generally persists no matter how far you look from the collision point along the beamline. But in lower-energy photon-lead collisions, the model showed that the geometry of the particle distributions changes rapidly with increasing longitudinal distance. This decorrelation had a large effect on the observed flow pattern, showing that 3D hydrodynamic modeling is essential for simulating these low energy photon-lead collisions.

     

    Funding

    This research was funded by Department of Energy Office of Science, Office of Nuclear Physics and the National Science Foundation. The research used computational resources of the Open Science Grid, supported by the National Science Foundation.


    Journal Link: Physical Review Letters

    [ad_2]

    Department of Energy, Office of Science

    Source link

  • Shape-Shifting Experiment Challenges Interpretation of How Cadmium Nuclei Move

    Shape-Shifting Experiment Challenges Interpretation of How Cadmium Nuclei Move

    [ad_1]

    The Science

    Atomic nuclei take a range of shapes, from spherical (like a basketball) to deformed (like an American football). Spherical nuclei are often described by the motion of a small fraction of the protons and neutrons, while deformed nuclei tend to rotate as a collective whole. A third kind of motion has been proposed since the 1950s. In this motion, known as nuclear vibration, atomic nuclei fluctuate about an average shape. Scientists recently investigated cadmium-106 using a technique called Coulomb excitation to probe its nuclear shape. They found clear experimental evidence that the vibrational description fails for this isotope’s nucleus. This finding is counter to the expected results.

    The Impact

    This research builds on a long quest to understand the transition between spherical and deformed nuclei. This transition often includes vibrational motion as an intermediate step. The new result suggests that nuclear physicists may need to revise the long-standing paradigm describing how this transition occurs. Scientists have not yet answered the question of what behavior takes place during this transition, but new evidence points to a description based on rotational motion of a nucleus together with a reorganization of its outermost protons and neutrons. The results make clear that scientists need more data to shed light on nuclei they have traditionally thought to be vibrational.

    Summary

    A multinational team of nuclear physicists used the Argonne Tandem Linac Accelerator System (ATLAS), a DOE Office of Science user facility at Argonne National Laboratory, to accelerate a beam of cadmium-106 nuclei to nine percent of the speed of light and direct it onto a 1-micron thick lead-208 target foil. During the collision, gamma rays from the cadmium-106 nuclei were emitted and detected by the Gamma-Ray Energy Tracking In-beam Nuclear Array (GRETINA), and the recoiling lead and cadmium nuclei were detected by the Compact Heavy Ion Counter 2 (CHICO2). The intensities of the gamma rays provided a measure of the probability of exciting cadmium-106 nuclei via the electromagnetic interaction, from which the electromagnetic properties of cadmium-106 were established.

    The researchers integrated these properties into a model-independent measure of the nuclear shape and compared the result to expectations from several leading nuclear theories. The results indicate that at low-energies, cadmium-106 is not vibrational but instead more in line with the rotation of a slightly deformed triaxial rotor – a shape akin to a deflated American football.

     

    Funding

    This research was supported by the Department of Energy Office of Science, Office of Nuclear Physics, and used resources of Argonne Tandem Linac Accelerator System (ATLAS), a DOE Office of Science user facility at Argonne National Laboratory.


    Journal Link: Physics Letters. B

    [ad_2]

    Department of Energy, Office of Science

    Source link

  • A Trial Run for Smart Streaming Readouts

    A Trial Run for Smart Streaming Readouts

    [ad_1]

    The Science

    Nuclear physics experiments are data intensive. Particle accelerators probe collisions of subatomic particles such as protons, neutrons, and quarks to reveal details of the bits that make up matter. Instruments that measure the particles in these experiments generate torrents of raw data. To get a better handle on the data, nuclear physicists are turning to artificial intelligence and machine learning methods. Recent tests of two streaming readout systems that use such methods found that the systems were able to perform real-time processing of raw experimental data. The tests also demonstrated that each system performed well in comparison with traditional systems.

    The Impact

    Streaming readout systems use advanced computer software to collect and analyze data generated by a device in real time. They feature a less complex physical infrastructure than traditional systems. In addition, they can be far more powerful, efficient, faster, and flexible. A streaming readout system can maximize the information that can be extracted from an experiment, from initial decisions about which data to save to flagging unexpected physics captured in very complex detector systems. These systems also store more of the original data for analysis. This allows for a more holistic picture of events by providing the whole of the event instead of just triggering on some small part of it.

    Summary

    Nuclear physics is demanding and getting more so every year. Advances in experiments require powerful software and computing resources to make sense of the extreme amounts of raw data that experiments produce. For instance, the powerful Continuous Electron Beam Accelerator Facility (CEBAF) is a Department of Energy (DOE) Office of Science user facility at Thomas Jefferson National Accelerator Facility (Jefferson Lab) that initiates cascades of subatomic particles thousands of times per second. These experiments generate enormous amounts of raw data every day. To harness the data, nuclear physicists have relied on hardware-based “triggered” systems to help them pre-sort data based on timed events. These systems only record data for a short period once a particular event is detected.

    Now, nuclear physicists are replacing triggered systems with software-based streaming readout systems. These systems harness artificial intelligence and machine learning tools to process — in real time — the vast amounts of data that nuclear physics experiments produce. In this way, all data are streamed to a data center to be analyzed, tagged, and filtered. The system automatically sifts through the enormous amount of data to filter out unnecessary background and record the interesting bits. With this work done by a streaming readout system, the actual data analysis can take a fraction of the time.

     

    Funding

    This material is based on work supported by the Department of Energy Office of Science, Office of Nuclear Physics, by the Italian Ministry of Foreign Affairs as Projects of Great Relevance within Italy/U.S. Scientific and Technological Cooperation, and by the Thomas Jefferson National Accelerator Facility Laboratory Directed Research and Development program.


    Journal Link: European Physical Journal Plus

    [ad_2]

    Department of Energy, Office of Science

    Source link

  • Celebrating the Upcoming sPHENIX Detector

    Celebrating the Upcoming sPHENIX Detector

    [ad_1]

    Newswise — UPTON, NY— Asmeret Asefaw Berhe, Director of the U.S. Department of Energy’s (DOE) Office of Science, visited DOE’s Brookhaven National Laboratory on Jan. 27 to celebrate the fast-approaching debut of a state-of-the-art particle detector known as sPHENIX. The house-sized, 1000-ton detector is slated to begin collecting data at Brookhaven Lab’s Relativistic Heavy Ion Collider (RHIC), a DOE Office of Science User Facility for nuclear physics research, this spring.

    Like a massive, 3D digital camera, sPHENIX will capture snapshots of 15,000 particle collisions per second to provide scientists with data to better understand the properties of quark-gluon plasma (QGP)—an ultra-hot and ultra-dense soup of subatomic particles that are the building blocks of nearly all visible matter. RHIC collisions briefly recreate the conditions of the universe a fraction of a second after the Big Bang, some 14 billion years ago. Studying QGP can help physicists learn about the origins of matter as we know it and how nature’s strongest force binds quarks and gluons into protons and neutrons, the particles that make up ordinary atomic nuclei.  

    “Brookhaven National Laboratory continues to be a central hub of nuclear physics expertise, making it the world’s premier facility for studying the quark gluon plasma,” said Asmeret Asefaw Berhe, DOE’s Director of the Office of Science. “The sPHENIX detector, and the talented collaboration that will operate it, will strive to give us that answer and the final piece of the quark-gluon puzzle.”

    Brookhaven Lab Director Doon Gibbs said, “sPHENIX marks a key milestone in the RHIC science program. It will allow us to explore many questions raised by incredible discoveries already made at RHIC, especially the surprising liquid nature of the quark-gluon plasma, and lay the foundation for future discoveries at the Electron-Ion Collider. I congratulate and thank all the scientists, engineers, technicians, and support staff at Brookhaven—and sPHENIX collaborators around the world—who have worked together to make this detector possible.”

    At the core of sPHENIX is a 20-ton cylindrical superconducting magnet that will bend the trajectories of charged particles produced in RHIC collisions. The magnet is surrounded and filled with subsystems that include complex silicon detectors, a Time Projection Chamber, and calorimeters that will capture details of particle jets, heavy quarks, and rare, high-momentum particles fast and accurately. These advanced particle tracking systems will allow nuclear physicists to probe properties of the quark-gluon plasma with higher precision than ever before to understand how the interactions between quarks and gluons give rise to the unique, liquid-like behavior of QGP.

    “Our detector employs 100,000 silicon photomultipliers, calorimeter elements built using 3-D printing techniques and a 300 million channel radiation-hard silicon detector that has its sensor and electronics integrated into a monolithic device,” said sPHENIX project director Ed O’Brien.

    Many sPHENIX detector components build on experience gained throughout RHIC operations and draw on expertise throughout the nuclear and particle physics communities, including running experiments at Europe’s Large Hadron Collider.

    “These technologies were barely on the drawing board when RHIC began operations over 20 years ago,” O’Brien said. “Now they are a reality in sPHENIX.”

    “We’ve pulled together the field’s most sophisticated technologies and pushed them to new limits to design a detector unlike any that have come before,” said Brookhaven Lab physicist and sPHENIX co-spokesperson David Morrison. “It’s really a technological marvel.”

    sPHENIX will generate an enormous amount of data to realize its science goals. Developing the capabilities to collect, store, share, and analyze that data will help push the limits of data handling in ways that could benefit other fields including climate modeling, public health, and any fields that require the analysis of huge datasets.

    Learn more about sPHENIX and watch as some of its components came together.

    sPHENIX was built by an international collaboration of physicists, engineers, and technicians from 80 universities and labs from 14 countries—close to 400 collaborators overall, including students. Students, for example, joined efforts to assemble and test complex detector subsystems, studied cost-effective materials for high-speed electronics, and contributed to accelerator improvements that will increase collision rates at RHIC.

    “These hands-on educational experiences are providing valuable training for our nation’s future scientists, technicians, and engineers,” said sPHENIX co-spokesperson Gunther Roland, a physicist at the Massachusetts Institute of Technology. “Their expertise and future work may impact fields well beyond fundamental physics that rely on similar sophisticated electronics and cutting-edge technologies—including medical imaging and national security.”

    sPHENIX and operations at RHIC are funded by the DOE Office of Science (NP). 

    Brookhaven National Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit science.energy.gov

    Follow @BrookhavenLab on Twitter or find us on Facebook

    [ad_2]

    Brookhaven National Laboratory

    Source link

  • Department of Energy Announces $9.1 Million for Research on Quantum Information Science and Nuclear Physics

    Department of Energy Announces $9.1 Million for Research on Quantum Information Science and Nuclear Physics

    [ad_1]

    Newswise — WASHINGTON, D.C. – Today, the U.S. Department of Energy (DOE) announced $9.1 million in funding for 13 projects in Quantum Information Science (QIS) with relevance to nuclear physics. Nuclear physics research seeks to discover, explore, and understand all forms of nuclear matter that can exist in the universe – from the subatomic structure of nucleons, to exploding stars, to the emergence of the quark-gluon plasma seconds after the Big Bang.

    Quantum computers have the potential for computational breakthroughs in classically unsolvable nuclear physics problems. Quantum sensors exploit distinct quantum phenomena that do not have classical counterparts, to acquire, process, and transmit information in ways that greatly exceed existing capabilities or sensitivities.

    “Although we are just beginning to develop the knowledge and technology needed to power a revolutionary paradigm shift to quantum computing, there is a clear line of sight on how to proceed,” said Tim Hallman, DOE Associate Director of Science for Nuclear Physics. “These awards will contribute to advancing nuclear physics research and to pressing future quantum computing developments forward.”

    The selected projects are at the forefront of interdisciplinary research in both fundamental research and use-inspired challenges at the interface of nuclear physics and QIS technologies. Projects include advancing the development of next generation materials and architectures for high coherence superconducting quantum bits, or “qubits,” and a solid-state quantum simulator for applications in nuclear theory. Projects will also develop quantum sensors to enhance sensitivity to new physics beyond the Standard Model and improve precision measurements of nuclear decays. The quantum computing projects explore difficult nuclear physics problems using hardware advantages offered by different near-term quantum platforms.

    The projects were selected by competitive peer review under the DOE Funding Opportunity Announcement for Quantum Horizons: QIS Research and Innovation for Nuclear Science.

    Total funding is $9.1 million for projects lasting up to 3 years in duration. The list of projects and more information can be found here.

    [ad_2]

    Department of Energy, Office of Science

    Source link

  • Cooling 100 million degree plasma with a hydrogen-neon mixture ice pellet

    Cooling 100 million degree plasma with a hydrogen-neon mixture ice pellet

    [ad_1]

    Newswise — At ITER – the world’s largest experimental fusion reactor, currently under construction in France through international cooperation – the abrupt termination of magnetic confinement of a high temperature plasma through a so-called “disruption” poses a major open issue. As a countermeasure, disruption mitigation techniques, which allow to forcibly cool the plasma when signs of plasma instabilities are detected, are a subject of intensive research worldwide. Now, a team of Japanese researchers from National Institutes for Quantum Science and Technology (QST) and National Institute for Fusion Science (NIFS) of National Institute of National Sciences (NINS) found that by adding approximately 5% neon to a hydrogen ice pellet, it is possible to cool the plasma more deeply below its surface and hence more effectively than when pure hydrogen ice pellets are injected. Using theoretical models and experimental measurements with advanced diagnostics at Large Helical Device owned by NIFS, the researchers clarified the dynamics of the dense plasmoid that forms around the ice pellet and identified the physical mechanisms responsible for the successful enhancement of the performance of the forced cooling system, which is indispensable for carrying out the experiments at ITER. These results will contribute to the establishment of plasma control technologies for future fusion reactors. The team’s report was made available online in Physical Review Letters.

    The construction of the world’s largest experimental fusion reactor, ITER, is underway in France through international cooperation. At ITER, experiments will be conducted to generate 500 MW fusion energy by maintaining the ‘burning state’ of the hydrogen isotope plasma at more than 100 million degrees. One of the major obstacles to the success of those experiments is a phenomenon called “disruption” during which the magnetic field configuration used to confine the plasma collapses due to magnetohydrodynamic instabilities. Disruption causes the high-temperature plasma to flow into the inner surface of the containing vessel, resulting in structural damage that, in turn, may cause delays in the experimental schedule and higher cost. Although the machine and the operating conditions of ITER have been carefully designed to avoid disruption, uncertainties remain and for a number of experiments so that a dedicated machine protection strategy is required as a safeguard.

    A promising solution to this problem is a technique called “disruption mitigation,” which forcibly cools the plasma at the stage where first signs of instabilities that may cause a disruption are detected, thereby preventing damage to plasma-facing material components. As a baseline strategy, researchers are developing a method using ice pellets of hydrogen frozen at temperatures below 10 Kelvin and injecting it into a high-temperature plasma. The injected ice melts from the surface and evaporates and ionizes owing to heating by the ambient high-temperature plasma, forming a layer of low-temperature, high-density plasma (hereafter referred to as a “plasmoid”) around the ice. Such a low-temperature, high-density plasmoid mixes with the main plasma, whose temperature is reduced in the process. However, in recent experiments, it has become clear that when pure hydrogen ice is used, the plasmoid is ejected before it can mix with the target plasma, making it ineffective for cooling the high-temperature plasma deeper below the surface.

    This ejection was attributed to the high pressure of the plasmoid. Qualitatively, a plasma confined in a donut-shaped magnetic field tends to expand outward in proportion to the pressure. Plasmoids, which are formed by the melting and the ionization of hydrogen ice, are cold but very dense. Because temperature equilibration is much faster than density equilibration, the plasmoid pressure rises above that of the hot target plasma. The consequence is that the plasmoid becomes polarized and experiences drift motion across the magnetic field, so that it propagates outward before being able to fully mix with the hot target plasma. 

    A solution to this problem was proposed from theoretical analysis: model calculations predicted that by mixing a small amount of neon into hydrogen, the pressure of the plasmoid could be reduced. Neon freezes at a temperature of approximately 20 Kelvin and produces strong line radiation in the plasmoid. Therefore, if the neon is mixed with hydrogen ice before injection, part of the heating energy can be emitted as photon energy. 

    To demonstrate such a beneficial effect of using a hydrogen-neon mixture, a series of experiments was conducted in the Large Helical Device (LHD) located in Toki, Japan. For many years, the LHD has operated a device called the “solid hydrogen pellet injector” with high reliability, which injects ice pellets with a diameter of approximately 3 mm at the speed of 1100 m/s. Owing to the system’s high reliability, it is possible to inject hydrogen ice into the plasma with a temporal precision of 1 ms, which allows measurement of the plasma temperature and density just after the injected ice melts. Recently, the world’s highest time resolution for Thomson Scattering (TS) of 20 kHz was achieved in the LHD system using new laser technology. Using this system, the research team has captured the evolution of plasmoids. They found that, as predicted by theoretical calculations, plasmoid ejection was suppressed when hydrogen ice was doped with approximately 5 % neon, in stark contrast to the case where pure hydrogen ice was injected. In addition, the experiments confirmed that the neon plays a useful role in the effective cooling of the plasma.

    The results of this study show for the first time that the injection of hydrogen ice pellets doped with a small amount of neon into a high-temperature plasma is useful to effectively cool the deep core region of the plasma by suppressing plasmoid ejection. This effect of neon doping is not only interesting as a new experimental phenomenon, but also supports the development of the baseline strategy of disruption mitigation in ITER. The design review of the ITER disruption mitigation system is scheduled for 2023, and the present results will help improve the performance of the system.

    [ad_2]

    National Institutes of Natural Sciences (NINS)

    Source link

  • sPHENIX Assembly Update: Magnet Mapped, Detectors Prepared

    sPHENIX Assembly Update: Magnet Mapped, Detectors Prepared

    [ad_1]

    Newswise — Physicists, engineers, and technicians at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory are rounding out the year with key developments to a house-sized particle detector that will begin capturing collision snapshots for the first time next spring.

    The state-of-the-art, three-story, 1,000-ton detector—known as sPHENIX—will precisely track particles streaming from collisions at the Relativistic Heavy Ion Collider (RHIC), a DOE Office of Science user facility for nuclear physics research. It’s an ongoing makeover of the PHENIX experiment, which took data at RHIC from 2000 until 2016. The upgraded, state-of-the-art sPHENIX will enable scientists to better understand the properties of quark-gluon plasma (QGP) —a soup of subatomic particles that are the inner building blocks of protons and neutrons. Scientists want to measure those particles to learn more about how those building blocks interact to form the visible matter that makes up our world.

    With the recent completion of essential particle-tracking components and a project to map the magnetic field of a superconducting electromagnet at the detector’s core, sPHENIX crews are gearing up for final installations.

    “There’s this whole choreography of a very intricate process of how these remaining pieces go together that’s going to play out in the next months and have us in shape to take data in the spring,” said Brookhaven Lab nuclear physicist and sPHENIX co-spokesperson David Morrison.

    CERN crew maps magnetic field

    A central component of sPHENIX is a 20-ton cylindrical superconducting solenoid magnet. It was once the centerpiece of an experiment called BaBar at SLAC National Accelerator Laboratory in California. Crews transported it across the country in 2015, tested it at low-field in 2016 and high-field in 2018, and carefully installed it at sPHENIX last year.

    The magnet generates a precise and uniform magnetic field—1.4 Tesla, or about as strong as the magnet used for magnetic resonance imaging (MRI) scans. The powerful field will bend the trajectories of charged particles that are among the “debris” produced when nuclei collide at RHIC.

    Remaining detectors soon to be layered inside the magnet’s drum will measure very accurately the position of the particles that stream out of these nuclear smashups, from which other properties can be obtained. Scientists seek to “connect the dots” of those measurements to discern very small differences among three kinds of “parent” particles called upsilons. The upsilon data is only one of numerous studies with sPHENIX at RHIC which will reveal clues about how QGP transitions from a hot soup of quarks and gluons to matter as we know it.

    But before these final tracking components can be installed, the sPHENIX crew sought to map the solenoid’s magnetic field.

    “Once you fill up the middle of the magnet, you can’t place a mapping machine inside,” said Brookhaven physicist Kin Yip.

    A team from CERN, Europe’s particle physics laboratory, came to Brookhaven in November to tackle the precision task.

    “CERN’s detector technologies group are the world experts in magnet mapping,” Yip said.

    The CERN team used the same mapping machine they’d previously used to map the magnet that forms the backbone of the ATLAS experiment at CERN’s Large Hadron Collider.

    The mapping machine, shipped from Geneva, Switzerland, fit into precision rails inside of the magnet’s drum, where some panels of the sPHENIX electromagnetic calorimeter (EMCal)—which will measure different types of charged and uncharged particles in RHIC collisions—had not yet been installed. The cryogenic group from Brookhaven’s Collider-Accelerator Department used liquid helium to cool the solenoid’s superconducting cables to 4.6 degrees Kelvin (-451.4 degrees Fahrenheit)—the temperature needed to generate the magnetic field. Two arms run by air-powered motors rotated like propellers to measure the magnetic field as crews stepped the machine along points from one end of the cylindrical magnet to the other. (Technicians installed the final EMCal segments soon after the mapping project ended.)

    “We thank Brookhaven Lab and in particular the people at sPHENIX for tasking us with the mapping of the sPHENIX solenoid,” said Nicola Pacifico of CERN’s mapping group, which included Francois Garnier, Raphael Dumps, Pritindra Bhowmick. “Every mapping campaign is an R&D exercise on its own, presenting its specific challenges. We enjoyed the support of a very competent team on site, which allowed us to complete the mapping in a timely manner. We wish sPHENIX and its team full success in its physics programme, and au revoir until the next mapping at Brookhaven Lab!”

    sPHENIX scientists had been using a calculated map of the solenoid’s magnetic field to run RHIC collision simulations. The new precision measurements will increase the accuracy of deciphering data from the complex experiment once it’s up and running.

    “In general, in experimental physics, more information is better than less information,” said John Haggerty, a Brookhaven physicist who led the acquisition of the magnet in the early days of sPHENIX. “We can only calculate what we think we built, not what we may have inadvertently built. Now, we have the best possible map.”

    Key sub-detector arrives at Brookhaven

    The massive magnet isn’t the only major detector component that made a cross-country trek to sPHENIX. Pieces of a pixel-based vertex detector known as MVTX, were built at CERN, then shipped to DOE’s Lawrence Berkeley National Laboratory (LBNL) in California for expert assembly, before arriving safely at Brookhaven in October. The detector was shipped in two halves for the 3,000-mile cross-country road trip. Crews used a truck with special suspension and took care to consider a safe route and weather conditions.

    The MVTX is one of three components that will work together to measure the position to determine the momentum of all charged particles emerging from RHIC’s collisions. (The other two are an Intermediate Silicon Strip Tracker (INTT, see below) and a Time Projection Chamber (TPC) being built at Stony Brook University.

    The MVTX, which will sit within the sPHENIX magnet’s central core, offers a very precise answer to the question: did a particle come exactly from the collision or even a fraction of a hair’s width away? It turns out that differences of such tiny distances can make a big difference.

    “Thousands of particles come out of our collisions,” Morrison explained. “Some of those particles decay, turning into other types of particles almost right away—making it maybe 50 microns, about the thickness of a strand of hair. MVTX tells us extremely precisely where particles came from, with a precision of about five microns, so we know if the particle was created in the collision itself or is a product of such as decay.”

    The part of MVTX that actually makes measurements is compact—about a foot long, 3.5 inches in diameter, and weighing in at about 3 ounces. All together, MVTX is made up of three overlapping layers of silicon sensors, which line two halves of a carbon fiber tube. At one end, the tube widens like the bell of a trumpet to fit lots of cables and fibers that power and readout the detector.

     “In this compact package there are 300 million channels, elements that can say ‘I saw something,’” said Edward O’Brien, the sPHENIX project director. “If we think of those channels as pixels, MVTX has a factor of 40 more pixels than your high-definition TV crammed into a space that’s over 20 times smaller.”

    Before installing the pixel-based detector early next year, sPHENIX engineers and technicians will practice placing a mockup of this delicate component around the experiment’s beam pipe., They’ll have only a tiny amount of clearance—about a millimeter—to slide the device into its final position after the other detector components are installed. “It’s like playing the game ‘Operation’ in reverse,” Morrison said. When it comes time to put that final piece in place, he says, the sPHENIX crew will be ready.

    Tracking super-fast, overlapping events

    Meanwhile the team is making progress on those other particle-tracking components.

    With a response time of 60 nanoseconds—60 billionths of second—the INTT will be key in capturing continuous snapshots of 15,000 particle collisions per second, more than three times faster than the former PHENIX detector.

    The INTT takes measurements in the space where MVTX and TPC do not, allowing physicists to reconstruct a complete particle track. It’s super-fast response time enables it to distinguish which tracks come from overlapping events when collisions are piling up.

    The sub-detector was completed in mid-September by an international collaboration that included technicians, engineers, postdocs, and scientists from Japan, Taiwan, and the U.S. The project is funded primarily through the RIKEN BNL Research Center (RBRC) with additional U.S and international contributions.

    The INTT consists of four layers of overlapping silicon strips that form a semiconductor particle detector based on ionizing radiation detection. The layers sit in two halves of a 10-foot-long cylinder. Bringing the two-halves of the detector together for testing, and soon installation, was a tricky task with many moving parts.

    “It’s like flying a 747 airplane,” said Rachid Nouicer, a Brookhaven Lab nuclear physicist, RBRC senior visiting scientist, Stony Brook University adjunct professor, and co-manager of the INTT detector construction.

    To ensure a “safe landing” the INTT assembly team used a machine with two “claws” that picked up each half and pressed them together while technicians tightened screws and knobs around the device. They had to be careful to prevent any cracks in the silicon strips. They also needed to ensure there are no gaps between overlapping silicon layers so the detector can receive all particle signals when its operational.

    “Physics is always moving towards precision and detector technology has to keep up with it—we want detectors to be faster, more precise,” Nouicer said. “It’s a great accomplishment to see all the INTT detector’s channels working. Now, we want to do physics with it.”

    As work progresses on the TPC, a gas tracking detector, at Stony Brook, the time for physics is fast approaching. Stay tuned for another update on that detector component.

    “We’re right at the end of detector component construction. O’Brien said. “We’re done within errors. The challenge ahead is completing installation in the next few months”

    “As you can see, the construction and assembly of these complex detector components is a major international effort,” said sPHENIX co-spokesperson Gunther Roland, a physicist at the Massachusetts Institute of Technology. “This work brings together so many great physicists from all over the world—80 universities and labs from 14 countries and close to 400 collaborators —to make the vision for this detector and the science it will enable a reality.”

    The upgrade and operations at RHIC are funded by the DOE Office of Science (NP).

     Brookhaven National Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

    Follow @BrookhavenLab on Twitter or find us on Facebook.

     

    [ad_2]

    Brookhaven National Laboratory

    Source link

  • Deblurring Can Reveal 3D Features of Heavy-Ion Collisions

    Deblurring Can Reveal 3D Features of Heavy-Ion Collisions

    [ad_1]

    The Science

    When the nuclei of atoms are about to collide in an experiment, their centers never perfectly align along the direction of relative motion. This leads to collisions with complex three-dimensional geometry. Emissions from the dense hot region of nuclear matter form patterns during a collision. In relation to the geometry of the collisions, the patterns of emissions offer insights into characteristics of the compressed matter. The proposed deblurring strategy can reveal the emission patterns as if the initial nuclear centers were under a tight control in an experiment.

    The Impact

    The proposed strategy offers a new way to analyze and present data from the collisions of atomic nuclei. The strategy may make it easier for physicists to arrive at qualitative conclusions from collision data when the results from an experiment refer directly to the geometry of a collision. Until now, this sort of direct reference to collision geometry was only possible with theoretical simulations. This means simulations can focus on what researchers had believed was beyond the reach of experiment. This will help scientists to better understand compressed matter. The optical strategy may also help in nuclear experiments where the methodology makes it hard to obtain the desired information.

    Summary

    The deblurring strategy was inspired by a deblurring algorithm used in optics experiments to sharpen images. Outside of nuclear science, deblurring is used to decipher speed-camera photos. It was suggested by a research collaboration between the Facility for Rare Isotope Beams, a Department of Energy (DOE) Office of Science user facility at Michigan State University, and RIKEN Nishina Center in Japan. The strategy is an effective means of finding triple-differential distributions of products from heavy-ion collisions for a fixed direction of the reaction plane. The reaction plane is defined by the direction of relative velocity and the centers of nuclei entering a collision. At intermediate energies for the collisions, products emerge from a collision exhibiting correlations with the plane. Those correlations help to coarsely identify the orientation of that plane in an experiment. The proposed strategy can benefit the analysis of data from experiments focusing on properties of the compressed nuclear matter at facilities worldwide.

     

    Funding

    This research was supported by the Department of Energy Office of Science, Office of Nuclear Physics.


    Journal Link: Physical Review C

    Journal Link:

    [ad_2]

    Department of Energy, Office of Science

    Source link

  • For Protons and Neutrons, Things Aren’t the Same Inside Nuclei

    For Protons and Neutrons, Things Aren’t the Same Inside Nuclei

    [ad_1]

    The Science

    Newswise — The building blocks of protons and neutrons—quarks—are distributed differently in free protons and neutrons versus inside nuclei. Nuclear physicists call this difference “the EMC effect.” Each proton is made of three quarks, with two called up quarks and one called a down quark. Neutrons have two down quarks and one up quark. Scientists previously thought that the EMC effect treated the up and down quarks equally. New high-precision data from the MARATHON experiment made possible a new global analysis of experimental data on this phenomenon. The complex analysis indicates that the EMC effect may exert more influence on the distribution of down quarks compared to up quarks inside nuclei.

    The Impact

    Prior to this result, nuclear physicists thought they could treat protons and neutrons, and their quarks, similarly in certain cases. This allowed a simpler understanding of how up and down quarks arrange themselves inside protons and neutrons, without the need to account for confounding effects of the environment inside nuclei. The new results from MARATHON appear to contradict this simple picture. Nuclear physicists need to conduct further investigations of this phenomenon to better characterize this effect. If confirmed, the result could affect experiments in neutrino physics, heavy-ion physics, astrophysics, and other fields.

    Summary

    When protons and neutrons live inside an atom’s nucleus, their internal quarks are distributed differently versus those inside protons or neutrons that roam free. This effect was first observed by the European Muon Collaboration at CERN in the 1980s, and it has remained a mystery for decades. The MARATHON collaboration has now collected new data on this phenomenon in an experiment carried out at Thomas Jefferson National Accelerator Facility’s Continuous Electron Beam Accelerator Facility particle accelerator, a Department of Energy (DOE) user facility. The data came from helium-3 and tritium nuclei. Helium-3 has two protons (each with two up quarks and one down quark) and one neutron (with two down quarks and one up quark). Tritium has one proton and two neutrons. Helium-3 and tritium have the same number of up quarks compared to the other nucleus’ down quarks. This new data enabled a sophisticated global analysis by the Jefferson Lab Angular Momentum (JAM) collaboration. The JAM analysis revealed that the distributions of down quarks may be more modified by the environment inside nuclei compared to the up quark distributions. This means that experiments seeking to reveal new information about the quark structure of nucleons will need to account for the nuclear environment.

     

    Funding

    This research was supported by the Department of Energy Office of Science, Office of Nuclear Physics, the National Science Foundation, the University of Adelaide, and the Australian Research Council.


    Journal Link: Physical Review Letters

    [ad_2]

    Department of Energy, Office of Science

    Source link

  • Jefferson Lab Welcomes a ‘New’ Hall Group Leader

    Jefferson Lab Welcomes a ‘New’ Hall Group Leader

    [ad_1]

    Newswise — NEWPORT NEWS, VA – After an extensive international search, the U.S. Department of Energy’s Thomas Jefferson National Accelerator Facility has appointed Mark Jones as the new group leader of the lab’s Experimental Halls A and C. He began his tenure Nov. 1.

    Jones already has deep experience with nuclear physics, equipment and analysis. He began working at Jefferson Lab in 1992 as a postdoctoral researcher at William & Mary. He was hired at the lab as a staff scientist in 2001 and was recently promoted to the level of senior staff scientist. For most of the past year, he has also served as acting hall leader.

    “I’m grateful and honored to be chosen,” Jones said. “There has been outstanding leadership in the past, so I just hope I can keep up the good work. We have a world-renowned staff of physicists, engineers, designers and technicians, so it’s great to have this wonderful team. It makes it easier.”

    As hall leader

    Around 1,600 users from around the world conduct cutting-edge nuclear physics research at Jefferson Lab, using its powerful Continuous Electron Beam Accelerator Facility, or CEBAF, to probe the smallest subatomic particles that are the building blocks of the universe. CEBAF is a DOE Office of Science user facility pursuing nuclear physics research.

    For each experiment, a particle beam is shot around a nearly mile-long oval underground accelerator at nearly the speed of light, gaining energy with each lap. When the right energy is reached, the beam is directed into one of four experimental halls — A, B, C or D — where it collides with a chosen target. Highly sensitive detector systems observe and register the subatomic particles that cascade downstream of the collision. The results augment — or sometimes challenge — current understanding of the workings of the universe.

    As hall leader, Jones is responsible for managing the physicists, administrators, engineers and technicians who support, develop, maintain and engage in experiments as well as the vast number of precision instruments required to conduct them in Halls A and C.

    Halls A and C have had joint leadership in the last decade or so, largely because there’s some overlap between the halls and the types of experiments they’re able to support. Jones worked on the first Hall A experiments at the lab while still a postdoc, when he supervised the construction, installation and operation of the front chambers of the focal plane polarimeter in one of Hall A’s High Resolution Spectrometers.

    Jones said his goal is to continue the productive leadership of his predecessors, advancing experiments that have been vetted and approved by its Program Advisory Committee sometimes years in advance.

    “I’m just hoping to successfully run the experiments that have been proposed and approved by the PAC and then support new ideas that come forward,” he said. “We’re in early planning for a potential energy upgrade for the CEBAF, so I’ll try to generate new ideas for experiments that can take advantage of that upgrade and improve our understanding of the fundamental forces between quarks and gluons, so that we can push these limits and improve our understanding.”

    Jones was a staff scientist in Hall C during the last CEBAF upgrade, when its energy was doubled to 12 GeV, or 12 billion electron-volts, to enable even more informative studies in nuclear physics. During that upgrade, he managed the update of the data analysis software from the aging Fortran to a C++ code based on the framework of the Hall A analyzer. He also served as a co-spokesperson for several of the upgraded CEBAF’s first-run, high-profile experiments.

    ‘New possibilities’

    The drive to expand our understanding of the universe is what initially drew him to physics.

    “The sense of discovery is the main thing,” Jones said. “It crosses all science. Even with the best predictions, you’re never sure what you’re going to find in nature. There are always surprises.”

    This is true for what he considers his most notable accomplishment in the late 1990s in an experiment to measure the electron form factor of the proton, which produced highly unexpected results that have now been verified multiple times by subsequent measurements. The form factor encodes information about the internal structure of a particle, which can be used to test theories of the strong force between quarks and gluons.

    “People didn’t think that the measurement was going to be that exciting,” Jones said. “I do remember when we were getting the first online results, and they were totally different than what people were expecting.

    “That’s what’s exciting about discovery. If you find the unexpected, it usually opens up new avenues for a theory to explain the data, and new possibilities.”

    Those results are the most-cited Jefferson Lab publication and led to Jones’ becoming project manager for the successful construction of the Super BigBite Spectrometer equipment and comprehensive nucleon form factor program now running in Hall A.

    For now, Hall A is in the midst of experiments to measure the electric and magnetic form factors of the proton and neutron using the Super BigBite Spectrometer and BigBite Spectrometer. After these wrap up, the Measurement of a Lepton-Lepton Electroweak Reaction (MOLLER) experiment will measure the weak charge of the electron. This measurement is sensitive to new physics beyond the Standard Model and is complementary to direct searches for new physics at high-energy colliders such as the Large Hadron Collider at CERN. MOLLER is a ~$60 million project supported Primarily by DOE with contributions to the detector and data collection systems from the National Science Foundation and the Canadian Foundation for Innovation.

    Next spring, Jones will oversee installing the Neutral Particle Spectrometer (NPS) in Hall C. With the NPS, scientific users will measure deeply virtual Compton scattering on protons and neutrons, while simultaneous experiments will measure neutral pion production in semi-inclusive deep inelastic scattering. The results will guide theorists in developing models of the 3D map of the quark’s momentum and position inside the proton and neutron.

    Jones, 61, is originally from Pennsylvania. He earned his bachelor’s degree in physics from Oberlin College and Conservatory and his Ph.D. from the University of Minnesota. In addition to William & Mary, he conducted postdoctoral work at Old Dominion University, the University of Maryland and Rutgers University before joining Jefferson Lab as a staff scientist in 2001.

    By Tamara Dietrich

    -end-

    Jefferson Science Associates, LLC, manages and operates the Thomas Jefferson National Accelerator Facility, or Jefferson Lab, for the U.S. Department of Energy’s Office of Science.

    DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science.

    [ad_2]

    Thomas Jefferson National Accelerator Facility

    Source link

  • FRIB Experiment Pushes Elements to the Limit

    FRIB Experiment Pushes Elements to the Limit

    [ad_1]

    Newswise — A new study led by the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) has measured how long it takes for several kinds of exotic nuclei to decay. The paper, published today in Physical Review Letters, marks the first experimental result from the Facility for Rare Isotope Beams (FRIB), a DOE Office of Science user facility operated by Michigan State University.

    Scientists used the one-of-a-kind facility to better understand nuclei, the collection of protons and neutrons found at the heart of atoms. Understanding these basic building blocks allows scientists to refine their best models and has applications in medicine, national security, and industry.

    “The breadth of the facility and the programs that are being pursued are really exciting to watch,” said Heather Crawford, a physicist at Berkeley Lab and lead spokesperson for the first FRIB experiment. “Research is going to be coming out in different areas that will impact things we haven’t even thought of yet. There’s so much discovery potential.”

    The first experiment is just a small taste of what’s to come at the facility, which will become 400 times more powerful over the coming years. “It’s going to be really exciting – mind-blowing, honestly,” Crawford said.

    More than 50 participants from ten universities and national laboratories were involved in the first experiment. The study looked at isotopes of several elements. Isotopes are variations of a particular element; they have the same number of protons but can have different numbers of neutrons.

    Researchers focused on unstable isotopes near the “drip-line,” the spot where neutrons can no longer bind to a nucleus. Instead, any additional neutrons drip off, like water from a saturated kitchen sponge.

    Researchers smashed a beam of stable calcium-48 nuclei traveling at about 60% of the speed of light into a beryllium target. The calcium fragmented, producing a slew of isotopes that were separated, individually identified, and delivered to a sensitive detector that measured how long they took to decay. The result? The first reported measurements of half-lives for five exotic, neutron-laden isotopes of phosphorus, silicon, aluminum, and magnesium.

    Half-life measurements (perhaps best known from applications in carbon dating) are one of the first things researchers can observe about these short-lived particles. The fundamental information about nuclei at the limits of their existence provides a useful test for different models of the atomic world.

    “This is a basic science question, but it links to the bigger picture for the field,” Crawford said. “Our aim is to describe not only these nuclei, but all kinds of nuclei. These models help us fill in the gaps, which helps us more reliably predict things we haven’t been able to measure yet.”

    More complete theories help advance research in areas such as astrophysics and nuclear physics – for example, understanding how elements form in exploding stars or how processes unfold in nuclear reactors.

    Crawford and the team plan to repeat the half-life experiment again next year, taking advantage of additional beam intensity that will increase the number of isotopes produced, including rare isotopes near the neutron drip-line. In the meantime, other groups will take advantage of the facility’s many beamlines and instruments.

    “Bringing the facility online was a big effort by a lot of people, and something the community has been looking forward to for a long time,” Crawford said. “I’m excited I am young enough to keep taking advantage of it for the next several decades.”

    Multiple institutions collaborated on the first experiment, with researchers from Argonne National Laboratory (ANL), Berkeley Lab, Brookhaven National Laboratory, Florida State University, FRIB, Lawrence Livermore National Laboratory, Louisiana State University, Los Alamos National Laboratory, Mississippi State University, Oak Ridge National Laboratory (ORNL), and the University of Tennessee Knoxville (UTK).

    Scientists from ORNL, UTK, ANL and FRIB led the collaboration to provide the instruments used in the FRIB Decay Station initiator, the sensitive detector system that measured the isotopes.

    Michigan State University (MSU) operates the Facility for Rare Isotope Beams (FRIB) as a user facility for the U.S. Department of Energy Office of Science (DOE-SC), supporting the mission of the DOE-SC Office of Nuclear Physics. Hosting what is designed to be the most powerful heavy-ion accelerator, FRIB enables scientists to make discoveries about the properties of rare isotopes in order to better understand the physics of nuclei, nuclear astrophysics, fundamental interactions, and applications for society, including in medicine, homeland security, and industry.

    ###

    Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 16 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Lab’s facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energy’s Office of Science.

    DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

    [ad_2]

    Lawrence Berkeley National Laboratory

    Source link

  • Observation of a self-generated current to self-confine fusion plasmas

    Observation of a self-generated current to self-confine fusion plasmas

    [ad_1]

    Newswise — Nuclear fusion has drawn more attention in the era of carbon neutrality because of no carbon dioxide production during power generation and no generation of high-level radioactive wastes.

    A tokamak, a torus-shaped nuclear fusion device, needs an electric current in the plasma to produce magnetic field around the torus for confining fusion plasmas. Plasma current is conventionally generated by electromagnetic induction.

    However, for a steady-state fusion reactor, minimizing the inductive current is essential to extend the tokamak operating duration. Several non-inductive current drive schemes have been developed for steady-state operations such as radio-frequency waves and neutral beams. However, commercial reactors require minimal use of these external sources to maximize the fusion gain, Q, the ratio of the fusion power to the external power. Apart from these external current drives, a self-generated current, so-called bootstrap current, was predicted theoretically and demonstrated experimentally.

    The research team led by Prof. Yong-Su Na in the Department of Nuclear Engineering at Seoul National University and Dr. Jaemin Seo at Princeton University have revealed that another type of self-generated current can exist in a tokamak which can not yet been explained by present theories. They discovered this in the experiments on the KSTAR tokamak in collaboration with Korea Institute of Fusion Energy, Princeton Plasma Physics Laboratory, and General Atomics.

    While conducting an experiment on plasma turbulence, it was discovered by chance that an un-identified plasma current that could not be explained by existing theories and simulations occurred. As a result of the analysis, it was found that this comprises a significant amount up to 30% of the total plasma current, and appears when the turbulence was relatively low.

    The discovery of a new plasma current generated by itself without magnetic induction shows a new possibility that the plasma confines by itself and continues the fusion reaction in long-pulse operations for a fusion reactor.

    The new current source in this experiment was unusually observed only when the fuel was injected into the plasma and the exact cause is still unknown, so follow-up studies are planned to proceed actively in the future.

    Prof. Yong-Su Na, the co-first author and corresponding author of the study, said, “This result was obtained from a unfamiliar experiment to the extent that the experiment proposal was not selected at KSTAR. If we had tried to look at it from a conventional point of view, we would not have found it. “We were able to discover new things by approaching with an open perspective rather than being confined to what we wanted to see or get.” Another co-first author, Dr. Seo Jae-min, said, “Big science such as the nuclear fusion research is being devoted to small steps that put an apple on the shoulders of giants. I hope that future scientists who can step forward together will be interested in and support the nuclear fusion research.”

    Once the physics mechanism is found, this new discovery is expected to significantly contribute to the long continuous operation of ITER and commercial reactors, which are exploring current drive ways that do not reoly on inductive current.

     

    This work was supported by National R&D Program through the National Research Foundation of Korea (NRF) funded by the Korean government (Ministry of Science and ICT) (NRF-2021M1A7A4091135 and 2021M3F7A1084419). This work was also supported by the Ministry of Science and ICT under the KFE R&D Program of “KSTAR Experimental Collaboration and Fusion Plasma Research (KFE-EN2201-12)”.

    [ad_2]

    Seoul National University

    Source link

  • Machine Learning Takes Hold in Nuclear Physics

    Machine Learning Takes Hold in Nuclear Physics

    [ad_1]

    Newswise — Scientists have begun turning to new tools offered by machine learning to help save time and money. In the past several years, nuclear physics has seen a flurry of machine learning projects come online, with many papers published on the subject. Now, 18 authors from 11 institutions summarize this explosion of artificial intelligence-aided work in “Machine Learning in Nuclear Physics,” a paper recently published in Reviews of Modern Physics. The paper is also available on arXiv.

    “It was important to document the work that has been done. We really do want to raise the profile of the use of machine learning in nuclear physics to help people see the breadth of the activities,” said Amber Boehnlein, lead author of the paper and the associate director for computational science and technology at the U.S. Department of Energy’s Thomas Jefferson National Accelerator Facility. 

    Because the paper gathers and summarizes major work in the field thus far, Boehnlein hopes it can act as an educational resource for interested readers, as well as a roadmap for future endeavors. 

    “It provides a benchmark that people can use as they go forward into the next phase,” she said.

    A machine learning revolution

    After attending a workshop exploring artificial intelligence at Jefferson Lab in March 2020 and publishing a follow-up report, Boehnlein and two of her co-authors, Witold Nazarewicz and Michelle Kuchera, were inspired to go a step further. Together with 15 colleagues representing all subfields of nuclear physics, they decided to conduct a survey of the state of machine learning projects in nuclear physics. 

    They started at the beginning.  As the authors describe, the first significant work employing machine learning in nuclear physics used computer experiments to study nuclear properties, such as atomic masses, in 1992. Although this work hinted at machine learning’s potential, its use in the field remained minimal for more than two decades. In the last several years, that changed.

    Machine learning, which involves building models that can perform tasks without explicit instruction, requires computers to do specific things, including complicated calculations. With recent advances, computers can better meet these demands, which has allowed physicists to more readily incorporate machine learning into their work. 

    “This would have been a less interesting paper in 2019, because there wouldn’t have been enough work to catalog. But now, there is significant work to cite due to the increased use of the techniques,” Boehnlein said

    Today, machine learning spans all scales and energy ranges of research, from investigations of matter’s building blocks to inquiries into the life cycles of stars. It is also found across the four subfields of nuclear physics: theory, experiment, accelerator science and operations, and data science.

    “We made an effort to compile a comprehensive, collective resource that bridges the efforts in our subfields, which will hopefully spark rich discussions and innovation across nuclear physics,” said co-author Kuchera, who is an associate professor of physics and computer science at Davidson College.

    Machine learning models can be used to help both the design and execution of experiments in nuclear physics. They can also be used to aid in the analysis of those experiments’ data, of which there is often in excess of petabytes.

    “I expect machine learning to become embedded into our data collection and analysis,” Kuchera said.

    Machine learning will speed up these processes, which could mean less time and money is needed for beamtime, computer usage, and other experimental costs.

    Connecting theory and experiment

    So far, however, machine learning has developed the strongest foothold in nuclear theory. Nazarewicz, who is a nuclear theorist and chief scientist at the Facility for Rare Isotope Beams at Michigan State University, is especially interested in this subject. He says that machine learning can help theorists do advanced calculations faster, improve and simplify models, make predictions, and help theorists understand the uncertainties of their predictions. It can also be used to study phenomena that researchers cannot conduct experiments on, such as supernova explosions or neutron stars.

    Neutron stars are not very user friendly,” said Nazarewicz.

    He uses machine learning to study hyperheavy nuclei and elements, which have so many protons and neutrons in their nuclei that they can’t be observed experimentally. 

    “I find the results to be the most impressive in the theory community, particularly the low-energy theory community that Witold is associated with,” Boehnlein said. “They seem to be really embracing these techniques.”

    Boehnlein said theorists have also started to embrace these techniques at Jefferson Lab in their study of proton and neutron structures. Specifically, machine learning can help extract information from complicated theories, such as quantum chromodynamics, the theory that describes the interactions between the quarks and gluons that make up protons and neutrons. 

    The authors predict that machine learning’s involvement in both theory and experiment will speed up these subfields independently, and it will also better interconnect them to speed up the entire loop of the scientific process.

    “Nuclear physics helps us make discoveries to better understand the nature of our universe, and it’s also used for societal applications,” said Nazarewicz. “The faster we can do the cycle between experiment and theory, the faster we will arrive at discoveries and applications.”

    As machine learning continues to grow in this field, the authors expect to see more developments and broader applications incorporating this tool.

    “I think we’re only in the infancy of the application of machine learning to nuclear physics,” Boehnlein said.  

    And, along the way, this paper will act as a reference, even for its own authors. 

    “I hope the paper is used as a resource to understand the current state of machine learning research, allowing us to build from these efforts,” Kuchera said. “My research is centered on machine learning methods, so I absolutely will utilize this paper as a window into the state of machine learning across nuclear physics right now.”

    Further Reading
    Journal Article: Machine Learning in Nuclear Physics

    By Chris Patrick

    -xxx-

    Jefferson Science Associates, LLC, manages and operates the Thomas Jefferson National Accelerator Facility, or Jefferson Lab, for the U.S. Department of Energy’s Office of Science.

    Michigan State University operates the Facility for Rare Isotope Beams as a user facility for the U.S. Department of Energy Office of Science, supporting the mission of the Office of Nuclear Physics. 

    DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science.

    [ad_2]

    Thomas Jefferson National Accelerator Facility

    Source link

  • JLab Welcomes New Experimental Hall Leader

    JLab Welcomes New Experimental Hall Leader

    [ad_1]

    Newswise — NEWPORT NEWS, VA – The U.S. Department of Energy’s Thomas Jefferson National Accelerator Facility has appointed Patrick Carsten Achenbach as the new leader of Jefferson Lab’s Experimental Hall B. The appointment comes after an international search. 

    Long before he was chosen for this position that leads studies of the tiniest particles in nature, Achenbach was fascinated by the biggest. As a schoolboy in his native Germany, he was intrigued by astronomy and the workings of the universe. 

    “But then I learned very quickly that this also relates to some fundamental research in physics – in nuclear and particle physics, where we study the Big Bang and the particles created 14 billion years ago, which are now making up the matter in the universe,” said Achenbach. 

    “It’s not a single topic. It’s all interconnected. Physics really describes the universe on many scales. It describes it on the largest scales of millions and billions of light-years and it can also describe it on the tiniest scales inside of the nucleus,” he said. 

    The star-struck student went on to become an experimental physicist investigating the fundamental makeup of the universe by using powerful particle accelerators to delve deep inside atomic nuclei.

    Now in his new position leading one of four experimental halls at Jefferson Lab, he will promote cutting-edge nuclear physics using the most powerful accelerator of its kind in the world: the Continuous Electron Beam Accelerator Facility, or CEBAF. More than 1,600 nuclear physicists worldwide come to CEBAF, a DOE Office of Science user facility, to conduct their research.

    Leading an experimental hall

    Achenbach began his tenure Sept. 1.

    “I’m very happy to be here,” he said. “It’s a great lab, a world-leading lab in this type of accelerator-based nuclear physics. I’m proud to be part of the group here, and of the team.”

    An experimental hall relies on a vast network of moving parts and precision instruments, including an injector to produce the particle beam; cryogenics systems to supercool components that accelerate the beam; electromagnets to steer it around the accelerator; detectors that can run as big as a house; complex electronics and computing systems; and a small army of highly skilled technicians, engineers and physicists to keep it all humming.

    For each experiment, the particle beam shoots around the nearly mile-long underground racetrack-shaped accelerator at nearly the speed of light. With each lap, the beam gains energy. Once it gains the right amount of energy, it’s directed into an experimental hall, where it smashes into a chosen target. There, detector systems with more than 100,000 electronic channels – or electronic “eyes” – can see and register the fleeting and often rare subatomic particles created in the collision.

    “And all of this needs to be coordinated, and all of these great people need to work together,” Achenbach explained. “So that, in the end, we get results out or we get data that can be analyzed and we can do our research, and maybe we have discovered something new, or we understand something new, or we expand our knowledge.”

    As hall leader, Achenbach will coordinate staff, instruments and experiments, as well as help choose future experiments from among the recommendations of an international advisory committee and the priorities or restrictions of the hall. As he settles into his new position, he plans to look for ways to best develop the hall even more. 

    Discussions are underway, he said, to potentially upgrade CEBAF and increase its energy. Greater energy means even more compelling experiments and the potential for even greater discoveries. The lab is also considering producing a different type of beam – a positron beam – for new kinds of experiments, he said. A positron is the antimatter counterpart of an electron.

    Such upgrades and enhancements would require adapting the experimental halls to accommodate them. 

    A background in physics

    Achenbach most recently served as a professor of experimental physics at the Johannes Gutenberg University in Mainz, Germany. He has a strong background in the operation of experiments and experimental equipment, with leadership roles at electron accelerator and spectrometer facilities. In 2009, he also engaged in research at Jefferson Lab.

    He studied physics and mathematics at Justus Liebig University in Giessen and earned a doctorate at Johannes Gutenberg University before conducting postdoctoral research at the University of Oxford.

    He has served on the Japan Proton Accelerator Research Complex (J-PARC) program advisory committee, as well as on various executive and collaboration boards and steering and collaboration management committees. 

    He worked on the H1 inclusive deep inelastic scattering experiments at the German laboratory DESY; in the A2 and TAPS collaborations at the Mainz Microton accelerator (MAMI) to study nucleon resonances and excitations and pion/eta photoproduction; and in A4 collaborations at MAMI to carry out elastic electron scattering, parity violation and strangeness form factor experiments. He was also involved in cosmic ray and atmospheric neutrino science.

    He was a member of the A1 Collaboration at Mainz and the PANDA Collaboration at the Facility for Antiproton and Ion Research (FAIR) in Darmstadt. He has many years working within the A1 Collaboration on strangeness production, hadron spectroscopy and hypernuclei. He is also involved in the light dark matter searches and beam dump experiment at MESA. 

    By Tamara Dietrich

    -end-

    Jefferson Science Associates, LLC, manages and operates the Thomas Jefferson National Accelerator Facility, or Jefferson Lab, for the U.S. Department of Energy’s Office of Science.

    DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science.

    [ad_2]

    Thomas Jefferson National Accelerator Facility

    Source link