ReportWire

Tag: Lawrence Berkeley National Laboratory

  • Here’s How the Plastic Industry Thinks We Can Solve the Waste Crisis

    Here’s How the Plastic Industry Thinks We Can Solve the Waste Crisis

    [ad_1]

    This story was originally published by Grist. Sign up for Grist’s weekly newsletter here.

    In the time it takes you to read this sentence — say, four seconds — the world produces nearly 60 metric tons of plastic, almost entirely out of fossil fuels. That’s about 53,000 metric tons an hour, 1.3 million metric tons a day, or 460 million metric tons a year. Those numbers are fueling widespread and growing contamination of Earth’s oceans, rivers, and the terrestrial environment with plastic trash.

    In March 2022, the United Nations’ 193 member states got together in Nairobi, Kenya, and agreed to do something about it. They pledged to negotiate a treaty to “end plastic pollution,” with the goal of delivering a final draft by 2025. The most ambitious vision espoused by member states in the negotiating sessions that have taken place so far would require petrochemical companies to stop making so much of the darn stuff by putting a cap on global plastic production.

    Given the existential threat this would pose to fossil fuel and chemical companies, you might expect them to be vociferously opposed to the treaty. Yet they claim to support the agreement. They’re even “championing” it, according to statements from a handful of industry groups. The American Chemistry Council has repeatedly “welcome[d]” progress on the treaty negotiations, while an executive from the International Council of Chemical Associations told Plastics Today in April that the industry is “fully committed” to supporting an agreement.

    So what exactly do plastic-producing companies want from the treaty? To answer this question, Grist sifted through dozens of public statements and policy documents from five of the world’s largest petrochemical industry trade organizations, as well as two product-specific industry groups. These documents included press releases reacting to treaty negotiating sessions and longer position statements detailing the industry’s desired pathway to a “world without waste.”

    Much of what these groups have published is vague — many documents call for “targets,” for example, without saying what they should be. Grist reached out to all of the groups for clarification, but only two agreed to answer questions about the policies they support.

    What we found is that, although they fall far short of what so-called “high-ambition” countries and advocacy groups would like to get out of the treaty, industry groups’ proposals to bolster recycling and waste collection could cause a significant reduction in mismanaged plastic waste — even in the absence of a cap on plastic production. According to a policy analysis tool developed by researchers at the University of California, the elements of the treaty that industry groups support, cobbled together, could cut global plastic pollution by 43 million metric tons annually by 2050 — a 36 percent reduction below business-as-usual estimates.

    Read Next: UN plastics treaty inches closer to reality as lobbyists tout plastics’ ‘massive societal benefits’

    Meanwhile, a realistic production cap could cut annual pollution by 48 million metric tons all by itself. Excluding a production cap from the treaty will make it much harder to rein in plastic pollution, said Douglas McCauley, an associate professor of biology at the University of California, Santa Barbara, and one of the creators of the policy tool. “It means you really have to ramp up your ambition on what some of the other policies would need to do,” he told Grist.

    These numbers matter, because the plastic industry’s influence over the treaty negotiations seems to be growing stronger. At the most recent round of talks — held in Ottawa, Canada, at the end of April — nearly 200 petrochemical and fossil fuel lobbyists registered to attend. That’s 37 more than were registered for the previous session, and more than the number of representatives from European Union member states.

    At the same time, several delegations promoted solutions on the industry’s terms. Malaysia warned about the unintended economic consequences of limiting plastic production, and India said the treaty should focus on pollution while considering plastics’ utility to modern society. Given the power of the plastics industry and the tendency of international negotiations to cater to the lowest common denominator, it’s possible that the treaty will strongly reflect these industry priorities.

    How the industry sees the problem

    To understand the industry position on the plastics treaty, it’s important to understand how plastic makers conceptualize the plastics crisis. While they agree that pollution is a scourge, they don’t think the solution is to reduce society’s production and use of plastic. After all, plastics come with myriad benefits. They’re inexpensive, lightweight, and widely used in important sectors like clean energy and medicine — their “unmatched properties and versatility have allowed for incredible innovations that conserve resources and make more things in life possible,” as the Plastics Industry Association has put it. America’s Plastic Makers, an arm of the American Chemistry Council, says policymakers should ensure that the material stays “in our economy and out of our environment.”

    The way to do this, according to industry groups, is through “plastics circularity,” a concept that seeks to keep the material in use for as long as possible before it’s thrown away. Generally, this means more recycling. But circularity can also refer to scaled-up systems allowing plastic to be reused, or better infrastructure for waste collection. As plastic makers see it, the plastic treaty’s function should be to increase circularity while retaining the social and economic benefits derived from plastic products.

    Perhaps the biggest problem faced by circularity proponents is plastic’s abysmal recycling rate. At present, the world only recycles about 9 percent of all plastic it produces; the rest gets sent to landfills or incinerators, or winds up as litter. What’s more, in most cases the material can only be reprocessed once or twice — if at all — before it has to be “downcycled” into lower-quality products like carpeting. Although some experts believe it’s impossible to recycle much more plastic due to technological and economic constraints, plastic makers say otherwise. Indeed, plastics circularity hinges on the possibility of a better recycling rate.

    The industry’s first solution: Recycling targets

    To that end, several industry groups — including the World Plastics Council, the self-described “global voice of the plastics industry” — are advocating for “mandatory minimum recycling rates” as part of the treaty, as well as higher targets for recycled content used in new products.

    This could mean that countries, regions, or other jurisdictions would set legally binding quotas for the amount of plastic recycled within their borders and then converted into new items. Plastic makers typically favor targets that are set at the local or national level and that differentiate based on the type of plastic, since some types are harder to recycle than others.

    Industry groups also want recycling targets to be “technology-neutral,” meaning they should count plastics processed through controversial “chemical recycling” techniques. Although these techniques do not yet work at scale, the industry says they will one day be able to break down mixed post-consumer plastic into their constituent polymers using high heat and pressure, and then turn those polymers back into new plastic products. Environmental experts oppose chemical recycling, pointing to evidence that it is primarily used to burn plastics or turn them into fuel.

    The two policies — on plastics recycling and recycled content — could be mutually reinforcing, with the latter creating a more reliable market for the recycled material generated by the former. Ross Eisenberg, president of America’s Plastic Makers, told Grist via email that recycling and recycled content targets would “create demand signals and provide added certainty for companies to make additional investments for a circular economy, so more plastic products are reused or remade into new plastic products.”

    According to Plastics Europe, the continent’s main plastic trade group, boosting the recycling rate would decrease countries’ dependence on fossil fuels used to make virgin plastics.

    Plastics Europe and the World Plastics Council declined to be interviewed for this article. They did not respond to questions about their support for specific recycling and recycled content targets, although Plastics Europe has voiced support for “mandatory data and reporting objectives for all stages of the life cycle of the plastics system.” For the U.S., America’s Plastic Makers supports a 30 percent recycled content requirement in plastic packaging by 2030, and for 100 percent of plastic packaging to be “reused, recycled, or recovered by 2040.”

    The industry’s second solution: Infrastructure and design changes

    Additional policies supported by industry groups could indirectly facilitate an increase in the plastics recycling rate by raising money for recycling infrastructure. These policies typically involve systems for “extended producer responsibility,” or EPR, requiring companies that make and sell plastics to help pay for the collection and recycling of the waste they generate, as well as the cleanup of existing plastic pollution. Every industry group Grist reached out to says it supports EPR as a part of the treaty, although some specifically note in their policy documents that such policies should be adopted at the local or national level, rather than globally. Some groups, including the American Chemistry Council and Global Partners for Plastics Circularity — an umbrella group supported by a dozen plastics associations and companies — also call more vaguely for additional financing through “public-private partnerships and blended finance.”

    For plastic packaging — which accounts for about 36 percent of global plastic production — a European industry consortium called the Circular Economy for Flexible Packaging supports “mandatory legislation on product design” to make products easier to recycle. It doesn’t back any specific design elements, but points to ideas laid out by the Consumer Goods Forum, an industry-led network of consumer product retailers and manufacturers. These ideas include using clear instead of colored plastics, limiting the use of unnecessary plastic wrap, and ensuring that any adhesives or inks applied to plastic packaging don’t render it nonrecyclable. Plastics Europe additionally supports technical and design standards for biodegradable and compostable plastics intended to replace those made from fossil fuels.

    Many groups also say they support targets for “pellet containment,” referring to the tiny plastic pieces that are melted down and shaped into larger items. These pellets are notorious for spilling out of manufacturing facilities or off of cargo ships and into waterways; in Europe, 20 truckloads of them escape into the environment every day. Several trade groups say in their public statements that they support an industry-led program called Operation Clean Sweep to help companies achieve “zero resin loss” by “fostering a venue for precompetitive collaboration and peer-learning opportunities.”

    However, Operation Clean Sweep has been around since 1991 and has not yet achieved its goal; some policymakers have recently called for stricter regulations on plastic pellet loss.

    The industry’s third solution: Application-based regulations

    In addition to capping plastic production, many countries’ delegates — along with scientists and environmental groups — would like the treaty to ban or restrict some of the most problematic plastic polymers, as well as certain chemicals used in plastics. They call these “chemicals and polymers of concern,” meaning those least likely to be recycled, or most likely to damage people’s health and the environment. Potential candidates include polyvinyl chloride, widely used in water pipes, upholstery, toys, and other applications; expanded polystyrene, or EPS, the foamy plastic that’s often used in takeout food containers; and endocrine-disrupting chemicals such as phthalates, bisphenols, and per- and polyfluoroalkyl substances.

    The general idea of identifying problematic chemicals and polymers in the plastics treaty is very popular; observers of the negotiations say it’s been one of the areas of greatest convergence among delegates. Industry groups are also supportive — but only of a very specific approach. According to the World Plastics Council, the treaty shouldn’t include “arbitrary bans or restrictions on substances or materials,” but rather regulations based on the “essential use and societal value” of particular types of plastic.

    For instance, polystyrene used in packing peanuts and takeout containers is virtually never recycled and might be a good candidate for restriction. But the Global Expanded Polystyrene Sustainability Alliance — a trade group for EPS makers — points to evidence that, in Europe and Japan, the material can be recycled at least 30 percent of the time when it’s in a different format — namely, insulation for products like coolers, as well as big pieces used to protect fragile shipments.

    In a press release, the group said this distinction in polystyrene formatting demonstrates the need to assess plastics’ “individual material applications and uses independently.”

    “We’ve got five major types” of polystyrene, said Betsy Bowers, executive director of the Expanded Polystyrene Industry Alliance, a trade group representing the U.S. EPS market. “Some of them can be recycled, and some of them can’t.”

    Plastics Europe has said an application-based approach could also consider plastic products on the basis of “leakage,” how easily the products become litter; the feasibility of redesigning them; or “effects on human or animal health.” That said, the organization does not support restricting plastic-related chemicals as part of the treaty, beyond what is already spelled out in existing international agreements like the Stockholm Convention. The International Council of Chemical Associations, whose members include individual chemical manufacturers and regional trade groups, does not support any chemical regulation as part of the treaty.

    In an email to Grist, the American Chemistry Council said it supports a “decision-tree approach” to prevent specific plastic products from leaking into the environment. The organization said in a letter sent to President Joe Biden last May that it opposes “restrictions of trade in chemicals or polymers” because they would “make U.S. manufacturers less competitive and/or jeopardize the many benefits plastics provide to the economy and the environment.”

    The International Council of Chemical Associations, the Plastics Industry Association, and the Circular Economy for Flexible Packaging initiative did not respond to Grist’s request to be interviewed for this story, or to questions about the policies they support.

    The impact of the plastic industry’s favorite policies

    While it’s clear that self-preservation is at the heart of the petrochemical industry’s agenda for the plastics treaty, the policies it supports could have a positive impact on plastic pollution. According to the policy analysis tool created by researchers at the University of California, Berkeley and the University of California, Santa Barbara, a suite of ambitious policies to hit recycling and recycled content rates of 20 percent, reuse 60 percent of plastic packaging (where applicable), and dedicate $35 billion to plastics recycling and waste infrastructure could prevent 43 million metric tons of plastic pollution annually by midcentury. Most of this reduction would come from the infrastructure funding.

    McCauley, one of the creators of the tool, said these policies are certainly better than nothing. They can bring the world “closer to a future without plastic pollution,” he told Grist, although he emphasized that recycling is not a silver bullet.

    The policy tool takes for granted that higher recycling and recycled content rates are achievable, but this might not be the case. Bjorn Beeler, executive director and international coordinator for the nonprofit International Pollutants Elimination Network, said a 20 percent recycling rate would be “nearly impossible” to reach, given the relatively low cost of virgin plastic and the petrochemical industry’s projected expansion over the coming decades. Jan Dell, an independent chemical engineer and founder of the nonprofit The Last Beach Cleanup, estimated the maximum possible recycled content rate for consumer product packaging would be about 5 percent, due to insurmountable technological constraints related to plastics’ toxicity.

    Experts tend to favor plastic production caps as a much faster, reliable, and more straightforward way to reduce plastic pollution than relying on recycling. According to McCauley’s policy tool, capping plastic production at the level reached in 2019 would prevent 48 million metric tons of annual plastic pollution by 2050 — even in the absence of any efforts to boost recycling or fund waste management. “It’s possible to be effective without the cap,” said Sam Pottinger, a senior research data scientist at the University of California, Berkeley, and a contributor to the policy tool. “But it requires a huge amount of effort elsewhere.”

    There’s no reason the plastics treaty couldn’t incorporate a production cap in addition to the industry’s preferred recycling interventions. Some experts say this would form the most effective agreement; according to the policy tool, a production cap at 2019 levels plus the suite of recycling targets and funding for waste infrastructure could prevent nearly 78 million metric tons of annual plastic pollution by 2050. Bumping up the funding for recycling and waste infrastructure to an aggressive $200 billion, in combination with the production cap and other policies, would avert nearly 109 million metric tons of pollution each year.

    “We need to use all of the tools in our toolbox,” said Zoie Diana, a postdoctoral plastics researcher at the University of Toronto who was not involved in creating the policy tool. She too emphasized, however, that governments should prioritize reducing plastic production.

    What the industry doesn’t like to talk about

    The case for a production cap goes beyond plastic litter concerns. It would also address the inequitable impact of toxic pollution from plastic manufacturing facilities, as well as the industry’s contribution to climate change. In April, a study from the Lawrence Berkeley National Laboratory found that plastic production already accounts for 5 percent of global climate pollution, and that by 2050 — given the petrochemical industry’s plans to dramatically ramp up plastic production — it could eat up one-fifth of the world’s remaining carbon budget, the amount of emissions the world can release while still limiting global warming to 1.5 degrees Celsius (2.7 degrees Fahrenheit). To achieve international climate goals, some environmental groups have estimated that the world must reduce plastic production by 12 to 17 percent every year starting in 2024.

    “Whether the treaty includes plastic production cuts is not just a policy debate,” said Jorge Emmanuel, an adjunct professor at Silliman University in the Philippines, in a statement describing the mountains of plastic trash that are harming Filipino communities. “It’s a matter of survival.”

    Petrochemical companies, for their part, do not deeply engage with these arguments — at least not in their public policy documents. They claim that plastics actually help mitigate climate change, since the lightweight material takes less fuel to transport than alternatives made of metal and glass. And industry groups’ public statements mostly do not address environmental justice concerns related to plastic use, production, and disposal, except to vaguely say that the treaty shouldn’t harm waste pickers — the millions of workers, most of them in the developing world, who make a living collecting plastic trash and selling it to recyclers.

    The fifth and final round of negotiations for the plastics treaty is scheduled to take place in Busan, South Korea, this November. Although many observers, including a group of U.S. Congressional representatives and the U.N. High Commissioner for Human Rights, have called for conflict-of-interest policies to limit trade groups’ influence over the talks, these requests face long odds. The dozens of countries advocating for production limits may have to defend their proposals against an even larger industry presence than they did at the last session in Ottawa.

    This article originally appeared in Grist at https://grist.org/accountability/petrochemical-industry-global-plastics-treaty-production-cap-recycling-policies/. Grist is a nonprofit, independent media organization dedicated to telling stories of climate solutions and a just future. Learn more at Grist.org

    [ad_2]

    Joseph Winters, Grist

    Source link

  • ‘Category 5’ was considered the worst hurricane. There’s something scarier, study says.

    ‘Category 5’ was considered the worst hurricane. There’s something scarier, study says.

    [ad_1]

    As fearsome as Category 5 hurricanes can be for people living in harm’s way, a new study reports global warming is supercharging some of the most intense cyclones with winds high enough to merit a hypothetical Category 6.

    The world’s most intense hurricanes are growing even more intense, fueled by rising temperatures in the ocean and atmosphere, according to the study published Monday in the Proceedings of the National Academy of Sciences. And, the authors say, a Category 5 on the traditional wind scale underestimates their dangers.

    “As a cautious scientist, you never want to cry wolf,” said Michael Wehner, co-author and climate scientist at the Lawrence Berkeley National Laboratory. But after searching for the signature of climate change in the world’s most intense cyclones, Wehner said he and co-author Jim Kossin found “the wolf is here.”

    “Significantly increasing” temperatures, fueled by greenhouse gas emissions, up the energy available to the most intense tropical cyclones, reported Wehner and Kossin, a retired federal scientist and science advisor at the nonprofit First Street Foundation.

    More cyclones are making the most of it, gaining higher wind speeds and more intensity, the authors said, and their evidence shows that will occur even more often as the world grows warmer.

    They used a hypothetical Category 6, with a minimum threshold of 192 mph, to study hurricanes that have occurred in the modern satellite era, since around 1980. They found five hurricanes and typhoons that would have met the criteria and all five occurred within the last decade.

    To be clear, they aren’t proposing adding that category to the National Hurricane Center’s wind scale, which experts say would require a lengthy process and many partners. But they are hoping to “inform broader discussions about how to better communicate risk in a warming world,” Kossin told USA TODAY.

    Their findings emphasize that the dangers associated with a Category 5 cyclone are increasing as storms intensify above the Cat 5’s 157-mph threshold and that results in an underestimation of risk, he said.

    An enhanced satellite image released National Oceanic and Atmospheric Administration on Oct.23, 2015, shows Hurricane Patricia as it approaches the coastline of Mexico from the Eastern Pacific.

    An enhanced satellite image released National Oceanic and Atmospheric Administration on Oct.23, 2015, shows Hurricane Patricia as it approaches the coastline of Mexico from the Eastern Pacific.

    They found the chances of that potential intensity occurring in such storms have more than doubled since 1979. They say the areas where the growing risks of these storms are of greatest concern are the Gulf of Mexico, the Philippines, parts of Southeast Asia and Australia.

    Their peer-reviewed, scientific research provides the evidence pointing to climate change that some scientists have been waiting for.

    For more than 35 years, the scientific community has expected to see thermodynamic wind speeds increase in hurricanes, said Kerry Emanuel, the climate scientist who edited the paper for the journal. “And now we are seeing this increase in both climate analyses and models..”

    What is the Saffir-Simpson Hurricane Wind Scale?

    The hurricane center has used the well-known scale – with wind speed ranges for each of five categories – since the 1970s. The minimum threshold for Category 5 winds is 157 mph.

    Designed by engineer Herbert Saffir and adapted by former center director Robert Simpson, the scale stops at Category 5 since winds that high would “cause rupturing damages that are serious no matter how well it’s engineered,” Simpson said during a 1999  interview.

    The Saffir-Simpson scale categorizes hurricanes.The Saffir-Simpson scale categorizes hurricanes.

    The Saffir-Simpson scale categorizes hurricanes.

    The open-ended Category 5 describes anything from “a nominal Category 5 to infinity,” Kossin said. “That’s becoming more and more inadequate with time because climate change is creating more and more of these unprecedented intensities.”

    A Category 6?

    Scientists, including Kossin, have occasionally brought up adding another category to the scale for more than 20 years.

    Climate scientist Michael Mann, director of the Penn Center for Science, Sustainability & the Media at the University of Pennsylvania, has argued for years that the Earth is “experiencing a new class of monster storms – ‘Category 6’ – hurricanes,” thanks to the effects of human-caused warming.

    Mann wrote a commentary to the Wehner and Kossin study, published in the same journal Monday, saying their work lays out an objective case for expanding the scale to include the “climate change-fueled stronger and more destructive storms.”

    “We are witnessing hurricanes that – by any logical extension of the existing Saffir-Simpson scale – deserve to be placed in a whole separate, more destructive category from the traditionally defined (category 5) ‘strongest’ storms,” Mann wrote.

    The research adds to a growing discussion about how the center, emergency managers and others could better convey the full range of hazards from a major hurricane.

    Climate change Is it fueling hurricanes in the Atlantic? Here’s what science says.

    Hurricane scale doesn’t measure other, greater risks

    The Saffir-Simpson scale only describes the wind risk and does not account for coastal storm surge and rainfall-driven flooding, the two biggest killers in hurricanes.

    Adding a sixth category to the wind scale wouldn’t help address those concerns, Kossin said.

    The hurricane center has tried to steer the focus toward the individual hazards, including storm surge, wind, rainfall, tornadoes and rip currents, Jamie Rhome, the center’s deputy executive director, said last week. “So, we don’t want to over-emphasize the wind hazard by placing too much emphasis on the category.”

    Despite the center’s efforts, the storm’s wind category always gets the most attention from the public when a storm approaches.

    “That focus on category over the years has detracted from effective communication of the other hazards,” said James Franklin, a retired branch chief for hurricane specialists at the hurricane center. “The emphasis at the NHC, rightly, has been to focus on the hazards,” he said.

    Ultimately, the decision would likely rest with the center, but Kossin said the conversation would “have to happen over time with a lot of input” from the Federal Emergency Management Agency, social scientists and others.

    It’s likely the World Meteorological Organization would be asked to weigh in because of the international scope involved in hurricane and typhoon forecasting, Franklin said. That’s the same group that sets the list of hurricane names for each season.

    To Franklin, the question is what would a sixth category accomplish?

    “If there are things that emergency managers would do differently, or the public might do differently because a storm has 195 mph winds versus 160 mph winds, then maybe the categories should be changed,” he said. “Personally, I’m getting out of the way if it’s 165 mph winds or 195 mph winds.”

    This infrared satellite image shows Hurricane Patricia over the Pacific Ocean on Oct. 23, 2015.This infrared satellite image shows Hurricane Patricia over the Pacific Ocean on Oct. 23, 2015.

    This infrared satellite image shows Hurricane Patricia over the Pacific Ocean on Oct. 23, 2015.

    Which storms fit the study’s hypothetical Category 6 description?

    One hurricane in the eastern Pacific, Patricia, and four typhoons in the western Pacific:

    Haiyan, November 2013: Struck the southern Philippines with 196-mph winds and a storm surge of almost 25 feet, killing 6,300 people and leaving 4 million homeless.

    Patricia, October 2015: Reached winds of 216 mph at sea, then dropped before it made landfall in Jalisco, Mexico as a Category 4 storm.

    Meranti, September 2016: Moved between the Philippines and Taiwan before making landfall in eastern China. Its winds reached 196 mph.

    Goni, November 2020: Made landfall in the Philippines with winds estimated at 196 mph.

    Surigae, April 2021: Reached wind speeds of 196 mph over the ocean, tracking east of the Philippines. Its max winds were the highest ever recorded for a storm from January to April anywhere in the world.

    Dinah Voyles Pulver covers climate and environmental issues for USA TODAY. Reach her at dpulver@gannett.com or @dinahvp.

    This article originally appeared on USA TODAY: Category 6 hurricane? That’s what a new study suggests. Here’s why.

    [ad_2]

    Source link

  • Climate Change Likely to Uproot More Amazon Trees

    Climate Change Likely to Uproot More Amazon Trees

    [ad_1]

    Newswise — Tropical forests are crucial for sucking up carbon dioxide from the atmosphere. But they’re also subject to intense storms that can cause “windthrow” – the uprooting or breaking of trees. These downed trees decompose, potentially turning a forest from a carbon sink into a carbon source.

    A new study finds that more extreme thunderstorms from climate change will likely cause a greater number of large windthrow events in the Amazon rainforest. This is one of the few ways that researchers have developed a link between storm conditions in the atmosphere and forest mortality on land, helping fill a major gap in models.

    “Building this link between atmospheric dynamics and damage at the surface is very important across the board,” said Jeff Chambers, a senior faculty scientist at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), and director of the Next Generation Ecosystem Experiments (NGEE)-Tropics project, which performed the research. “It’s not just for the tropics. It’s high-latitude, low-latitude, temperate-latitude, here in the U.S.”

    Researchers found that the Amazon will likely experience 43% more large blowdown events (of 25,000 square meters or more) by the end of the century. The area of the Amazon likely to see extreme storms that trigger large windthrows will also increase by about 50%. The study was published in the journal Nature Communications on Jan. 6.

    “We want to know what these extreme storms and windthrows mean in terms of the carbon budget and carbon dynamics, and for carbon sinks in the forests,” Chambers said. While downed trees slowly release carbon as they decompose, the open forest becomes host to new plants that pull carbon dioxide from the air. “It’s a complicated system, and there are still a lot of pieces of the puzzle that we’re working on. In order to answer the question more quantitatively, we need to build out the land-atmosphere links in Earth system models.”  

    To find the link between air and land, researchers compared a map of more than 1,000 large windthrows with atmospheric data. They found that a measurement known as CAPE, the “convective available potential energy,” was a good predictor of major blowdowns. CAPE measures the amount of energy available to move parcels of air vertically, and a high value of CAPE often leads to thunderstorms. More extreme storms can come with intense vertical winds, heavy rains or hail, and lightning, which interact with trees from the canopy down to the soil.

    “Storms account for over half of the forest mortality in the Amazon,” said Yanlei Feng, first author on the paper. “Climate change has a lot of impact on Amazon forests, but so far, a large fraction of the research focus has been on drought and fire. We hope our research brings more attention to extreme storms and improves our models to work under a changing environment from climate change.”

    While this study looked at a future with high carbon emissions (a scenario known as SSP-585), scientists could use projected CAPE data to explore windthrow impacts in different emissions scenarios. Researchers are now working to integrate the new forest-storm relationship into Earth system models. Better models will help scientists explore how forests will respond to a warmer future – and whether they can continue to siphon carbon out of the atmosphere or will instead become a contributor.

    “This was a very impactful climate change study for me,” said Feng, who completed the research as a graduate student researcher in the NGEE-Tropics project at Berkeley Lab. She now studies carbon capture and storage at the Carnegie Institution for Science at Stanford University. “I’m worried about the projected increase in forest disturbances in our study and I hope I can help limit climate change. So now I’m working on climate change solutions.” 

    NGEE-Tropics is a ten-year, multi-institutional project funded by the U.S. Department of Energy’s Office of Science, Office of Biological and Environmental Research.

    ###

    Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 16 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Lab’s facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energy’s Office of Science.

    DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

    [ad_2]

    Lawrence Berkeley National Laboratory

    Source link

  • Electronic bridge allows rapid energy sharing between semiconductors

    Electronic bridge allows rapid energy sharing between semiconductors

    [ad_1]

    Newswise — As semiconductor devices become ever smaller, researchers are exploring two-dimensional (2D) materials for potential applications in transistors and optoelectronics. Controlling the flow of electricity and heat through these materials is key to their functionality, but first we need to understand the details of those behaviors at atomic scales.

    Now, researchers have discovered that electrons play a surprising role in how energy is transferred between layers of 2D semiconductor materials tungsten diselenide (WSe2) and tungsten disulfide (WS2). Although the layers aren’t tightly bonded to one another, electrons provide a bridge between them that facilitates rapid heat transfer, the researchers found.

    “Our work shows that we need to go beyond the analogy of Lego blocks to understand stacks of disparate 2D materials, even though the layers aren’t strongly bonded to one another,” said Archana Raja, a scientist at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), who led the study. “The seemingly distinct layers, in fact, communicate through shared electronic pathways, allowing us to access and eventually design properties that are greater than the sum of the parts.”

    The study appeared recently in Nature Nanotechnology and combines insights from ultrafast, atomic-scale temperature measurements and extensive theoretical calculations.

    “This experiment was motivated by fundamental questions about atomic motions in nanoscale junctions, but the findings have implications for energy dissipation in futuristic electronic devices,” said Aditya Sood, co-first author of the study and currently a research scientist at Stanford University. “We were curious about how electrons and atomic vibrations couple to one another when heat flows between two materials. By zooming into the interface with atomic precision, we uncovered a surprisingly efficient mechanism for this coupling.”

    An ultrafast thermometer with atomic precision

    The researchers studied devices consisting of stacked monolayers of WSe2 and WS2. The devices were fabricated by Raja’s group at Berkeley Lab’s Molecular Foundry, who perfected the art of using Scotch tape to lift off crystalline monolayers of the semiconductors, each less than a nanometer in thickness. Using polymer stamps aligned under a home-built stacking microscope, these layers were deposited on top of each other and precisely placed over a microscopic window to enable the transmission of electrons through the sample.

    In experiments conducted at the Department of Energy’s SLAC National Accelerator Laboratory, the team used a technique known as ultrafast electron diffraction (UED) to measure the temperatures of the individual layers while optically exciting electrons in just the WSe2 layer. The UED served as an “electron camera”, capturing the atom positions within each layer. By varying the time interval between the excitation and probing pulses by trillionths of a second, they could track the changing temperature of each layer independently, using theoretical simulations to convert the observed atomic movements into temperatures.

    “What this UED approach enables is a new way of directly measuring temperature within this complex heterostructure,” said Aaron Lindenberg, a co-author on the study at Stanford University. “These layers are only a few angstroms apart, and yet we can selectively probe their response and, as a result of the time resolution, can probe at fundamental time scales how energy is shared between these structures in a new way.”

    They found that the WSe2 layer heated up, as expected, but to their surprise, the WS2 layer also heated up in tandem, suggesting a rapid transfer of heat between layers. By contrast, when they didn’t excite electrons in the WSe2 and heated the heterostructure using a metal contact layer instead, the interface between WSe2 and WS2 transmitted heat very poorly, confirming previous reports.

    “It was very surprising to see the two layers heat up almost simultaneously after photoexcitation and it motivated us to zero in on a deeper understanding of what was going on,” said Raja.

    An electronic “glue state” creates a bridge

    To understand their observations, the team employed theoretical calculations, using methods based on density functional theory to model how atoms and electrons behave in these systems with support from the Center for Computational Study of Excited-State Phenomena in Energy Materials (C2SEPEM), a DOE-funded Computational Materials Science Center at Berkeley Lab.

    The researchers conducted extensive calculations of the electronic structure of layered 2D WSe2/WS2, as well as the behavior of lattice vibrations within the layers. Like squirrels traversing a forest canopy, who can run along paths defined by branches and occasionally jump between them, electrons in a material are limited to specific states and transitions (known as scattering), and knowledge of that electronic structure provides a guide to interpreting the experimental results.

    “Using computer simulations, we explored where the electron in one layer initially wanted to scatter to, due to lattice vibrations,” said Jonah Haber, co-first author on the study and now a postdoctoral researcher in the Materials Sciences Division at Berkeley Lab. “We found that it wanted to scatter to this hybrid state – a kind of ‘glue state’ where the electron is hanging out in both layers at the same time. We have a good idea of what these glue states look like now and what their signatures are and that lets us say relatively confidently that other, 2D semiconductor heterostructures will behave the same way.”

    Large-scale molecular dynamics simulations confirmed that, in the absence of the shared electron “glue state”, heat took far longer to move from one layer to another. These simulations were conducted primarily at the National Energy Research Scientific Computing Center (NERSC).

    “The electrons here are doing something important: they are serving as bridges to heat dissipation,” said Felipe de Jornada, a co-author from Stanford University. “If we can understand and control that, it offers a unique approach to thermal management in semiconductor devices.”

    NERSC and the Molecular Foundry are DOE Office of Science user facilities at Berkeley Lab.

    This research was funded primarily by the Department of Energy’s Office of Science.  

    ### 

    Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 16 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Lab’s facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energy’s Office of Science.

    DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

    [ad_2]

    Lawrence Berkeley National Laboratory

    Source link

  • Berkeley Lab Scientists Develop a Cool New Method of Refrigeration

    Berkeley Lab Scientists Develop a Cool New Method of Refrigeration

    [ad_1]

    Newswise — Adding salt to a road before a winter storm changes when ice will form. Researchers at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) have applied this basic concept to develop a new method of heating and cooling. The technique, which they have named “ionocaloric cooling,” is described in a paper published Dec. 23 in the journal Science.

    Ionocaloric cooling takes advantage of how energy, or heat, is stored or released when a material changes phase – such as changing from solid ice to liquid water. Melting a material absorbs heat from the surroundings, while solidifying it releases heat. The ionocaloric cycle causes this phase and temperature change through the flow of ions (electrically charged atoms or molecules) which come from a salt.

    Researchers hope that the method could one day provide efficient heating and cooling, which accounts for more than half of the energy used in homes, and help phase out current “vapor compression” systems, which use gases with high global warming potential as refrigerants. Ionocaloric refrigeration would eliminate the risk of such gases escaping into the atmosphere by replacing them with solid and liquid components.

    “The landscape of refrigerants is an unsolved problem: No one has successfully developed an alternative solution that makes stuff cold, works efficiently, is safe, and doesn’t hurt the environment,” said Drew Lilley, a graduate research assistant at Berkeley Lab and PhD candidate at UC Berkeley who led the study. “We think the ionocaloric cycle has the potential to meet all those goals if realized appropriately.”

    Finding a solution that replaces current refrigerants is essential for countries to meet climate change goals, such as those in the Kigali Amendment (accepted by 145 parties, including the United States in October 2022). The agreement commits signatories to reduce production and consumption of hydrofluorocarbons (HFCs) by at least 80% over the next 25 years. HFCs are powerful greenhouse gases commonly found in refrigerators and air conditioning systems, and can trap heat thousands of times as effectively as carbon dioxide.

    The new ionocaloric cycle joins several other kinds of “caloric” cooling in development. Those techniques use different methods – including magnetism, pressure, stretching, and electric fields – to manipulate solid materials so that they absorb or release heat. Ionocaloric cooling differs by using ions to drive solid-to-liquid phase changes. Using a liquid has the added benefit of making the material pumpable, making it easier to get heat in or out of the system – something solid-state cooling has struggled with.

    Lilley and corresponding author Ravi Prasher, a research affiliate in Berkeley Lab’s Energy Technologies Area and adjunct professor in mechanical engineering at UC Berkeley, laid out the theory underlying the ionocaloric cycle. They calculated that it has the potential to compete with or even exceed the efficiency of gaseous refrigerants found in the majority of systems today.

    They also demonstrated the technique experimentally. Lilley used a salt made with iodine and sodium, alongside ethylene carbonate, a common organic solvent used in lithium-ion batteries. 

    “There’s potential to have refrigerants that are not just GWP [global warming potential]-zero, but GWP-negative,” Lilley said. “Using a material like ethylene carbonate could actually be carbon-negative, because you produce it by using carbon dioxide as an input. This could give us a place to use CO2 from carbon capture.”

    Running current through the system moves the ions, changing the material’s melting point. When it melts, the material absorbs heat from the surroundings, and when the ions are removed and the material solidifies, it gives heat back. The first experiment showed a temperature change of 25 degrees Celsius using less than one volt, a greater temperature lift than demonstrated by other caloric technologies.

    “There are three things we’re trying to balance: the GWP of the refrigerant, energy efficiency, and the cost of the equipment itself,” Prasher said. “From the first try, our data looks very promising on all three of these aspects.”

    While caloric methods are often discussed in terms of their cooling power, the cycles can also be harnessed for applications such as water heating or industrial heating. The ionocaloric team is continuing work on prototypes to determine how the technique might scale to support large amounts of cooling, improve the amount of temperature change the system can support, and improve the efficiency. 

    “We have this brand-new thermodynamic cycle and framework that brings together elements from different fields, and we’ve shown that it can work,” Prasher said. “Now, it’s time for experimentation to test different combinations of materials and techniques to meet the engineering challenges.”

    This work was supported by the DOE’s Energy Efficiency and Renewable Energy Building Technologies Program.

    ###

    Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 16 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Lab’s facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energy’s Office of Science.

    DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

    [ad_2]

    Lawrence Berkeley National Laboratory

    Source link

  • FRIB Experiment Pushes Elements to the Limit

    FRIB Experiment Pushes Elements to the Limit

    [ad_1]

    Newswise — A new study led by the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) has measured how long it takes for several kinds of exotic nuclei to decay. The paper, published today in Physical Review Letters, marks the first experimental result from the Facility for Rare Isotope Beams (FRIB), a DOE Office of Science user facility operated by Michigan State University.

    Scientists used the one-of-a-kind facility to better understand nuclei, the collection of protons and neutrons found at the heart of atoms. Understanding these basic building blocks allows scientists to refine their best models and has applications in medicine, national security, and industry.

    “The breadth of the facility and the programs that are being pursued are really exciting to watch,” said Heather Crawford, a physicist at Berkeley Lab and lead spokesperson for the first FRIB experiment. “Research is going to be coming out in different areas that will impact things we haven’t even thought of yet. There’s so much discovery potential.”

    The first experiment is just a small taste of what’s to come at the facility, which will become 400 times more powerful over the coming years. “It’s going to be really exciting – mind-blowing, honestly,” Crawford said.

    More than 50 participants from ten universities and national laboratories were involved in the first experiment. The study looked at isotopes of several elements. Isotopes are variations of a particular element; they have the same number of protons but can have different numbers of neutrons.

    Researchers focused on unstable isotopes near the “drip-line,” the spot where neutrons can no longer bind to a nucleus. Instead, any additional neutrons drip off, like water from a saturated kitchen sponge.

    Researchers smashed a beam of stable calcium-48 nuclei traveling at about 60% of the speed of light into a beryllium target. The calcium fragmented, producing a slew of isotopes that were separated, individually identified, and delivered to a sensitive detector that measured how long they took to decay. The result? The first reported measurements of half-lives for five exotic, neutron-laden isotopes of phosphorus, silicon, aluminum, and magnesium.

    Half-life measurements (perhaps best known from applications in carbon dating) are one of the first things researchers can observe about these short-lived particles. The fundamental information about nuclei at the limits of their existence provides a useful test for different models of the atomic world.

    “This is a basic science question, but it links to the bigger picture for the field,” Crawford said. “Our aim is to describe not only these nuclei, but all kinds of nuclei. These models help us fill in the gaps, which helps us more reliably predict things we haven’t been able to measure yet.”

    More complete theories help advance research in areas such as astrophysics and nuclear physics – for example, understanding how elements form in exploding stars or how processes unfold in nuclear reactors.

    Crawford and the team plan to repeat the half-life experiment again next year, taking advantage of additional beam intensity that will increase the number of isotopes produced, including rare isotopes near the neutron drip-line. In the meantime, other groups will take advantage of the facility’s many beamlines and instruments.

    “Bringing the facility online was a big effort by a lot of people, and something the community has been looking forward to for a long time,” Crawford said. “I’m excited I am young enough to keep taking advantage of it for the next several decades.”

    Multiple institutions collaborated on the first experiment, with researchers from Argonne National Laboratory (ANL), Berkeley Lab, Brookhaven National Laboratory, Florida State University, FRIB, Lawrence Livermore National Laboratory, Louisiana State University, Los Alamos National Laboratory, Mississippi State University, Oak Ridge National Laboratory (ORNL), and the University of Tennessee Knoxville (UTK).

    Scientists from ORNL, UTK, ANL and FRIB led the collaboration to provide the instruments used in the FRIB Decay Station initiator, the sensitive detector system that measured the isotopes.

    Michigan State University (MSU) operates the Facility for Rare Isotope Beams (FRIB) as a user facility for the U.S. Department of Energy Office of Science (DOE-SC), supporting the mission of the DOE-SC Office of Nuclear Physics. Hosting what is designed to be the most powerful heavy-ion accelerator, FRIB enables scientists to make discoveries about the properties of rare isotopes in order to better understand the physics of nuclei, nuclear astrophysics, fundamental interactions, and applications for society, including in medicine, homeland security, and industry.

    ###

    Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 16 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Lab’s facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energy’s Office of Science.

    DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

    [ad_2]

    Lawrence Berkeley National Laboratory

    Source link

  • After Fire and Monsoons, DESI Resumes Cataloguing the Cosmos

    After Fire and Monsoons, DESI Resumes Cataloguing the Cosmos

    [ad_1]

    Newswise — On June 11, lightning struck a remote ridge in the Baboquivari Mountain range outside of Tucson, Arizona. Within days, the Contreras Fire had traveled eight miles and climbed Kitt Peak, a 6,800-foot mountain dotted with white telescope domes. Within one was the Dark Energy Spectroscopic Instrument (DESI), the heart of a next-generation sky survey that is creating the largest 3D map of the universe.

    Researchers use DESI to study dark energy, the mysterious force accelerating the expansion of our universe. It’s a clue into the fundamental workings of nature, how the universe evolved, and how it may end.

    Collaborators who had spent years designing, building, and running the instrument watched the flames sweep over the observatory’s southern ridge on webcams – until the power went out. They switched to watching the curlicue paths of planes dropping fire retardant. 

    When the smoke had cleared, teams returned to find something astounding: All of the scientific equipment was intact. For several weeks, they carefully cleaned components and turned DESI’s systems on one by one. On Sept. 10, DESI began imaging the night sky once again.

    “We’re relieved to return to our science with equipment that is performing almost as well as it was before the fire,” said Michael Levi, director of the international DESI collaboration and a scientist at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), which manages the project. “I’m eternally grateful to the firefighters and the crews who secured the site, and their patience and ingenuity getting things running again.”  

    It’s not entirely business as usual yet, since the fire knocked out power lines and the high-speed network normally used to transmit data. The telescope is temporarily powered by a generator, and the information recorded each night has to take a more circuitous route to researchers around the globe. Each day, the data (roughly 80 gigabytes worth on a clear night, capturing about 150,000 celestial objects) is loaded onto an external hard drive and driven down the winding mountain road, past the recently charred mesquite and rebounding wild grasses, for processing in Tucson.

    DESI owes much of the successful restart to quick actions by crews on the mountain who secured the precious equipment.

    “We’ve performed tests during the restart and found little loss in performance despite the terrible conditions that the mountain experienced,” said Claire Poppett, one of DESI’s lead observers and a physicist at UC Berkeley’s Space Sciences Laboratory. “The work that the crew did to protect the instrument was phenomenal, and we wouldn’t be in the good shape we are in without it.”

    Fire on the mountain

    DESI is housed in the Nicholas U. Mayall 4-meter Telescope at Kitt Peak National Observatory. As the fire approached, non-essential staff evacuated. A small team stayed behind to secure the site as best they could. They rotated the telescope to face away from the oncoming smoke, powered off electronics, and covered the mirror and lenses that image galaxies billions of light-years away.

    “The most important thing was the optics,” said Matthew Evatt, the mechanical engineering manager at NSF’s NOIRLab, which operates the Mayall Telescope with funding provided by DOE. “We scrounged around and found tarps and plastic left over from way earlier in DESI.”

    Evatt and Bob Stupak, the electronics maintenance supervisor at NOIRLab, climbed ladders and secured the plastic sheets over the 4-meter-diameter mirror using bungee cords, ratchet straps, and electrical tape. They maneuvered the telescope and a scissor lift to access and cover DESI’s corrector barrel, which holds six glass lenses in alignment. Soon, they too evacuated, leaving behind only firefighters and two NOIRLab employees familiar with the site: Fred Wortman and Zade Arnold.

    “This place is a second home to me,” said Arnold, the site’s environmental health and safety technician who grew up close to the observatory, which sits on Tohono O’odham land. Rising above the Sonoran Desert below, Kitt Peak (or Iolkam Du’ag) is considered a “Sky Island”: a remote mountaintop with a unique ecosystem, including some unexpected inhabitants, such as bears. 

    “It’s a little piece of paradise I go to every day, and I wanted to keep it safe,” Arnold said.

    As hotshot crews cleared brush, controlled backburns, and put out spot fires, Wortman and Arnold supported their efforts, providing information on the site’s hydrant and water systems. When teams later cut power to avoid potential flare-ups, the hydrant system shut off, so the two rigged a gravity-fed water system for responders to drink and fill their trucks. “Fred and Zade’s efforts were vital,” Levi said.

    In the early hours of June 17, the fire swept over the main observatory site, causing the white domes to glow red with reflected firelight.

    The fire and smoke wrapped around the peak and continued north, burning a total of around 30,000 acres before being contained. At the observatory, four support buildings burned, but all the scientific equipment and telescopes survived.   

    Road to recovery

    It took several weeks to secure the site and restore basic functions like power and water. The fire had damaged the observatory access road, burning away all of the guardrails and miles of power poles. It was followed closely by monsoons, causing mudslides. With the charred vegetation unable to stabilize the soil, a boulder the size of a car fell onto the road. Crews accessing the site to assess damage and begin clean up traveled together in a daily caravan to minimize disruptions to road repair.

    “The amount of work it takes to recover from something like this is always surprising,” said Stupak. “This facility is pretty much a small town up here. We’re pretty isolated. Everything from potable water to data is a huge effort by a lot of people.”

    DESI collaborators took a methodical approach, starting up and quadruple-checking one system at a time. Experts looked for any smoke damage, changed out air filters, and cleaned the optical components with a special wash of carbon dioxide. They checked the 5,000 robotic positioners that rotate and lock onto galaxies, and placed the spectrographs (tools that measure the wavelength of light) under vacuum, removing all the air over several days. The last step was turning on sensitive image sensors known as CCDs, which turn light into data and operate in extreme cold. It all worked. When the monsoons finally cleared, DESI resumed cataloguing the cosmos. 

    The sky survey uses the distance and speed of far-off galaxies, collecting data known as “redshifts.” During the first year of observations leading up to the fire, researchers were already ahead of schedule, having collected 14 million galaxy and quasar redshifts – a whopping 30% of the total they plan to gather during the instrument’s five-year run. The collaboration doesn’t expect any long-term impact from the fire and is working towards a large data release in early 2023.

    In the coming months, crews will continue to repair the larger site and improve the instrument, doing additional cleaning on the optics to return them to pre-fire condition.

    “It feels really great to be back on sky again,” said Poppett, who has worked on DESI for more than a decade. “The fact that the telescope and instrument is still there is all we need – and it just needs a small tune-up to be as good as before.”

    DESI, including operations of the Mayall telescope, is supported by the DOE Office of Science and by the National Energy Research Scientific Computing Center, a DOE Office of Science user facility. Additional support for DESI is provided by the U.S. National Science Foundation, the Science and Technologies Facilities Council of the United Kingdom, the Gordon and Betty Moore Foundation, the Heising-Simons Foundation, the French Alternative Energies and Atomic Energy Commission (CEA), the National Council of Science and Technology of Mexico, the Ministry of Economy of Spain, and by the DESI member institutions.

    Kitt Peak National Observatory (KPNO) is a program of the National Science Foundation’s NOIRLab.

    The DESI collaboration is honored to be permitted to conduct scientific research on Iolkam Du’ag (Kitt Peak), a mountain with particular significance to the Tohono O’odham Nation.

    ###

    Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 16 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Lab’s facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energy’s Office of Science.

    DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

    [ad_2]

    Lawrence Berkeley National Laboratory

    Source link