ReportWire

Tag: Nature (journal)

  • Structure of Opioid Receptors May Reveal How to Better Design Pain Relievers, Addiction Therapies

    Structure of Opioid Receptors May Reveal How to Better Design Pain Relievers, Addiction Therapies

    Newswise — Opioids remain the most potent and effective pain relievers in medicine, but they’re also among the most addictive drugs that can halt a person’s ability to breathe during an overdose — which can be deadly. Researchers have been racing to develop safer pain reliever drugs that target a specific opioid receptor, called the kappa opioid receptor, that is only found in the central nervous system and not elsewhere in the body, like other opioid receptors. Previous research suggests that such drugs may not lead to addiction or death due to overdose, but the currently known drugs that target these kappa opioid receptors have their own set of unacceptable side effects, including depression and psychosis.

    In one of the first steps towards eventually developing a new wave of kappa opioid receptor drugs without these side effects, researchers at the University of Maryland School of Medicine and Washington University have mapped the 3D structure of the central nervous system specific kappa opioid receptor and figured out how it differs from the other opioid receptors. In this new study, they discovered what instructs the kappa opioid receptor to change its shape, which uniquely binds to opioid drugs, akin to a lock fitting with a specific key.  

    They published their results in the May issue of Nature.

    Aside from relieving pain, opioid receptors are also involved in everything from sensing taste and smell to digestion and breathing, as well as responding to many of the body’s hormones. The way that opioid receptors can influence so many functions around the body is by acting with one of seven cell activity proteins, known as G-alpha proteins, that each help to specialize the function they suppress in the cell.

    “Knowing how these drugs interact with opioid receptors and having a clear view of this molecular snapshot is critical for allowing researchers to develop more effective pain-relieving drugs. This requires a drug that binds to the right type of opioid receptor, such as one in the central nervous system to reduce pain versus the ones that interact in the gut, causing side effects like constipation,” said study corresponding author Jonathan Fay, PhD, Assistant Professor of Biochemistry and Molecular Biology at UMSOM. “Additionally, these next generation medications will need to be designed with the appropriate kind of G-alpha protein in mind, as this will help to precisely target location and cell function by determining the specific shape of the opioid receptor — so the drug only reduces pain without affecting other body functions.”

    The known kappa opioid receptor drugs do not produce the same euphoria as traditional opioid drugs, making these kappa opioid receptor drugs less likely to be addictive.

    For the current study, the researchers used cryogenic electron microscopy in order to visualize the structure of the kappa opioid receptor. They first needed to flash freeze the receptors, which were bound to a hallucinogenic drug with one of two of the traditional G-alpha proteins. They then used a different drug to see how the kappa opioid receptor interacted with two other types of G-alpha proteins; one of these G-alpha proteins is found only in the central nervous system and the other is used to detect taste and smell.

    Dr. Fay described the G-protein as shaped like a chainsaw with a handle and a ripcord. Each G-protein had a slightly different position of its chainsaw handle when bound to the kappa opioid receptor. This change in position played an active role in determining the shape of the kappa opioid receptor and thus what drug bound the best to it. These findings ultimately could have implications for how new drugs will be designed.

    UMSOM Dean Mark T. Gladwin, MD, Vice President for Medical Affairs, University of Maryland, Baltimore, and the John Z. and Akiko K. Bowers Distinguished Professor, said, “Researchers face an enormous challenge in developing safer pain-reliever drugs since they will need to target both the correct opioid receptor as well as the appropriate G-alpha protein. Studies like these reinforce the mission of our new Kahlert Institute for Addiction Medicine, which aims to help develop this next generation of engineered small molecule drugs that are less addictive.

    The research was supported by National Institutes of Health grants from the National Institute of General Medical Sciences (R35GM143061) and the National Institute of Neurological Disorders and Stroke (R01NS099341). The Titan X Pascal graphics card used for this research was donated by NVIDIA.

    About the University of Maryland School of Medicine

    Now in its third century, the University of Maryland School of Medicine was chartered in 1807 as the first public medical school in the United States. It continues today as one of the fastest growing, top-tier biomedical research enterprises in the world — with 46 academic departments, centers, institutes, and programs, and a faculty of more than 3,000 physicians, scientists, and allied health professionals, including members of the National Academy of Medicine and the National Academy of Sciences, and a distinguished two-time winner of the Albert E. Lasker Award in Medical Research. With an operating budget of more than $1.3 billion, the School of Medicine works closely in partnership with the University of Maryland Medical Center and Medical System to provide research-intensive, academic, and clinically based care for nearly 2 million patients each year. The School of Medicine has nearly $600 million in extramural funding, with most of its academic departments highly ranked among all medical schools in the nation in research funding. As one of the seven professional schools that make up the University of Maryland, Baltimore campus, the School of Medicine has a total population of nearly 9,000 faculty and staff, including 2,500 students, trainees, residents, and fellows. The combined School of Medicine and Medical System (“University of Maryland Medicine”) has an annual budget of over $6 billion and an economic impact of nearly $20 billion on the state and local community. The School of Medicine, which ranks as the 8th highest among public medical schools in research productivity (according to the Association of American Medical Colleges profile) is an innovator in translational medicine, with 606 active patents and 52 start-up companies. In the latest U.S. News & World Report ranking of the Best Medical Schools, published in 2021, the UM School of Medicine is ranked #9 among the 92 public medical schools in the U.S., and in the top 15 percent (#27) of all 192 public and private U.S. medical schools. The School of Medicine works locally, nationally, and globally, with research and treatment facilities in 36 countries around the world. Visit medschool.umaryland.edu

    University of Maryland School of Medicine

    Source link

  • Webb Detects Water Vapor in Rocky Planet-Forming Zone

    Webb Detects Water Vapor in Rocky Planet-Forming Zone

    Newswise — Water is essential for life as we know it. However, scientists debate how it reached the Earth and whether the same processes could seed rocky exoplanets orbiting distant stars. New insights may come from the planetary system PDS 70, located 370 light-years away. The star hosts both an inner disk and outer disk of gas and dust, separated by a 5 billion-mile-wide (8 billion kilometer) gap, and within that gap are two known gas-giant planets.

    New measurements by NASA’s James Webb Space Telescope’s MIRI (Mid-Infrared Instrument) have detected water vapor in the system’s inner disk, at distances of less than 100 million miles (160 million kilometers) from the star – the region where rocky, terrestrial planets may be forming. (The Earth orbits 93 million miles from our Sun.) This is the first detection of water in the terrestrial region of a disk already known to host two or more protoplanets.

    “We’ve seen water in other disks, but not so close in and in a system where planets are currently assembling. We couldn’t make this type of measurement before Webb,” said lead author Giulia Perotti of the Max Planck Institute for Astronomy (MPIA) in Heidelberg, Germany.

    “This discovery is extremely exciting, as it probes the region where rocky planets similar to Earth typically form,” added MPIA director Thomas Henning, a co-author on the paper. Henning is co-principal investigator of Webb’s MIRI (Mid-Infrared Instrument), which made the detection, and the principal investigator of the MINDS (MIRI Mid-Infrared Disk Survey) program that took the data.

    A Wet Environment for Forming Planets

    PDS 70 is a K-type star, cooler than our Sun, and is estimated to be 5.4 million years old. This is relatively old in terms of stars with planet-forming disks, which made the discovery of water vapor surprising.

    Over time, the gas and dust content of planet-forming disks declines. Either the central star’s radiation and winds blow out such material, or the dust grows into larger objects that eventually form planets. As previous studies failed to detect water in the central regions of similarly aged disks, astronomers suspected it might not survive the harsh stellar radiation, leading to a dry environment for the formation of any rocky planets.

    Astronomers haven’t yet detected any planets forming within the inner disk of PDS 70. However, they do see the raw materials for building rocky worlds in the form of silicates. The detection of water vapor implies that if rocky planets are forming there, they will have water available to them from the beginning.

    “We find a relatively high amount of small dust grains. Combined with our detection of water vapor, the inner disk is a very exciting place,” said co-author Rens Waters of Radboud University in The Netherlands.

    What is the Water’s Origin?

    The discovery raises the question of where the water came from. The MINDS team considered two different scenarios to explain their finding.

    One possibility is that water molecules are forming in place, where we detect them, as hydrogen and oxygen atoms combine. A second possibility is that ice-coated dust particles are being transported from the cool outer disk to the hot inner disk, where the water ice sublimates and turns into vapor. Such a transport system would be surprising, since the dust would have to cross the large gap carved out by the two giant planets.

    Another question raised by the discovery is how water could survive so close to the star, when the star’s ultraviolet light should break apart any water molecules. Most likely, surrounding material such as dust and other water molecules serves as a protective shield. As a result, the water detected in the inner disk of PDS 70 could survive destruction.

    Ultimately, the team will use two more of Webb’s instruments, NIRCam (Near-Infrared Camera) and NIRSpec (Near-Infrared Spectrograph) to study the PDS 70 system in an effort to glean an even greater understanding.

    These observations were taken as part of Guaranteed Time Observation program 1282. This finding has been published in the journal Nature.

    For more information or to download full-resolution images, visit https://webbtelescope.org/contents/news-releases/2023/news-2023-130

    The James Webb Space Telescope is the world’s premier space science observatory. Webb is solving mysteries in our solar system, looking beyond to distant worlds around other stars, and probing the mysterious structures and origins of our universe and our place in it. Webb is an international program led by NASA with its partners, ESA (European Space Agency) and the Canadian Space Agency.

    Media Contacts:

    Christine Pulliam
    Space Telescope Science Institute, Baltimore, Md.
    [email protected]

    Markus Nielbock
    Max Planck Institute for Astronomy, Heidelberg, Germany
    [email protected]

    Space Telescope Science Institute (STScI)

    Source link

  • Ancient Impacts May Have Fueled Venus Volcanism

    Ancient Impacts May Have Fueled Venus Volcanism

    Newswise — SAN ANTONIO —July 20, 2023 —A Southwest Research Institute-led team has modeled the early impact history of Venus to explain how Earth’s sister planet has maintained a youthful surface despite lacking plate tectonics. The team compared the early collision histories of the two bodies and determined that Venus likely experienced higher-speed, higher-energy impacts creating a superheated core that promoted extended volcanism and resurfaced the planet.

    “One of the mysteries of the inner solar system is that, despite their similar size and bulk density, Earth and Venus operate in strikingly distinct ways, particularly affecting the processes that move materials through a planet,” said Dr. Simone Marchi, lead author of a new paper about these findings in Nature Astronomy.

    The Earth’s shifting plates continuously reshape its surface as chunks of the crust collides to form mountains ranges, and in places promote volcanism. Venus has more volcanos than any other planet in the solar system but has only one continuous plate for its surface. More than 80,000 volcanos — 60 times more than Earth — have played a major role in renewing the planet’s surface through floods of lava, which may continue to this day. Previous simulations struggled to create scenarios to support this level of volcanism.

    “Our latest models show that long-lived volcanism driven by early, energetic collisions on Venus offer a compelling explanation for its young surface age,” said Professor Jun Korenaga, a co-author from Yale University. “This massive volcanic activity is fueled by a superheated core, resulting in vigorous internal melting.”

    Earth and Venus formed in the same neighborhood of the solar system as solid materials collided with each other and gradually combined to form the two rocky planets. The slight differences in the planets’ distances from the Sun changed their impact histories, particularly the number and outcome of these events. These differences arise because Venus is closer to the Sun and moves faster around it, energizing impact conditions. In addition, the tail of collisional growth is typically dominated by impactors originating from beyond Earth’s orbit that require higher orbital eccentricities to collide with Venus rather than Earth, resulting in more powerful impacts.

    “Higher impact velocities melt more silicate, melting as much as 82% of Venus’ mantle,” said Dr. Raluca Rufu, a Sagan Fellow and SwRI co-author. “This produces a mixed mantle of molten materials redistributed globally and a superheated core.”

    If impacts on Venus had significantly higher velocity than on Earth, a few large impacts could have had drastically different outcomes, with important implications for the subsequent geophysical evolution. The multidisciplinary team combined expertise in large-scale collision modeling and geodynamic processes to assess the consequences of those collisions for the long-term evolution of Venus.

    “Venus internal conditions are not well known, and before considering the role of energetic impacts, geodynamical models required special conditions to achieve the massive volcanism we see at Venus,” Korenaga said. “Once you input energetic impact scenarios into the model, it easily comes up with the extensive and extended volcanism without really tweaking the parameters.”

    And the timing of this new explanation is serendipitous. In 2021, NASA committed to two new Venus missions, VERITAS and DAVINCI, while the European Space Agency is planning one called EnVision.

    “Interest in Venus is high right now,” Marchi said. “These findings will have synergy with the upcoming missions, and the mission data could help confirm the findings.”  

    The paper “Long-lived volcanic resurfacing of Venus driven by early collisions” appears in Nature Astronomy and can be accessed at https://doi.org/10.1038/s41550-023-02037-2.

    Southwest Research Institute

    Source link

  • ‘Stunning’ discovery: Metals can heal themselves

    ‘Stunning’ discovery: Metals can heal themselves

    Newswise — ALBUQUERQUE, N.M. — Scientists for the first time have witnessed pieces of metal crack, then fuse back together without any human intervention, overturning fundamental scientific theories in the process. If the newly discovered phenomenon can be harnessed, it could usher in an engineering revolution — one in which self-healing engines, bridges and airplanes could reverse damage caused by wear and tear, making them safer and longer-lasting.

    The research team from Sandia National Laboratories and Texas A&M University described their findings today in the journal Nature.

    “This was absolutely stunning to watch first-hand,” said Sandia materials scientist Brad Boyce.

    “What we have confirmed is that metals have their own intrinsic, natural ability to heal themselves, at least in the case of fatigue damage at the nanoscale,” Boyce said.

    Fatigue damage is one way machines wear out and eventually break. Repeated stress or motion causes microscopic cracks to form. Over time, these cracks grow and spread until — snap! The whole device breaks, or in the scientific lingo, it fails.

    The fissure Boyce and his team saw disappear was one of these tiny but consequential fractures — measured in nanometers.

    “From solder joints in our electronic devices to our vehicle’s engines to the bridges that we drive over, these structures often fail unpredictably due to cyclic loading that leads to crack initiation and eventual fracture,” Boyce said. “When they do fail, we have to contend with replacement costs, lost time and, in some cases, even injuries or loss of life. The economic impact of these failures is measured in hundreds of billions of dollars every year for the U.S.”

    Although scientists have created some self-healing materials, mostly plastics, the notion of a self-healing metal has largely been the domain of science fiction.

    “Cracks in metals were only ever expected to get bigger, not smaller. Even some of the basic equations we use to describe crack growth preclude the possibility of such healing processes,” Boyce said.

    Unexpected discovery confirmed by theory’s originator

    In 2013, Michael Demkowicz — then an assistant professor at the Massachusetts Institute of Technology’s department of materials science and engineering, now a full professor at Texas A&M — began chipping away at conventional materials theory. He published a new theory, based on findings in computer simulations, that under certain conditions metal should be able to weld shut cracks formed by wear and tear.

    The discovery that his theory was true came inadvertently at the Center for Integrated Nanotechnologies, a Department of Energy user facility jointly operated by Sandia and Los Alamos national laboratories.

    “We certainly weren’t looking for it,” Boyce said.

    Khalid Hattar, now an associate professor at the University of Tennessee, Knoxville, and Chris Barr, who now works for the Department of Energy’s Office of Nuclear Energy, were running the experiment at Sandia when the discovery was made. They only meant to evaluate how cracks formed and spread through a nanoscale piece of platinum using a specialized electron microscope technique they had developed to repeatedly pull on the ends of the metal 200 times per second.

    Surprisingly, about 40 minutes into the experiment, the damage reversed course. One end of the crack fused back together as if it was retracing its steps, leaving no trace of the former injury. Over time, the crack regrew along a different direction.

    Hattar called it an “unprecedented insight.”

    Boyce, who was aware of the theory, shared his findings with Demkowicz.

    “I was very glad to hear it, of course,” Demkowicz said. The professor then recreated the experiment on a computer model, substantiating that the phenomenon witnessed at Sandia was the same one he had theorized years earlier.

    Their work was supported by the Department of Energy’s Office of Science, Basic Energy Sciences; the National Nuclear Security Administration and the National Science Foundation.

    A lot remains unknown about the self-healing process, including whether it will become a practical tool in a manufacturing setting.

    “The extent to which these findings are generalizable will likely become a subject of extensive research,” Boyce said. “We show this happening in nanocrystalline metals in vacuum. But we don’t know if this can also be induced in conventional metals in air.”

    Yet for all the unknowns, the discovery remains a leap forward at the frontier of materials science.

    “My hope is that this finding will encourage materials researchers to consider that, under the right circumstances, materials can do things we never expected,” Demkowicz said.

    Sandia National Laboratories is a multimission laboratory operated by National Technology and Engineering Solutions of Sandia LLC, a wholly owned subsidiary of Honeywell International Inc., for the U.S. Department of Energy’s National Nuclear Security Administration. Sandia Labs has major research and development responsibilities in nuclear deterrence, global security, defense, energy technologies and economic competitiveness, with main facilities in Albuquerque, New Mexico, and Livermore, California.

    Sandia National Laboratories

    Source link

  • Memory and Learning Genes Date Back 650 Million Years: Study

    Memory and Learning Genes Date Back 650 Million Years: Study

    Newswise — A team of scientists led by researchers from the University of Leicester have discovered that the genes required for learning, memory, aggression and other complex behaviours originated around 650 million years ago.

    The findings led by Dr Roberto Feuda, from the Neurogenetic group in the Department of Genetics and Genome Biology and other colleagues from the University of Leicester and the University of Fribourg (Switzerland), have now been published in Nature Communications.

    Dr Feuda said: “We’ve known for a long time that monoamines like serotonin, dopamine and adrenaline act as neuromodulators in the nervous system, playing a role in complex behaviour and functions like learning and memory, as well as processes such as sleep and feeding.

    “However, less certain was the origin of the genes required for the production, detection, and degradation of these monoamines. Using the computational methods, we reconstructed the evolutionary history of these genes and show that most of the genes involved in monoamine production, modulation, and reception originated in the bilaterian stem group.

    “This finding has profound implications on the evolutionary origin of complex behaviours such as those modulated by monoamines we observe in humans and other animals.”

    The authors suggest that this new way to modulate neuronal circuits might have played a role in the Cambrian Explosion – known as the Big Bang – which gave rise to the largest diversification of life for most major animal groups alive today by providing flexibility of the neural circuits to facilitate the interaction with the environment.

    Dr Feuda added: “This discovery will open new important research avenues that will clarify the origin of complex behaviours and if the same neurons modulate reward, addiction, aggression, feeding, and sleep.”

    University of Leicester

    Source link

  • Exotic Remote Flora

    Exotic Remote Flora

    Newswise — Oceanic islands provide useful models for ecology, biogeography and evolutionary research. Many ground-breaking findings – including Darwin’s theory of evolution – have emerged from the study of species on islands and their interplay with their living and non-living environment. Now, an international research team led by the University of Göttingen has investigated the flora of the Canary Island of Tenerife. The results were surprising: the island’s plant-life exhibits a remarkable diversity of forms. But the plants differ little from mainland plants in functional terms. However, unlike the flora of the mainland, the flora of Tenerife is dominated by slow-growing, woody shrubs with a “low-risk” life strategy. The results were published in Nature.

    The researchers investigated how the plants of Tenerife differ in functional terms from plants from other parts of the world. They conducted extensive field research and measurements at over 500 sites using the most up-to-date methods of functional ecology. The sites were scattered all over the island at altitudes ranging from sea level to mountainous regions above 3,300 metres. The scientists recorded about 80% of Tenerife’s native seed plants, and surveyed eight plant characteristics: plant size, specific wood density, leaf thickness, absolute and specific leaf area, leaf dry matter, nitrogen concentration in leaf tissue, and seed weight. They compared their data with data on more than 2,000 plant species found on the mainland.

    “Our study shows, for the first time and contrary to all expectations, that species groups that evolved on the Canary Islands do not contribute to the expansion of the breadth of different traits. This means they do not lead to more functional diversity,” explains the lead of the study, Professor Holger Kreft, and Göttingen University’s Biodiversity, Macroecology and Biogeography research group. Previous comparisons show that species occurring on islands can differ significantly from their relatives on the mainland. A well-known example is provided by the Galapagos giant tortoise: the species is only found on the Galapagos Islands and, as a result of adaptation to its environmental conditions, is much larger than tortoises from the mainland. The research team expected similar differences between island and mainland plants, but this was not the case. “Rather, we see that most species follow the constraints of the island climate. Thus, medium-sized, woody species develop. These tend to live with the limited resources and high risks of extinction on the island. That is, they grow slowly. The high functional diversity is mainly due to the species that are widespread on the island and the nearby mainland,” explains Kreft.

    “At the beginning of our research, we assumed that island plants would show fundamental differences and would be characterised by rather limited diversity in terms of function due to their geographical isolation,” explains first author Dr Paola Barajas Barbosa. The results are part of her doctoral thesis, which she did at the University of Göttingen. She now does research at the German Centre for Integrative Biodiversity Research in Leipzig (iDiv). “We were all the more surprised to find that the plants of Tenerife have a comparatively high functional diversity.”

     

    Original publication: Martha Paola Barajas Barbosa et al. Assembly of functional diversity in an oceanic island flora. Nature (2023). DOI: 10.1038/s41586-023-06305-z

    University of Gottingen

    Source link

  • AI offers hope to patients with lyosomal storage disease

    AI offers hope to patients with lyosomal storage disease

    Newswise — Artificial intelligence is becoming increasingly important in drug discovery. Advances in the use of Big Data, learning algorithms and powerful computers have now enabled researchers at the University of Zurich (UZH) to better understand a serious metabolic disease. 

    Cystinosis is a rare lyosomal storage disorder affecting around 1 in 100,000 to 200,000 newborns worldwide. Nephropathic (non-inflammatory) cystinosis, the most common and severe form of the disease, manifests with kidney disease symptoms during the first months of life, often leading to kidney failure before the age of 10. “Children with cystinosis suffer from a devastating, multisystemic disease, and there are currently no available curative treatments,” says Olivier Devuyst, head of the Mechanisms of Inherited Kidney Disorders (MIKADO) group and co-director of the ITINERARE University Research Priority Program at UZH.

    The UZH researchers worked with Insilico Medicine, a company that uses AI for drug discovery, to uncover the underlying cellular mechanism behind kidney disease in cystinosis. Leveraging model systems and Insilico’s PandaOmics platform, they identified the disease-causing pathways and prioritized therapeutic targets within cystinosis cells. Their findings revealed a causal association between the regulation of a protein called mTORC1 and the disease. Alessandro Luciani, one of the research group leaders, explains: “Our research showed that cystine storage stimulates the activation of the mTORC1 protein, leading to the impairment of kidney tubular cell differentiation and function.”

    Promising drug identified for treatment

    As patients with cystinosis often require a kidney transplant to restore kidney function, there is an urgent need for more effective treatments. Utilizing the PandaOmics platform, the UZH research team therefore embarked on a search for existing drugs that could be repurposed for cystinosis. This involved an analysis of the drugs’ structure, target enzymes, potential side effects and efficacy in the affected tissues. The already-licensed drug rapamycin was identified as a promising candidate for treating cystinosis. Studies in cell systems and model organisms confirmed that treatment with rapamycin restored the activity of lysosomes and rescued the cellular functions.

    Olivier Devuyst and Alessandro Luciani are optimistic about future developments: “Although the therapeutic benefits of this approach will require further clinical investigations, we believe that these results, obtained through unique interdisciplinary collaboration, bring us closer to a feasible therapy for cystinosis patients.”

    Study participants

    Scientists from the University of Zurich (UZH), the Faculty of Medicine at UCLouvain in Brussels, the Microsoft Research-University of Trento Centre for Computational and Systems Biology, and the company Insilico Medicine were involved in the study. The USA’s Cystinosis Research Foundation and the Swiss National Science Foundation (SNSF) provided funding for the study.

    University of Zurich

    Source link

  • Butterfly species’ big brains adapted giving them a survival edge, study finds

    Butterfly species’ big brains adapted giving them a survival edge, study finds

    BYLINE: Laura Thomas

    Newswise — Heliconius butterflies’ brains grew as they adopted a novel foraging behaviour, scientists at the University of Bristol have found.

    A region of their brain, known as the mushroom body due to its shape, are two to four times larger than those of their close relatives.

    The findings, published today in Nature Communications, suggest that the structure and function of the nervous system are closely linked to an organism’s ecological niche and behaviour.

    Dr Stephen Montgomery of Bristol’s School of Biological Sciences explained: “Heliconius are the only butterflies known to collect and digest pollen, which gives them an adult source of protein, when most other butterflies exclusively obtain protein as caterpillars.

    “This shift in diet allows Heliconius to live much longer lives, but they seemingly only collect pollen from specific plant species that occur at low densities.

    “Learning the location of these plants is therefore a critical behaviour for them, but to do so they must presumably invest more in the neural structures and cells that support spatial memory.”

    The team focused on the relationship between mushroom body expansion, sensory specialization and the evolutionary innovation of pollen feeding.

    The study involved a unique synthesis of comparative data on large-scale brain structure, cellular composition and connectivity in the brain, and studies of behaviour across species.

    They built 3D models of the brain in 30 pollen-feeding species of Heliconius, and 11 species from closely related genera, collected from across Central and South America.

    The volume of different brain areas was measured and mapped over phylogenetic (family) trees to estimate where major evolutionary changes in brain composition occurred.

    They then investigated changes in neural circuitry by quantifying in the number of neurons in the mushroom bodies and the density of their connections, as well as sensory specialisation by tracing neural inputs from brain areas that process visual information and smell before sending it to the central brain.

    Finally, in partnership with the Smithsonian Tropical Research Institute in Panama, they conducted behavioural experiments in key species to assess whether the observed expansion of the mushroom body correlated with improved visual learning and memory.

    One striking result is the remarkable range of variation in mushroom body size observed among these closely-related species within a relatively short evolutionary timeframe. Across the whole dataset mushroom body size varies by 25-fold.

    This provides a compelling example of how specific brain structures can vary independently over evolutionary time, known as mosaic evolution, when under strong selective constraints for behavioural adaptation.

    Dr Montgomery added: “We identified that changes in mushroom body size are due to an increased number of ‘Kenyon cells’, the neurons that form the majority of the mushroom body and whose interactions are thought to be the basis of memory storage, as well as increased inputs from the visual system.

    “This expansion and visual specialization of the mushroom bodies were accompanied by enhanced visual learning and memory abilities. Through this synthesis of data types, we provide a clear example of a novel foraging behaviour coinciding with adaptations in the brain and associated cognitive shifts.”

    Co-lead author, Bristol’s Dr Antoine Couto, said: “”The study reveals how brain structure of Heliconius butterflies, specifically the mushroom bodies, has undergone remarkable changes that are tightly linked to their specialized foraging behaviours.

    “These butterflies have evolved larger mushroom bodies with enhanced visual processing abilities, allowing them to discriminate complex visual patterns and retain visual memories over extended periods. These findings highlight the fascinating connection between brain evolution and behavioural adaptations in the natural world.”

    Dr Fletcher Young, also co-lead author, added: “This study provides a rare combination of neurobiological and behavioural data across closely related species, revealing a clear example of marked evolutionary changes in the brain over a relatively short time scale coinciding with improved visual learning and memory abilities. Identifying such relationships between brain adaptations and behavioural shifts are crucial to our understanding of cognitive evolution.”

    Dr Montgomery concluded “We provide evidence that brain structure can vary in striking ways between even closely related species that live in the same habitats.

    “In this example, the innovation of one suite of behaviours has led to a dramatic expansion of critical learning and memory centres in the brain, and we show these neural changes co-occur with substantial enhancements in cognitive ability.

    “We hypothesise that these behavioural differences reflect either a direct response to selection on foraging behaviour, and the information the butterflies are extracting for the environment around them to guide their behaviour.”

    Understanding the relationship between brain anatomy, sensory processing, and foraging behaviour in Heliconius butterflies could also provide insights into the evolution of learning and memory mechanisms in not only insects, but other animals as the function and circuitry of mushroom bodies share some similarities with vertebrate brains. Hence these butterflies provide an excellent system in which to explore the neural basis of learning and memory with widespread relevance.

     

    Paper:

    ‘Rapid expansion and visual specialisation of learning and memory centers in the brains of Heliconiini butterflies’ in by Stephen Montgomery et al in Nature Communications.

    University of Bristol

    Source link

  • Possible? Climate-neutral air travel

    Possible? Climate-neutral air travel

    Newswise — Researchers at the Paul Scherrer Institute PSI and ETH Zurich have performed calculations to work out how air traffic could become climate-neutral by 2050. They conclude that simply replacing fossil aviation fuel with sustainable synthetic fuels will not be enough. Air traffic would also have to be reduced. The researchers are publishing their results today in the journal Nature Communications.

    The European Union aims to be climate neutral by 2050, a target that was set by the European Parliament in 2021. Switzerland is pursuing the same goal. The aviation sector, which is responsible for 3.5 percent of global warming, is expected to contribute its fair share – especially since the greenhouse gas emissions of aircraft are two to three times higher per passenger or freight kilometre than in other transport sectors. The International Civil Aviation Organisation (ICAO) and many airlines have therefore announced their intention to reduce CO2 emissions to zero by 2050 or to become climate neutral.

    In a new study, researchers at PSI and ETH Zurich have now calculated whether this can be achieved, and how. “An important question is what exactly we mean by zero carbon or climate neutrality,” says Romain Sacchi of PSI’s Laboratory for Energy Systems Analysis, one of the study’s two lead authors. If this is only referring to the CO2emitted by aircraft actually in the air, adds his co-author Viola Becattini from ETH Zurich, this does not go nearly far enough. Because assuming that air traffic continues to grow as it has in the past, the calculations predict that the CO2emissions of aircraft will only account for about 20 percent of their total climate impact by 2050. In order to make aviation as a whole climate neutral, it is necessary to ensure that not only flying but also the production of fuel and the entire aviation infrastructure have no further impact on the climate.

    However, the study concludes that this cannot be achieved by 2050 using the climate measures that are currently being pursued in flight operations. “New engines, climate-friendly fuels and filtering CO2 out of the atmosphere in order to store it underground (carbon capture and storage, or CCS) will not get us there on their own,” says Marco Mazzotti, Professor of Process Engineering at ETH. “On top of this, we need to reduce air traffic.”

    Non-CO2 effects play a major role

    In their study, Sacchi and Becattini looked at various different scenarios. These showed, on the one hand, that while the climate impact of the infrastructure, i.e. manufacturing aircraft and building and operating airports, does need to be taken into account, it is comparatively small overall for the period up until 2050 and beyond. The impact of flying itself on the climate, and of the emissions from producing the fuel are far greater. That in itself was nothing new.

    What had been less clear before was the importance of so-called non-CO2 effects, which occur in addition to CO2 emissions. The bulk of the greenhouse effect caused by aviation is not due to the carbon released into the atmosphere by burning aviation fuel, but to the particulate matter (soot) and nitrogen oxides that are also released and that react in the air to form methane and ozone, water vapour and the condensation trails that lead to the formation of cirrus clouds in the upper atmosphere. “Many analyses and ‘net zero’ pledges so far have ignored these factors,” says Romain Sacchi. “Or they have not been calculated correctly.”

    It is customary to express emissions and effects like these in terms of CO2 equivalents when calculating the overall balance. “But the methods and values used to date have proved to be inappropriate,” says Marco Mazzotti. “We therefore adopted a more precise approach.” The methods they used take into account one major difference between the various factors: non-CO2 effects are much more short-lived than CO2, which is why they are also called “short-lived climate forcers”, or SLCFs for short. While about half of the emitted carbon dioxide is absorbed by forests and oceans, the other half remains in the air for thousands of years, dispersing and acting as a greenhouse gas. Methane, on the other hand, has a much greater impact on the climate, but decomposes within a few years; contrails and the resulting clouds dissipate within hours. “The problem is that we are producing more and more SLCFs as air traffic increases, so these are adding up instead of disappearing quickly. As a result, they exert their enormous greenhouse impact over longer periods of time,” says Viola Becattini. It’s like a bathtub with both the drain and the tap open: as long as the tap lets in more water than can escape through the drain, the bathtub will keep getting fuller – until eventually it overflows.

    Climate-friendly fuel alone does not achieve the goal – but it helps

    “But this analogy also demonstrates that the crucial lever is under our control: the volume of air traffic,” Romain Sacchi points out. “By flying less instead of more often, in other words closing the tap instead of opening it, we can actually cool the atmosphere and push the greenhouse effect caused by aviation towards zero.” This is not to say that we must stop flying altogether. The calculations performed in the study show that for aviation to achieve climate neutrality by 2050, air traffic will need to be reduced by 0.8 percent every year – in conjunction with underground carbon dioxide storage – if we continue to use fossil fuels. This would bring it down to about 80 percent of today’s volume by 2050. If we manage to switch to more climate-friendly fuels based on electricity from renewables, 0.4 percent per year will be sufficient.

    The study also took a closer look at these new fuels. Researchers around the world are working to replace conventional petroleum-based engines. As in road transport, this could be achieved by using electric batteries, fuel cells or the direct combustion of hydrogen. However, the available energy density is only sufficient for small aircraft on short routes, or in the case of hydrogen also for medium-size planes on medium-haul flights. Yet large aircraft on long-haul flights of more than 4000 kilometres account for the majority of global air traffic and greenhouse gas emissions from aviation.

    Synthetic aviation fuel has pros and cons

    In addition, propulsion technologies for the aviation industry based on electricity or hydrogen are far from being ready for a widespread roll-out. So-called Sustainable Aviation Fuel (SAF) is therefore viewed as the industry’s great hope. This man-made aviation fuel could replace petroleum-based aviation fuel more or less one-to-one, without the need to redesign turbines and aircraft.

    SAF can be produced from CO2 and water via a production cascade. The CO2 is extracted from the air using a process known as air capture, and hydrogen can be obtained from water by electrolysis. “If the necessary processes are carried out entirely using renewable energy, SAF is virtually climate-neutral,” says Christian Bauer from the PSI Laboratory for Energy Systems Analysis, who was involved in the study. “This makes us less dependent on fossil fuels.” Another advantage of SAF is that it produces fewer SLCFs, which would have to be offset by capturing equivalent amounts of CO2 from the air and storing them underground. This is significant because CO2 storage capacity is limited and not reserved exclusively for the aviation industry.

    Air tickets three times more expensive

    SAF also has certain disadvantages though, in that it takes far more energy to produce than conventional aviation fuel. This is mainly because producing hydrogen via electrolysis takes a lot of electricity. In addition, energy is lost at every step in the production process – air capture, electrolysis and synthesisation. Using large amounts of electrical power, in turn, means expending more resources such as water and land. SAF is also expensive: not just in terms of the electrical power required, but also the cost of carbon capture and electrolysis plants, which makes it four to seven times more expensive than conventional aviation fuel. In other words, the widespread use of SAF makes carbon-neutral aviation a possibility, but it also costs more resources and more money. This means that flying will have to become even more expensive than it already needs to be in order to meet the climate targets. “Anyone buying a ticket today can pay a few extra euros to make their flight supposedly carbon neutral, by investing this money in climate protection,” says Romain Sacchi. “But this is greenwashing, because many of these measures for offsetting carbon are ineffective. To fully offset the actual climate impact, tickets would have to cost about three times as much as they do today.”

    “Such a hefty price hike should significantly reduce the demand for flights and bring us closer to the goal of climate neutrality,” says Viola Becattini. In addition, SAF production is expected to become cheaper and more efficient over the years as quantities increases, and this will have a positive effect on the carbon footprint. The study took such dynamics into account – including the fact that the electricity mix used to produce SAF is shifting. This distinguishes the analysis from most others.

    “The bottom line is that there is no magic bullet for achieving climate neutrality in aviation by 2050,” says Sacchi. “We cannot continue as before. But if we develop the infrastructure for storing CO2 underground and producing SAF quickly and efficiently, while also reducing our demand for air travel, we could succeed.”

    Paul Scherrer Institute

    Source link

  • Quantum physics secures digital payments

    Quantum physics secures digital payments

    Newswise — Have you ever been compelled to enter sensitive payment data on the website of an unknown merchant? Would you be willing to consign your credit card data or passwords to untrustworthy hands? Scientists from the University of Vienna have now designed an unconditionally secure system for shopping in such settings, combining modern cryptographic techniques with the fundamental properties of quantum light. The demonstration of such “quantum-digital payments” in a realistic environment has just been published in Nature Communications.

    Digital payments have replaced physical banknotes in many aspects of our daily lives. Similar to banknotes, they should be easy to use, unique, tamper-resistant and untraceable, but additionally withstand digital attackers and data breaches. In today’s payment ecosystem, customers’ sensitive data is substituted by sequences of random numbers, and the uniqueness of each transaction is secured by a classical cryptographic method or code. However, adversaries and merchants with powerful computational resources can crack these codes and recover the customers’ private data, and for example, make payments in their name.

    A research team led by Prof. Philip Walther from the University of Vienna has shown how the quantum properties of light particles or photons can ensure unconditional security for digital payments. In an experiment the researchers have demonstrated that each transaction cannot be duplicated or diverted by malicious parties, and that the user’s sensitive data stays private. “I am really impressed how the quantum properties of light can be used for protecting new applications such as digital payments that are relevant in our every day’s life”, says Tobias Guggemos.

    For enabling absolute secure digital payments, the scientists replaced classical cryptographic techniques with a quantum protocol exploiting single photons. During the course of a classical digital payment transaction the client shares a classical code – called cryptogram – with his payment provider (e.g. a bank or credit card company). This cryptogram is then passed on between customer, merchant and payment provider. In the demonstrated quantum protocol this cryptogram is generated by having the payment provider sending particularly prepared single photons to the client. For the payment procedure, the client measures these photons whereby the measurement settings depend on the transaction parameters. Since quantum states of light cannot be copied, the transaction can only be executed once. This, together with the fact that any deviation of the intendent payment alters the measurement outcomes, which are verified by the payment provider, makes this digital payment unconditionally secure.

    The researchers successfully implemented quantum-digital payments over an urban optical fiber link of 641m, connecting two university buildings in down-town Vienna. Digital payments currently operate within a few seconds. “At present, our protocol takes a few minutes of quantum communication to complete a transaction. This is to guarantee security in the presence of noise and losses” says Peter Schiansky, first author of the paper. “However, these time limitations are only of technological nature” adds Matthieu Bozzio, who is convinced that “we will witness that quantum-digital payments reach practical performance in the very near future”.

    University of Vienna

    Source link

  • Collagen’s Weak Bonds: A Sacrifice for Tissue Protection

    Collagen’s Weak Bonds: A Sacrifice for Tissue Protection

    Newswise — One of the more unusual ways objects can increase longevity is by sacrificing a part of themselves: From dummy burial chambers used to deceive tomb raiders, to a fuse melting in an electrical circuit to safeguard appliances, to a lizard’s tail breaking off to enable its escape. Sacrificial parts can also be found within collagen, the most abundant protein in our bodies. Scientists at the Heidelberg Institute for Theoretical Studies (HITS) have revealed how the rupture of weak sacrificial bonds within collagen tissue helps to localize damage caused by excessive force, minimize negative impacts on the wider tissue, and promote recovery. Published in Nature Communications, the work shines light on collagen’s rupture mechanisms, which is crucial for understanding tissue degradation, material ageing, and potentially advancing tissue engineering techniques.

    “Collagen’s remarkable crosslink chemistry appears to be perfectly adapted to handling mechanical stress,” says Frauke Gräter, who led the research at HITS. “By using complementary computational and experimental techniques to study collagen in rat tissue, our findings indicate that weak bonds within the crosslinks of collagen have a strong propensity to rupture before other bonds, such as those in the collagen’s backbone. This serves as a protective mechanism, localizes the detrimental chemical and physical effects of radicals caused by ruptures, and likely supports molecular recovery processes.”

    Collagen comprises roughly 30 percent of all proteins in the human body. It provides strength to bones, elasticity to skin, protection to organs, flexibility to tendons, aids in blood clotting, and supports the growth of new cells. Structurally, collagen resembles a triple-braided helix: Three chains of amino acids intertwine to form a strong and rigid backbone. Each collagen fibre contains thousands of individual molecules that are staggered and bound to each other by crosslinks, contributing to collagen’s mechanical stability. It was thought that collagen crosslinks are susceptible to rupture, however little was known about the specific sites of bond ruptures or why ruptures occur where they do.

    Scientists from the Molecular Biomechanics Group at HITS aimed to unravel these puzzles using computer simulations of collagen across multiple biological scales and under different mechanical forces. They validated their findings via gel electrophoresis and mass spectrometry experiments conducted on rat tails, flexors, and Achilles tendons. By subjecting collagen to rigorous testing, the team was able to determine specific breakage points. They observed how force dissipates through the complex hierarchical structure of the tissue and how its chemical bonds bare the load.

    Mature crosslinks in collagen consist of two arms: one of which is weaker than other bonds in collagen tissue. When subjected to excessive force, the weaker arm is typically first to rupture, dissipating the force and localizing detrimental effects. The scientists found that in regions of collagen tissue where weak bonds are present, other bonds – both in the crosslinks and the collagen backbone – are more likely to remain intact, thereby preserving the structural integrity of the collagen tissue.

    Previous work led by HITS scientists revealed that excessive mechanical stress on collagen leads to the generation of radicals, which in turn cause damage and oxidative stress in the body. ”Our latest research shows that sacrificial bonds in collagen serve a vital role in maintaining the overall integrity of the material can help to localize the impacts of this mechanical stress that could otherwise have catastrophic consequences for the tissue”, explains Benedikt Rennekamp, the study’s first author. “As collagen is a major substituent of tissues in our bodies, by uncovering and understanding these rupture sites, researchers can gain valuable insights into the mechanics of collagen and potentially develop strategies to enhance its resilience and mitigate damage.”

    Heidelberg Institute for Theoretical Studies (HITS)

    Source link

  • How the use of chemicals and biodiversity loss are connected

    How the use of chemicals and biodiversity loss are connected

    Newswise — Science does not take a deep enough look at chemicals in the environment as one of the causes of the decline in biodiversity. Forty scientists in the RobustNature research network of Goethe University Frankfurt and collaborating institutes have corroborated this in a study that has now been published in the journal “Nature Ecology and Evolution”. The researchers regard an interdisciplinary approach as a new opportunity to better understand biodiversity loss in order to be able to take more efficient countermeasures. To this end, they are studying the interactions between chemical pollution and biodiversity loss.

    Declining biodiversity threatens the very basis of human life. Science contends that there are many reasons for this decline. However, while much research is being conducted into the connection between species decline on the one hand and loss of habitats, invasion by non-native species or climate change on the other, science is giving less attention to the impact of chemicals on biodiversity. A recent study by a team of researchers led by Professor Henner Hollert, Dr. Francisco Sylvester and Fabian Weichert from Goethe University Frankfurt corroborates this.

    The team has analyzed in depth the scientific literature on this topic from 1990 to 2021. According to their analysis, the very many research papers on environmental pollution through chemicals were published in only a small number of highly specialized ecotoxicological journals, in which papers on biodiversity loss are only occasionally found. “This suggests that the field is highly encapsulated, which is in stark contrast to publication behavior in relation to other causes of global biodiversity loss,” says Henner Hollert. “Research on the environmental impact of chemicals is still mostly dissociated from the assessment of biodiversity loss.”

    The authors call for a stronger interdisciplinary focus in research so that the impacts of chemical substances on biodiversity can be better understood and mitigated. What makes the researchers optimistic here is the fact that there have been many methodological advances in ecotoxicology and ecology in recent years. For example, with the help of state-of-the-art chemical and effect-based analytics as well as big data science it is possible to detect thousands of known and unknown substances in environmental samples at the same time. In addition, there are technologies for remote environmental monitoring, for example with satellites, as well as computer models for predicting the ecological risks of chemicals and methods for determining biodiversity with the help of environmental DNA.

    However, the scientists also see quite considerable challenges despite the interdisciplinary approach. For example, basic data are often lacking; each area under study has specific characteristics; the processes at ecosystem scale are complex. To meet these challenges, the researchers have made 16 recommendations. They suggest, for example, obligating industry to make relevant data public. Or they propose developing ecological test models that cover not only individual organisms but also populations, communities or even entire ecosystems.

    The RobustNature research network is examining the robustness and resilience of nature-society systems in the developing Anthropocene and specifically the interaction of chemical pollution and biodiversity loss. To address important questions related to human-ecosystem dynamics, RobustNature has established interdisciplinary collaboration with partners from Germany and abroad. https://www.robustnature.de/en/

    Partners:

    • Goethe University Frankfurt (Coordination; Faculty of Biological Sciences (15) with the faculties of Law (1), Economics & Business (2), Social Sciences (3), Educational Sciences (4), Geosciences & Geography (11), Computer Science & Mathematics (12), Medicine (16) and the profile area Sustainability & Biodiversity)
    • Institute for Social-Ecological Research (ISOE)
    • Senckenberg – Leibniz Institution for Biodiversity and Earth System Research (SGN)
    • LOEWE Center for Translational Biodiversity Genomics (LOEWE TBG)
    • Helmholtz Center for Environmental Research (UFZ), Leipzig
    • Leibniz Institute for Financial Research SAFE, Frankfurt
    • Fraunhofer Institute for Molecular Biology and Applied Ecology (IME), Schmallenberg
    • RWTH Aachen University
    • University of Saskatchewan, Canada
    • ETH Zurich, Switzerland
    • Stockholm University, Sweden

    Goethe-Universitat Frankfurt am Main

    Source link

  • A new model allows us to see and understand human embryonic development like never before.

    A new model allows us to see and understand human embryonic development like never before.

    Newswise — Two to three weeks after conception, an embryo faces a critical point in its development. In the stage known as gastrulation, the transformation of embryonic cells into specialized cells begins. This initiates an explosion of cellular diversity in which the embryonic cells later become the precursors of future blood, tissue, muscle, and more types of cells, and the primitive body axes start to form. Studying this process in the human-specific context has posed significant challenges to biologists, but new research offers an unprecedented window into this point in time in human development.

    A recent strategy to combat these challenges is to model embryo development using stem cell technologies, with many valuable approaches emerging from research groups across the globe. But embryos don’t grow in isolation and most previous developmental models have lacked crucial supporting tissues for embryonic growth. A groundbreaking model that includes both embryonic and extraembryonic components will allow researchers to study how these two parts interact around gastrulation stages—providing a unique look at the molecular and cellular processes that occur, and offering potential new insights into why pregnancies can fail as well as the origins of congenital disorders. The team, including Berna Sozen, PhD, and Zachary Smith, PhD, both assistant professors of genetics at Yale School of Medicine (YSM), published its findings in Nature on [tk].

    “This work is extremely important as it provides an ethical approach to understand the earliest stages of human growth,” says Valentina Greco, PhD, the Carolyn Walch Slayman Professor of Genetics at YSM and incoming president-elect of the International Society for Stem Cell Research (ISSCR), who was not involved in the study. “This stem cell model provides an excellent alternative to start to understand aspects of our own early development that is normally hidden within the mother’s body.”

    “The Sozen and Smith groups have achieved a milestone in developing in vitro models to study the earliest stages of human development that are unfeasible yet so important for understanding health and disease,” says Haifan Lin, PhD, the Eugene Higgins Professor of Cell Biology, director of the Yale Stem Cell Center, and president of ISSCR. “I commend their exceptional accomplishment as well as their sensitivity to ethical issues by limiting the model’s ability to develop further”

    The ethical questions are profound, including whether these models have the potential to develop into human beings. Sozen, the principal investigator of the study, emphasizes that they do not. The published paper demonstrates that this model lacks trophectodermal cells, which are required for an embryo to implant in the uterus. Sozen says this model also represents a developmental stage beyond the time frame in which embryos can implant. “It is very important to focus on the fact that our model cannot grow further or implant and therefore is not considered a human embryo,” she says. But as a reductionist strategy to mimic and study aspects of natural development, its potential is immense, especially where universal guidelines severely limit scientists’ ability to study actual embryos.

    New Model Contains Embryonic and Extraembryonic Tissues

    All embryos have two components—embryonic and extraembryonic. The tissues we have now in our adult bodies grew from the embryonic component. The extraembryonic component includes the tissues that offer nutritional and other support, such as the placenta and yolk sac. The majority of previous embryo models of developmental stages around gastrulation were single-tissue models that only contained the embryonic component.

    In the new study, the Yale-led team grew embryonic stem cells in vitro in the lab to generate their new model. They transferred these cells into a 3D culture system and exposed them to a conditions which stimulated the cells to spontaneously self-organize and differentiate. The cells diverged into two lineages—embryonic and extraembryonic precursors. The extraembryonic cells in this model were precursors for the yolk sac. The researchers grew these cellular lineages in the culture for approximately one week and analyzed how they guided each other as they developed. “We started looking into very mechanistic details, such as what signals they are giving each other and how specific genes are impacting one another,” says Sozen. “This has been limited in the literature previously.”

    The Need for Models of Human Development

    While researchers have learned a great deal from embryos of other species such as mouse, the lack of accessibility to human embryos has left significant knowledge gaps about our development. “If you want to understand human development, you need to look at the human system,” says Sozen. “This work is really important because it’s giving us direct information about our own species.” Not only does this model give access into the human gastrulation window, but will also allow for a greater quantity of research. The ability to generate as many as thousands of these models will allow for mass analysis that is not possible with human embryos. “I’m one scientist with one vision,” says Sozen. “But thinking about what other scientists are envisioning globally and what we can all accomplish is just really, really exciting to me.”

    The new model has over 70% efficiency—in other words, the stem cells aggregate correctly over roughly 70% of the time. As noted by the authors, there are some limitations to the strategy, and it is challenging to benchmark some findings against the natural embryo itself. Sozen hopes to continue to work on the models so that they become more standardized in the future.

    The team believes the models will transform scientists’ knowledge around human developmental biology. In their latest publication, the team explored some of the molecular paths underlying human gastrulation onset. In future studies, they hope to delve even deeper into the developmental pathways, including whether pregnancy loss and congenital disorders may stem from failures during gastrulation stages. Sozen believes her model can be used to look at some of these disorders and learn more about what is going awry. “Previous model systems have been able to look at this, but our model is unique because it has this extra tissue that allows us to analyze a bit deeper,” she says.  

    Yale University

    Source link

  • Soil microbes speed up CO2 emissions amid global warming

    Soil microbes speed up CO2 emissions amid global warming

    Newswise — The rise in atmospheric carbon dioxide (CO2) concentration is a primary catalyst for global warming, and an estimated one fifth of the atmospheric CO2 originates from soil sources. This is partially attributed to the activity of microorganisms, including bacteria, fungi, and other microorganisms that decompose organic matter in the soil utilizing oxygen, such as deceased plant materials. During this process, CO2 is released into the atmosphere. Scientists refer to it as heterotrophic soil respiration.

    Based on a recent study published in the scientific journal Nature Communications, a team of researchers from ETH Zurich, the Swiss Federal Institute for Forest, Snow and Landscape Research WSL, the Swiss Federal Institute of Aquatic Science and Technology Eawag, and the University of Lausanne has reached a significant conclusion. Their study indicates that emissions of CO2 by soil microbes into the Earth’s atmosphere are not only expected to increase but also accelerate on a global scale by the end of this century.

    Using a projection, they find that by 2100, CO2 emissions from soil microbes will escalate, potentially reaching an increase of up to about forty percent globally, compared to the current levels, under the worst-​case climate scenario. “Thus, the projected rise in microbial CO2 emissions will further contribute to the aggravation of global warming, emphasising the urgent need to get more accurate estimates of the heterotrophic respiration rates,” says Alon Nissan, the main author of the study and an ETH Postdoctoral Fellow at the ETH Zurich Institute of Environmental Engineering.

    Soil moisture and temperature as key factors

    These findings do not only confirm earlier studies but also provide more precise insights into the mechanisms and magnitude of heterotrophic soil respiration across different climatic zones. In contrast to other models that rely on numerous parameters, the novel mathematical model, developed by Alon Nissan, simplifies the estimation process by utilising only two crucial environmental factors: soil moisture and soil temperature.

    The model represents a significant advancement as it encompasses all biophysically relevant levels, ranging from the micro-​scales of soil structure and soil water distribution to plant communities like forests, entire ecosystems, climatic zones, and even the global scale. Peter Molnar, a professor at the ETH Institute of Environmental Engineering, highlights the significance of this theoretical model which complements large Earth System models, stating, “The model allows for a more straightforward estimation of microbial respiration rates based on soil moisture and soil temperature. Moreover, it enhances our understanding of how heterotrophic respiration in diverse climate regions contributes to global warming.”

    Polar CO2 emissions likely to more than double

    A key finding of the research collaboration led by Peter Molnar and Alon Nissan is that the increase in microbial CO2 emissions varies across climate zones. In cold polar regions, the foremost contributor to the increase is the decline in soil moisture rather than a significant rise in temperature, unlike in hot and temperate zones. Alon Nissan highlights the sensitivity of cold zones, stating, “Even a slight change in water content can lead to a substantial alteration in the respiration rate in the polar regions.”

    Based on their calculations, under the worst-​case climate scenario, microbial CO2 emissions in polar regions are projected to rise by ten percent per decade by 2100, twice the rate anticipated for the rest of the world. This disparity can be attributed to the optimal conditions for heterotrophic respiration, which occur when soils are in a semi-​saturated state, i.e. neither too dry nor too wet. These conditions prevail during soil thawing in polar regions.

    On the other hand, soils in other climate zones, which are already relatively drier and prone to further desiccation, exhibit a comparatively smaller increase in microbial CO2 emissions. However, irrespective of the climate zone, the influence of temperature remains consistent: as soil temperature rises, so does the emission of microbial CO2.

    How much CO2 emissions will increase by each climate zone

    As of 2021, most CO2 emissions from soil microbes are primarily originating from the warm regions of the Earth. Specifically, 67 percent of these emissions come from the tropics, 23 percent from the subtropics, 10 percent from the temperate zones, and a mere 0.1 percent from the arctic or polar regions.

    Significantly, the researchers anticipate substantial growth in microbial CO2 emissions across all these regions compared to the levels observed in 2021. By the year 2100, their projections indicate an increase of 119 percent in the polar regions, 38 percent in the tropics, 40 percent in the subtropics, and 48 percent in the temperate zones.

    Will soils be a CO2 sink or a CO2 source for the atmosphere?

    The carbon balance in soils, determining whether soils act as a carbon source or sink, hinges on the interplay between two crucial processes: photosynthesis, whereby plants assimilate CO2, and respiration, which releases CO2. Therefore, studying microbial CO2 emissions is essential for comprehending whether soils will store or release CO2 in the future.

    “Due to climate change, the magnitude of these carbon fluxes—both the inflow through photosynthesis and the outflow through respiration—remains uncertain. However, this magnitude will impact the current role of soils as carbon sinks,” explains Alon Nissan.

    In their ongoing study, the researchers have primarily focused on heterotrophic respiration. However, they have not yet investigated the CO2 emissions that plants release through autotrophic respiration. Further exploration of these factors will provide a more comprehensive understanding of the carbon dynamics within soil ecosystems.

    ETH Zurich

    Source link

  • New discovery set to boost disease-resistant rice

    New discovery set to boost disease-resistant rice

    Newswise — Rice that is resistant to some of the worst crop-destroying diseases but can still produce large yields could soon become a reality for farmers worldwide.

    A University of Adelaide researcher is part of an international team which has identified a new gene variant in a type of rice that can be modified to improve the performance of the crop.

    “Rice is the most widely grown crop in the world but serious bacterial and fungal diseases such as rice blast and bacterial blight are a major threat to the industry,” said co-author Associate Professor Jenny Mortimer from the University of Adelaide’s School of Agriculture, Food and Wine.

    “By identifying a specific gene called RBL1, we may have cracked the code for developing rice crops that are resistant to these destructive diseases without the yield penalties often associated with disease resistance.”

    In an international collaboration led by researchers at Huazhong Agricultural University, China and University of California Davis, USA, researchers identified a rice variety that already had strong resistance to fungal and bacterial diseases but produced poor grain yields. They showed that this plant was mutated in the gene RBL1.

    “Using existing genome-editing technology, the team then generated 57 gene variants from this type of rice and tested their immunity against several strains of rice blast and bacterial blight. We found that one variant of RBL1 had broad-spectrum disease resistance but unlike other varieties, it was still able to produce large yields in small-scale field trials,” said Associate Professor Mortimer, who is a researcher at the University’s Waite Research Institute.

    The research has been published in the journal Nature and also indicates the RBL1 gene may play a role in the plant’s defence system by interacting with the cells that stop fungal infections from spreading.

    “…we may have cracked the code for developing rice crops that are resistant to these destructive diseases without the yield penalties often associated with disease resistance.”Associate Professor Jenny Mortimer, School of Agriculture, Food and Wine, University of Adelaide.

    In 2021/2022 about 520 million tonnes of rice were consumed worldwide.

    “This is an exciting development because rice is a staple food for more than a third of the world’s population and crop disease is a constant threat to this food source,” said Associate Professor Mortimer.

    Australians alone are estimated to consume around 300,000 tonnes of rice each year; half comes from imports while the remainder is grown here. The Australian rice industry has the ability to produce up to one million tonnes of rice each year.

    While the new gene identified in this research has promising traits, more field trials are needed to test the immunity and yield of the RBL1 gene in other rice varieties.

    Initial work also indicates that this gene is important in disease resistance in other staple crops, and future research will explore this.

    “Rice crops with higher yields are needed to meet growing global demand and the results from this study could help shore up food supply in the future,” said Associate Professor Mortimer.

    University of Adelaide

    Source link

  • Evolutionary Origin of Cognitive Flexibility Traced

    Evolutionary Origin of Cognitive Flexibility Traced

    Newswise — Key factor in many neuropsychiatric diseases

    Cognitive flexibility is essential for the survival of all species on Earth. It is particularly based on functions of the so-called orbitofrontal cortex located in the frontal brain. “The loss of cognitive flexibility in everyday life is a key factor in many neuropsychiatric diseases,” Professor Burkhard Pleger and first author Dr. Bin Wang from the Berufsgenossenschaftliches Universitätsklinikum Bergmannsheil describe their motivation for the study. “Understanding the underlying network mechanisms is therefore essential for the development of new therapeutic methods.”

    Using functional magnetic resonance imaging (fMRI), the Bochum team and their cooperation partner Dr. Abhishek Banerjee from the Biosciences Institute at Newcastle University examined the brain functions of 40 participants while they were learning a sensorimotor task.

    While lying in the MRI, the volunteers had to learn to recognise the meaning of different touch signals – similar to those used in Braille – on the tip of the right index finger. One touch signal told the participants to press a button with their free hand, while another signal instructed them not to do so and to remain still. The connection between the two different touch signals and pressing the button or not pressing the button had to be learned from trial to trial. The challenge: after a certain time, the touch signals changed their meaning. What had previously meant “pressing the button” now meant “holding still” – an ideal experimental set-up to investigate the volunteers’ cognitive flexibility. The fMRI provided images of the corresponding brain activity.

    Similarities between humans and mice

    “Similar studies had already been done with mice in the past,” says Pleger. “The learning task we chose now allowed us to observe the brains of mice and humans under comparable cognitive demands.”

    A surprising finding is the comparability between the Bochum results in humans and the previously published data from mice, Wang points out. The similarity shows that cognitive functions that are important for survival, such as the flexibility to adapt quickly to suddenly changing conditions, are following comparable rules in different species.

    In addition, the Bochum scientists were able to determine a close involvement of sensory brain regions in the processing of the decisions made during tactile learning. Wang emphasises: “Besides the frontal brain, sensory regions are essential for decision-making in the brain.” “Similar mechanisms had also previously been observed in mice,” adds Pleger. “This now suggests that the interplay between the frontal brain and sensory brain regions for decision-making was formed early in the evolutionary development of the brain.”

    Funding

    The publication was funded by the Collaborative Research Centre 874 (SFB 874) and the project PL602/6-1 of the German Research Foundation. The SFB 874 “Integration and Representation of Sensory Processes” existed from 2010 to 2022 at Ruhr University Bochum.

    Ruhr-Universitat Bochum

    Source link

  • Brain activity organized by spiral signals found

    Brain activity organized by spiral signals found

    Newswise — University of Sydney and Fudan University scientists have discovered human brain signals travelling across the outer layer of neural tissue that naturally arrange themselves to resemble swirling spirals.

    The research, published today in Nature Human Behaviour, indicates these ubiquitous spirals, which are brain signals observed on the cortex during both resting and cognitive states, help organise brain activity and cognitive processing.

    Senior author Associate Professor Pulin Gong, from the School of Physics in the Faculty of Science, said the discovery could have the potential to advance powerful computing machines inspired by the intricate workings of the human brain.

    The discovery opens up new avenues for understanding how the brain works and provides valuable insights into the fundamental functions of the human brain. It could help medical researchers understand the effects of brain diseases, such as dementia, by examining the role they play.

    “Our study suggests that gaining insights into how the spirals are related to cognitive processing could significantly enhance our understanding of the dynamics and functions of the brain,” said Associate Professor Gong, who is a member of the Complex Systems research group in Physics.

    “These spiral patterns exhibit intricate and complex dynamics, moving across the brain’s surface while rotating around central points known as phase singularities

    “Much like vortices act in turbulence, the spirals engage in intricate interactions, playing a crucial role in organising the brain’s complex activities.

    “The intricate interactions among multiple co-existing spirals could allow neural computations to be conducted in a distributed and parallel manner, leading to remarkable computational efficiency.”

    PhD student Yiben Xu, the lead author of the research from the School of Physics, said the location of the spirals on the cortex could allow them to connect activity in different sections, or networks, of the brain – acting as a bridge of communication. Many of the spirals are large enough to cover multiple networks.

    The cortex of the brain, also known as the cerebral cortex, is the outermost layer of the brain that is responsible for many complex cognitive functions, including perception, memory, attention, language and consciousness.

    “One key characteristic of these brain spirals is that they often emerge at the boundaries that separate different functional networks in the brain,” Mr Xu said.

    “Through their rotational motion, they effectively coordinate the flow of activity between these networks.

    “In our research we observed that these interacting brain spirals allow for flexible reconfiguration of brain activity during various tasks involving natural language processing and working memory, which they achieve by changing their rotational directions.”

    The scientists gathered their findings from functional magnetic resonance imaging (fMRI) brain scans of 100 young adults, which they analysed by adapting methods used to understand complex wave patterns in turbulence.

    Neuroscience has traditionally focused on interactions between neurons to understand brain function. There is a growing area of science looking at larger processes within the brain to help us understand its mysteries.

    “By unravelling the mysteries of brain activity and uncovering the mechanisms governing its coordination, we are moving closer to unlocking the full potential of understanding cognition and brain function,” Associate Professor Gong said.

    University of Sydney

    Source link

  • Carbon Emissions: How Will a Warmer World Affect Us?

    Carbon Emissions: How Will a Warmer World Affect Us?

    Newswise — Washington, DC—As the world heats up due to climate change, how much can we continue to depend on plants and soils to help alleviate some of our self-inflicted damage by removing carbon pollution from the atmosphere?

    New work led by Carnegie’s Wu Sun and Anna Michalak tackles this key question by deploying a bold new approach for inferring the temperature sensitivity of ecosystem respiration—which represents one side of the equation balancing carbon dioxide uptake and carbon dioxide output in terrestrial environments. Their findings are published in Nature Ecology & Evolution.

    “Right now, plants in the terrestrial biosphere perform a ‘free service’ to us, by taking between a quarter and a third of humanity’s carbon emissions out of the atmosphere,” Michalak explained. “As the world warms, will they be able to keep up this rate of carbon dioxide removal? Answering this is critical for understanding the future of our climate and devising sound climate mitigation and adaptation strategies.”

    Photosynthesis, the process by which plants, algae, and some bacteria convert the Sun’s energy into sugars for food, requires the uptake of atmospheric carbon dioxide. This occurs during daylight hours. But through day and night, these same organisms also perform respiration, just like us, “breathing” out carbon dioxide.

    Being able to better quantify the balance of these two processes across all the components of land-based ecosystems—from soil microbes to trees and everything in between—and to understand their sensitivity to warming, will improve scientists’ models for climate change scenarios.

    In recent years, researchers—including Carnegie’s Joe Berry—have developed groundbreaking approaches for measuring the amount of carbon dioxide taken up by plants through photosynthesis, such as using satellites to monitor global photosynthetic activity and measuring the concentration of the atmospheric trace gas carbonyl sulfide.

    But, until now, developing similar tools to track respiration at the scale of entire biomes or continents has not been possible. As a result, respiration is often indirectly estimated as the difference between photosynthesis and the overall uptake of carbon dioxide.

    “We set out to develop a new way to infer how respiration is affected by changes in temperature over various ecosystems in North America,” said Sun. “This is absolutely crucial for refining our climate change projections and for informing mitigation strategies.”

    Michalak, Sun, and their colleagues developed a new way to infer at large scales how much respiration increases when temperatures warm using measurements of atmospheric carbon dioxide concentrations. These measurements were taken by a network of dozens of monitoring stations across North America.

    The team revealed that atmospheric observations suggest lower temperature sensitivities of respiration than represented in most state-of-the-art models. They also found that this sensitivity differs between forests and croplands. Temperature sensitivities of respiration have not been constrained using observational data at this scale until now, as previous work has focused on sensitivities for much smaller plots of land.

    “The beauty of our approach is that measurements of atmospheric carbon dioxide concentrations from a few dozen well-placed stations can inform carbon fluxes at the scale of entire biomes over North America,” Sun explained. “This enables a more comprehensive understanding of respiration at the continental scale, which will help us assess how future warming affects the biosphere’s ability to retain carbon,” Sun emphasized.

    To their surprise, the researchers found that respiration is less sensitive to warming than previously thought, when viewed at the biome or continental scale. But they caution that this temperature sensitivity is just one piece of a complex puzzle.

    “Although our work indicates that North American ecosystems may be more resilient to warming than plot-scale studies had implied, hitting the brakes on climate change ultimately depends on us ceasing to inject more and more carbon into the atmosphere as quickly as possible. We cannot rely on the natural components of the global carbon cycle to do the heavy lifting for us,” Michalak cautioned. “It is up to us to stop the runaway train.”

    Other members of the research team include: Xiangzhong Luo, Yao Zhang, and Trevor Keenan of University of California Berkeley and Lawrence Berkeley National Laboratory; Yuanyuan Fang of the Bay Area Air Quality Management District; Yoichi P. Shiga of the Universities Space Research Association; and Joshua Fisher of Chapman University.

    Carnegie Institution for Science

    Source link

  • UC Irvine scientists create long-lasting, cobalt-free, lithium-ion batteries

    UC Irvine scientists create long-lasting, cobalt-free, lithium-ion batteries

    Newswise — Irvine, Calif., June 14, 2023 – In a discovery that could reduce or even eliminate the use of cobalt – which is often mined using child labor – in the batteries that power electric cars and other products, scientists at the University of California, Irvine have developed a long-lasting alternative made with nickel.

    “Nickel doesn’t have child labor issues,” said Huolin Xin, the UCI professor of physics & astronomy whose team devised the method, which could usher in a new, less controversial generation of lithium-ion batteries. Until now, nickel wasn’t a practical substitute because large amounts of it were required to create lithium batteries, he said. And the metal’s cost keeps climbing.

    To become an economically viable alternative to cobalt, nickel-based batteries needed to use as little nickel as possible.

    “We’re the first group to start going in a low-nickel direction,” said Xin, whose team published its findings in the journal Nature Energy. “In a previous study by my group, we came up with a novel solution to fully eliminate cobalt. But that formulation still relied on a lot of nickel.”

    To solve that problem, Xin’s team spent three years devising a process called “complex concentrated doping” that enabled the scientists to alter the key chemical formula in lithium-ion batteries as easily as one might adjust seasonings in a recipe.

    The doping process, Xin explained, eliminates the need for cobalt in commercial components critical for lithium-ion battery functioning and replaces it with nickel.

    “Doping also increases the efficiency of nickel,” said Xin, which means EV batteries now require less nickel to work – something that will help make the metal a more attractive alternative to cobalt-based batteries.

    Xin said he thinks the new nickel chemistry will quickly start transforming the lithium-ion battery industry. Already, he said, electric vehicle companies are planning to take his team’s published results and replicate them.

    “EV makers are very excited about low-nickel batteries, and a lot of EV companies want to validate this technique,” Xin said. “They want to do safety tests.”

    About the University of California, Irvine: Founded in 1965, UCI is a member of the prestigious Association of American Universities and is ranked among the nation’s top 10 public universities by U.S. News & World Report. The campus has produced five Nobel laureates and is known for its academic achievement, premier research, innovation and anteater mascot. Led by Chancellor Howard Gillman, UCI has more than 36,000 students and offers 224 degree programs. It’s located in one of the world’s safest and most economically vibrant communities and is Orange County’s second-largest employer, contributing $7 billion annually to the local economy and $8 billion statewide. For more on UCI, visit www.uci.edu.

    Media access: Radio programs/stations may, for a fee, use an on-campus ISDN line to interview UCI faculty and experts, subject to availability and university approval. For more UCI news, visit news.uci.edu. Additional resources for journalists may be found at communications.uci.edu/for-journalists.

    NOTE TO EDITORS: PHOTO AVAILABLE AT
    https://news.uci.edu/2023/06/14/uc-irvine-scientists-create-long-lasting-cobalt-free-lithium-ion-batteries/

    University of California, Irvine

    Source link

  • Long COVID patients endure lasting inflammation: study

    Long COVID patients endure lasting inflammation: study

    Newswise — An overactive inflammatory response could be at the root of many long COVID cases, according to a new study from the Allen Institute and Fred Hutchinson Cancer Center.

    Looking at proteins circulating in the blood, the scientists found a set of molecules associated with inflammation that were present only in a subset of patients with long COVID and were not seen in those who recovered from their disease. The researchers published an article describing their findings in the journal Nature Communications today.

    Out of 55 patients with long COVID, about two-thirds had persistently high levels of certain signals of inflammation. The scientists also looked at blood samples from 25 people who had COVID but recovered, and from 25 volunteers who had never had COVID to their knowledge. Those without long COVID did not show the same signs of inflammation in their blood.

    The patient volunteers in the new analysis are part of a larger, ongoing study based at Fred Hutch, the Seattle COVID Cohort Study, which is led by Julie McElrath, M.D., Ph.D., Senior Vice President and Director of Fred Hutch’s Vaccine and Infectious Disease Division, and Julie Czartoski, ARNP, Research Clinician at the Hutch.

    Scientists have seen previous links between inflammation and long COVID, but the new study is the first to trace the persistence of these inflammatory markers over time in the same patients.

    There’s an obvious implication to these findings, said Troy Torgerson, M.D., Ph.D., Director of Experimental Immunology at the Allen Institute for Immunology, a division of the Allen Institute: Certain kinds of anti-inflammatory drugs might alleviate symptoms for some long COVID patients. But physicians need a way of telling which long COVID patients might benefit from which treatment — a form of precision medicine for a disease that so far remains maddeningly mysterious.

    “The big question was, can we define which long COVID patients have persistent inflammation versus those that don’t? That would be useful in terms of clinical trial planning and in terms of helping clinicians figure out targeted treatments for their patients,” said Torgerson, who led the Nature Communications publication along with McElrath, Aarthi Talla, Senior Bioinformatician at the Allen Institute for Immunology, Suhas Vasaikar, Ph.D., former Senior Bioinformatics Scientist (now a Principal Scientist at Seagen), and Tom Bumol, Ph.D., former Executive Vice President and Director.

    Specifically, the blood markers uncovered in this subset of patients with “inflammatory long COVID,” as the scientists call it, point to a flavor of inflammation similar to that seen in autoimmune diseases like rheumatoid arthritis. This kind of inflammation can be treated with an existing class of drugs called JAK inhibitors, at least in the case of rheumatoid arthritis (it has not yet been tested for long COVID).

    The scientists also hope to narrow down their molecular signature of “inflammatory long COVID” to a few markers that could be used in the clinic to sort this subset of long COVID patients out from the rest.

     

    Refining treatment options

    Launched in the spring of 2020, shortly after the COVID-19 pandemic shut down businesses and schools in the U.S., the Fred Hutch-led Seattle COVID Cohort Study was originally designed to follow immune responses over time in patients with mild or moderate COVID. The idea was to capture details of a “successful” immune response — one in which patients didn’t get too sick, didn’t need to be hospitalized, and recovered fully.

    But the team soon realized that even among those who didn’t get super sick, not everyone recovered. In their initial work in 2020 tracing the details of immune responses in 18 COVID patients, the scientists found a handful whose symptoms persisted, early examples of what would eventually be termed long-haul COVID, or just long COVID.

    In those early days of the study, the scientists saw that certain immune responses — namely inflammation — were consistently high in these few patients with long COVID. In the patients that got sick and then recovered fully, inflammation levels went up as their bodies fought off the illness, and then went back down as they got better. In those with long COVID, the levels never went back down.

    So the team decided to expand their study to look at more patients with long COVID, focusing on a panel of 1500 proteins circulating in the blood. These assays revealed different molecular “buckets” of long COVID, namely inflammatory and non-inflammatory long COVID. Understanding the molecular roots of the disease, or subsets of the disease, will help guide clinical trial design and ultimately treatment decisions, the scientists said.

    “The ultimate goal is to treat patients,” Talla said. “Although we call everything long COVID, what’s come out of this work shows us that we might not be able to give everyone the same kinds of therapies and we shouldn’t put everyone into one group for treatment purposes.”

    Those patients with non-inflammatory long COVID might be living with permanent organ or tissue damage from their disease, Torgerson said. That would require very different treatment from those with high levels of inflammation. The scientists also saw that these groups can’t be distinguished based on symptoms alone. If anti-inflammatory drugs prove effective in treating inflammatory long COVID, patients would first need to be screened to determine which form of long COVID they have.

    “We hope these findings provide features of long COVID that may guide potential future therapeutic approaches,” McElrath said.

    Allen Institute

    Source link