ReportWire

Tag: Nature (journal)

  • CRISPR Technology Improves Huntington’s Disease Symptoms in Models

    CRISPR Technology Improves Huntington’s Disease Symptoms in Models

    [ad_1]

    Newswise — Huntington’s disease (HD) is a neurological disorder that causes progressive loss of movement, coordination and cognitive function. It is caused by a mutation in a single gene called huntingtin or HTT. More than 200,000 people worldwide live with the genetic condition, approximately 30,000 in the United States. More than a quarter of a million Americans are at risk of inheriting HD from an affected parent. There is no cure.

    But in a new study, published December 12, 2022 in Nature Neuroscience, researchers at University of California San Diego School of Medicine, with colleagues elsewhere, describe using RNA-targeting CRISPR/Cas13d technology to develop a new therapeutic strategy that specifically eliminates toxic RNA that causes HD.

    CRISPR is known as a genome-editing tool that allows scientists to add, remove or alter genetic material at specific locations in the genome. It is based on a naturally occurring immune-defense system used by bacteria. However, current strategies run the risk of off-target edits at unintended sites that may cause permanent and inheritable chromosomal insertions or genome alterations. Because of this, significant efforts have focused on identifying CRISPR systems that target RNA directly without altering the genome.

    In the case of HD, the condition is caused by repetitive and damaging sequences in the HTT gene.

    Our cells have a hard time copying repetitive DNA, and these copying errors can cause repetitive sequences to grow longer with each generation,” said senior study author Gene Yeo, PhD, professor of cellular and molecular medicine at UC San Diego School of Medicine.

    “In the Huntingtin gene, these repeats can sometimes grow to many times their normal length, with the resulting repeat-expanded protein tending to aggregate and form toxic clumps in a part of the brain called the striatum that is important for regulating movement. The loss of functional neurons in the striatum ultimately leads to HD symptoms.”

    With colleagues at UC Irvine and Johns Hopkins University, Yeo and his team investigated whether recently described RNA-targeting CRISPR technology could be used to affect RNA (a chemical intermediate between DNA instructions and protein production) accumulation associated with HD.

    They used viral vehicles to deliver the therapy to neuronal cultures, which were developed from stem cells derived from patients with HD, and found that the approach not only targeted and destroyed mutant RNA molecules, but also cleared out toxic protein buildup. They also demonstrated that expression of other human genes was generally not disrupted by the therapy.

    “Our goal was to engineer a type of therapy that would only target the toxic RNA that causes HD and could keep the rest of the human genome and transcriptome intact,” said co-first author Kathryn Morelli, PhD, a research fellow in Yeo’s lab. “We specifically screened our top therapeutic constructs in HD patient cell lines to make sure of it.”

    Development of effective therapies for HD has proven challenging. In 2021, for example, two clinical trials for promising gene therapies were halted following disappointing results performance. Both potential drugs had been touted as game-changers for HD. Currently, no treatments can alter the course of the disease, though medications can lessen some symptoms.

    “The Huntington’s community was devastated when the clinical trials failed, primarily due to target specificity and toxic effects,” said Yeo. “But their termination has only re-energized the scientific community to find alternative strategies.”

    Yeo’s lab collaborated with Wenzhen Duan, MD, PhD, professor of psychiatry and behavioral sciences, at Johns Hopkins Medicine to conduct preclinical testing in mice. Duan, with co-first author Qian Wu, PhD, found that the therapy improved motor coordination, attenuated striatal degradation and reduced toxic protein levels in a mouse model of HD. The improvements lasted for at least 8 months without adverse effects and minimal off-target effects on other RNA molecules.

    Co-authors include: Maya L. Gosztyla, Ryan J. Marina, Kari Lee, Krysten L. Jones, Megan Huang and Allison Li, all at UC San Diego; Hongshuai Liu, Minmin Yao and Chuangchuang Zhang, Johns Hopkins University; Jiaxu Chen, Beijing University of Chinese Medicine; and Charlene Smith-Geater and Leslie M. Thompson, UC Irvine.

    Funding for this research came, in part, from the National Institutes of Health (grants EY029166, NS103172,MH107367, AI132122, HG004659, HG009889, NS099397, NS124084, T32GM008666 ) the Bev Hartig Huntington’s Disease Foundation, an NIH NS112654-03 postdoctoral fellowship, a University of California President’s Postdoctoral Fellowship, the Paul G. Allen Foundation, the China Scholarship Council and the National Natural Science Foundation of China (82174278 and 81973748), the Hereditary Disease Foundation, an NIH predoctoral fellowship (NS111859), a National Science Foundation Graduate Research Fellowship (DGE-2038238),a Myotonic Dystrophy Foundation Doctoral Research Fellowship, an Association for Women in Science Scholarship and a Triton Research and an Experiential Learning Scholarship from Eureka! Research Scholarship.

    ###

    Disclosures: Gene Yeo is a scientific advisory board member of Jumpcode Genomics and a co-founder, member of the Board of Directors, scientific advisory board member, equity holder and paid consultant for Locanabio and Eclipse BioInnovations. He is also a Distinguished Visiting Professor at the National University of Singapore.

    [ad_2]

    University of California San Diego

    Source link

  • Scientists Map Genetic Evolution of Chronic Lymphocytic Leukemia to Richter’s Syndrome

    Scientists Map Genetic Evolution of Chronic Lymphocytic Leukemia to Richter’s Syndrome

    [ad_1]

    Newswise — Every year, up to 1% of patients with chronic lymphocytic leukemia (CLL), a slow-growing blood cancer, have their disease transform into a far more aggressive cancer, a form of lymphoma known as Richter’s Syndrome. For the most part, the genomic changes that underlie this metamorphosis and push it forward have been obscure, hindering advances in treatment. 

    In a new study, scientists at Dana-Farber and the Broad Institute of MIT and Harvard trace these changes in unprecedented detail, revealing for the first time the genomic differences between CLL and Richter’s, the molecular pathways by which Richter’s emerges, and the existence of multiple subtypes of the disease.

    The findings, presented today at the annual meeting of the American Society of Hematology (ASH) and published online in the journal Nature Medicine, not only break open what was once a “black box” of molecular change but point the way to an earlier diagnosis of the disease, when treatments may be more effective.

    “The treatments for CLL and Richter’s Syndrome are very different, so it’s critical that doctors be able to determine, as early as possible, when CLL has ‘crossed over’ to become Richter’s,” says study co-senior author Catherine Wu, MD, of Dana-Farber, the Broad Institute, and Brigham and Women’s Hospital. “The traditional method of diagnosing Richter’s has a number of shortcomings, which can lead to delays in patients’ receiving the appropriate treatment. Our findings in this study hold the promise of an earlier, more definitive diagnosis based on the molecular makeup of the tumor cells.”

    One of the biggest obstacles to diagnosing Richter’s Syndrome is that patients don’t have either CLL or Richter’s cells, but a mixture of both. And unlike CLL, which is diagnosed from a blood sample, a formal diagnosis of Richter’s requires a biopsy, in which a small piece of tissue is removed and examined under a microscope for telltale changes in the structure and markings of the cells. But because a biopsy collects tissue from just one area, it may find CLL cells but miss Richter’s cells lurking right nearby. As a result, a patient may have classic symptoms of Richter’s such as swollen lymph nodes, fever, night sweats, but the biopsy – which ultimately determines the diagnosis – indicates CLL.

    To understand Richter’s at the molecular level and track how it evolves from CLL, researchers began with tissue samples gathered from 52 patients over a period of years. Samples collected when the patients had CLL were paired with samples taken when they were diagnosed with Richter’s. The researchers then performed whole exome sequencing, reading the protein-coding sections of DNA in the samples.  Because the samples likely had a mix of cells, they used computational methods on these sequencing data to estimate the proportion of CLL and Richter’s cells in each one. Knowing the relative levels of different cell types within the samples, they were able to identify the genetic changes that drive the evolution from CLL to Richter’s.

    The researchers discovered a motley assortment of such changes, including mutations in multiple genes, missing or added copies of other genes, duplication of cell genomes, and ‘chromothripsis’, a splintering and haphazard reassembly of entire chromosomes.

    “We see myriad differences between CLL and Richter’s at the molecular level, with a much more complex genome in Richter’s, as well as additional driver events,” remarks study co-senior author, Dr. Gad Getz of the Broad Institute and Massachusetts General Hospital. “In addition, we’ve found that Richter’s exists in a number of different subtypes.”

    The subtypes are distinguished by their molecular signatures, the specific pattern of genomic anomalies within their cells. These DNA-level differences suggest that the subtypes arrived by taking different routes in evolving from CLL. “The formation of multiple subtypes can give us insights into the ‘archaeology’ of the disease: what was the molecular make-up of CLL before it transformed into one subtype or another?” Wu explains.

    Being able to identify different subtypes of Richter’s can be helpful in the clinic: patients with certain subtypes generally fare better than those with others, although the outlook at the present time is poor for all patients with Richter’s. Scientists hope that advances can improve those prospects.

    Plasma diagnosis

    Once researchers knew the genomic features of Richter’s, they explored whether the disease could be detected by analyzing DNA in patients’ plasma, the liquid portion of blood. They sequenced the DNA in 46 plasma samples from 24 patients with Richter’s. The samples had been collected over a period of years, beginning within three years of a diagnosis of Richter’s and extending through treatment and relapse of the disease. The researchers then sequenced the DNA floating freely within the samples. “We found that genomic features of Richter’s were indeed detectable in the plasma,” Wu relates.

    “We then asked whether such changes could be detected before patients had been diagnosed with Richter’s based on a biopsy,” she continues. “For some patients, we clearly detected Richter’s-related DNA alterations in plasma that had been collected one to ten months prior to their Richter’s diagnosis – a time at which they had been undergoing treatment for what was presumed to be aggressive CLL.” The upshot is that it may become possible to diagnose Richter’s through a simple blood test, potentially earlier than it would show up on a biopsy and at a stage where it may be more treatable.

    “The current therapies for Richter’s are of very limited effectiveness, yet there is hope that patients may benefit from novel, more effective agents. Clinical trials of these agents and of stem cell transplant can explore that promise,” Wu notes. “By the time Richter’s is diagnosed, however, patients may be very sick, at which point transplant or other new therapies may not be an option. So detecting it early may make a critical difference.”

    Learning the molecular hallmarks of Richter’s led researchers to one further discovery. In a substantial portion of patients, their Richter’s cells didn’t share a genetic history with their CLL cells, meaning Richter’s arose independently, with no connection to the earlier disease.

    “Looking ahead to future work, we would like to analyze even larger cohorts of RS patients to obtain a comprehensive characterization of the genomic and microenvironmental landscape of RS; from this, we can discover new and robust therapeutic targets as well as refined molecular subtypes, getting us closer to applying precision medicine to this disease,” adds Dr. Getz

    “Our findings suggest that in many patients, the genomic changes in CLL that lead to Richter’s occur before patients develop symptoms of lymphoma,” Wu says. “Being able to trace the transition from CLL to Richter’s at a molecular level impacts not only our understanding of the disease but, potentially, our ability to treat it and improve outcomes for patients.”

    The co-senior authors of the study are Gad Getz, PhD, of Massachusetts General Hospital and the Broad Institute, and Stephan Stilgenbauer, MD, of Ulm University, Ulm, Germany. The lead authors are Erin Parry, MD, PhD, of Dana-Farber and the Broad Institute; Ignaty Leshchiner, PhD, of the Broad Institute and Boston University School of Medicine; and Romain Guieze, MD, PhD, of Dana-Farber, the Broad Institute, CHU de Clermont-Ferrand and Université Clermont Auvergne, both in France. The co-authors are: Camilla Lemvigh, Shanye Yin, PhD, Teddy Huang, Shuqiang Li, PhD, Geoff Fell, Robert Redd, Neil Ruthen, Stacey Fernandes, Annabelle J Anandappa, MD, Kenneth J. Livak, PhD, Donna Neuberg, ScD, Matthew S. Davids, MD, and Jennifer R. Brown, MD, PhD, of Dana-Farber; Noelia Purroy-Zuriguel, MD, PhD, of Dana-Farber and the Broad Institute; Connor Johnson, Conor Messer, Liang Li, Daniel Rosebrock, Kara Slowik, Raquel Jacobs, Ziao Lin, Binyamin A. Knisbacher, PhD, Dimitri Livitz, Liudmilla Elagina, Amaro Taylor-Weiner, PhD, Bria Persaud, Aina Martinez, Jialin Ma, Julian Hess, Brian P. Danysh, PhD, and Chip Stewart, PhD, of the Broad Institute; Eugen Tausch, MD, and Christof Schneider of Ulm University; Sameer A. Parikh, MD, and Neil E. Kay, MD, of the Mayo Clinic; Julien Broséus, MD, PhD, Sébastien Hergelant, and Pierre Feugier of Université de Lorraine, Nancy, France; Filippo Utro, PhD, Chaya Levovitz, MD, PhD, Kahn Rhrissorrakrai, PhD, and Laxmi Parida, PhD, of IBM Research, Yorktown Heights, N.Y.; Laura Z Rassenti,, PhD, and Thomas J Kipps MD, PhD, of Moores Cancer Center, Medicine, University of California, San Diego; Nitin Jain, MD, and William Wierda, MD, PhD, of the University of Texas MD Anderson Cancer Center; and Florence Cymbalista, MD, PhD, of Université Sorbonne Paris Nord, Bobigny, France.

    The study was supported by the National Institutes of Health/National Cancer Institute (grants # P01 CA206978, 1U10CA180861-01, R01 CA 213442, and P01 CA 206978); a DDCF Physician-Scientist Fellowship; Dana-Farber Flames FLAIR fellowship; ASCO Conquer Cancer Young Investigator Award; the Broad/IBM Cancer Resistance Research Project; the Fishman Family Fund; Force Hemato; a long-term EMBO fellowship (ALTF 14-2018); Deutsche Forschungsgemeinschaft; an NCI Research Specialist Award; and the Melton Family Foundation.

    ###

    [ad_2]

    Dana-Farber Cancer Institute

    Source link

  • Aging is driven by unbalanced genes

    Aging is driven by unbalanced genes

    [ad_1]

    • New study finds that most molecular-level changes that occur during aging are associated with gene length
    • Organisms balance the activity of short and long genes
    • Aging is accompanied by a shift in gene activity toward short genes, which are associated with accelerated aging
    • Researcher: ‘Aging is a subtle imbalance, away from equilibrium’ that requires your cells to expend more effort to function properly
    • Findings could lead to medical interventions that slow or even reverse the biological hallmarks of aging

    Newswise — EVANSTON, Ill. — Northwestern University researchers have discovered a previously unknown mechanism that drives aging.

    In a new study, researchers used artificial intelligence to analyze data from a wide variety of tissues, collected from humans, mice, rats and killifish. They discovered that the length of genes can explain most molecular-level changes that occur during aging.

    All cells must balance the activity of long and short genes. The researchers found that longer genes are linked to longer lifespans, and shorter genes are linked to shorter lifespans. They also found that aging genes change their activity according to length. More specifically, aging is accompanied by a shift in activity toward short genes. This causes the gene activity in cells to become unbalanced.

    Surprisingly, this finding was near universal. The researchers uncovered this pattern across several animals, including humans, and across many tissues (blood, muscle, bone and organs, including liver, heart, intestines, brain and lungs) analyzed in the study.

    The new finding potentially could lead to interventions designed to slow the pace of — or even reverse — aging.

    The study will be published on Dec. 9 in the journal Nature Aging.

    “The changes in the activity of genes are very, very small, and these small changes involve thousands of genes,” said Northwestern’s Thomas Stoeger, who led the study. “We found this change was consistent across different tissues and in different animals. We found it almost everywhere. I find it very elegant that a single, relatively concise principle seems to account for nearly all of the changes in activity of genes that happen in animals as they age.”

    “The imbalance of genes causes aging because cells and organisms work to remain balanced — what physicians denote as homeostasis,” said Northwestern’s Luís A.N. Amaral, a senior author of the study. “Imagine a waiter carrying a big tray. That tray needs to have everything balanced. If the tray is not balanced, then the waiter needs to put in extra effort to fight the imbalance. If the balance in the activity of short and long genes shifts in an organism, the same thing happens. It’s like aging is this subtle imbalance, away from equilibrium. Small changes in genes do not seem like a big deal, but these subtle changes are bearing down on you, requiring more effort.”

    An expert in complex systems, Amaral is the Erastus Otis Haven Professor of Chemical and Biological Engineering in Northwestern’s McCormick School of Engineering. Stoeger is a postdoctoral scholar in Amaral’s laboratory.

    Looking across ages

    To conduct the study, the researchers used various large datasets, including the Genotype-Tissue Expression Project, a National Institutes of Health-funded tissue bank that archives samples from human donors for research purposes.

    The research team first analyzed tissue samples from mice — aged 4 months, 9 months, 12 months, 18 months and 24 months. They noticed the median length of genes shifted between the ages of 4 months and 9 months, a finding that hinted at a process with an early onset. Then, the team analyzed samples from rats, aged 6 months to 24 months, and killifish, aged 5 weeks to 39 weeks.

    “There already seems to be something happening early in life, but it becomes more pronounced with age,” Stoeger said. “It seems that, at a young age, our cells are able to counter perturbations that would lead to an imbalance in gene activity. Then, suddenly, our cells are no longer able to counter it.”

    After completing this research, the researchers turned their attention to humans. They looked at changes in human genes from ages 30 to 49, 50 to 69 and then 70 and older. Measurable changes in gene activity according to gene length already occurred by the time humans reached middle age.

    “The result for humans is very strong because we have more samples for humans than for other animals,” Amaral said. “It was also interesting because all the mice we studied are genetically identical, the same gender and raised in the same laboratory conditions, but the humans are all different. They all died from different causes and at different ages. We analyzed samples from men and women separately and found the same pattern.”

    ‘Systems-level’ changes

    In all animals, the researchers noticed subtle changes to thousands of different genes across samples. This means that not just a small subset of genes that contributes to aging. Aging, instead, is characterized by systems-level changes.

    This view differs from prevailing biological approaches that study the effects of single genes. Since the onset of modern genetics in the early 20th century, many researchers expected to be able to attribute many complex biological phenomena to single genes. And while some diseases, such as hemophilia, do result from single gene mutations, the narrow approach to studying single genes has yet to lead to explanations for the myriad changes that occur in neurodegenerative diseases and aging.

    “We have been primarily focusing on a small number of genes, thinking that a few genes would explain disease,” Amaral said. “So, maybe we were not focused on the right thing before. Now that we have this new understanding, it’s like having a new instrument. It’s like Galileo with a telescope, looking at space. Looking at gene activity through this new lens will enable us to see biological phenomena differently.”

    Lengthy insights

    After compiling the large datasets, many of which were used in other studies by researchers at Northwestern University Feinberg School of Medicine and in studies outside Northwestern, Stoeger brainstormed an idea to examine genes, based on their length.

    The length of a gene is based on the number of nucleotides within it. Each string of nucleotides translates to an amino acid, which then forms a protein. A very long gene, therefore, yields a large protein. And a short gene yields a small protein. According to Stoeger and Amaral, a cell needs to have a balanced number of small and large proteins to achieve homeostasis. Problems occur when that balance gets out of whack. 

    Although the researchers did find that long genes are associated with increased lifespans, short genes also play important roles in the body. For example, short genes are called upon to help fight off pathogens.

    “Some short genes could have a short-term advantage on survival at the expense of ultimate lifespan,” Stoeger said. “Thus, outside of a research laboratory, these short genes might help survival under harsh conditions at the expense of shortening the animal’s ultimate lifespan.”

    Suspected ties to long COVID-19

    This finding also may help explain why bodies take longer to heal from illnesses as they age. Even with a simple injury like a paper cut, an older person’s skin takes a longer time to recover. Because of the imbalance, cells have fewer reserves to counteract the injury. 

    “Instead of just dealing with the cut, the body also has to deal with this activity imbalance,” Amaral hypothesized. “It could explain why, over time with aging, we don’t handle environmental challenges as well as when we were younger.”

    And because thousands of genes change at the system-level, it doesn’t matter where the illness starts. This could potentially explain illnesses like long COVID-19. Although a patient might recover from the initial virus, the body experiences damage elsewhere.

    “We know cases where infections — predominantly viral infections — lead to other problems later in life,” Amaral said. “Some viral infections can lead to cancer. Damage moves away from the infected site and affects other areas of our body, which then is less able to fight environmental challenges.”

    Hope for medical interventions

    The researchers believe their findings could open new venues for the development of therapeutics, designed to reverse or slow aging. Current therapeutics to treat illness, the researchers argue, are merely targeting the symptoms of aging rather than aging itself. Amaral and Stoeger compare it to using Tylenol to reduce a fever instead of treating the illness that caused the fever.

    “Fevers can occur for many, many reasons,” Amaral said. “It could be caused by an infection, which requires antibiotics to cure, or caused by appendicitis, which requires surgery. Here, it’s the same thing. The issue is the gene activity imbalance. If you can help correct the imbalance, then you can address the downstream consequences.”

    Other Northwestern co-senior authors include Richard Morimoto, a professor of molecular biosciences in the Weinberg College of Arts and Sciences; Dr. Alexander Misharin, an associate professor of medicine at Feinberg; and Dr. G.R. Scott Budinger, the Ernest S. Bazley Professor of Airway Diseases at Feinberg and chief of pulmonary and critical care at Northwestern Medicine. 

    The study, “Aging is associated with a systemic length-associated transcriptome imbalance,” was supported by the Office of the Assistant Secretary of Defense for Health Affairs, the U.S. Department of Defense, the National Institutes of Health (grant numbers AG068544, AG049665, AG054407, AG026647, AG057296, AG059579), the Veterans Administration, the National Science Foundation and a gift from John and Leslie McQuown.

    [ad_2]

    Northwestern University

    Source link

  • The messy death of a star

    The messy death of a star

    [ad_1]

    Newswise — Around 2500 years ago, a star ejected most of its gas, forming the beautiful Southern Ring Nebula, NGC 3132, chosen as one of the first five image packages from the James Webb Space Telescope (JWST).

    A team of nearly 70 astronomers from 66 organisations across Europe, North, South and Central America, and Asia have used the JWST images to piece together the messy death of this star. 

    “It was nearly three times the size of our Sun, but much younger, about 500 million years old. It created shrouds of gas that have expanded out from the ejection site, and left a remnant dense white dwarf star, with about half the mass of the Sun, but approximately the size of the Earth,” says Professor Orsola De Marco, lead author on the paper, from Macquarie University’s Research Centre for Astronomy, Astrophysics and Astrophotonics. 

    “We were surprised to find evidence of two or three companion stars that probably hastened its death as well as one more ‘innocent bystander’ star that got caught up in the interaction,” she says. 

    The study was based on the JWST images supplemented by data from the ESO Very Large Telescope in Chile, the San Pedro de Mártir Telescope in Mexico, the Gaia Space Telescope, and the Hubble Space Telescope.  

    It paves the way for future JWST observations of nebulae, providing insight into fundamental astrophysical processes including colliding winds, and binary star interactions, with implications for supernovae and gravitational wave systems.

    The paper is published today in Nature Astronomy https://www.nature.com/articles/s41550-022-01845-2 

    “When we first saw the images, we knew we had to do something, we must investigate! The community came together and from this one image of a randomly chosen nebula we were able to discern much more precise structures than ever before. The promise of the James Webb Space Telescope is incredible,” says De Marco, who is also president of the International Astronomical Union Commission on Planetary Nebulae. 

    Astronomers gathered online and developed theories and models around the mid-infrared image to reconstruct just how the star had died. 

    Shining at the centre of the nebula is an ultra-hot central star, a white dwarf that has burned up its hydrogen. “This star is now small and hot, but is surrounded by cool dust,” said Joel Kastner, another team member, from the Rochester Institute of Technology USA. “We think all that gas and dust we see thrown all over the place must have come from that one star, but it was tossed in very specific directions by the companion stars.”

    There are also a series of spiral structures moving out from the centre. These concentric arches would be created when a companion orbits the central star while it is losing mass. Another companion is further out and is also visible in the picture.

    Looking at a three-dimensional reconstruction of the data, the team also saw pairs of protuberances that may occur when astronomical objects eject matter in jet form. These are irregular and shoot out in different directions, possibly implying a triple star interaction at the centre. 

    De Marco says: “We first inferred the presence of a close companion because of the dusty disk around the central star, the further partner that created the arches and the super far companion that you can see in the image. Once we saw the jets, we knew there had to be another star or even two involved at the centre, so we believe there are one or two very close companions, an additional one at middle distance and one very far away. If this is the case, there are four or even five objects involved in this messy death.”

    NASA image and caption

    Catalogued as NGC 3132 the Southern Ring Nebula is the death shroud of a dying sun-like star some 2,500 light-years from Earth. Composed of gas and dust the stunning cosmic landscape is nearly half a light-year in diameter, explored in unprecedented detail by the James Webb Space Telescope. 

    In this NIRCam image the bright star near centre is a companion of the dying star. In mutual orbit, the star whose transformation has ejected the nebula’s gas and dust shells over thousands of years is the fainter stellar partner. Evolving to become a white dwarf, the faint star appears along the diffraction spike extending toward the 8 o’clock position. 

    Reported in Nature Astronomy today

    All images and detailed analysis available from the Space Telescope Science Institute here.

    [ad_2]

    Macquarie University

    Source link

  • Finding the right AI for you

    Finding the right AI for you

    [ad_1]

    Newswise — The human genome is three billion letters of code, and each person has millions of variations. While no human can realistically sift through all that code, computers can. Artificial intelligence (AI) programs can find patterns in the genome related to disease much faster than humans can. They also spot things that humans miss. Someday, AI-powered genome readers may even be able to predict the incidence of diseases from cancer to the common cold. Unfortunately, AI’s recent popularity surge has led to a bottleneck in innovation.

    “It’s like the Wild West right now. Everyone’s just doing whatever the hell they want,” says Cold Spring Harbor Laboratory (CSHL) Assistant Professor Peter Koo. Just like Frankenstein’s monster was a mix of different parts, AI researchers are constantly building new algorithms from various sources. And it’s difficult to judge whether their creations will be good or bad. After all, how can scientists judge “good” and “bad” when dealing with computations that are beyond human capabilities?

    That’s where GOPHER, the Koo lab’s newest invention, comes in. GOPHER (short for GenOmic Profile-model compreHensive EvaluatoR) is a new method that helps researchers identify the most efficient AI programs to analyze the genome. “We created a framework where you can compare the algorithms more systematically,” explains Ziqi Tang, a graduate student in Koo’s laboratory.

    GOPHER judges AI programs on several criteria: how well they learn the biology of our genome, how accurately they predict important patterns and features, their ability to handle background noise, and how interpretable their decisions are. “AI are these powerful algorithms that are solving questions for us,” says Tang. But, she notes: “One of the major issues with them is that we don’t know how they came up with these answers.”

    GOPHER helped Koo and his team dig up the parts of AI algorithms that drive reliability, performance, and accuracy. The findings help define the key building blocks for constructing the most efficient AI algorithms going forward. “We hope this will help people in the future who are new to the field,” says Shushan Toneyan, another graduate student at the Koo lab.

    Imagine feeling unwell and being able to determine exactly what’s wrong at the push of a button. AI could someday turn this science-fiction trope into a feature of every doctor’s office. Similar to video-streaming algorithms that learn users’ preferences based on their viewing history, AI programs may identify unique features of our genome that lead to individualized medicine and treatments. The Koo team hopes GOPHER will help optimize such AI algorithms so that we can trust they’re learning the right things for the right reasons. Toneyan says:  “If the algorithm is making predictions for the wrong reasons, they’re not going to be helpful.”

    [ad_2]

    Cold Spring Harbor Laboratory

    Source link

  • Reliable planning tool for the emissions path to achieving the Paris temperature goal

    Reliable planning tool for the emissions path to achieving the Paris temperature goal

    [ad_1]

    Newswise — The central aim of the Paris climate agreement is clear: Limiting man-made global warming to well below 2°C. This limit requires a reduction in greenhouse gas emissions to net zero. But what do the intermediate stages look like? How big should the reduction in emissions be within the next five, ten, or fifteen years? And which emissions path is being followed? There is no consensus on these issues between countries, which complicates the active implementation of the Paris Agreement.

    Researchers at the University of Bern have now developed a new method to determine the necessary reduction in emissions on a continuous basis. The main idea: Instead of complex climate models and scenarios, the observed relationship between warming and emissions is applied, and the reduction path is adapted repeatedly according to the latest observations. This new approach has just been published in the journal Nature Climate Change.

    A new calculation method for the emission reduction path

    To date, climate models have been used to calculate possible emissions pathways to the net zero goal. These pathways are based on scenarios including economic and social developments. “These calculations for the emission paths are subject to large uncertainties. This makes the decision-making more difficult and might be one reason why the promised reductions made by the 194 signatory countries to the Paris Agreement remain insufficient,” says lead author Jens Terhaar, explaining the background to the study. Like most of the other authors, Terhaar is a member of the Oeschger Center for Climate Change Research at the University of Bern.

    “Since the climate agreement actually aims at regulating temperature, we thought to specify an optimal emissions reduction path for this purpose which is independent of model-based projections,” continues Terhaar. According to this initial idea, a calculation method has emerged which is based exclusively on observation data: on the one hand, global surface temperatures in the past, and on the other hand, CO2 emissions statistics.

    The Paris Agreement calls for a stocktake of the necessary reductions in global emissions every five years. “The new Bern calculation method is ideally suited to support the stocktake mechanism of the Paris Agreement, as it enables the emission reductions to be recalculated regularly on an adaptive basis,” explains co-author Fortunat Joos of the Oeschger Center. For this purpose, a new algorithm has been developed which is known as the AERA (adaptive emissions reduction approach). In simple terms, the algorithm correlates CO2 emissions with rising temperatures, and is adjusted using a control mechanism. In this way, the current uncertainties in the interaction between these variables can be put aside.

    “Our adaptive approach circumvents the uncertainties, so to speak,” explains Fortunat Joos. “In the same way that a thermostat continuously adjusts the heating to the required room temperature, our algorithm adjusts the emission reductions according to the latest temperature and emissions data. This will allow us to approach a temperature goal, such as the 2°C goal, step-by-step and with specific interim goals.”

    Stronger emissions goals and effective implementation

    “The AERA method already confirms that international climate policy must be far more ambitious,” demands Terhaar. According to the Bern study, to achieve the 2°C goal, global CO2 emissions would have to fall by 7 percent between 2020 and 2025. They actually increased by approximately 1 percent in 2021 in comparison with 2020, though. According to the algorithm, limiting global warming to 1.5°C would require as much as a 27 percent reduction by 2025. “We need far stricter emissions goals than those to which nations have committed,” explains Thomas Frölicher, co-author of the study from the Oeschger Center, “and above all else, effective implementation of the goals.”

    The Researchers in Bern hope that the new calculation method will succeed in finding its way into international climate policy. “The AERA algorithm is already attracting a lot of interest in the climate research community, as it can also be applied to climate modelling,” explains Jens Terhaar. Until now, climate models with prescribed greenhouse gas concentrations have been used. This meant that at the end of the 21st century, the warming for a specific greenhouse gas concentration was very uncertain. When using the climate models with the AERA, however, emissions are continuously adjusted according to the calculated temperature and the intended temperature goal. On this basis, the model temperature is eventually stabilised at the intended level and all the models simulate the same warming, but with different emission pathways. “The AERA enables us to study impacts such as heat waves or ocean acidification for different temperature goals – such as 1.5°C versus 2°C versus 3°C – on a consistent basis and with state-of-the-art models,” explains Terhaar.

    Worldwide, 11 research groups have already started to apply the algorithm under the leadership of the University of Bern in order to study such impacts.

    Information about the publication:

    Jens Terhaar, Thomas L. Frölicher, Mathias T. Aschwanden, Pierre Friedlingstein, Fortunat Joos. Adaptive emission reduction approach to reach any global warming target, Nature Climate Change

    DOI: 10.1038/s41558-022-01537-9

    Oeschger Center for Climate Change Research

    The Oeschger Center for Climate Change Research (OCCR) is one of the strategic centers of the University of Bern. It brings together researchers from 14 institutes and four faculties. The OCCR conducts interdisciplinary research at the cutting edge of climate change research. The Oeschger Center was founded in 2007 and bears the name of Hans Oeschger (1927-1998), a pioneer of modern climate research, who worked in Bern.

    Further information: www.oeschger.unibe.ch

    [ad_2]

    University of Bern

    Source link

  • Researchers develop system for improved latent fingerprint recognition

    Researchers develop system for improved latent fingerprint recognition

    [ad_1]

    Newswise — Recently, a research group led by Prof. LONG Shibing from the University of Science and Technology of China (USTC) of the Chinese Academy of Sciences, collaborating with Prof. LIU Qi from Fudan University, developed an in-sensor reservoir computing system for latent fingerprint recognition with deep ultraviolet photo-synapses and a memristor array. This study was published in Nature Communications.

    Deep ultraviolet (DUV) photodetectors play a pivotal role in deep space exploration, environmental monitoring, and bio-information identification. However, the conventional ex-situ DUV fingerprint recognition systems use separated sensor, memory, and processor, which significantly increases latency in decision making and thus overall computing power. Inspired by the human visual perception system, the research group constructed a DUV in-sensor RC system with optical synapses as the input layer of the reservoir and the memristor device array as the readout network, which can sense and process in parallel to ensure high efficiency and low power consumption.

    The research team used the Ga-rich component design and developed amorphous GaOx (a-GaOx) photo-synapses with enhanced persistent photoconductivity (PPC) effects. A non-linear mapping relationship for the DUV in-sensor RC system wasconstructed by inputting 4-bit equivalent light pulses for simulation so that the image pixel sequence information could be sampled for feature values.

    Ultimately, the training of the reservoir outputs was achieved through the stable polymorphic modulation properties of the memristor device array, enabling small-scale DUV fingerprint recognition. The excellent recognition accuracy of DUV fingerprint images when using a dual-feature strategy and this hardware system is almost identical to the simulated results. The system achieves 100% recognition accuracy after 100 training epochs and maintains 90% accuracy even in the presence of 15% background noise, in line with the anti-noise characteristics of DUV light.

    This fully-hardware DUV in-sensor RC system provides a good reference prototype for efficient recognition and secure applications of latent fingerprints. It is also a critical reference for developing intelligent optoelectronic devices in the DUV waveband.

    “This prototype system … will provide more insight into emerging in-sensor reservoir computing. Overall, the topic of this work is truly interesting.” said one referee for Nature Communications.

    [ad_2]

    University of Science and Technology of China

    Source link

  • Palm e-tattoo can tell when you’re stressed out

    Palm e-tattoo can tell when you’re stressed out

    [ad_1]

    Newswise — Our palms tell us a lot about our emotional state, tending to get wet when people are excited or nervous. This reaction is used to measure emotional stress and help people with mental health issues, but the devices to do it now are bulky, unreliable and can perpetuate social stigma by sticking very visible sensors on prominent parts of the body.

    Researchers at The University of Texas at Austin and Texas A&M University have applied emerging electronic tattoo (e-tattoo) technology to this type of monitoring, known as electrodermal activity or EDA sensing. In a new paper published recently in Nature Communications, the researchers created a graphene-based e-tattoo that attaches to the palm, is nearly invisible and connects to a smart watch.

    “It’s so unobstructive that people sometimes forget they had them on, and it also reduces the social stigma of wearing these devices in such prominent places on the body,” said Nanshu Lu, professor in the Department of Aerospace Engineering and Engineering Mechanics and leader of the project.

    Lu and her collaborators have been advancing wearable e-tattoo technology for many years. Graphene has been a favorite material because of how thin it is and how well it measures electrical potential from human body, leading to very accurate readings.

    But, such ultra-thin materials can’t handle much, if any strain. So that makes applying them to parts of the body that include a lot of movement, such as the palm/wrist, challenging.

    The secret sauce of this discovery is how the e-tattoo on the palm is able to successfully transfer data to a rigid circuit – in this case a commercially available smart watch, in out-of-lab, ambulatory settings. They used a serpentine ribbon that has two layers of graphene and gold partially overlapped. By snaking the ribbon back and forth, it can handle the strain that comes with movements of the hand for everyday activities like holding the steering wheel while driving, opening doors, running etc.

    Current palm monitoring tech uses bulky electrodes that fall off and are very visible, or EDA sensors applied to other parts of the body, which gives a less accurate reading.

    Other researchers have tried similar methods using nanometer-thick straight-line ribbons to connect the tattoo to a reader, but they couldn’t handle the strain of constant movement.

    Lu said the researchers were inspired by virtual reality (VR), gaming and the incoming metaverse for this research. VR is used in some cases to treat mental illness; however, the human-aware capability in VR remains lacking in many ways.

    “You want to know whether people are responding to this treatment,” Lu said. “Is it helping them? Right now, that’s hard to tell.”

    Other members of the team include Hongwoo Jang and Eunbin Kim from the Texas Materials Institute; Sangjun Kim and Kyoung-Ho Ha from the Walker Department of Mechanical Engineering; Xiangxing Yang from the Chandra Family Department of Electrical and Computer Engineering; and Kaan Sel and Roozbeh Jafari from Texas A&M’s Department of Electrical and Computer Engineering.

    [ad_2]

    University of Texas at Austin (UT Austin)

    Source link

  • To save nature, focus on populations, not species

    To save nature, focus on populations, not species

    [ad_1]

    Newswise — AMHERST, Mass. – Human-released greenhouse gasses are causing the world to warm, and with that warming comes increasing stress for many of the planet’s plants and animals. That stress is so great that many scientists believe we are currently in the midst of the “sixth extinction,” when entire species are disappearing up to 10,000 times faster than before the industrial era. However, scientists have been uncertain which ecosystems, and which species, are most at risk. New research, recently published in Nature Climate Change, is the first to show that the focus on species-level risk obscures a wide variability in temperature tolerance, even within the same species, and that this variability is greater for marine species than terrestrial ones. The findings have immediate implications for management and conservation practices and offer a window of hope in the effort to adapt to a rapidly warming world.

    “One of the most important biological discoveries in the last century is that evolution can happen much more quickly than previously thought,” says Brian Cheng, professor of marine ecology at the University of Massachusetts Amherst and the paper’s senior author. “One of the implications of this is that different populations of the exact same species can adapt to their local environments more readily than traditional biology would have thought possible.”

    It turns out that this rapid, localized adaptation may be able to help ensure survival in a warming world.

    By conducting a metanalysis of 90 previously published studies, from which Cheng and his co-authors mined data on 61 species, the team was able to construct a set of “upper thermal limits”—specific temperatures above which each species could not survive. However, by zooming in further and looking at 305 distinct populations drawn from that pool of 61 species, they found that different populations of the same marine species often had widely different thermal limits. This suggests that some populations have evolved different abilities to tolerate high temperatures. The key then, is to keep different populations of the same species connected so that the populations that have adapted to the higher temperatures can pass this advantage on to the populations with the lower thermal limits.

    In other words, imagine a wide-ranging marine species, such as the diminutive Atlantic killifish, which occurs from the warm Florida coast of the United States north to the frigid waters of Newfoundland, Canada. The northern killifish populations may be better able to withstand warming waters if some of their southern kin are able to naturally shift their range to the north.

    “Scale matters,” says Matthew Sasaki, a marine biologist and evolutionary ecologist who completed this research as part of his postdoctoral fellowship at the University of Connecticut and is the paper’s lead author. “The patterns you see across species aren’t the same you see within species, and the big-picture story doesn’t necessarily match what is happening on the local level.”

    In yet another twist, the team, which was funded by the National Science Foundation and was composed of biologists specializing in terrestrial as well as marine ecosystems, discovered that this intra-species variability was primarily a feature of animals living in the ocean and intertidal areas. Populations of widespread species that live on land or in freshwater exhibit far more homogeneity in their thermal limits, and thus could be more sensitive to rising temperatures. However, on land, plants and animals can take advantage of microclimates to cool down and avoid extreme temperatures, by moving into shady spots, for example.

    Taken together, the research suggests that a one-size-fits-all-species approach to conservation and management won’t work. Instead, write the authors, we need to understand how populations have adapted to their local conditions if we want to predict their vulnerability to changing conditions. A more effective approach would include ensuring that marine species can find wide swaths of undamaged habitat throughout their entire range, so that different populations of the same species can mix and pass on the adaptations that help them survive warmer waters. And on land, we need to maintain large patches of cool ecosystems—such as old-growth forests—that terrestrial species can use as refuges.

    “The glimmer of hope here,” says Cheng, “is that with conservation policies tailored to individual populations, we can buy them time to adapt to the warming world.”

     

    [ad_2]

    University of Massachusetts Amherst

    Source link

  • Archaeology: Owl-shaped plaques may have been on Copper Age children’s wish list

    Archaeology: Owl-shaped plaques may have been on Copper Age children’s wish list

    [ad_1]

    Newswise — Ancient owl-shaped slate engraved plaques, dating from around 5,000 years ago in the Iberian Peninsula, may have been created by children as toys, suggests a paper published in Scientific Reports. These findings may provide insights into how children used artefacts in ancient European societies.

    Around 4,000 engraved slate plaques resembling owls – with two engraved circles for eyes and a body outlined below – and dating from the Copper Age between 5,500 and 4,750 years ago have been found in tombs and pits across the Iberian Peninsula. It has been speculated that these owl plaques may have had ritualistic significance and represented deities or the dead.

    Now, Juan Negro and colleagues re-examined this interpretation and suggest instead that these owl plaques may have been crafted by young people based on regional owl species, and may have been used as dolls, toys, or amulets. The authors assessed 100 plaques and rated them (on a scale of one to six) based on how many of six owl traits they displayed including two eyes, feathery tufts, patterned feathers, a flat facial disk, a beak, and wings. The authors compared these plaques to 100 modern images of owls drawn by children aged 4 to 13 years old, and observed many similarities between the depictions of owls. Owl drawings more closely resembled owls as children aged and became more skilful.

    The authors observe the presence of two small holes at the top of many plaques. These holes appear impractical to pass a cord through in order to hang the plaque, and lack the expected lack wear-marks if this was their use. Instead they speculate that feathers could be inserted through the holes in order to resemble the tufts on the heads of some regional owl species, such as the long-eared owl (Asio otus).

    The authors propose that, rather than being carved by skilled artisans for use in rituals, many of the owl plaques were created by children, and more closely resembled owls as the children’s carving skills increased. They may represent a glimpse into childhood behaviours in Copper Age societies.

    [ad_2]

    Scientific Reports

    Source link

  • Clouds less climate-sensitive than assumed

    Clouds less climate-sensitive than assumed

    [ad_1]

    Newswise — In a major field campaign in 2020, Dr. Raphaela Vogel who is now at Universität Hamburg’s Center for Earth System Research and Sustainability (CEN) and an international team from the Laboratoire de Météorologie Dynamique in Paris and the Max Planck Institute for Meteorology in Hamburg analyzed observational data they and others collected in fields of cumulus clouds near the Atlantic island of Barbados. Their analysis revealed that these clouds’ contribution to climate warming has to be reassessed.

    “Trade-wind clouds influence the climate system around the globe, but the data demonstrate behavior differently than previously assumed. Consequently, an extreme rise in Earth’s temperatures is less likely than previously thought,” says Vogel, an atmospheric scientist. “Though this aspect is very important for more accurately projecting future climate scenarios, it definitely doesn’t mean we can back off on climate protection.”

    To date, many climate models have simulated a major reduction in trade-wind clouds, which would mean much of their cooling function would be lost and the atmosphere would consequently warm even more. The new observational data shows that this isn’t likely to occur.

    What is certain is that, as global warming progresses, more water on the ocean’s surface evaporates and the moisture near the base of trade-wind clouds increases. In contrast, the air masses in the upper part of the clouds are very dry and only become slightly moister. This produces a substantial difference in moisture above and below. In the atmosphere, this is dispelled when the air masses mix. The previous hypothesis: drier air is transported downward, causing the cloud droplets to evaporate more rapidly and making it more likely that the clouds will dissipate.

    The observational data from Barbados now offers the first robust quantification as to how pronounced the vertical mixing actually is, and how this affects moisture and cloud cover as a whole. As such, it is the first data to shed light on a process that is essential to understanding climate change. In brief: more intensive mixing does not make the lower layers drier or make the clouds dissipate. Rather, the data shows that the cloud cover actually increases with increasing vertical mixing.

    “That’s good news, because it means that trade-wind clouds are far less sensitive to global warming than has long been assumed,” says Vogel. “With our new observations and findings, we can now directly test how realistically climate models portray the occurrence of trade-wind clouds. In this regard, a new generation of high-resolution climate models that can simulate the dynamics of clouds around the globe down to scales of one kilometer are particularly promising. Thanks to them, future projections will be more accurate and reliable.”

     

    The month-long field campaign EUREC4A (2020) was designed by the team members around extended flights with two research aircraft, which were equipped with different instruments and operated at different altitudes, and shipboard measurements from the R/V Meteor — A German research vessel managed by the University of Hamburg. One plane was used to drop hundreds of atmospheric probes from an altitude of nine kilometers. As they fell, the probes gathered atmospheric data on the temperature, moisture, pressure and wind. The other plane surveyed clouds at their base, at an altitude of 800 meters, while the ship performed surface-based measurements. The result: an unprecedented database that will help to understand the unclear role of clouds in the climate system – and to more accurately predict their role in future climate change.

    Whether clouds have a cooling or warming effect depends on how high they are. With a maximum altitude of two to three kilometers, the trade-wind clouds examined here are comparatively low, reflect sunlight, and cool the atmosphere in the process. In contrast, higher clouds amplify the greenhouse effect, warming the climate.

    Publication: Vogel R, Albright AL, Vial J, George G, Stevens B, Bony S (2022): Strong cloud-circulation coupling explains weak trade cumulus feedbackNature, DOI: 10.1038/s41586-022-05364-y https://www.nature.com/articles/s41586-022-05364-y

    Nature Research Briefing: https://doi.org/10.1038/d41586-022-03640-5

    [ad_2]

    Universitat Hamburg

    Source link

  • ZTF makes first discovery of a rare cosmic “lunch”

    ZTF makes first discovery of a rare cosmic “lunch”

    [ad_1]

    Newswise — The universe can be a violent place. Stars die or collide with each other and black holes devour everything that gets too close. These and other events produce flashes of light in the night sky that astronomers call transients. The Zwicky Transient Facility is currently one of the largest transient surveys astronomers use to study the ever-changing universe. The survey is also a treasure trove of rare, strange, and unusual events that often astronomers discover by chance. 

    “Our new search technique helps us to quickly identify rare cosmic events in the ZTF survey data. And since ZTF and upcoming larger surveys such as Vera Rubin’s LSST scan the sky so frequently, we can now expect to uncover a wealth of rare, or previously undiscovered cosmic events and study them in detail”, says Igor Andreoni, a postdoctoral associate in the Department of Astronomy at UMD and NASA Goddard Space Flight Center. 

    AT2022cmc is a peculiar case of what is known as a tidal-disruption event or TDE. TDEs happen with a star approaching a black hole is violently ripped apart by the black hole’s gravitational tidal forces—similar to how the Moon pulls tides on Earth but with greater strength. Then, pieces of the star are captured into a swiftly spinning disk orbiting the black hole. Finally, the black hole consumes what remains of the doomed star in the disk. 

    In some extremely rare cases such as AT2022cmc, the supermassive black hole launches “relativistic jets”—beams of matter traveling close to the speed of light—after destroying a star. Discovered in Feb 2022, astronomers led by Andreoni followed up AT2022cmc and observed it with multiple facilities at multiple wavelengths. The analysis is now published in the journal Nature. 

    “The last time scientists discovered one of these jets was well over a decade ago,” said Michael Coughlin, an assistant professor of astronomy at the University of Minnesota Twin Cities and co-lead on the paper. “From the data we have, we can estimate that relativistic jets are launched in only 1% of these destructive events, making AT2022cmc an extremely rare occurrence. In fact, the luminous flash from the event is among the brightest ever observed.”

    The novel data-crunching method – equivalent to searching through a million pages of information every night –  allowed Andreoni and colleagues to conduct a rapid analysis of the ZTF data and identify the AT2022cmc TDE with relativistic jets. They quickly started follow-up observations that revealed an exceptionally bright event across the electromagnetic spectrum, from the X-rays to the millimeter and radio.

    ESO’s Very Large Telescope revealed that AT2022cmc was at a cosmological distance of 8.5 billion light years away. The Hubble Space Telescope optical/infrared images and radio observations from the Very Large Array pinpointed the location of AT2022cmc with extreme precision. 

    The researchers believe that AT2022cmc was at the center of a galaxy that is not yet visible because the light from AT2022cmc outshone it, but future space observations with Hubble or James Webb Space Telescopes may unveil the galaxy when the transient eventually disappears.

    It is still a mystery why some TDEs launch jets while others may not. From their observations, Andreoni and his team concluded that the black holes in AT2022cmc and other similarly jetted TDEs are likely spinning rapidly so as to power the extremely luminous jets. This suggests that a rapid black hole spin may be one necessary ingredient for jet launching—an idea that brings researchers closer to understanding the physics of supermassive black holes at the center of galaxies billions of light years away.

    Before AT2022cmc, only a couple of possible jetted TDEs were known, primarily discovered by gamma-ray space missions, which detect the highest-energy forms of radiation produced by these jets. With their new method, astronomers can now search for such rare events in ground-based optical surveys. 

    “Astronomy is changing rapidly,” Andreoni said. “More optical and infrared all-sky surveys are now active or will soon come online. Scientists can use AT2022cmc as a model for what to look for and find more disruptive events from distant black holes. This means that more than ever, big data mining is an important tool to advance our knowledge of the universe.”

    The paper, “A very luminous jet from the disruption of a star by a massive black hole,” is published in Nature on November 30, 2022.
    DOI: DOI : 10.1038/s41586-022-05465-8

    [ad_2]

    California Institute of Technology

    Source link

  • Engineered proteins: A future treatment option for COVID-19

    Engineered proteins: A future treatment option for COVID-19

    [ad_1]

    Newswise — COVID-19 has had a lasting global health impact that continues to challenge the health care system. As the coronavirus continues to mutate, the current COVID-19 prevention strategies are plagued with supply chain disruptions, high vaccine manufacturing costs and inconvenient vaccine administration methods. In a study published in the journal Nature Chemical Biology, the lab of Zhilei Chen, PhD, at Texas A&M University School of Medicine engineered two small and specifically targeted proteins that could be administered as a nasal spray to protect against and treat COVID-19.

    The proteins were templated on the designed ankyrin repeat protein (DARPin), a synthetic scaffold inspired by a class of binding proteins commonly found in nature. Compared to conventional antibody-based drugs, DARPins are less prone to “go bad” during prolonged storage at moderate-to-high temperatures and can be made in large quantities at low cost, making DARPins potentially much more affordable. In addition, since DARPins are about one-eighth the size of an antibody, they have the capacity to access specific therapeutically important “hot spots” on a disease-related protein with greater precision.

    In this study, the researchers created two DARPin molecules that assemble in groups of three and block the interaction between the primary protein used by the COVID-19 virus to enter cells and its partner on host cells, thus stopping the virus in its tracks. When delivered into the nose of animal models with the COVID-causing virus, the DARPins reduced the amount of virus that accumulate in the airways by up to 100-fold and significantly reduced disease progression. What’s more, the DARPins were effective not only against the original variant, but also all of the newer COVID-causing variants, including the omicron strain. The researchers attribute the broad effectiveness of the DARPins to their engineering design, which resulted in DARPins able to mimic a key interface on the cellular receptor needed by the virus to enter cells.

    “This study offers the possibility of an on-demand nasal spray able to tackle COVID either before or after virus exposure,” Chen said. The team’s discovery provides another, potentially lower-cost therapeutic option for those who cannot receive traditional vaccines or are considered high risk.

    The DARPin molecules were engineered by Vikas Chonira, PhD, with assistance from Rudo Simeon, PhD, both postdoctoral fellows in the Chen lab. This research is part of a larger collaborative effort that included Michael S. Diamond, MD, PhD, from Washington University; Peter D. Kwong, PhD, from the National Institutes of Health; and Zhiqiang An, PhD, from University of Texas Health Houston. Funding for the Chen lab is provided by the NIH New Innovator Award.

    Karuppiah Chockalingam, PhD, Research Assistant Professor at the School of Medicine contributed to this article.

     

    ###       

    [ad_2]

    Texas A&M University

    Source link

  • Biodiversity in Africa and Latin America at risk from oil palm expansion, new report warns

    Biodiversity in Africa and Latin America at risk from oil palm expansion, new report warns

    [ad_1]

    Newswise — Zero deforestation commitments may inadvertently leave vital habitats in Latin America and Africa vulnerable to agricultural expansion, a new study has found. 

    The study highlights how sustainability commitments, which play an important role in preventing the destruction of tropical rainforest, fail to protect nature in tropical grassy and dry forest habitats such as the Llanos in Colombia, Beni savanna in northern Bolivia, and Guinean and Congolian savannas in West and Central Africa. 

    The research team, led by the University of York, calculated that if oil palm producers cleared these habitats to make way for new plantations, a third of vertebrates on the International Union for Conservation of Nature’s red list of threatened species could be affected, including the blue-throated macaw in Bolivia, the giant pangolin in Congo, and the Hellmich’s Rocket Frog in Colombia. 

    For the study, researchers mapped the areas around the globe that are at risk from new oil palm plantations. They identified 167 million hectares that are potentially suitable for the crop while still meeting the Roundtable on Sustainable Palm Oil’s (RSPO) definition of ‘zero deforestation’. Of those 167 million hectares, 95 million are in grasslands and dry forests, mostly in South America and Africa. 

    As global demand for agricultural land increases researchers are calling for urgent protections for these habitats, which support a rich array of species and act as an important carbon store. 

    Co-author of the study, Professor Jane Hill from the Department of Biology and the Leverhulme Centre for Anthropocene Biodiversity at the University of York, said: “Palm oil is at the sharp edge of debate on how we can balance the need to feed the world and sustain livelihoods, while protecting nature. 

    “With a yield estimated to be six times higher than many other vegetable oils such as oil seed rape, palm oil is regarded as a miracle crop and it supports the livelihoods of millions of people in tropical countries around the world. So rather than avoiding or banning palm oil, we need to ensure effective international policies and governance to protect, not just tropical rainforest, but tropical grasslands and dry forests too.  

    “Our study highlights how current sustainability commitments could have the unintended consequence of putting areas of remarkable biodiversity at risk from the expansion of oil palm agriculture.” 

    Since 2018, many oil palm companies have signed up to the RSPO’s zero deforestation commitments, which means they cannot expand plantations into tropical rainforest or peatlands. 

    While concern from buyers and consumers about the environmental impact of palm oil has helped to drive membership of the scheme, many oil palm producers are yet to sign up to these commitments.

    First author, Dr Susannah Fleiss, who carried out the study while researching her PhD at the University of York, said: “Although we found that oil palm yield in areas currently covered by grassland and dry forest would be lower than in tropical rainforest, these sites would still be attractive for the expansion of oil palm agriculture. We also found that irrigation would improve yield in many of these locations, potentially making them more attractive for expansion. 

    “Clearing these areas for plantations would have a serious impact on biodiversity, potentially reducing the ranges of one quarter of vertebrate species that are currently threatened with extinction. Plantation development would replace the existing habitat in these areas, disrupting the ability of the species present to find food and water, and affecting their migration routes. 

    “Large numbers of people live in tropical grassy and dry forest regions, where they often play a critical role in ecological processes such as burning and grazing. The expansion of oil palm agriculture in these areas could lead to a number of interlinked issues for local people and biodiversity. 

    “Our study highlights the strong need for internationally-coordinated governance to protect these habitats, in addition to the existing global efforts to protect tropical rainforest.” 

    Co-author Dr Phil Platts, Honorary Fellow at the University of York and Director of Earth Observation at BeZero Carbon, said: “Sustainability guidelines for palm oil were developed in the context of Southeast Asia’s rainforests, and so reflect the structure and function of those habitats. Now expansion is shifting to different ecological contexts, the scope of sustainability commitments must similarly expand, in line with the distinct biodiversity and carbon stocks now under threat.” 

    The research, published in the journal Nature Ecology and Evolution, is funded by Unilever,  in collaboration with the University of Liverpool, Oxford, the Potsdam Institute for Climate Impact Research, Unilever and BeZero Carbon.  

    [ad_2]

    University of York

    Source link

  • Spin correlation between paired electrons demonstrated

    Spin correlation between paired electrons demonstrated

    [ad_1]

    Newswise — Physicists at the University of Basel have experimentally demonstrated for the first time that there is a negative correlation between the two spins of an entangled pair of electrons from a superconductor. For their study, the researchers used spin filters made of nanomagnets and quantum dots, as they report in the scientific journal Nature.

    The entanglement between two particles is among those phenomena in quantum physics that are hard to reconcile with everyday experiences. If entangled, certain properties of the two particles are closely linked, even when far apart. Albert Einstein described entanglement as a “spooky action at a distance”. Research on entanglement between light particles (photons) was awarded this year’s Nobel Prize in Physics.

    Two electrons can be entangled as well – for example in their spins. In a superconductor, the electrons form so-called Cooper pairs responsible for the lossless electrical currents and in which the individual spins are entangled.

    For several years, researchers at the Swiss Nanoscience Institute and the Department of Physics at the University of Basel have been able to extract electron pairs from a superconductor and spatially separate the two electrons. This is achieved by means of two quantum dots – nanoelectronic structures connected in parallel, each of which only allows single electrons to pass.

    Opposite electron spins from Cooper pairs

    The team of Prof. Dr. Christian Schönenberger and Dr. Andreas Baumgartner, in collaboration with researchers led by Prof. Dr. Lucia Sorba from the Istituto Nanoscienze-CNR and the Scuola Normale Superiore in Pisa have now been able to experimentally demonstrate what has long been expected theoretically: electrons from a superconductor always emerge in pairs with opposite spins.

    Using an innovative experimental setup, the physicists were able to measure that the spin of one electron points upwards when the other is pointing downwards, and vice versa. “We have thus experimentally proven a negative correlation between the spins of paired electrons,” explains project leader Andreas Baumgartner.

    The researchers achieved this by using a spin filter they developed in their laboratory. Using tiny magnets, they generated individually adjustable magnetic fields in each of the two quantum dots that separate the Cooper pair electrons. Since the spin also determines the magnetic moment of an electron, only one particular type of spin is allowed through at a time.

    “We can adjust both quantum dots so that mainly electrons with a certain spin pass through them,” explains first author Dr. Arunav Bordoloi. “For example, an electron with spin up passes through one quantum dot and an electron with spin down passes through the other quantum dot, or vice versa. If both quantum dots are set to pass only the same spins, the electric currents in both quantum dots are reduced, even though an individual electron may well pass through a single quantum dot.”

    “With this method, we were able to detect such negative correlations between electron spins from a superconductor for the first time,” Andreas Baumgartner concludes. “Our experiments are a first step, but not yet a definitive proof of entangled electron spins, since we cannot set the orientation of the spin filters arbitrarily – but we are working on it.”

    The research, which was recently published in Nature, is considered an important step toward further experimental investigations of quantum mechanical phenomena, such as the entanglement of particles in solids, which is also a key component of quantum computers.

    [ad_2]

    University of Basel

    Source link

  • Protein spheres protect the genome of cancer cells

    Protein spheres protect the genome of cancer cells

    [ad_1]

    Newswise — MYC genes and their proteins play a central role in the emergence and development of almost all cancers. They drive the uncontrolled growth and altered metabolism of tumour cells. And they help tumours hide from the immune system.

    MYC proteins also show an activity that was previously unknown – and which is now opening new doors for cancer research: They form hollow spheres that protect particularly sensitive parts of the genome. If these MYC spheres are destroyed, cancer cells will die.

    This was reported by a research team led by Martin Eilers and Elmar Wolf from the Institute of Biochemistry and Molecular Biology at Julius-Maximilians-Universität Würzburg (JMU, Bavaria, Germany) in the journal “Nature”. The researchers are convinced that their discovery is a game changer for cancer research, an important breakthrough on the way to new therapeutic strategies.

    Hollow spheres protect sensitive DNA sites

    What the researchers discovered: When cells in the lab are kept under stress conditions similar to those found in fast-growing tumour cells, the MYC proteins in the cell nucleus rearrange themselves in a dramatic way. They join together to form hollow spheres consisting of thousands of MYC proteins.

    The hollow spheres surround and protect individual, particularly sensitive sites in the genome – precisely the sites where two types of enzymes can collide: Enzymes that read DNA to synthesize RNA and enzymes that duplicate DNA. Both can be thought of as two trains travelling on only one track, on DNA.

    The hollow spheres thus prevent the two enzymes from colliding. The Würzburg team was able to confirm this observation in cancer cells. If the protective function of the protein spheres is switched off experimentally, collisions of the enzymes occur and, as a consequence, multiple breaks occur in the DNA – which ultimately kill the cancer cells.

    Search for specifically effective drugs

    “These observations revolutionize our understanding of why MYC proteins are so crucial for the growth of tumor cells,” says Martin Eilers. The new findings also raise the question of whether drugs can be developed that specifically prevent the formation of the hollow spheres.

    To drive this development forward, Eilers and Wolf have started a company. Together with JMU and partners from the pharmaceutical industry, the search for drugs that interfere with the newly discovered functions of the MYC proteins has begun.

    “The fact that investors made it possible for us to set up so quickly is certainly not an everyday occurrence,” say the JMU professors. They also consider this as a sign that they have made a discovery that is very promising.

    [ad_2]

    University of Wurzburg

    Source link

  • Tufts University Researchers Find Link Between Foods Scored Higher By New Nutrient Profiling System and Better Long-Term Health Outcomes

    Tufts University Researchers Find Link Between Foods Scored Higher By New Nutrient Profiling System and Better Long-Term Health Outcomes

    [ad_1]

    Newswise — The idea that what we eat directly affects our health is ancient; Hippocrates recognized this as far back as 400 B.C. But, identifying healthier foods in the supermarket aisle and on restaurant menus is increasingly challenging. Now, researchers at the Friedman School of Nutrition Science and Policy at Tufts have shown that a holistic food profiling system, Food Compass, identifies better overall health and lower risk for mortality.  

    In a paper published in Nature Communications on November 22, researchers assessed whether adults who ate more foods with higher Food Compass scores had better long-term health outcomes and found that they did.

    Introduced in 2021, Food Compass provides a holistic measure of the overall nutritional value of a food, beverage, or mixed meal. It measures 9 domains of each item, such as nutrient ratios, food-based ingredients, vitamins, minerals, extent of processing, and additives. Based on scores of 10,000 commonly consumed products in the U.S., researchers recommend foods with scores of 70 or above as foods to encourage; foods with scores of 31-69 to be eaten in moderation; and anything that scores 30 or below to be consumed sparingly. For this new study, Food Compass was used to score a person’s entire diet, based on the Food Compass scores of all the foods and beverages they regularly consume.

    “A nutrient profiling system is intended to be an objective measure of how healthy a food is. If it’s achieving its purpose, then individuals who eat more foods with higher scores should have better health,” said Meghan O’Hearn, a doctoral candidate at the Friedman School and the study’s lead author.

    For this validation study, researchers used nationally representative dietary records and health data from 47,999 U.S. adults aged 20-85 who were enrolled between 1999-2018 in the National Health and Nutrition Examination Survey (NHANES). Deaths were determined through linkage with the National Death Index (NDI).

    Overall, researchers found that the mean Food Compass score for the diets of the nearly 50,000 subjects was only 35.5 out of 100, well below ideal. “One of the most alarming discoveries was just how poor the national average diet is,” said O’Hearn. “This is a call for actions to improve diet quality in the United States.”

    When people’s Food Compass diet scores were assessed against health outcomes, multiple significant relationships were seen, even adjusting for other risk factors like age, sex, race, ethnicity, education, income, smoking, alcohol intake, physical activity, and diabetes status. A higher Food Compass diet score was associated with lower blood pressure, blood sugar, blood cholesterol, body mass index, and hemoglobin A1c levels; and lower prevalence of metabolic syndrome and cancer. A higher Food Compass diet score was also associated with lower risk of mortality: for each 10-point increase, there was a 7 percent lower risk of death from all causes.

    “When searching for healthy foods and drinks, it can be a bit of a wild west,” said Dariush Mozaffarian, Jean Mayer Professor of Nutrition and dean for policy at the Friedman School. “Our findings support the validity of Food Compass as a tool to guide consumer decisions, as well as industry reformulations and public health strategies to identify and encourage healthier foods and beverages.”

    Compared to existing nutrient profiling systems, Food Compass provides a more innovative and comprehensive assessment of nutritional quality, researchers say. For example, rather than measuring levels of dietary fats, sodium, or fiber in isolation, it takes a more nuanced and holistic view, evaluating the ratio of saturated to unsaturated fat; sodium to potassium; and carbohydrate to fiber. 

    Food Compass also boosts scores for ingredients shown to have protective effects on health, like fruits, non-starchy vegetables, beans and legumes, whole grains, nuts and seeds, seafood, yogurt, and plant oils; and lowers scores for less healthful ingredients like refined grains, red and processed meat, and ultra-processed foods and additives.

    Researchers designed Food Compass with the ever-evolving field of nutrition science in mind, and their multidisciplinary team—comprised of researchers with expertise in epidemiology, medicine, economics, and biomolecular nutrition—will continue to evaluate and adapt the tool based on the most cutting-edge nutrition research.

    “We know Food Compass is not perfect,” said Mozaffarian. “But, it provides a more comprehensive, holistic rating of a food’s nutritional value than existing systems, and these new findings support its validity by showing it predicts better health.”

    These findings are timely given the release of the new U.S. National Strategy on Hunger, Nutrition and Health. One pillar of this strategy is to “empower all consumers to make and have access to healthy choices” through measures such as updating food labeling and making it easier to interpret, creating healthier food environments, and creating a healthier food supply.

    “This study further validates Food Compass as a useful tool for defining healthy foods. We hope the Food Compass algorithm—publicly available to all—can help guide front-of-pack labeling; procurement choices in workplace, hospital, and school cafeterias; incentive programs for healthier eating in healthcare and federal nutrition programs; industry reformulations; and government policies around food,” said O’Hearn.  

    Researchers plan to work on a simplified version that requires fewer nutrient inputs, as well as versions tailored to specific conditions such as diabetes and pregnancy or to other nations’ populations. The research team is also interested in adding Food Compass domains based on other aspects of foods, such as environmental sustainability, social justice, or animal welfare.

    “We look forward to continuing to find ways to improve the Food Compass system, and to get it to more users to help clear up confusion about healthier choices,” said Mozaffarian.

    Research reported in this article was supported by the National Institutes of Health’s National Heart, Lung, and Blood Institute under award number 2R01HL115189 and Vail Innovative Global Research. Complete information on authors, funders, and conflicts of interest is available in the published paper.

    The content is solely the responsibility of the authors and does not necessarily represent the official views of the funders.

    [ad_2]

    Tufts University

    Source link

  • The body’s own cannabinoids widen the bronchial tubes

    The body’s own cannabinoids widen the bronchial tubes

    [ad_1]

    Newswise — Bronchial constriction is what makes many lung diseases like asthma so dangerous. Researchers have discovered a new signalling pathway that causes the airways to widen.

    Inhalation therapy for asthma and other obstructive lung diseases often loses its effect following prolonged use. A research team led by Professor Daniela Wenzel from the Department of Systems Physiology at Ruhr University Bochum, Germany, has now shown an alternative signalling pathway through which the body’s own cannabinoids cause the bronchial tubes to dilate. This gives rise to hope for alternative treatment options. Asthma is evidently also associated with a deficiency of these cannabinoids in the bronchial tubes, which could be one of the causes of the disease. The research team published its findings in the journal Nature Communications 17. November 2022.

    Bronchial tubes dilated by the body’s own cannabinoids

    Obstructive lung diseases are the third most common cause of death worldwide. They include chronic obstructive pulmonary disease (COPD), which affects many smokers, as well as bronchial asthma. During an asthma attack, the bronchial tubes contract so violently that it is no longer possible to exhale – and this can be life-threatening. “Asthma is an inflammatory process, but what is fatal is the constriction of the bronchial tubes,” explains Annika Simon, lead author of the study. “This is why we are very much interested in the regulation of this constriction.”

    In a previous study, the researchers had likewise focused on the body’s own cannabinoid system, specifically on its effect in the blood vessels of the lungs. The best known endogenous cannabinoid is anandamide. “Since our results show that anandamide dilates the bronchial tubes, we wanted to understand the exact mechanism behind it,” explains Daniela Wenzel.

    Enzyme degrades cannabinoid

    It quickly emerged that the two best-known receptors for anandamide (CB1 and CB2) are irrelevant for this regulation. Therefore, there must be an alternative signalling pathway through which the messenger substance anandamide acts on the bronchial tubes.

    Daniela Wenzel and her team showed that this alternative pathway uses an enzyme called fatty acid amide hydrolase (FAAH). FAAH degrades anandamide, producing e.g. arachidonic acid, which in turn is converted to prostaglandin E2. “We know that prostaglandin E2 can dilate the bronchial tubes,” points out Annika Simon. Prostaglandin E2 acts via certain receptors and leads to an increase in the messenger substance cAMP (cyclic adenosine monophosphate). “It is precisely this, the increase in cAMP, that is targeted by well-established inhalation medications against asthma,” says Daniela Wenzel. So, the goal is the same, but the path is different.

    Anandamide deficiency in asthma

    Wenzel and her team gradually deciphered the signalling pathway. They revealed that the enzyme FAAH is located both in the smooth muscle of the bronchial tubes and in the ciliated epithelium. The increase in cAMP after anandamide administration could be detected both in the mouse model and in human bronchial cells. In order to find out whether anandamide could also works in asthma patients, the team used a disease model in mice where certain substances can be used to create artificial asthma. In these animals, too, the administration of anandamide led to a widening of the bronchial tubes. “This means that asthma doesn’t result in resistance to anandamide,” explains Daniela Wenzel. Moreover, the researchers found that asthmatic animals have less anandamide and other endocannabinoids in their bronchial system than healthy animals. “Therefore, it’s possible that this anandamide deficiency is one of the causes of bronchial asthma,” concludes Daniela Wenzel.

    The discovery of the new signalling pathway could also open up new possibilities for intervening in the disease process. “But there’s still a long way to go, and it will certainly take several years,” stresses Daniela Wenzel. She expressly warns patients not to undertake experiments with cannabis plants. “We can’t draw any direct conclusions regarding plant cannabinoids from the findings on endogenous cannabinoids. Exactly which other ingredients are found in cannabis plants besides the known cannabinoids is entirely unclear. Plus, the plants sometimes contain harmful substances.” Nevertheless, the findings of this study are already pointing towards a better understanding of the body’s own cannabinoid system, which could lead to new treatment options for lung diseases in a few years’ time.

    Funding

    The study was funded by the German Research Foundation (funding code: WE4461/1-1).

    https://news.rub.de/english/press-releases/2022-11-17-medicine-bodys-own-cannabinoids-widen-bronchial-tubes

    [ad_2]

    Ruhr-Universitat Bochum

    Source link

  • Cultural heritage may influence choice of tools by capuchin monkeys, study suggests

    Cultural heritage may influence choice of tools by capuchin monkeys, study suggests

    [ad_1]

    Newswise — Capuchin monkeys (Sapajus spp.) are among only a few primates that use tools in day-to-day activities. In the Cerrado and Caatinga, they use stones as hammers and anvils to crack open cashew nuts, seed pods of Hymenaea courbaril (West Indian locust; jatobá in Brazil) and other hard foods. 

    In an article published in Scientific Reports, Brazilian researchers show that food hardness and tool size do not always correlate as closely as has been thought. 

    In their study, the researchers observed three populations of bearded capuchin monkeys (Sapajus libidinosus), measuring food hardness, tool size and weight, and local availability of stones. They concluded that culture, defined as information passed on from one generation to the next by social learning, can also influence behavior in this regard. 

    “In one of the populations we analyzed, even when they have stones that are suitable for use on a particular food resource, they may use disproportionately heavy tools, possibly evidencing a cultural trait of that group,” said Tiago Falótico, a researcher at the University of São Paulo’s School of Arts, Sciences and Humanities (EACH-USP) supported by FAPESP.

    The population to which he referred lives in Chapada dos Veadeiros National Park in Goiás, a state in Brazil’s Center-West region. In the study, this population was compared with capuchins living in Serra das Confusões National Park, in Piauí, a state in the Northeast region, and another population that lives in Serra da Capivara National Park, about 100 km away in the same state. 

    The tools are pieces of quartzite and sandstone found in places referred to as processing sites. The animals frequent these sites solely to look for these stones for use as hammers and anvils. One stone is used to pound a nut or seed resting on another stone used as an anvil. 

    “In Serra das Confusões, they use smaller tools to open smaller and softer fruit but use large, heavy hammers to crack coconut shells, which are very hard. In Chapada dos Veadeiros, where there are stones of varying sizes to choose from, they use the heaviest ones even for fragile foods,” Falótico said.

    Not by chance, it was in this latter park that the researchers recorded the heaviest stone lifted by capuchins. An adult male weighs 3.5 kg on average, and they filmed an individual lifting a hammer stone that was later found to weigh 4.65 kg. “They’re champion weightlifters,” he chuckled.

    Measurements

    The findings were the result of a great deal of hard work. The researchers documented the kinds of food most frequently found in the processing sites, such as babassu (Attalea speciosa), West Indian locust, cashew, and wild cassava (Manihot spp). They also documented the stones available, as well as the sizes and weights of the tools they found, measured the hardness of each type of food using a special device, and observed and filmed tool usage in each study area.

    “We expected to find a very close correlation between the type of food and the size and weight of the tool, but the population in Chapada dos Veadeiros mainly used the larger ones even though stones of all sizes are plentiful and they can choose a smaller size. They probably inherited this habit from their ancestors. It’s a cultural difference compared with the other populations,” Falótico said.

    The cultural learning hypothesis is reinforced by the fact that studies in other areas, such as Serra de Itabaiana in Sergipe and Chapada Diamantina in Bahia (both states in the Northeast), involving Sapajus capuchins, stones and the same kinds of fruit and seed have not found processing sites or the use of stone tools for this purpose. In Serra das Confusões, the capuchins use tools to crack open several kinds of food except cashew nuts, which are nevertheless abundant.

    “Their behavior isn’t due to the availability of resources but to cultural heritage,” Falótico said.

    The researchers are now analyzing the genomes of all three populations to see if the cultural differences can be linked to genetic differences.

    The study was also supported by FAPESP via a scholarship awarded to Tatiane Valença, a PhD candidate at EACH-USP.

    Human evolution

    A paper by Falótico and a team of archeologists from Germany, Spain and the United Kingdom, published in the Journal of Human Evolution, reports the results of field experiments conducted to test the potential for accidental flake production during nut cracking by capuchins using various types of rock as anvils.

    Some capuchins ingest or anoint themselves with powder produced by pounding stones. They may also rub the powder on their teeth. Their reasons for doing so are unknown, but the researchers believe one aim may be to combat parasites. In the experiments, flakes were also produced by fragmentation of anvils comprising homogeneous material.

    The monkeys did not use the flakes, which closely resembled the lithic tools found by archeologists at digs around the world. The researchers believe the earliest hominins obtained flakes accidentally before their deliberate production for use as tools.

    “Capuchins may also use flakes as tools in future if an innovative individual starts doing so, and others learn by observing. These primates can therefore serve as a model to help us understand human evolution,” Falótico said.

    A previous study by the same group of researchers showed how lithic tools used by the capuchin population in Serra da Capivara displayed different patterns of wear marks depending on the activities involved (read more at: agencia.fapesp.br/35251). 

    Comparisons of the use-wear marks on tools used by monkeys and hominins could reveal how our earliest ancestors used lithic tools. It may therefore be possible to find out more about human evolution from the study of Brazilian capuchin monkeys.

    The article “Stone tools differences across three capuchin monkey populations: food’s physical properties, ecology, and culture” is at: www.nature.com/articles/s41598-022-18661-3

    About São Paulo Research Foundation (FAPESP)

    The São Paulo Research Foundation (FAPESP) is a public institution with the mission of supporting scientific research in all fields of knowledge by awarding scholarships, fellowships and grants to investigators linked with higher education and research institutions in the State of São Paulo, Brazil. FAPESP is aware that the very best research can only be done by working with the best researchers internationally. Therefore, it has established partnerships with funding agencies, higher education, private companies, and research organizations in other countries known for the quality of their research and has been encouraging scientists funded by its grants to further develop their international collaboration. You can learn more about FAPESP at www.fapesp.br/en and visit FAPESP news agency at www.agencia.fapesp.br/en to keep updated with the latest scientific breakthroughs FAPESP helps achieve through its many programs, awards and research centers. You may also subscribe to FAPESP news agency at http://agencia.fapesp.br/subscribe

    [ad_2]

    Fundacao de Amparo a Pesquisa do Estado de Sao Paulo

    Source link

  • Sea level rise to dramatically speed up erosion of rock coastlines by 2100

    Sea level rise to dramatically speed up erosion of rock coastlines by 2100

    [ad_1]

    Newswise — Rock coasts, which make up over half the world’s coastlines, could retreat more rapidly in the future due to accelerating sea level rise. 

    This is according to new Imperial College London research that modelled likely future cliff retreat rates of two rock coasts in the UK. The forecasts are based on predictions of sea level rise for various greenhouse gas emissions and climate change scenarios.  

    The study found that rock coasts, traditionally thought of as stable compared to sandy coasts and soft cliffs, are likely to retreat at a rate not seen for 3,000-5,000 years.  

    At the UK study sites in Yorkshire and Devon, this will cause rock coast cliffs to retreat by at least 10-22 metres inland. The rate of erosion is likely between three and seven times today’s rate and potentially up to tenfold. 

    Senior author Dr Dylan Rood, of Imperial’s Department of Earth Science said: “Coastal erosion is one of the greatest financial risks to society of any natural hazard. Some rock cliffs are already crumbling, and within the next century, rock coast erosion rates could increase tenfold. Even rock coasts that have been stable in the last hundred years will likely respond to sea level rise by 2030.” 

    Globally, coasts are home to hundreds of millions of people and hundreds of billions of dollars of infrastructure like homes, businesses, nuclear power stations, transport links, and agriculture.  

    The researchers are calling on policymakers, planners, and insurers to take action to classify rock coasts as high-risk areas in future planning for climate change response, as well as to limit climate change through achieving Net Zero as an immediate priority.  

    Dr Rood added: “Rock coast erosion is irreversible: now is the time to limit future sea level rise before it’s too late. Humanity can directly control the fate of our coastlines by reducing greenhouse gas emissions — the future of our coasts is in our hands.” 

    The research is published today in Nature Communications. 

    A rocky road 

    The new study is the first to validate models of the expected erosion of hard rock coasts from sea level rise using observational data over prehistoric timescales. Previous studies have mostly focused on theoretical models of soft, sandy coasts. The new results suggest that as sea levels continue to rise, the rate of rock coastal erosion will also accelerate. 

    To study the future rate of erosion, the researchers looked at past and present cliff retreat rates on the coastlines near Scalby in Yorkshire and Bideford in Devon, finding that by 2100 they will likely retreat by 13-22m and 10-14m, respectively.  

    They collected rock samples and analysed them for rare isotopes called cosmogenic radionuclides (CRNs) that build up in rocks exposed to cosmic rays. Concentrations of CRNs in rock reveal how quickly, and for how long, the rock has been exposed, reflecting the rate of erosion and retreat. 

    They combined these data with observed coastal topography to calibrate a model that tracks the evolution of these rock coasts over time, before comparing them with rates of past sea level change dating back 8000 years. They found that the rate of coastal erosion on these two sites has closely matched the rate of sea level rise.  

    The researchers say this is clear evidence of a causal relationship between cliff retreat and sea level from which future forecasts can be made, and that rock coasts are more sensitive to sea level rise than previously thought. The findings, they say, could be applied to rock coasts worldwide because the rock type is common globally, and similar hard rock coasts are likely to respond in a similar way to sea level rise. 

    Lead author Dr Jennifer Shadrick, who conducted the work in Imperial’s Department of Earth Science and Engineering as a member of the NERC Science & Solutions for a Changing Planet Doctoral Training Partnership, and now works in the marine and coastal risk management team at JBA Consulting, said: “Sea level rise is accelerating, and our results confirm that rock coast retreat will accelerate in line with this. It isn’t a matter of if, but when. 

    “The more positive news is that, now that we have a better idea of magnitudes and timescales, we can adapt accordingly. The more data we have on the effects of climate change on sea level rise and coastal erosion, the more we can prepare by championing urgent policies that protect coasts and their communities.” 

    Sea level rise 

    As the climate warms, sea levels are forecast to rise one metre by 2100 unless greenhouse gas emissions are reduced. 

    This study is the first to confirm with observational data that the rate of past coastal erosion followed the rate of sea level rise over prehistoric timescales. The researchers say this erosion was driven by waves, which will likely get larger and more forceful as future sea level rises, and more land is given over to the sea. 

    While this study looked at the effects of sea level rise, it did not account for the effects of stronger storms, which some studies forecast will happen more frequently due to climate change. Next, the researchers will adapt their model to also forecast the rate of cliff retreat for softer rock coasts, such as chalk. 

    Dr Rood said: “Our study did not account for the effect of increased storms, which may become stronger and more frequent in the future as the climate changes, on wave-driven cliff erosion. However, increased storms would only speed up the cliff retreat even more than our forecasts. This is another angle to the climate crisis we will account for in future studies to give a more complete picture of likely rates of rock coast erosion. We are also looking to improve our models for softer rock coasts where erosion other than by waves is more important.” 

    Dr Shadrick said: “The findings are a stark warning that we must better adapt to coastal retreat or face the loss of the people, homes, and infrastructure that call coastal areas home.” 

    Study co-author Dr Martin Hurst at the University of Glasgow said: “The implication is that rock coasts are more sensitive to sea level rise than previously thought. We need to pay more attention to how our rock coasts continue to erode as sea levels rise. 

    “Heightened erosion risks at our coasts will continue throughout this century. Even if we achieve Net Zero tomorrow, a substantive amount of sea level rise is already baked in as our climate, glaciers and oceans continue to respond to the emissions that have already taken place.”

    This study was funded by the Natural Environmental Research Council (NERC), the British Geological Survey (BGS), and the Australian Nuclear Science and Technology Organisation (ANSTO). 

    [ad_2]

    Imperial College London

    Source link