ReportWire

Tag: Nature (journal)

  • UT-Led Aerial Surveys Reveal Ancient Landscape Beneath East Antarctic Ice Sheet

    UT-Led Aerial Surveys Reveal Ancient Landscape Beneath East Antarctic Ice Sheet

    [ad_1]

    Newswise — AUSTIN, Texas — Long before Antarctica froze over, rivers carved valleys through mountains in the continent’s east. Millions of years later, researchers have discovered a remnant of this ancient highland landscape thanks to an aerial survey campaign led by the University of Texas Institute for Geophysics (UTIG).

    The findings were described by researchers at Durham University and Newcastle University, UK, and were published Oct. 24, 2023, in the journal Nature Communications.

    According to the research, the landscape of ancient valleys and ridges formed at least 14 million years ago. The find is unusual because the tremendous weight and motion of the overlying ice sheet should have ground it away.

    Finding such a well-preserved landscape from before the continent’s glaciation gives researchers a geologic reference point to measure how quickly the ice sheet grew and how rapidly it will melt, said coauthor Duncan Young, a UTIG research scientist. 

    “This landscape hanging out there in the middle of the basin is a little bit of an odd phenomenon,” he said. “We’re now working to answer why it was preserved and use that knowledge to find others.”

    Scientists are keen to learn about the land under Antarctica’s ice because it plays a vital role in the stability of the ice sheet. Some landscapes let ice flow rapidly to the ocean, others act to slow or bolster against intruding seawater. The land also records the history of how the ice sheet grew and retreated.

    The basin where the ancient landscape was found contains enough ice to raise global sea level by more than 25 feet. But less is known about the land under the ice than the surface of Mars, said the paper’s lead author Stewart Jamieson, a professor in the department of geography at Durham University.

    “And that’s a problem because that landscape controls the way that ice in Antarctica flows, and it controls the way it might respond to past, present and future climate change,” he said.  

    The more evidence researchers can find about how the ice sheet grew and retreated in the past the better they will understand how Antarctic ice will respond to ongoing global warming, he said.

    The discovery of the landscape was made using satellite data and radio-echo sounding techniques to map a region of land underneath the ice sheet measuring 32,000 square-kilometers (12,355 square-miles, about the size of the state of Maryland). It builds on previous work by researchers to map out hidden mountain ranges, canyon systems and lakes beneath the ice in Antarctica.  
     
    Although the landscape is not visible to the naked eye, satellite images captured over the region show small undulations of the ice sheet’s surface. The landscape’s existence was confirmed by UTIG-led aerial surveys that used ice-penetrating radar to see through the ice and map the shape of the land beneath the ice sheet.  
     
    The research team believes that there are other undiscovered, ancient landscapes hidden beneath the ice sheet. 

    More could soon be identified thanks to a long-term effort to map unexplored regions of East Antarctica by Young and his collaborators, who have flown hundreds of flights since 2008 using a modified, WWII-era DC-3, equipped with ice-penetrating radar and other instruments. 

    Those surveys helped in the latest discovery and could lead to many more as scientists continue to comb the data.

    “It’s a gift that keeps on giving,” said Young, who helped spearhead an initiative funded by the National Science Foundation to make the radar data available to the broader scientific community.
     
    The research behind the latest discovery was supported by the UK’s Natural Environment Research Council (NERC), the G. Unger Vetlesen Foundation, NSF and NASA.

    Other UTIG co-authors include Senior Research Scientist Don Blankenship, who led the first stage of aerial surveys, and Shuai Yan, a graduate student at The University of Texas at Austin’s Jackson School of Geosciences whose research turned up a lake hidden beneath the ice in another region of the basin. UTIG is a research unit of the Jackson School.

    Adapted from “Ancient landscape discovered beneath East Antarctic Ice Sheet” published Oct. 24, 2023, by Durham University.

    [ad_2]

    University of Texas at Austin, Jackson School of Geosciences

    Source link

  • Specific gut bacteria increase risk of severe malaria

    Specific gut bacteria increase risk of severe malaria

    [ad_1]

    BYLINE: Jackie Maupin

    Newswise — INDIANAPOLIS—Indiana University School of Medicine researchers have identified multiple species of bacteria that, when present in the gut, are linked to an increased risk of developing severe malaria in humans and mice. Their findings, recently published in Nature Communications, could lead to the development of new approaches targeting gut bacteria to prevent severe malaria and associated deaths.

    Malaria is a life-threatening infectious disease caused by parasites transmitted through the bite of infected mosquitoes. According to the World Health Organization’s latest World Malaria Report, an estimated 619,000 people died from malaria globally in 2021, with 76% of those deaths occurring in children age 5 or younger. 

    IU School of Medicine’s Nathan Schmidt, PhD, an associate professor of pediatrics with the Ryan White Center for Pediatric Infectious Disease and Global Health and the Herman B Wells Center for Pediatric Research, said previous efforts to combat the disease have led to several advancements in malaria treatment and prevention, including new vaccines and antimalarial drugs, insecticides to manage mosquito populations and improved health care processes. However, he said new developments are desperately needed because the gains made in decreasing malaria-related deaths between the early 2000s and late 2010s have plateaued over the last five years.

    “This plateau highlights the need for novel approaches to prevent malaria-related fatalities,” said Schmidt, whose research lab is focused on investigating this global health crisis and its critical impact on children. “Presently, there are no approaches that target gut microbiota. Therefore, we believe that our approach represents an exciting opportunity.”

    In a pivotal 2016 article published in PNAS, Schmidt and his colleagues made a groundbreaking discovery in their experimental models: the gut microbiota has the capability to influence the severity of malaria. This revelation ignited their determination to pinpoint the precise microorganisms, called “Bacteroides,” within the intestinal tract that orchestrate this effect. 

    In their latest study, the researchers found mice harboring particular species of Bacteroides were notably associated with an elevated risk of severe malaria. A similar correlation was also observed in the intestinal tracts of children afflicted with severe malaria.

    Most of the Schmidt lab’s research has been conducted using mouse models of malaria. Thanks to collaboration with several colleagues in the field, the research team was able to extend its observations by studying approximately 50 children with malaria in Uganda. They plan to continue their clinical observations by working with a cohort of over 500 children with malaria. 

    This collaboration was made possible by the joint efforts of Chandy John, MD, MS, of IU School of Medicine; Ruth Namazzi, MB ChB, MMEd, of Makerere University; and Robert Opoka, MD, MPH,  of Global Health Uganda. Together, they are evaluating how severe malaria may affect child neurodevelopment by studying children from households with a history of severe malaria. While these children may not display any symptoms of illness, some carry the malaria parasite in their blood, allowing researchers to explore risk factors associated with the development of severe malaria, including variations observed in the microbiome.

    “Dr. Namazzi, Dr. Opoka and I aren’t experts in the microbiome, so we collaborated with Nathan [Schmidt] on this part of the study since he is an expert,” said John, who is the Ryan White Professor of Pediatrics at IU School of Medicine. “I believe Nathan’s findings are important because they point to the possibility that certain bacteria or combinations of bacteria in the gut may predispose a child to severe malaria. This opens the way to thinking about how we might alter those combinations in the gut to try to protect children from severe malaria.”

    In addition to studying the expanded cohort in Uganda, Schmidt and his team will also collaborate with researchers in Malawi and Mali to get a broader sense of trends present between gut microbiota and malaria across Africa. 

    “Beyond our efforts to assess the contribution of gut bacteria towards severe malaria in diverse African populations, we have initiated pre-clinical efforts to target gut bacteria that cause susceptibility to severe malaria,” Schmidt said. “Our long-term aspiration is to move a treatment into the clinic.”

    About Indiana University School of Medicine

    IU School of Medicine is the largest medical school in the U.S. and is annually ranked among the top medical schools in the nation by U.S. News & World Report. The school offers high-quality medical education, access to leading medical research and rich campus life in nine Indiana cities, including rural and urban locations consistently recognized for livability.

    [ad_2]

    Indiana University

    Source link

  • New study finds global climate change could impact the flavor and cost of American beer

    New study finds global climate change could impact the flavor and cost of American beer

    [ad_1]

    BYLINE: Alex Hood

    Newswise — There are few things tastier than the crisp bite of a cold IPA…for now.  

    A recent study published in the journal Nature Communications found the changing global climate may be affecting the flavor and cost of beer.  

    A warmer and drier climate is expected to lower the yield of hops — the aromatic flowers of the Humulus lupulus plant that give beer its signature bitter flavor — in Europe up to 18 percent by 2050. The alpha acid content of hops is also expected to drop as crops begin to ripen earlier.

    “These climate variations may cause changes in the essential oils of particular varieties of hops,” said Herbert Bruce, assistant professor of practice for undergraduate education in Virginia Tech’s Department of Food Science and Technology and co-creator of the university’s official Fightin’ Hokies beers.  

    Bruce says that temperature and rainfall are a big part of that, which directly affect hop aroma and flavor. “It’s difficult to predict, but that could noticeably alter the aroma and flavor of beer. There’s already seasonal variation in the same variety of hops, but changes in the climate could exacerbate them.”

    According to Bruce, these changes might be more widespread in the brewing industry than consumers would think.

    “It’s important to remember that hops are a key ingredient in all beers, not just IPAs and other very bitter beers,” he said. “It’s also fairly common for American breweries to use European hops, especially noble or German hops in pilsners and other traditional lagers.”

    Bruce was quick to specify that though the exact outcome is uncertain, bitter beers likely aren’t going anywhere, as brewers can adjust the amount of hops they use to maintain bitterness. But that’s much more difficult to do with the unique aromas of different hop varieties.

    If warming temperatures cause decreased crop yields, Bruce said that price will likely be another factor affected.

    “In the U.S. most hops are grown in the northwest. If the study is correct and drier climates reduce hop yield there, it will likely cause prices to go up. This could have a disproportionate impact on smaller craft breweries, as they tend to use only one to three types of hops in their beer,” said Bruce.

    Bruce said it may take some time to see those costs impact the price of beer itself. 

    “Hops are only about four percent of the cost of a bottle of beer, so the price jump isn’t expected to be large initially. However, it’s really difficult to predict what other factors might come into play as the climate affects other areas of the economy.”

    About Bruce

    Herbert Bruce is assistant professor of practice for undergraduate education in the Virginia Tech Department of Food Science and Technology. He graduated from the Master Brewers program at UC Davis, passed the brewer’s exam from the Institute of Brewing and Distilling, London, and served as head brewer and plant manager of two microbreweries and one brewpub. He now teaches Applied Malting and Brewing Science and co-develops all of the university’s Fightin’ Hokies beers.  

    Interview

    To schedule an interview with Herbert Bruce, contact Margaret Ashburn in the media relations office at [email protected] or 540-529-0814.

    [ad_2]

    Virginia Tech

    Source link

  • Unavoidable rise in West Antarctic Ice Sheet melting.

    Unavoidable rise in West Antarctic Ice Sheet melting.

    [ad_1]

    Newswise — Scientists ran simulations on the UK’s national supercomputer to investigate ocean-driven melting of the West Antarctic Ice Sheet: how much is unavoidable and must be adapted to, and how much melting the international community still has control over through reduction of greenhouse gas emissions.

    Taking into account climate variability like El Niño, they found no significant difference between mid-range emissions scenarios and the most ambitious targets of the 2015 Paris Agreement. Even under a best-case scenario of 1.5°C global temperature rise, melting will increase three times faster than during the 20th century.

    The West Antarctic Ice Sheet is losing ice and is Antarctica’s largest contributor to sea-level rise. Previous modelling finds this loss could be driven by warming of the Southern Ocean, particularly the Amundsen Sea region. Collectively the West Antarctic Ice Sheet contains enough ice to raise global mean sea-level by up to five metres.

    Around the world millions of people live near the coast and these communities will be greatly impacted by sea level rise. A better understanding of the future changes will allow policymakers to plan ahead and adapt more readily.

    Lead author Dr Kaitlin Naughten, a researcher at the British Antarctic Survey says:

    “It looks like we’ve lost control of melting of the West Antarctic Ice Sheet. If we wanted to preserve it in its historical state, we would have needed action on climate change decades ago. The bright side is that by recognising this situation in advance, the world will have more time to adapt to the sea level rise that’s coming. If you need to abandon or substantially re-engineer a coastal region, having 50 years lead time is going to make all the difference.”

    The team simulated four future scenarios of the 21st century, plus one historical scenario of the 20th century. The future scenarios either stabilised global temperature rise at the targets set out by the Paris Agreement, 1.5°C and 2°C, or followed standard scenarios for medium and high carbon emissions.

    All scenarios resulted in significant and widespread future warming of the Amundsen Sea and increased melting of its ice-shelves. The three lower-range scenarios followed nearly identical pathways over the 21st century. Even under the best-case scenario, warming of the Amundsen Sea sped up by about a factor of three, and melting of the floating ice shelves which stabilise the inland glaciers followed, though it did begin to flatten by the end of the century.

    The worst-case scenario had more ice shelf melting than the others, but only after 2045. The authors heed that this high fossil fuel scenario, where emissions increase rapidly, is considered unlikely to occur.

    This study presents sobering future projections of Amundsen Sea ice-shelf melting but does not undermine the importance of mitigation in limiting the impacts of climate change.

    Naughten cautions: “We must not stop working to reduce our dependence on fossil fuels. What we do now will help to slow the rate of sea level rise in the long term. The slower the sea level changes, the easier it will be for governments and society to adapt to, even if it can’t be stopped.”

    Unavoidable future increase in West Antarctic ice-shelf melting over the 21st century by Kaitlin Naughten (BAS), Paul Holland (BAS), Jan De Rydt (Northumbria) is published in the journal Nature Climate Change.

    [ad_2]

    British Antarctic Survey

    Source link

  • Climate elevates toxin risk in Northern US lakes.

    Climate elevates toxin risk in Northern US lakes.

    [ad_1]

    Newswise — Washington, DC— As climate change warms the Earth, higher-latitude regions will be at greater risk for toxins produced by algal blooms, according to new research led by Carnegie’s Anna Michalak, Julian Merder, and Gang Zhao. Their findings, published in Nature Water, identify water temperatures of 20 to 25 degrees Celsius (68 to 77 degrees Fahrenheit) as being at the greatest risk for developing dangerous levels of a common algae-produced toxin called microcystin.  

    Harmful algal blooms result when bodies of water get overloaded with nitrogen and phosphorus runoff from agriculture and other human activities. These excess nutrients can allow blue-green algae populations to grow at an out-of-control rate.

    Some blue-green algal species produce a toxin called microcystin, which can pose a serious health hazard to people and the environment, as well as pose economic risks for fishing and tourism. Microcystin affects liver function and can cause death in wild and domestic animals, including humans in rare instances. It is also classified as a potential carcinogen in cases of chronic exposure.

    “In 2014 an algal bloom in Lake Erie led to high levels of microcystin in water intakes, and residents in Ohio and Ontario were instructed not to drink tap water due to risk of exposure,” Merder cautioned.

    Merder, Michalak, and their colleagues—Carnegie’s Gang Zhao, University of Kansas’s Ted Harris, and Dimitrios Stasinopoulos and Robert Rigby of the University of Greenwich—analyzed samples taken from 2,804 U.S. lakes between 2007 and 2017. They assessed how water temperature affects the occurrence and concentration of microcystin as part of an effort to better understand the risks to water quality posed by climate change.

    Michalak’s lab has taken a leading role in understanding the intersection of climate change and water quality impairments for more than a decade. Previous work has shown that lakes worldwide are already experiencing more severe algal blooms and that nutrient pollution is being exacerbated by changes in rainfall patterns.

    “Lakes are sentinels of climate change,” Michalak said. “They hold the vast majority, 87 percent, of the liquid freshwater on the Earth’s surface, and the warming and precipitation shifts associated with climate change pose some of the greatest threats to water quality around the world and to the health of aquatic ecosystems.”

    The surface temperatures of lakes have already been warming at 0.34 degrees Celsius (0.61 degrees Fahrenheit) per decade and Merder and Michalak set out to determine what this, as well as future warming, would mean in terms of risk for elevated toxin concentrations.

    “The abundance of blue-green algae is predicted to increase due to climate change as they outcompete other species,” Merder explained. “But previous field studies came to various conclusions about what this means for microcystin concentrations.”

    To inform land and water management strategies, it was important to quantitatively tie toxin levels to water temperature, which Merder and Michalak were able to accomplish through their extensive analysis of lake water samples, revealing that water temperatures in the 20 to 25 degrees Celsius (68 to 77 degrees Fahrenheit) range were most dangerous in terms of elevated microcystin concentrations. They also found that the impact of temperature is amplified when nutrient concentrations are high.

    By incorporating information from climate models, they were able to demonstrate that areas most susceptible to high toxin concentrations will continue to move northward. In some areas, the relative risk of exceeding water quality guidelines will increase by up to 50 percent in the coming decades. Additionally, they showed that toxin hazards will decrease in a small number of regions further south, as water temperatures begin to exceed those associated with the highest risk.

    “These findings should help demonstrate the serious risk to safe water for drinking, fishing, recreation, and other societal needs in many parts of the United States and the urgency for developing management strategies to prepare,” Michalak concluded. “When we think about water sustainability in the context of global change, we need to focus on the quality of the water as much as we focus on the amount of water.”

    [ad_2]

    Carnegie Institution for Science

    Source link

  • Potato starch supplements could be solution to bone marrow transplant complications

    Potato starch supplements could be solution to bone marrow transplant complications

    [ad_1]

    BYLINE: Tessa Roy

    Newswise — Experts at the University of Michigan Health Rogel Cancer Center have found a potential solution for preventing a common and dangerous complication in patients that receive stem cell transplants from a donor’s blood or bone marrow.  

    Approximately 18,000 people per year in the United States are diagnosed with life threatening illnesses, including blood cancers where a blood or bone marrow stem cell transplant from a donor is their best treatment option.

    About 9,000 such transplants are performed yearly in the U.S. 

    When patients receive a stem cell transplant, they get a new immune system from the donor whose job is to attack cells that don’t belong there including cancer cells. 

    Sometimes, however, those donor immune cells (the graft) begin to see the patient’s own body (the host) as unfamiliar and foreign. As a result, the donor cells may attack the patient’s own organs and tissues, causing Graft versus Host Disease. 

    GVHD develops in up to half of patients who receive stem cell transplants from a donor’s blood or bone marrow. It can affect many parts of the body and can range from mild or moderate to more severe and even life threatening.

    The way to prevent and treat GVHD is by using strong medicines to suppress the immune system which can cause patients to get infections which can also be life-threatening. Therefore, while bone marrow and blood stem cell transplants from a donor are lifesaving for many patients with various serious illnesses, the development of GVHD can cause injury or even death and the treatments available for GVHD are risky.  

    Previous research showed that the bacteria that normally live in the intestines and their products can affect whether or not GVHD happens after a transplant. 

    Researchers have found that a food supplement made from potato starch, when given to ten patients who received stem cell transplants from a donor, changed the products of intestinal bacteria in a way that could potentially prevent GVHD from happening.   

    “GVHD is a major limitation to the lifesaving capability of blood or marrow stem cell transplants. It is exciting to think of the prospect of potentially finding a simple, low-cost, and safe approach to mitigating this dangerous complication for patients who need a stem cell transplant, but researching this approach in more patients is still needed to confirm,” said Mary Riwes, D.O., assistant professor of internal medicine and medical director of the inpatient adult stem cell transplant unit of the Medical Directors Partnering to Lead Along with Nurse Managers program.   

    Investigators are currently enrolling more patients for a second phase of this study to determine whether taking potato starch will indeed result in less GVHD after stem cell transplant. Sixty patients undergoing a blood or bone marrow stem cell transplant from a donor who are ten years or older will be randomized to take potato starch or placebo starch in addition to taking all the usual medications for preventing GVHD with 80% receiving potato starch and 20% placebo starch. This phase II clinical trial will help researchers learn whether or not taking potato starch is an effective intervention for preventing GVHD. 

    More information about this Phase II trial can be found on Clinicaltrials.gov identifier: NCT02763033 

    Additional authors include Jonathan L. Golob, John Magenau, Mengrou Shan, Gregory Dick, Thomas Braun, Thomas M. Schmidt, Attaphol Pawarode, Sarah Anand, Monalisa Ghosh, John Maciejewski, Darren King, Sung Choi, Gregory Yanik, Marcus Geer, Ethan Hillman, Costas A. Lyssiotis, Muneesh Tewari and Pavan Reddy

    Funding/disclosures: Thanks to the volunteers who participated in the study and the clinical and research staff of the University of Michigan Bone Marrow Transplant program. This work was supported by the National Heart, Lung, and Blood Institute (grant no. P01 HL149633, P.R., M.T., M.M.R.) which facilitated all bio sample analyses. The funder had no role in the design and analysis of the study. Resistant starch was purchased using institutional startup funds (M.M.R). 

    Paper cited: “Feasibility of a dietary intervention to modify gut microbial metabolism in patients with hematopoietic stem cell transplantation,” Nature. DOI: 10.1038/s41591-023-02587-y

    [ad_2]

    Michigan Medicine – University of Michigan

    Source link

  • Johns Hopkins Study Supports Potential for Injectable ‘Chemical Vaccine’ For Malaria Using Atovaquone

    Johns Hopkins Study Supports Potential for Injectable ‘Chemical Vaccine’ For Malaria Using Atovaquone

    [ad_1]

    Newswise — Johns Hopkins researchers looking to develop a long-acting, injectable malaria preventive using atovaquone have shown in a new study that resistance may not be the challenge scientists thought it was, particularly when using atovaquone as a malaria preventive. Malaria parasites in infected patients being treated with atovaquone tend to develop a resistance to the drug. Because of this, atovaquone by itself is not used as a malaria treatment nor has not been seen as a strong candidate for use as a preventive.

    The study, led by a team of researchers at the Johns Hopkins Malaria Research Institute and the Johns Hopkins University School of Medicine, in conjunction with colleagues at the University of Liverpool, was published online October 12 in Nature Communications. The Malaria Research Institute is based at the Johns Hopkins Bloomberg School of Public Health.

    In their study, the researchers found that the same genetic mutation that renders malaria parasites resistant to atovaquone in patients also destroys the parasite’s ability to live within mosquito hosts—meaning atovaquone-resistant malaria parasites would not be transmissible. The researchers concluded that atovaquone, despite concerns over resistance, holds promise as a long-acting, injectable “chemical vaccine” that could prevent infection in malaria-endemic areas.

    “These findings should reduce concerns about the transmission of atovaquone resistance with atovaquone therapy, particularly when it is used as a chemical vaccine,” says study senior author Theresa Shapiro, MD, PhD, professor of Clinical Pharmacology in the Johns Hopkins University School of Medicine and professor in the W. Harry Feinstone Department of Molecular Microbiology and Immunology at the Bloomberg School.

    Malaria continues to be a major global health burden. According to the World Health Organization, the mosquito-borne parasitic disease afflicted nearly a quarter of a billion people in 2021, killing more than 600,000. Researchers generally agree that, despite the impact of insecticides and other malaria control measures, and the recent development of a malaria vaccine, new approaches against this deadly parasitic pathogen are needed.

    One new approach, described by Shapiro and colleagues at the University of Liverpool in a 2018 preclinical study, would use an injectable, slow-release formulation of atovaquone to provide vaccine-like protection for weeks at a time. Atovaquone is generally considered safe for long-term use even at higher doses, and has the further advantage that it interrupts the malaria life-cycle in human hosts even at the pre-symptomatic stage, when the parasite is developing in liver cells.

    However, when atovaquone is used not as a preventive but as a treatment for symptomatic malaria infection, it often fails due to the emergence of genetically acquired resistance. Shapiro notes that by the time an infection is symptomatic, it involves billions of individual malaria organisms, and in this vast population it is likely that a resistance mutation will appear, if only by random genetic variation. Under atovaquone treatment, parasites with this mutation will come to dominate the infection. Because of the resistance problem, atovaquone is used to treat malaria only in combination with another antimalarial called proguanil.

    Resistance should be much less likely when using atovaquone as a preventive in people who are malaria-free, Shapiro says. The drug in such cases would be acting against a far smaller number of individual parasites that are only in the early, liver-infection stage.

    “In fact, there are no reported cases of atovaquone resistance when the drug has been given prophylactically,” she says.

    Nevertheless, fear of resistance has left a cloud over the drug’s use even as a preventive. Indeed, there have been concerns that the mutation, once it emerged—for example, in a large population treated prophylactically with atovaquone—could spread via human-to-mosquito-to-human transmission.

    In the study, Shapiro’s team examined the resistance problem, focusing on a key resistance mutation, cytochrome-b Y268S, that has been found in clinical investigations involving the major malaria parasite of concern, Plasmodium falciparum. The researchers confirmed that P. falciparum parasites carrying this mutation are thousands of times less susceptible to atovaquone, compared to unmutated parasites.

    However, the scientists also found that the Y268S mutation, while it enables P. falciparum to survive in human hosts being treated with atovaquone, essentially destroys its ability to live within its Anophelesmosquito hosts. This means that atovaquone-resistant mutant parasites cannot spread via transmission from humans to mosquitoes and back again—as the researchers demonstrated using mosquitoes and a P. falciparum-infectable mouse model. For the study, the mice were engrafted with human liver cells and human red blood cells.

    “Testing the mutant parasites for their ability to infect humanized mice is the best in vivo assay we have short of using humans, and strongly supports the inability of drug-resistant parasites to be transmitted by mosquitoes,” says Photini Sinnis, MD, deputy director at the Johns Hopkins Malaria Research Institute and one of the paper’s senior authors.

    The findings suggest that a “chemical vaccine” strategy for protecting people from malaria with atovaquone remains viable and should continue to be investigated. Shapiro and colleagues are collaborating with Andrew Owen, PhD, a professor at the University of Liverpool, and his team to complete preclinical studies and launch a Phase I trial. Owen is principal investigator for LONGEVITY, an international project funded by Unitaid that aims to translate long-acting medicines for malaria and other diseases that disproportionately affect people in low- and middle-income countries.

    “Many advances in malaria medicines that have started at small scale for the protection of travelers, later see wider use in endemic areas where they are most needed—and this may be the path atovaquone takes as a chemical vaccine,” Shapiro says.

    The study’s first author was Victoria Balta, PhD, a graduate student working with coauthor David Sullivan, MD, a professor in the Bloomberg School’s Department of Molecular Microbiology and Immunology.

    Clinically relevant atovaquone-resistant human malaria parasites fail to transmit by mosquito” was co-authored by Victoria A. Balta, Deborah Stiffler, Abeer Sayeed, Abhai Tripathi, Rubayet Elahi, Godfree Mlambo, Rahul Bakshi, Amanda Dziedzic, Anne Jedlicka, Elizabeth Nenortas, Keyla Romero-Rodriguez, Matthew Canonizado, Alexis Mann, Andrew Owen, David Sullivan, Sean Prigge, Photini Sinnis and Theresa Shapiro.

    Funding was provided by Unitaid (2020-38-LONGEVITY); the Johns Hopkins Malaria Research Institute and Bloomberg Philanthropies; and the National Institutes of Health (R01AI132359, R01AI1095453, T32AI138953).

    Disclosures
    Johns Hopkins co-authors Rahul P. Bakshi, Godfree Mlambo, Theresa A. Shapiro, Abhai K. Tripathi, and co-author Andrew Owen of the University of Liverpool, are co-inventors on PCT/GB2017/ 051746 (Atovaquone long-acting injectable formulation).

    Additional author disclosures appear in the Competing interests section at the end of the paper, which is open access.

    # # #

    [ad_2]

    Johns Hopkins Bloomberg School of Public Health

    Source link

  • Whaling decimated more fin whales than previously estimated

    Whaling decimated more fin whales than previously estimated

    [ad_1]

    Key takeaways

    • Whaling in the 20th century destroyed 99% of the Eastern North Pacific fin whale breeding  population.
    • Because there is enough genetic diversity, current conservation measures should help the population rebound without becoming inbred.
    • The future of fin whales in the Gulf of California depends on the recovery of the Eastern North Pacific population.

    Newswise — A new genomic study by UCLA biologists shows that whaling in the 20th century destroyed 99% of the Eastern North Pacific fin whale breeding, or “effective,” population — 29% more than previously thought.

    But there is also some good news: Genes among members of this endangered species are still diverse enough that current conservation measures should be be enough to help the population rebound without becoming inbred. The study also found that the health of this group is essential for the survival of highly isolated, genetically distinct fin whales in the Gulf of California.

    The study, published in Nature Communications, is among the first to use whole genome information to get a picture of the size and genetic diversity of today’s population. Previous studies had to rely on whaling records or mitochondrial DNA, which is inherited only from the mother, providing limited genetic information.

    In the 19th century, whaling decimated most whale species around the world but left the largest ones — blue and fin whales — largely untouched. That changed with the advent of industrial whaling in the 20th century. By midcentury, close to a million fin whales worldwide had been slaughtered, at least 75,000 of these in the Eastern North Pacific.

    “When you look at whaling records, you can only tell how many were killed. You can’t tell how many there were to begin with,” said corresponding author Meixi Lin, who worked on the project as a UCLA doctoral student and is now a Carnegie Institution for Science postdoctoral fellow at Stanford University. “We know 20th century whaling was severe, but we didn’t know how severe it was for fin whales.”

    To find out, then-postdoctoral researcher and corresponding author Sergio Nigenda-Morales extracted DNA from tissue samples taken from wild fin whales in the Eastern North Pacific and the Gulf of California. He rounded this out with DNA provided by colleagues at the National Oceanic and Atmospheric Administration. In all, 50 whales were studied. Fin whales from the Gulf of California were included because the population there had been undisturbed by whaling, enabling researchers to assess their genetic diversity and learn how they were related to the Eastern North Pacific population.

    “Getting samples from live whales is hard, because you don’t know where they’re going to be —and when they come up, you only have a moment to take the sample before they go back underwater,” said Nigenda-Morales, now an assistant professor at Cal State San Marcos. “It is a humbling experience to conduct field research and interact with the second-largest animal on the planet.”

    The genome analyses revealed that the Gulf of California population diverged around 16,000 years ago, with a population that hovered around 114 adults of reproductive age. The population of breeding adults is a key indicator of a species’ ability to sustain itself. The Eastern North Pacific effective population remained at around 24,000 individuals for thousands of years, until a severe decline happened between 26 and 52 years ago — a period that coincides with 20th century whaling — to only about 305 individuals.

    Past ecological studies had suggested a 70% reduction in fin whale populations, while earlier genetic studies estimated a much larger pre-whaling population.

    “It’s usually hard to detect such strong recent reductions in the genome. But in this case, fin whales were really abundant before, which made the sudden reduction very obvious in our data. If the reduction hadn’t been so strong, we wouldn’t have been able to detect it,” Nigenda-Morales said.

    When a population suffers such a drastic decline, harmful genes left in the remaining organisms can become amplified over time as the small population size inevitably forces individuals carrying those genes to breed together. These harmful genes can reduce the health of the overall population and cause it to die out. Genetic diversity is still high among Eastern North Pacific whales, meaning that multiple versions of many genes are still plentiful and harmful genes have not yet become common.

    “Most of this variation originated long, long ago, so genetic diversity in the small number of surviving individuals comes from their ancient history,” said co-author Kirk Lohmueller, a UCLA professor of ecology and evolutionary biology.

    Luckily, thanks to the slow pace of fin whale reproduction, the population reduction caused by whaling at its strongest point lasted for only two fin whale generations — a 50-year span — and ended with the implementation of the international whaling moratorium in 1985. Since then, the population has slowly recovered, and harmful genes have not had time to pile up.

    However, computer simulations show that if the population remains at its current size, the diversity will begin to vanish. The study’s authors write that the most important thing governments can do to aid fin whale recovery is to continue to enforce the whaling ban so that fin whales have time to increase their numbers.

    The future of fin whales in the Gulf of California also depends on the recovery of the Eastern North Pacific population. The genomic analysis showed that many harmful genes have become common in the former group, and that the only source of new genetic variants is the occasional Eastern North Pacific whale who wanders into their territory about once in every three generations. This infusion of new genetic material, however, has been enough to keep the population going.

    For now, current protections for both populations appear sufficient, though they will need to remain in place for a long time. But climate change, ship strikes and other human-caused disturbances could jeopardize the species’ recovery. The authors expect that ongoing research will help identify additional conservation measures.

    “With improvement in computational models, we can incorporate factors like climate change and relate the risk of extinction from human-mediated processes with what’s happening at the genomic level,” said Lohmueller. “Continuing to develop such models is as important as collecting more data.”

    Nigenda-Morales and Lin undertook the research as doctoral students of UCLA professor and senior author Robert Wayne, who continued working on the project until he passed away late last year. The authors have dedicated the paper to him.

    [ad_2]

    University of California Los Angeles (UCLA)

    Source link

  • Superlensing without a super lens: physicists boost microscopes beyond limits

    Superlensing without a super lens: physicists boost microscopes beyond limits

    [ad_1]

    Newswise — Ever since Antonie van Leeuwenhoek discovered the world of bacteria through a microscope in the late seventeenth century, humans have tried to look deeper into the world of the infinitesimally small.

    There are, however, physical limits to how closely we can examine an object using traditional optical methods. This is known as the ‘diffraction limit’ and is determined by the fact that light manifests as a wave. It means a focused image can never be smaller than half the wavelength of light used to observe an object.

    Attempts to break this limit with “super lenses” have all hit the hurdle of extreme visual losses, making the lenses opaque. Now physicists at the University of Sydney have shown a new pathway to achieve superlensing with minimal losses, breaking through the diffraction limit by a factor of nearly four times. The key to their success was to remove the super lens altogether. 

    The research is published today in Nature Communications.

    The work should allow scientists to further improve super-resolution microscopy, the researchers say. It could advance imaging in fields as varied as cancer diagnostics, medical imaging, or archaeology and forensics. 

    Lead author of the research, Dr Alessandro Tuniz from the School of Physics and University of Sydney Nano Institute, said: “We have now developed a practical way to implement superlensing, without a super lens. 

    “To do this, we placed our light probe far away from the object and collected both high- and low-resolution information. By measuring further away, the probe doesn’t interfere with the high-resolution data, a feature of previous methods.” 

    Previous attempts have tried to make super lenses using novel materials. However, most materials absorb too much light to make the super lens useful.

    Dr Tuniz said: “We overcome this by performing the superlens operation as a post-processing step on a computer, after the measurement itself. This produces a ‘truthful’ image of the object through the selective amplification of evanescent, or vanishing, light waves. 

    Co-author, Associate Professor Boris Kuhlmey, also from the School of Physics and Sydney Nano, said: “Our method could be applied to determine moisture content in leaves with greater resolution, or be useful in advanced microfabrication techniques, such as non-destructive assessment of microchip integrity.

    “And the method could even be used to reveal hidden layers in artwork, perhaps proving useful in uncovering art forgery or hidden works.”

    Typically, superlensing attempts have tried to home in closely on the high-resolution information. That is because this useful data decays exponentially with distance and is quickly overwhelmed by low-resolution data, which doesn’t decay so quickly. However, moving the probe so close to an object distorts the image.

    “By moving our probe further away we can maintain the integrity of the high-resolution information and use a post-observation technique to filter out the low-resolution data,” Associate Professor Kuhlmey said.

    The research was done using light at terahertz frequency at millimetre wavelength, in the region of the spectrum between visible and microwave.

    Associate Professor Kuhlmey said: “This is a very difficult frequency range to work with, but a very interesting one, because at this range we could obtain important information about biological samples, such as protein structure, hydration dynamics, or for use in cancer imaging.”

    Dr Tuniz said: “This technique is a first step in allowing high-resolution images while staying at a safe distance from the object without distorting what you see.

    “Our technique could be used at other frequency ranges. We expect anyone performing high-resolution optical microscopy will find this technique of interest.”

    DOWNLOAD images at this link.

     

     

     Research paper: A Tuniz & B Kuhlmey, ‘Subwavelength terahertz imaging via virtual superlensing in the radiating near field’, Nature Communications (2023)

    DOI: 10.1038/s41467-023-41949-5

    (Available on request)

     

    DECLARATION

     

    The authors declare no competing financial interests. Research was in part funded by the Australian Research Council.

     

     

    [ad_2]

    University of Sydney

    Source link

  • Research Reveals Deep Neural Networks’ Unique Perception of the World.

    Research Reveals Deep Neural Networks’ Unique Perception of the World.

    [ad_1]

    Newswise — CAMBRIDGE, MA — Human sensory systems are very good at recognizing objects that we see or words that we hear, even if the object is upside down or the word is spoken by a voice we’ve never heard.

    Computational models known as deep neural networks can be trained to do the same thing, correctly identifying an image of a dog regardless of what color its fur is, or a word regardless of the pitch of the speaker’s voice. However, a new study from MIT neuroscientists has found that these models often also respond the same way to images or words that have no resemblance to the target.

    When these neural networks were used to generate an image or a word that they responded to in the same way as a specific natural input, such as a picture of a bear, most of them generated images or sounds that were unrecognizable to human observers. This suggests that these models build up their own idiosyncratic “invariances” — meaning that they respond the same way to stimuli with very different features.

    The findings offer a new way for researchers to evaluate how well these models mimic the organization of human sensory perception, says Josh McDermott, an associate professor of brain and cognitive sciences at MIT and a member of MIT’s McGovern Institute for Brain Research and Center for Brains, Minds, and Machines.

    “This paper shows that you can use these models to derive unnatural signals that end up being very diagnostic of the representations in the model,” says McDermott, who is the senior author of the study. “This test should become part of a battery of tests that we as a field are using to evaluate models.”

    Jenelle Feather PhD ’22, who is now a research fellow at the Flatiron Institute Center for Computational Neuroscience, is the lead author of the open-access paper, which appears today in Nature Neuroscience. Guillaume Leclerc, an MIT graduate student, and Aleksander Mądry, the Cadence Design Systems Professor of Computing at MIT, are also authors of the paper.

    Different perceptions

    In recent years, researchers have trained deep neural networks that can analyze millions of inputs (sounds or images) and learn common features that allow them to classify a target word or object roughly as accurately as humans do. These models are currently regarded as the leading models of biological sensory systems.

    It is believed that when the human sensory system performs this kind of classification, it learns to disregard features that aren’t relevant to an object’s core identity, such as how much light is shining on it or what angle it’s being viewed from. This is known as invariance, meaning that objects are perceived to be the same even if they show differences in those less important features.

    “Classically, the way that we have thought about sensory systems is that they build up invariances to all those sources of variation that different examples of the same thing can have,” Feather says. “An organism has to recognize that they’re the same thing even though they show up as very different sensory signals.”

    The researchers wondered if deep neural networks that are trained to perform classification tasks might develop similar invariances. To try to answer that question, they used these models to generate stimuli that produce the same kind of response within the model as an example stimulus given to the model by the researchers.

    They term these stimuli “model metamers,” reviving an idea from classical perception research whereby stimuli that are indistinguishable to a system can be used to diagnose its invariances. The concept of metamers was originally developed in the study of human perception to describe colors that look identical even though they are made up of different wavelengths of light.

    To their surprise, the researchers found that most of the images and sounds produced in this way looked and sounded nothing like the examples that the models were originally given. Most of the images were a jumble of random-looking pixels, and the sounds resembled unintelligible noise. When researchers showed the images to human observers, in most cases the humans did not classify the images synthesized by the models in the same category as the original target example.

    “They’re really not recognizable at all by humans. They don’t look or sound natural and they don’t have interpretable features that a person could use to classify an object or word,” Feather says.

    The findings suggest that the models have somehow developed their own invariances that are different from those found in human perceptual systems. This causes the models to perceive pairs of stimuli as being the same despite their being wildly different to a human.

    Idiosyncratic invariances

    The researchers found the same effect across many different vision and auditory models. However, each of these models appeared to develop their own unique invariances. When metamers from one model were shown to another model, the metamers were just as unrecognizable to the second model as they were to human observers.

    “The key inference from that is that these models seem to have what we call idiosyncratic invariances,” McDermott says. “They have learned to be invariant to these particular dimensions in the stimulus space, and it’s model-specific, so other models don’t have those same invariances.”

    The researchers also found that they could induce a model’s metamers to be more recognizable to humans by using an approach called adversarial training. This approach was originally developed to combat another limitation of object recognition models, which is that introducing tiny, almost imperceptible changes to an image can cause the model to misrecognize it.

    The researchers found that adversarial training, which involves including some of these slightly altered images in the training data, yielded models whose metamers were more recognizable to humans, though they were still not as recognizable as the original stimuli. This improvement appears to be independent of the training’s effect on the models’ ability to resist adversarial attacks, the researchers say.

    “This particular form of training has a big effect, but we don’t really know why it has that effect,” Feather says. “That’s an area for future research.”

    Analyzing the metamers produced by computational models could be a useful tool to help evaluate how closely a computational model mimics the underlying organization of human sensory perception systems, the researchers say.

    “This is a behavioral test that you can run on a given model to see whether the invariances are shared between the model and human observers,” Feather says. “It could also be used to evaluate how idiosyncratic the invariances are within a given model, which could help uncover potential ways to improve our models in the future.”

    ###

    [ad_2]

    Massachusetts Institute of Technology (MIT)

    Source link

  • Newsmakers: Basic Research Findings by Johns Hopkins Scientists Focus on Gene Sequencing, Hearing Loss and a Brain Disorder

    Newsmakers: Basic Research Findings by Johns Hopkins Scientists Focus on Gene Sequencing, Hearing Loss and a Brain Disorder

    [ad_1]

    Newswise — Yes, Scientists Have Sequenced the Entire Human Genome, But They’re Not Done Yet

    The human genome, from end to end, has been sequenced, meaning scientists worldwide have identified most of the nearly 20,000 protein-coding genes. However, an international group of scientists notes there’s more work to be done. The scientists point out that even though we have nearly converged on the identities of the 20,000 genes, the genes can be cut and spliced to create approximately 100,000 proteins, and gene experts are far from agreement on what those 100,000 proteins are.

    The group, which convened last fall at Cold Spring Harbor Laboratory in New York, has now published a guide for prioritizing the next steps in the effort to complete the human gene “catalog.”

    “Many scientists have been working on efforts to fully understand the human genome, and it’s much more difficult and complex than we thought,” says Steven Salzberg, Ph.D., Bloomberg Distinguished Professor of Biomedical Engineering, Computer Science, and Biostatistics at The Johns Hopkins University. “We have provided a state of the human gene catalog and a guide on what’s needed to complete it.”

    Salzberg, along with Johns Hopkins biomedical engineer and associate professor Mihaela Pertea, Ph.D., M.S., M.S.E., postdoctoral researcher Ales Varabyou and 19 other scientists, offered perspectives on the human gene catalog Oct. 4 in the journal Nature.

    The scientists say that while the final list of protein coding genes is nearly complete, scientists have not yet fully cataloged the variety of ways that a gene can be cut, or spliced, resulting in “isoforms” of proteins that are slightly different. Some protein isoforms will not affect the protein’s function but some may be different enough to result in increased risk for a particular trait, condition or illness.

    To complete the catalog, the scientists propose a comprehensive look at how each gene is expressed into functional and nonfunctional proteins and the three-dimensional shape of those proteins.

    The scientists also propose a focus on cataloging non-coding RNA genes. RNA is the genetic material that is transcribed by DNA and follows a molecular path to making proteins. Instead of proteins, non-coding RNA genes encode other types of molecular material that performs a cellular function.

    Finally, the international group emphasizes the importance of enhancing commonly used databases of gene variations that cause illness and disease, improving clinical laboratory standards for annotating DNA sequencing results and developing new technology to enable more effective and precise methods to match the wide array of proteins with their gene products.

    When It Comes to Hearing, the Left and Right Sides of the Brain Work Together, Mouse Research Shows

    Johns Hopkins-led research has revealed an extensive network of connections between the right and left sides of the brain when mice are exposed to different sounds. The researchers also found that some areas of the brain are specialized to recognize certain sounds, such as “calls” from the animals. Further, the researchers also found that deaf mice had far fewer right and left brain connections, suggesting that the brain needs to “hear” and process sound during early ages to spur development of left-right brain connections.

    The findings, say the researchers, may eventually help scientists pinpoint the time period when such brain connections and specialization form, and offer potential insights into how to restore hearing loss.

    “The auditory system is a collection of parts, which need to be connected properly,” says Johns Hopkins neuroengineer Patrick Kanold, Ph.D., a professor of biomedical engineering. “Using a novel microscope that enabled us to see both brain hemispheres at the same time, we found that some of those connections are between the right and left brain hemispheres, allowing functional specialization. When the brain does not get the right inputs, for example in hearing loss, these brain connections are missing. This obviously is an issue if we hope to restore hearing at a later age.”

    In efforts to find new ways to restore hearing, Kanold’s team will continue its work to identify the specific time period when brain connections form in response to sound and how to restore abnormal connections. The team is also continuing research to understand how the brain adapts to and modulates sound processing to filter out distracting signals, such as its recent work indicating that the brain’s frontal cortex provides specific signals to the auditory system during behaviors that might help in this filtering process.

    New Mouse Models May Help Scientists Find Therapies for Brain Development Disorder

    For more than 25 years, Richard Huganir, Ph.D., Bloomberg Distinguished Professor of Neuroscience and Psychological and Brain Sciences and director of the Solomon H. Snyder Department of Neuroscience, at the Johns Hopkins University School of Medicine, has studied the protein SYNGAP1 that is now known to be linked to a group of neurodevelopmental disorders that are usually diagnosed during early childhood. Working with biotechnology companies to find new therapies for the conditions, his team at Johns Hopkins Medicine reports it has developed new mouse models that more accurately represent genetic mutations in people who have SYNGAP1-related disorders.

    The new collection of mouse models, now available to scientists developing treatments, have several variations in the SYNGAP1 gene, which were discovered to cause conditions marked by seizures, cognitive impairment, social deficits and sleep disturbances.

    The SYNGAP1 gene, found also in humans, makes proteins that regulate synapses, the space between two neurons where they trade chemical and molecular messages. When SYNGAP1 is mutated, as in the case of SYNGAP1-related disorders in people, neurons make less of the protein in the synapse, and learning and memory are impaired.

    In other mouse models, called “knock-out” models, the SYNGAP1 gene is removed entirely. Huganir says both the knock-out models and the new versions — “knock-in” models, which carry a variety of SYNGAP1 mutations linked to the disorders — will be helpful in finding therapies that boost SYNGAP1 protein production.

    ###

    [ad_2]

    Johns Hopkins Medicine

    Source link

  • Researchers Capture First-Ever Afterglow of Huge Planetary Collision in Outer Space

    Researchers Capture First-Ever Afterglow of Huge Planetary Collision in Outer Space

    [ad_1]

    Newswise — The study, published today in Nature, reports the sighting of two ice giant exoplanets colliding around a sun-like star, creating a blaze of light and plumes of dust. Its findings show the bright heat afterglow and resulting dust cloud, which moved in front of the parent star dimming it over time.

    The international team of astronomers was formed after an enthusiast viewed the light curve of the star and noticed something strange. It showed the system doubled in brightness at infrared wavelengths some three years before the star started to fade in visible light.

    Co-lead author Dr Matthew Kenworthy, from Leiden University, said: “To be honest, this observation was a complete surprise to me. When we originally shared the visible light curve of this star with other astronomers, we started watching it with a network of other telescopes.

    “An astronomer on social media pointed out that the star brightened up in the infrared over a thousand days before the optical fading. I knew then this was an unusual event.”

    The network of professional and amateur astronomers studied the star intensively including monitoring changes in the star’s brightness over the next two years. The star was named ASASSN-21qj after the network of telescopes that first detected the fading of the star at visible wavelengths.

    The researchers concluded the most likely explanation is that two ice giant exoplanets collided, producing the infrared glow detected by NASA’s NEOWISE mission, which uses a space telescope to hunt for asteroids and comets.

    Co-lead author Dr Simon Lock, Research Fellow in Earth Sciences at the University of Bristol, said: “Our calculations and computer models indicate the temperature and size of the glowing material, as well as the amount of time the glow has lasted, is consistent with the collision of two ice giant exoplanets.”

    The resultant expanding debris cloud from the impact then travelled in front of the star some three years later, causing the star to dim in brightness at visible wavelengths.

    Over the next few years, the cloud of dust is expected to start smearing out along the orbit of the collision remnant, and a tell-tale scattering of light from this cloud could be detected with both ground-based telescopes and NASA’s largest telescope in space, known as JWST.

    The astronomers plan on watching closely what happens next in this system.

    Co-author Dr Zoe Leinhardt, Associate Professor of Astrophysics at the University of Bristol, added: “It will be fascinating to observe further developments. Ultimately, the mass of material around the remnant may condense to form a retinue of moons that will orbit around this new planet.”

    [ad_2]

    University of Bristol

    Source link

  • Copycat nutrient leaves pancreatic tumors starving

    Copycat nutrient leaves pancreatic tumors starving

    [ad_1]

    Newswise — LA JOLLA, CALIF. – 2023 – A study led by scientists at Sanford Burnham Prebys suggests an entirely new approach to treat pancreatic cancer. The research shows that feeding tumors a copycat of an important nutrient starves them of the fuel they need to survive and grow. The method, described in the journal Nature Cancer, has been used in early clinical trials for lung cancer. However, the unique properties of pancreatic cancer may make the strategy an even stronger candidate in the pancreas.

    “Pancreatic cancer relies on the nutrient glutamine much more than other cancers, so therapies that can interfere with tumors’ ability to access glutamine could be highly effective,” says senior author Cosimo Commisso, Ph.D., director and associate professor of the Cancer Metabolism and Microenvironment Program at Sanford Burnham Prebys.

    Pancreatic cancer is relatively rare, accounting for only 3% of all cancers. However, it has one of the lowest survival rates among cancers: most people only live three to six months after being diagnosed with this disease.

    “Over the course of the past decade, there has been a notable improvement in survival rates for pancreatic cancer, but they still hover around just 10%,” says Commisso. “There is a dire need for new treatments for these cancers.”

    One of the challenges of treating pancreatic cancer has to do with the physical properties of the tumors themselves.

    “Pancreatic tumors tend to be packed in dense connective tissue that keeps them encapsulated from the rest of the body and cuts off their supply of oxygen,” says Commisso. “As a consequence, these cancers develop unique metabolic properties compared to other tumors, and this is something we may be able to exploit with new treatments.”

    One of the metabolic quirks of pancreatic cancer is that it relies heavily on glutamine to produce energy for growth and survival. In the past, scientists have tried to block access to glutamine to slow the growth of pancreatic tumors, but this is easier said than done.

    The new method relies on a molecule called DON that has structural similarities to glutamine but can’t actually be used as a nutrient source. By studying mice, the research team found that DON significantly slowed pancreatic tumor growth and stopped the tumors from spreading.

    Although DON was able to stop pancreatic tumors from using glutamine, pancreatic cancer cells can use other nutrients to grow in glutamine’s absence. To combat this effect, the researchers combined DON with an existing cancer treatment that blocks the metabolism of asparagine, another important nutrient. The combined treatment had a synergistic effect, helping prevent the spread of pancreatic tumors to other distant organs, such as the liver and lungs.

    “With DON, the cancer cells can’t use glutamine, but they can start to depend on other nutrients as a backup, including asparagine,” says Commisso. “We thought that if we could stop them from using glutamine and asparagine, the tumors would run out of options.”

    Although this is the first time this combination of treatments has been proposed for any cancer, the approach of using DON on its own has already advanced to early clinical trials in lung cancer.

    “This is particularly exciting, because exploring it further for pancreatic cancer patients could be relatively simple, since the study designs exist for other solid tumors,” adds Commisso. “This could be a game changer for pancreatic cancer, and a lot of the preclinical work needed to rationalize it is already happening.”

    ###

    Additional authors on the study include Maria Victoria Recouvreux, Shea Grenier, Yijuan Zhang, Guillem Lambies, Cheska Marie Galapate, Swetha Maganti, Karen Duong-Polk, Deepika Bhullar, David A. Scott and Razia Naeem, Sanford Burnham Prebys; Edgar Esparza, Andrew M. Lowy and Hervé Tiriac, University of California San Diego.

    This study was supported by an American Cancer Society Discovery Boost Grant (DBG-22-172-01-TBE) and grants from the NIH (R01CA254806, R01CA207189).

    The study’s DOI is 10.1038/s43018-023-00649-1.

    About Sanford Burnham Prebys

    Sanford Burnham Prebys is an independent biomedical research institute dedicated to understanding human biology and disease and advancing scientific discoveries to profoundly impact human health. For more than 45 years, our research has produced breakthroughs in cancer, neuroscience, immunology and children’s diseases, and is anchored by our NCI-designated Cancer Center and advanced drug discovery capabilities. For more information, visit us at SBPdiscovery.org or on Facebook facebook.com/SBPdiscovery and on Twitter @SBPdiscovery.

    [ad_2]

    Sanford Burnham Prebys

    Source link

  • Studying Grand Canyon’s Past for Climate Insights

    Studying Grand Canyon’s Past for Climate Insights

    [ad_1]

    Newswise — The Grand Canyon’s valleys and millions of years of rock layers spanning Earth’s history have earned it a designation as one of the Seven Natural Wonders of the World. But, according to a new UNLV and University of New Mexico study, its marvels extend to vast cave systems that lie beneath the surface, which just might hold clues to better understand the future of climate change — by studying nature’s past.

    A research team led by UNLV paleoclimatologist and Professor Matthew Lachniet that included the University of New Mexico Department of Earth & Planetary Sciences Distinguished Professor Yemane Asmerom and Research Scientist Victor Polyak and other collaborators, studied an ancient stalagmite from the floor of an undisturbed Grand Canyon cave. By studying the mineral deposits’ geochemistry, they were able to analyze precipitation patterns during the rapidly warming period following the last Ice Age to improve understanding of the potential impact of future climate change on summer monsoon rains in the U.S. Southwest and northwestern Mexico.

    Their findings, “Elevated Grand Canyon groundwater recharge during the warm Early Holocene,” published Oct. 2 in Nature Geoscience, revealed that increasing levels of water seeped into the cave between 8,500 and 14,000 years ago, during a period known as the early Holocene when temperatures rose throughout the region. Using a paleoclimate model, the researchers determined that this was likely caused by intensified and expanded summer rainfall stemming from atmospheric impacts on air circulation patterns that more quickly melted the winter snowpacks and sped up the evaporation process that fuels monsoon rains. 

    This is significant, authors say, because most of the water currently infiltrating through the bedrock and into caves and aquifers — and contributing to groundwater recharge — comes from winter snowmelt. During the early Holocene, however, when peak temperatures were only slightly warmer than today, both summer and winter moisture contributed to groundwater recharge in the region.

    The authors suggest that future warming, which could cause temperatures to rise above those of the early Holocene, may also lead to greater rates of summer rainfall on the high-elevation Colorado Plateau and an intensifying North American monsoon, the pattern of pronounced and increased thunderstorms and precipitation that typically occur between June and mid-September.

    “What was surprising about our results is that during this past warm period, both the summer monsoon and infiltration into the cave increased, which suggests that summer was important for Grand Canyon groundwater recharge, even though today it is not an important season for recharge,” said Lachniet, who personally retrieved the stalagmite from a cave in the Redwall Formation on the South Rim of eastern Grand Canyon in 2017. “While we still expect the region to dry in the future, more intense summer rainfall may actually infiltrate into the subsurface more than it does today.”

    Stalagmites are common cave formations that act as ancient rain gauges that record historic climate change. They grow as mineral-rich waters seep through the ground above and drop from the tips of stalactites on cave ceilings. Calcite minerals from tiny drops of water accumulate over thousands of years and, much like tree rings, accurately record the rainfall history of an area. Three natural forms of oxygen are found in water, and the quantity of one form decreases as rainfall increases. This information is

    locked into the stalagmites over time. Because of the distinct difference in the oxygen isotope composition between summer and winter precipitation, it is possible to estimate the relative contributions from each season. Variation in uranium 234 isotope and changes in the growth thickness of stalagmite give indication of the change in the amount of precipitation. 

    “We were able to validate the oxygen record with the growth data, with the uranium isotope data to confirm that in fact, we see significant increases in summer moisture during this warm period, which we attribute is to the monsoon,” said Asmerom. “Obviously, we know things very precisely in terms of timing because we know how to date things. This is something that we are known for around the world using these methods”, Polyak added.

    The research team used stalagmite samples to reconstruct groundwater recharge rates — or the amount of water that penetrates the aquifers — in the Grand Canyon area during the early years of the Holocene period. High groundwater recharge rates likely occurred on other high-elevation plateaus in the region, too, they said, though it’s unclear how the activity applies to hotter, low-elevation deserts.   

    What is clear is that ongoing human-caused climate change is leading to hotter temperatures throughout southwestern North America, including the Grand Canyon region. Alongside population growth and agricultural pressures, this warming can reduce the infiltration of surface water into groundwater aquifers. Groundwater recharge rates also depend on the frequency and intensity of summer rains associated with monsoon season.

    Though summer infiltration isn’t a significant contributor to groundwater recharge in the region today, these latest findings suggest that could change in the future as the climate warms and monsoonal moisture increases. What’s unknown is how a projected decrease in winter precipitation and snowpack could impact overall groundwater reserves.

    In a previous study led by UNM’s Asmerom and published in the Proceedings of the National Academy of Sciences, they found that the North American monsoon is likely to intensify with increased warming. But there were other, mostly model-based studies that suggested otherwise. The new study is consistent with Asmerom and colleagues’ previous study. 

    “Unfortunately, effective moisture is the balance between precipitation and evaporation. Unlike the more temperate Grand Canyon climate, the dry southern part, is likely to be drier, as a result of the increased temperatures,” said Asmerom.

    [ad_2]

    University of New Mexico

    Source link

  • A 130g soft robot gripper lifts 100kg?

    A 130g soft robot gripper lifts 100kg?

    [ad_1]

    Newswise — Utilizing soft, flexible materials such as cloth, paper, and silicone, soft robotic grippers is an essential device that acts like a robot’s hand to perform functions such as safely grasping and releasing objects. Unlike conventional rigid material grippers, they are more flexible and safe, and are being researched for household robots that handle fragile objects such as eggs, or for logistics robots that need to carry various types of objects. However, its low load capacity makes it difficult to lift heavy objects, and its poor grasping stability makes it easy to lose the object even under mild external impact.

    Dr. Song, Kahye of the Intelligent Robotics Research Center at the Korea Advanced Institute of Science and Technology (KIST), along with Professor Lee, Dae-Young of the Department of Aerospace Engineering at the Korea Advanced Institute of Science and Technology (KAIST), have jointly developed a soft gripper with a woven structure that can grip objects weighing more than 100 kg with 130 grams of material.

    To increase the loading capacity of the soft robot gripper, the research team applied a new structure inspired by textiles, as opposed to the conventional method of developing new materials or reinforcing the structure. The weaving technique they focused on involves tightly intertwining individual threads to create a strong fabric, which can reliably support heavy objects and has been used for centuries in clothing, bags, and industrial textiles. The team used thin PET plastic The grippers were designed to allow the strips to intertwine and unwind into a woven structure.

    The resulting woven gripper weighs 130 grams and can grip an object weighing 100 kilograms. Conventional grippers of the same weight can lift no more than 20 kilograms at most, and considering that a gripper that can lift the same weight weighs 100 kilograms, the team succeeded in increasing the load capacity relative to its own weight.

    Also, the soft robot gripper developed by the research team uses plastic, which costs only a few thousand won per unit of material, and can be used as a universal gripper that can grip objects of various shapes and weights, making it highly competitive in price. In addition, since the soft robot gripper can be manufactured by simply fastening a plastic strip, the manufacturing process can be completed in less than 10 minutes, and it is easy to replace and maintain, so the process efficiency is excellent.

    In addition to PET, which is the main material used by the research team, the gripper can also be made of various materials such as rubber and compounds that possess elasticity, allowing the team to customize and utilize grippers suitable for industrial and logistics sites that require strong gripping performance or various environments that need to withstand extreme conditions.

    “The woven structure gripper developed by KIST and KAIST has the strengths of a soft robot but can grasp heavy objects at the level of a rigid gripper,” said Dr. Song. It can be manufactured in a variety of sizes, from coins to cars, and can grip objects of various shapes and weights, from thin cards to flowers, so it is expected to be used in fields such as industry, logistics, and housework that require soft grippers.”

    ###

    KIST was established in 1966 as the first government-funded research institute in Korea. KIST now strives to solve national and social challenges and secure growth engines through leading and innovative research. For more information, please visit KIST’s website at https://eng.kist.re.kr/

    KAIST is the first and top science and technology university in Korea. KAIST has been the gateway to advanced science and technology, innovation, and entrepreneurship, and our graduates have been key players behind Korea’ innovations. KAIST will continue to pursue advances in science and technology as well as the economic development of Korea and beyond. (https://www.kaist.ac.kr/en)

    The research was supported by the Ministry of Science and ICT (Minister Lee Jong-ho) through the KIST Major Project and the Korea Research Foundation Basic Research Program, the Overseas Advanced Scientist Invitation Program, and the Basic Research Laboratory Support Program. The results of the study were published on August 2 in the international journal Nature Communications (IF:16.6, top 8.2% in JCR) and were selected as Editors’ Highlights, which introduces the best 50 papers in each field.

    [ad_2]

    National Research Council of Science and Technology

    Source link

  • Sustainable futures beyond mining

    Sustainable futures beyond mining

    [ad_1]

    Newswise — Mining brings huge social and environmental change to communities: landscapes, livelihoods and the social fabric evolve alongside the industry. But what happens when the mines close? What problems face communities that lose their main employer and the very core of their identity and social networks? A research fellow at the University of Göttingen provides recommendations for governments to successfully navigate mining communities through their transition toward non-mining economies. Based on past experiences with industrial transitions, she suggests that a three-step approach centred around stakeholder collaboration could be the most effective way forward. This approach combines early planning, local-based solutions, and targeted investments aimed at fostering economic and workforce transformation. This comment article was published in Nature Energy.

    Dr Kamila Svobodova, Marie Skłodowska-Curie Research Fellow at the University of Göttingen, argues that, in practice, governments struggle to truly engage mining communities in both legislation and action. Even the more successful, often deemed exemplary, transitions failed to follow the principles of open and just participation or invest enough time in the process. Early discussions about how the future will look following closure help to build trust and relationships with communities. A combination of bottom-up and top-down approaches engages people at all levels. This ensures that the local context is understood and targeted specifically. It also establishes networks for collaboration during the transition. Effective coordination of investments toward mining communities, including funding to implement measures to support workers, seed new industries, support innovations, and enhance essential services in urban centres, proved to be successful in the past.

    “To ensure energy security, it’s essential for governments to recognize the profound transformation that residents of mining communities experience when they shift away from mining,” Svobodova explains. “Neglecting these communities, their inherent strength of mining identity and unity, could lead to social and economic instability, potentially affecting the overall national energy infrastructure.”

    Moving toward closure and consequently away from mining is not an easy or short journey. “It is essential that governments recognize that the transition takes time, and persistence is essential for success,” says Svoboda. “They should openly communicate their strategies, ensuring communities and other stakeholders are well-informed and engaged. Building trust and providing guidance helps residents navigate the uncertainties associated with transitions. By embracing the three-step approach that centers around stakeholder engagement, governments can prioritize equitable and just outcomes when navigating mining transitions as part of their energy security strategies.”

    [ad_2]

    University of Gottingen

    Source link

  • Predicting condensate formation by cancer-associated fusion oncoproteins

    Predicting condensate formation by cancer-associated fusion oncoproteins

    [ad_1]

    Newswise — (Memphis, Tenn – September 28, 2023) Many cancers are caused by fusion oncoproteins, molecules that aberrantly form when a rearrangement of DNA results in parts of two different proteins being expressed as one. Several fusion oncoproteins spontaneously form condensates inside cells that promote cancer development. New research by St. Jude Children’s Research Hospital established a method to study this biophysical process in cells, then used that information as a launchpad to predict the behavior of other fusion oncoproteins. The findings, which offer insight into fusion oncoprotein-driven cancers, were published today in Nature Communications. 

    While genes define everything about us, they are not immutable. Genes are made of DNA, which is constantly being read and replicated. Errors can occur, and sometimes a piece of DNA can break and reattach at a different location. This can lead to two previously independent genes being glued together, resulting in a fusion protein. These unnatural proteins retain properties of both original components, which can have disastrous consequences for cells.  

    “Fusion proteins have been shown to be oncogenic drivers in upwards of 15% of human cancers,” said Richard Kriwacki, Ph.D., St. Jude Department of Structural Biology. These fusion oncoproteins can interfere with cellular regulatory pathways involved in cell growth and differentiation, leading to uncontrolled cell division and cancer.  

     

    Secrets in the droplets 

    “We hypothesized that gaining the ability to form condensates could be linked with the oncogenic properties of fusion oncoproteins,” Kriwacki explained. Biomolecular condensates can form through a process called liquid-liquid phase separation, in which biomolecules separate from the surrounding local environment and form their own compartment, akin to oil droplets in water. Condensates have been shown to be very powerful tools for a cell to regulate many different processes. However, when a fusion oncoprotein has the ability to form a condensate, it can wreak havoc in our cells. 

    Kriwacki, along with collaborators set out to uncover how interwoven fusion oncoproteins were with the process of phase separation.   

     

    The code of fusion oncoprotein condensate behavior 

    The researchers initially examined 166 fusion oncoproteins in cells to observe if they phase separate. Then they categorized them, which was no small feat, according to co-first author Hazheen Shirnekhi, Ph.D., St. Jude Department of Structural Biology. 

    “The condensates were all different sizes, different shapes, and located in different areas of the cell,” Shirnekhi said. “It was difficult for any computer program to recognize the condensates in an unbiased manner, so we had to do this manually. It took a lot of time.”  

    This effort revealed that 58% of the fusion oncoproteins examined formed condensates, opening the door to additional insights.  

    “We found that a large number of those fusion oncoproteins that form condensates, especially in the nucleus, had functional features associated with regulation of gene expression,” Kriwacki said. “The cytoplasmic fusion oncoproteins forming condensates had functional features associated with regulation of cell signaling.” These observations suggest that the fusion oncoproteins elicit their oncogenic properties by altering gene regulation or cell signaling pathways through formation of condensates. 

     

    Machine learning reveals scope of phenomenon  

    In addition to those links to cellular functions, patterns began to emerge within the protein sequences of the fusion oncoproteins that form condensates. These patterns involve so-called physicochemical features, such as number of polar amino acids, charged groups or disordered regions.  

    “When we looked at the sequences of the condensate-forming fusion oncoproteins, we noticed features that are distinct from the condensate-negative fusion oncoproteins,” explained co-first author Swarnendu Tripathi, Ph.D., St. Jude Department of Structural Biology. “That motivated us to select 25 non-redundant features and use data science to predict whether a fusion oncoprotein forms condensates or not.” 

    This data science aspect allowed the researchers to use their 166-sample groundwork to train a machine-learning algorithm using those 25 features. The computational model was then applied to predict the condensate-forming behavior of ~3,000 additional fusion oncoproteins associated with different cancer types.   

    The model predicted that upwards of 67% of those additional fusion oncoproteins likely form condensates. The condensate-forming predictions were tested for a subset of fusion oncoproteins. “The model was shown to be 80% accurate in independent testing with fusions not used in the training,” Tripathi noted. 

    This research establishes the foundational framework for determining the mechanisms underlying the oncogenic properties of fusion oncoproteins to enable their targeted inhibition through pharmaceutical agents or alternative approaches. “We’re looking to address the relationship between condensate formation, alteration of gene expression and oncogenesis,” Kriwacki explained. “We’re working with collaborators so that we can address this causality question in as rigorous a way as possible.” As Kriwacki highlighted, “By obtaining a grasp of the underlying mechanisms, we are setting the stage for potential innovative therapeutic approaches against fusion oncoprotein-driven cancers.” 

     

    Authors and funding 

    The study’s other co-first author was Scott Gorman, formerly of St. Jude. The study’s other authors include Bappaditya Chandra, David Baggett, Cheon-Gil Park, Ramiz Somjee, Benjamin Lang, Seyed Mohammad Hadi Hosseini, Brittany Pioso, Ilaria Iacobucci, Qingsong Gao, Michael Edmonson, Stephen Rice, Xin Zhou, John Bollinger, Madan Babu, Charles Mullighan and Jinghui Zhang, of St. Jude; Diana Mitrea and Michael White, formerly of St. Jude, Yongsheng Li and Stephen Yi of the University of Texas at Austin; Daniel McGrail of Cleveland Clinic; Daniel Jarosz of Stanford University School of Medicine; and Nidhi Sahni of the University of Texas MD Anderson Cancer Center and Baylor College of Medicine.  

    The study was supported by grants from the National Institutes of Health (R35 GM137836, R35 GM133658), Komen Foundation grants (CCR19609287, PDF17483544), the National Cancer Institute (P30 CA021765, R35 CA197695, R01 CA246125, U54 CA243124, R01 CA216391, T32 CA236748, K99 CA240689), the National Institute of General Medical Sciences (F32 GM143847), a St. Jude Children’s Research Hospital Chromatin Collaborative award, a Neoma Boadway Fellowship from St. Jude Children’s Research Hospital, the Cancer Prevention and Research Institute of Texas (RR160021, RP220292), a SummerPlus Program Fellowship from Rhodes College and ALSAC, the fundraising and awareness organization of St. Jude. 

     

     

    St. Jude Children’s Research Hospital 

    St. Jude Children’s Research Hospital is leading the way the world understands, treats and cures childhood cancer, sickle cell disease and other life-threatening disorders. It is the only National Cancer Institute-designated Comprehensive Cancer Center devoted solely to children. Treatments developed at St. Jude have helped push the overall childhood cancer survival rate from 20% to 80% since the hospital opened more than 60 years ago. St. Jude shares the breakthroughs it makes to help doctors and researchers at local hospitals and cancer centers around the world improve the quality of treatment and care for even more children. To learn more, visit stjude.org, read St. Jude Progress blog, and follow St. Jude on social media at @stjuderesearch.   

    [ad_2]

    St. Jude Children’s Research Hospital

    Source link

  • Organic lasers: a bright future

    Organic lasers: a bright future

    [ad_1]

    Newswise — Scientists at St Andrews are leading a significant breakthrough in a decades-long challenge to develop compact laser technology.

    Lasers are used across the world for a huge range of applications in communications, medicine, surveying, manufacturing and measurement.  They are used to transmit information across the internet, for medical treatments, and even in the face scanner on phones.  Most of these lasers are made from rigid, brittle, semiconductor crystals such as gallium arsenide.

    Organic semiconductors are a newer class of electronic material. Flexible, based on carbon and emitting visible light, they enable the simple fabrication of electronic devices. They are now widely used for the OLED (organic light-emitting diode) screens found in most mobile phones. 

    A limitation of organic semiconductor lasers is that they typically need another laser to power them. Researchers have been working to overcome this limitation for 30 years, so it is particularly significant that scientists at the University of St Andrews have recently developed an electrically driven organic semiconductor laser.  The team made this breakthrough, reported in the journal Nature, by first making an OLED with world-record light output and then carefully combining it with a polymer laser structure. This new type of laser emits a green laser beam consisting of short light pulses.  

    For now, this is mainly a scientific breakthrough, but with future development the laser could potentially be integrated with OLED displays and allow communication between them, or be used for spectroscopy for the detection of disease and environmental pollutants.

    Prof Ifor Samuel commented “Making an electrically driven laser from organic materials has been a huge challenge for researchers across the world.  Now, after many years of hard work, we are delighted to have made this new type of laser.”

    Prof Graham Turnbull added “We expect this new laser to use less energy in its manufacture, and in the future will generate laser light across the visible spectrum.”

    [ad_2]

    University of St. Andrews

    Source link

  • Sperm swimming is caused by the same patterns that are believed to dictate zebra stripes

    Sperm swimming is caused by the same patterns that are believed to dictate zebra stripes

    [ad_1]

    BYLINE: Laura Thomas

    Newswise — Patterns of chemical interactions are thought to create patterns in nature such as stripes and spots. This new study shows that the mathematical basis of these patterns also governs how sperm tail moves.

    The findings, published today in Nature Communications, reveal that flagella movement of, for example, sperm tails and cilia, follow the same template for pattern formation that was discovered by the famous mathematician Alan Turing. 

    Flagellar undulations make stripe patterns in space-time, generating waves that travel along the tail to drive the sperm and microbes forward.

    Alan Turing is most well-known for helping to break the enigma code during WWII. However he also developed a theory of pattern formation that predicted that chemical patterns may appear spontaneously with only two ingredients: chemicals spreading out (diffusing) and reacting together. Turing first proposed the so-called reaction-diffusion theory for pattern formation.

    Turing helped to pave the way for a whole new type of enquiry using reaction-diffusion mathematics to understand natural patterns. Today, these chemical patterns first envisioned by Turing are called Turing patterns. Although not yet proven by experimental evidence, these patterns are thought to govern many patterns across nature, such as leopard spots, the whorl of seeds in the head of a sunflower, and patterns of sand on the beach. Turing’s theory can be applied to various fields, from biology and robotics to astrophysics. 

    Mathematician Dr Hermes Gadêlha, head of the Polymaths Lab, and his PhD student James Cass conducted this research in the School of Engineering Mathematics and Technology at the University of Bristol. Gadêlha explained: “Live spontaneous motion of flagella and cilia is observed everywhere in nature, but little is known about how they are orchestrated.

    “They are critical in health and disease, reproduction, evolution, and survivorship of almost every aquatic microorganism in earth.”

    The team was inspired by recent observations in low viscosity fluids that the surrounding environment plays a minor role on the flagellum. They used mathematical modelling, simulations, and data fitting to show that flagellar undulations can arise spontaneously without the influence of their fluid environment.

    Mathematically this is equivalent to Turing’s reaction-diffusion system that was first proposed for chemical patterns.

    In the case of sperm swimming, chemical reactions of molecular motors power the flagellum, and bending movement diffuses along the tail in waves. The level of generality between visual patterns and patterns of movement is striking and unexpected, and shows that only two simple ingredients are needed to achieve highly complex motion.

    Dr Gadêlha added: “We show that this mathematical ‘recipe’ is followed by two very distant species – bull sperm and Chlamydomonas (a green algae that is used as a model organism across science), suggesting that nature replicates similar solutions.

    “Travelling waves emerge spontaneously even when the flagellum is uninfluenced by the surrounding fluid. This means that the flagellum has a fool-proof mechanism to enable swimming in low viscosity environments, which would otherwise be impossible for aquatic species.

    “It is the first time that model simulations compare well with experimental data.

    “We are grateful to the researchers that made their data freely available, without which we would not have been able to proceed with this mathematical study.”

    These findings may be used in future to better understand fertility issues associated with abnormal flagellar motion and other ciliopathies; diseases caused by ineffective cilia in human bodies.

    This could also be further explored for robotic applications, artificial muscles, and animated materials, as the team discovered a simple “mathematical recipe” for making patterns of movement.

    Dr Gadêlha is also a member of the SoftLab at Bristol Robotics Laboratory (BRL), where he uses pattern formation mathematics to innovate the next generation of soft-robots.

    “In 1952, Turing unlocked the reaction-diffusion basis of chemical patterns,” said Dr Gadêlha. “We show that the ‘atom’ of motion in the cellular world, the flagellum, uses Turing’s template to shape, instead, patterns of movement driving tail motion that pushes sperm forwards.

    “Although this is a step closer to mathematically decode spontaneous animation in nature, our reaction-diffusion model is far too simple to fully capture all complexity. Other models may exist, in the space of models, with equal, or even better, fits with experiments, that we simply have no knowledge of their existence yet, and thus substantial more research is still needed!”

    The study was completed using funding from the Engineering and Physical Sciences Research Council (EPSRC) and DTP studentship for James Cass PhD.

    The numerical work was carried out using the computational and data storage facilities of the Advanced Computing Research Centre, at the University of Bristol.

     

    Paper:

    The reaction-diffusion basis of animated patterns in eukaryotic flagella’ by James Cass and Dr Hermes Bloomfield-Gadêlha in Nature Communications.

    [ad_2]

    University of Bristol

    Source link

  • Study Seeks to Explain How Smell is Impacted in Individuals with Autism

    Study Seeks to Explain How Smell is Impacted in Individuals with Autism

    [ad_1]

    Newswise — A new study by a researcher at New York Institute of Technology College of Osteopathic Medicine (NYITCOM) could help explain how the sense of smell is impacted in individuals with autism.

    Individuals with autism have an “insistence on sameness,” and often avoid unfamiliar elements, including new smells and foods, which can impact their quality of life. While many studies have focused on the behavioral features of autism, additional research is needed to help explain its sensory aspects.

    Now, research published in The Journal of Neuroscience, authored by NYITCOM Assistant Professor of Biomedical Sciences Gonzalo Otazu, Ph.D., analyzes a mouse model of autism and reports differences in the neurological processes responsible for smell.

    The findings reinforce one of Otazu’s earlier studies, which was published in the journal Nature Communications in February. In this earlier study, his team trained two groups of mice—one group with a mutation in a gene linked to autism (CNTNAP2 knockout mice) and one neurotypical group (wild-type mice)—to recognize familiar scents. Both groups succeeded in doing so. Then, they were tasked with identifying these scents when new, unfamiliar odors were introduced in the background. While the neurotypical mice were able to “filter out” new background odors and identify target scents, the CNTNAP2 knockout mice struggled to do so.

    In this latest study, Otazu repeated a similar process in mice with a mutation in the SHANK3 gene, a leading autism candidate gene, and found that the same deficit appeared in this mouse model

    Otazu and his co-authors write: 

    “People and mice with mutations in a single copy in the synaptic gene SHANK3 show features seen in autism spectrum disorders, including social interaction deficits…Here we used a recently developed task to show that these mice could identify odors in the presence of known background odors as well as wild-type mice. However, their performance fell below wild-type mice when challenged with novel background odors. This deficit was also previously reported in the CNTNAP2 mouse model of autism suggesting that odor detection in novel backgrounds is a general deficit across mouse models of autism.”

    [ad_2]

    New York Institute of Technology, New York Tech

    Source link