ReportWire

Tag: Nature (journal)

  • Training Birds for Climate Adaptation

    Training Birds for Climate Adaptation

    [ad_1]

    Newswise — One result of climate change is that spring is arriving earlier. However, migratory birds are not keeping up with these developments and arrive too late for the peak in food availability when it is time for breeding. By getting the birds to fly a little further north, researchers in Lund, Sweden, and the Netherlands have observed that these birds can give their chicks a better start in life.

    Global warming is causing problems for birds in Sweden and elsewhere. Warmer springs mean that caterpillars hatch, grow and pupate earlier compared with just a few decades ago. This has consequences for birds that cannot eat caterpillars that have entered the pupal stage. Therefore, when the food supply runs out at an ever earlier time in the spring, more and more chicks starve during the breeding season. This is a big problem for migratory birds that spend winters in Africa, as they do not know how early spring arrives in Sweden. Could the problem be solved if the migratory birds simply came home and started breeding earlier?

    “It seems that our non-migratory birds are doing this to a certain extent. But, of course, they are present and can feel how early spring will come. We thought that perhaps the migratory birds could fly further north until they find a place with suitable well-developed caterpillars,” says Jan-Åke Nilsson, biology researcher at Lund University in Sweden.

    To test this in practice, the researchers decided to help some Pied Flycatchers along the way. The biologists caught Pied Flycatchers that had arrived prior to breeding in the Netherlands. The birds were then driven during the night to Vombs Fure, an area of pine forest outside Lund in Skåne, where they were released. The peak of caterpillar availability in Skåne is about two weeks later than in the Netherlands – a distance of around 600 kilometres that a Pied Flycatcher could cover in just two nights.

    “The birds that were given a lift from the Netherlands to Skåne synchronised very well with the food peak! As they started to breed about 10 days earlier the “Swedish” Pied Flycatchers they had a dramatically better breeding success than the Swedish ones as well as a better success than the Pied Flycatchers that remained in the Netherlands,” says Jan-Åke Nilsson.

    In addition, it was shown that the chicks of the Dutch Pied Flycatchers that had received migration assistance did not stop in the Netherlands when they returned after their first spring migration. Instead, they continued on to the area of pine forest outside Lund where they were born. Furthermore, they arrived earlier than the Swedish Pied Flycatchers and thereby had more well-fed chicks at Vombs Fure the year after the researchers gave the Pied Flycatchers a helping hand to find Skåne.

    “The number of small birds, particularly migratory birds, has decreased drastically throughout Europe. By flying a little further north, these birds, at least in principle, could synchronise with their food resources and there is hope that robust populations of small birds can be maintained, even though springs are arriving ever earlier,” concludes Jan-Åke Nilsson.

    [ad_2]

    Lund University

    Source link

  • AI boosts plant observation precision

    AI boosts plant observation precision

    [ad_1]

    Newswise — Artificial intelligence (AI) can help plant scientists collect and analyze unprecedented volumes of data, which would not be possible using conventional methods. Researchers at the University of Zurich (UZH) have now used big data, machine learning and field observations in the university’s experimental garden to show how plants respond to changes in the environment.

    Climate change is making it increasingly important to know how plants can survive and thrive in a changing environment. Conventional experiments in the lab have shown that plants accumulate pigments in response to environmental factors. To date, such measurements were made by taking samples, which required a part of the plant to be removed and thus damaged. “This labor-intensive method isn’t viable when thousands or millions of samples are needed. Moreover, taking repeated samples damages the plants, which in turn affects observations of how plants respond to environmental factors. There hasn’t been a suitable method for the long-term observation of individual plants within an ecosystem,” says Reiko Akiyama, first author of the study.

    With the support of UZH’s University Research Priority Program (URPP) “Evolution in Action”, a team of researchers has now developed a method that enables scientists to observe plants in nature with great precision. PlantServation is a method that incorporates robust image-acquisition hardware and deep learning-based software to analyze field images, and it works in any kind of weather.

    Millions of images support evolutionary hypothesis of robustness

    Using PlantServation, the researchers collected (top-view) images of Arabidopsis plants on the experimental plots of UZH’s Irchel Campus across three field seasons (lasting five months from fall to spring) and then analyzed the more than four million images using machine learning. The data recorded the species-specific accumulation of a plant pigment called “anthocyanin” as a response to seasonal and annual fluctuations in temperature, light intensity and precipitation.

    PlantServation also enabled the scientists to experimentally replicate what happens after the natural speciation of a hybrid polyploid species. These species develop from a duplication of the entire genome of their ancestors, a common type of species diversification in plants. Many wild and cultivated plants such as wheat and coffee originated in this way.

    In the current study, the anthocyanin content of the hybrid polyploid species A. kamchatica resembled that of its two ancestors: from fall to winter its anthocyanin content was similar to that of the ancestor species originating from a warm region, and from winter to spring it resembled the other species from a colder region. “The results of the study thus confirm that these hybrid polyploids combine the environmental responses of their progenitors, which supports a long-standing hypothesis about the evolution of polyploids,” says Rie Shimizu-Inatsugi, one of the study’s two corresponding authors.

    From Irchel Campus to far-flung regions

    PlantServation was developed in the experimental garden at UZH’s Irchel Campus. “It was crucial for us to be able to use the garden on Irchel Campus to develop PlantServation’s hardware and software, but its application goes even further: when combined with solar power, its hardware can be used even in remote sites. With its economical and robust hardware and open-source software, PlantServation paves the way for many more future biodiversity studies that use AI to investigate plants other than Arabidopsis – from crops such as wheat to wild plants that play a key role for the environment,” says Kentaro Shimizu, corresponding author and co-director of the URPP Evolution in Action.

    The project is an interdisciplinary collaboration with LPIXEL, a company that specializes in AI image analysis, and Japanese research institutes at Kyoto University and the University of Tokyo, among others, under the Global Strategy and Partnerships Funding Scheme of UZH Global Affairs and the International Leading Research grant program of the Japan Society for the Promotion of Science (JSPS). The project also received funding from the Swiss National Science Foundation (SNSF).

    Strategic Partnership with Kyoto University

    Kyoto University is one of UZH’s strategic partner universities. The strategic partnership ensures that high-potential research collaborations will receive the necessary support to thrive, for instance through the UZH Global Strategy and Partnership Funding Scheme. Over the last years, several joint research projects between Kyoto University and UZH have already received funding, among them “PlantServation”.

    [ad_2]

    University of Zurich

    Source link

  • New method to study microRNA activity in single cells

    New method to study microRNA activity in single cells

    [ad_1]

    Newswise — MicroRNAs are small molecules that regulate gene activity by binding to and destroying RNAs produced by the genes. More than 60% of all human genes are estimated to be regulated by microRNAs, therefore it is not surprising that these small molecules are involved in many biological processes including diseases such as cancer. To discover the function of a microRNA, it is necessary to find out exactly which RNAs are targeted by it. While such methods exist, they require a lot of material typically in order of millions of cells, to work.

    Now researchers at Stockholm University and SciLifeLab have developed a new method to detect microRNA targets at the level of single cells. Such cells are each around one-hundredth millimeter in diameter and weigh less than a billionth gram, and comprise the basic building blocks of living organisms. With their new sensitive method, the researchers can follow microRNA targeting of thousands of RNAs during biological processes such as the cell cycle or differentiation into red blood cells. In these processes, the researchers find that microRNAs – surprisingly – perform quite different tasks in each cell. In the future, it will be possible to also apply this method to study microRNA targeting in whole tissues, to find out exactly what is happening in each of the many cell types that comprise complex organs such as brains.

    Marc Friedländer, associate professor at Stockholm University, says: “In our research team, we want to understand and ultimately make mathematical models of gene regulation at the level of the single cell. Our new method is a huge leap towards making this possible”.

    The work was spearheaded by Dr. Inna Biryukova, who took a leading role in developing the laboratory method, and by PhD student Vaishnovi Sekar, who performed the bulk of the advanced computational analyses. Vaishnovi Sekar highlights the challenges of the project: “In terms of complexity of the computational work, this is uncharted territory, and we lacked reference points and thresholds. We had to explore a myriad of approaches to devise a methodology that not only works but also yields biologically meaningful observations.”

    The study was supported by ERC and Vetenskapsrådet and has been published in the journal Nature Biotechnology.

    [ad_2]

    Stockholm University

    Source link

  • Enhancing genomics and bioinformatics knowledge sharing

    Enhancing genomics and bioinformatics knowledge sharing

    [ad_1]

    Newswise — The African BioGenome Project, a large-scale international research project involving Konstanz bioinformatician Abdoallah Sharaf, successfully launched its “Open Institute”. The institute’s mission: accelerating knowledge exchange in biodiversity genomics and bioinformatics.

    The “blueprints” of all living organisms are encoded in the sequences of their DNAs’ base pairs. Knowledge about these genome sequences and the identification of functional subunits, such as genes, are of great importance for biodiversity conservation efforts and for the life sciences in general. For this reason, the African BioGenome Project (AfricaBP), founded in 2021, has set itself an ambitious goal: sequencing the genomes of 100,000 animal and plant species that occur only in Africa – and to do so within 10 years. What is more, the sequencing is to be conducted exclusively in Africa.

    A second goal of the AfricaBP, therefore, is to empower African scientists and institutions to obtain the required skill sets, capacity, and infrastructure to generate, analyze, and utilize genome sequences in labs across the continent. With the successful launch of AfricaBP’s Open Institute, the project has made an essential step towards reaching this goal, as Abdoallah Sharaf and his colleagues now describe in an article in Nature Biotechnology. Sharaf is a bioinformatician in the Department of Biology at the University of Konstanz (Germany), associate professor at Ain Shams University (Egypt), and Co-Chair of the AfricaBP pilot committee.

    Central goals of the Open Institute
    “The Open Institute aims to lower some of the barriers that often prevent the advancement of biodiversity genomics and bioinformatics knowledge exchange in Africa,” says Sharaf. To do so, the AfricaBP Open Institute focuses on five key priority areas:

    • Curriculum development
    • Technology development and infrastructure
    • Promoting grassroot knowledge exchange and equitable partnerships
    • Maximizing data ownership and sovereignty
    • Scientific enterprise and industry

    In 2022, the Open Institute started hosting widely-attended workshops in cooperation with African institutions and organizations as well as global partners. So far, over 700 participants from 29 countries have been trained in cutting-edge technologies in the fields of biodiversity and genomics. As many of the participants came from African countries with active genomics research, in the future, the Open Institute will broaden its outreach to increase the participation of scientists from regions that currently have minimal genomic activity. In line with this, five more workshops on various aspects of genomics and bioinformatics are planned by the end of 2023 – two of them online, three in a hybrid format.

    Tom Kariuki, Chief Executive Officer of the Science for Africa Foundation (SFA), applauds the project: “As the SFA Foundation, we are laser-focused on improving the quantity, quality, and productivity of science in Africa, which requires a skilled scientific workforce through the development of globally competitive science leaders in Africa. The Open Institute serves our objective of training future generations of scientists who will generate data to inform policy and Africa’s development agenda.”

    This text is an adapted version of the original press release of the African BioGenome Project.

    [ad_2]

    University of Konstanz

    Source link

  • Scientists uncovered mystery of important material for semiconductors at the surface

    Scientists uncovered mystery of important material for semiconductors at the surface

    [ad_1]

    Newswise — A team of scientists with the Department of Energy’s Oak Ridge National Laboratory has investigated the behavior of hafnium oxide, or hafnia, because of its potential for use in novel semiconductor applications.

    Materials such as hafnia exhibit ferroelectricity, which means that they are capable of extended data storage even when power is disconnected and that they might be used in the development of new, so-called nonvolatile memory technologies. Innovative nonvolatile memory applications will pave the way for the creation of bigger and faster computer systems by alleviating the heat generated from the continual transfer of data to short-term memory.

    The scientists explored whether the atmosphere plays a role in hafnia’s ability to change its internal electric charge arrangement when an external electric field is applied. The goal was to explain the range of unusual phenomena that have been obtained in hafnia research. The team’s findings were recently published in Nature Materials.

    “We have conclusively proven that the ferroelectric behavior in these systems is coupled to the surface and is tunable by changing the surrounding atmosphere. Previously, the workings of these systems were speculation, a hypothesis based on a large number of observations both by our group and by multiple groups worldwide,” said ORNL’s Kyle Kelley, a researcher with the Center for Nanophase Materials Sciences. CNMS is a DOE Office of Science user facility.

    Kelley performed the experiments and envisioned the project in collaboration with Sergei Kalinin of the University of Tennessee, Knoxville.

    Materials commonly used for memory applications have a surface, or dead, layer that interferes with the material’s ability to store information. As materials are scaled down to only several nanometers thick, the effect of the dead layer becomes extreme enough to completely stop the functional properties. By changing the atmosphere, the scientists were able to tune the surface layer’s behavior, which, in hafnia, transitioned the material from the antiferroelectric to the ferroelectric state. 

    “Ultimately, these findings provide a pathway for predictive modeling and device engineering of hafnia, which is urgently needed, given the importance of this material in the semiconductor industry,” Kelley said.

    Predictive modeling enables scientists to use previous research to estimate the properties and behavior of an unknown system. The study that Kelley and Kalinin led focused on hafnia alloyed, or blended, with zirconia, a ceramic material. But future research could apply the findings to anticipate how hafnia may behave when alloyed with other elements.

    The research relied on atomic force microscopy both inside a glovebox and in ambient conditions, as well as ultrahigh-vacuum atomic force microscopy, methods available at the CNMS.

    “Leveraging the unique CNMS capabilities enabled us to do this type of work,” Kelley said. “We basically changed the environment all the way from ambient atmosphere to ultrahigh vacuum. In other words, we removed all gases in the atmosphere to negligible levels and measured these responses, which is extremely hard to do.”

    Team members from the Materials Characterization Facility at Carnegie Mellon University played a key role in the research by providing electron microscopy characterization, and collaborators from the University of Virginia led the materials development and optimization.

    ORNL’s Yongtao Liu, a researcher with CNMS, performed ambient piezoresponse force microscopy measurements.

    The model theory that underpinned this research project was the result of a long research partnership between Kalinin and Anna Morozovska at the Institute of Physics, National Academy of Sciences of Ukraine.

    “I have worked with my colleagues in Kiev on physics and chemistry of ferroelectrics for almost 20 years now,” Kalinin said. “They did a lot for this paper while almost on the front line of the war in that country. These people keep doing science in conditions that most of us cannot imagine.”

    The team hopes that what they have discovered will stimulate new research specific to exploring the role of controlled surface and interface electrochemistries — the relationship between electricity and chemical reactions — in a computing device’s performance.

    “Future studies can extend this knowledge to other systems to help us understand how the interface affects the device properties, which, hopefully, will be in a good way,” Kelley said. “Typically, the interface kills your ferroelectric properties when scaled to these thicknesses. In this case, it showed us a transition from one material state to another.”  

    Kalinin added: “Traditionally, we explored surfaces at the atomic level to understand phenomena such as chemical reactivity and catalysis, or the modification of the rate of a chemical reaction. Simultaneously, in traditional semiconductor technology, our goal was only to keep surfaces clean from contaminants. Our studies show that, in fact, these two areas — the surface and the electrochemistry — are connected. We can use surfaces of these materials to tune their bulk functional properties.”

    The title of the paper is “Ferroelectricity in hafnia controlled via surface electrochemical state.”

    This research was supported as part of the Center for 3D Ferroelectric Microelectronics, an Energy Frontier Research Center funded by DOE’s Office of Science, Basic Energy Sciences program, and was partially performed as a user proposal at the CNMS.

    UT-Battelle manages ORNL for DOE’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. The Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

    [ad_2]

    Oak Ridge National Laboratory

    Source link

  • Pioneering research sheds surprising new light on evolution of plant kingdom

    Pioneering research sheds surprising new light on evolution of plant kingdom

    [ad_1]

    Newswise — A new study has uncovered intriguing insights into the evolution of plant biology, effectively rewriting the history of how they evolved over the past billion years.

    The research, published today in Nature Plants, shows plants have gradually developed their range of anatomical designs throughout the passage of time, punctuated by episodic bursts of innovation to overcome and adapt to environmental challenges.

    Such findings overturn the long-held belief that, much like animals, the fundamental range of plant types evolved in a big burst of sudden change early in their evolutionary history.

    Co-lead author Philip Donoghue, Professor of Palaeobiology at the University of Bristol, said: “Although plants are extraordinarily diverse in their design and organisation, they share a common ancestor which originated at sea more than a billion years ago.

    “We wanted to test whether they really evolved with a big bang early on in their history or whether their evolution was a slower and more continual process. Surprisingly, the results revealed plant evolution was a bit of a mix, with long periods of gradual change interrupted by short bursts of large-scale innovation, overcoming the challenges of living on dry land.”

    To test this theory the team of scientists analysed the similarities and differences of 248 groups of plants, ranging from single-celled pond scum and seaweed to land plants including everything from mosses and ferns, to pines, conifers and flowering plants. They also looked at 160 extinct groups known only from the fossil record, including species from the Devonian Rhynie Chert which lived more than 400 million years ago.

    More than a 130,000 observations were generated by breaking down plant designs into their components and recording those present or absent in each of the main groups, living and fossil. Computerised statistical techniques measured the overall similarities and differences between groups and how they have varied over time.

    The scientists also tried to work out what led to these evolutionary innovations, like the introduction of spores, seeds, roots, leaves, pollen and flowers.   

    Co-lead author Dr James Clark, Research Associate in Biological Sciences at the University of Bristol, said: “We found changes in plant anatomical design occur in association with events in which the entire cellular genetic make-up was doubled. This has happened many times in plant evolutionary history, as a result of errors in the genome-copying process, creating duplicate copies of genes that are free to mutate and evolve new functions.”

    But the major pulses of plant anatomical evolution were found to be associated with the challenge of living and reproducing in increasingly dry environments, connected to the progressive emergence of plants from sea on to land.

    Co-lead author Dr Sandy Hetherington’s fascination with the evolution of land plants began as a budding geologist at the University of Bristol and now continues in his work at the University of Edinburgh.

    He said: “Overall the pattern of episodic pulses in the evolution of plant anatomical designs matches that seen in other multi-cellular kingdoms of complex life, like animals and fungi. This suggests it is a general pattern and blueprint for complex multicellular life from its inception.”

    Paper

    ‘Evolution of phenotypic disparity in the plant kingdom’ by James W. Clark et al in Nature Plants

    Notes to editors

    Professor Philip Donoghue, Dr James Clark and Dr Sandy Hetherington are available for interview and advance copies of the embargoed paper can be requested. Please contact Victoria Tagg, Media & PR Manager (Research) at the University of Bristol: [email protected]

    Images

    https://fluff.bris.ac.uk/fluff/u2/oc20541/_mcB3ejZQJjOMTnnuN0oqgELk/

    Caption: The moss, Polytrichum commune, which is one of the closest living relatives of the ancestral land plant

    Credit: Silvia Pressel, The Natural History Museum

    https://fluff.bris.ac.uk/fluff/u3/oc20541/y_2cDGSW92fm1yF6LDdirgELg/

    Caption: The evolution of plant anatomical variety. Each dot represents a living or fossil species and the connecting lines reflect their evolutionary relationships, branching from a universal ancestor (bottom left) to the most recently evolved group, the flowering plants (bottom right).

    Credit: James Clark and colleagues, University of Bristol, UK

    https://fluff.bris.ac.uk/fluff/u3/oc20541/QllcweUjKzmlC4sFggkdVwELV/

    Caption: A diverse community of land plants, ranging from mosses to flowering species, grow together in boggy stream in the Cairngorms National Park, Scotland.

    Credit: Sandy Hetherington, The University of Edinburgh, UK

    [ad_2]

    University of Bristol

    Source link

  • Largest genetic study of epilepsy to date provides new insights on why epilepsy develops and potential treatments

    Largest genetic study of epilepsy to date provides new insights on why epilepsy develops and potential treatments

    [ad_1]

    Newswise — The largest genetic study of its kind, coordinated by the International League Against Epilepsy (ILAE), including scientists from the ILAE Consortium on Complex Epilepsies of the Genetics Commission, has discovered specific changes in our DNA that increase the risk of developing epilepsy.

    The research, published today in Nature Geneticsgreatly advances our knowledge of why epilepsy develops and may inform the development of new treatments for the condition.

    Epilepsy, a common brain disorder of which there are many different types, is known to have genetic component and to sometimes run in families. Here, researchers compared the DNA from diverse groups of almost 30,000 people with epilepsy to the DNA of 52,500 people without epilepsy. The differences highlighted areas of our DNA that might be involved in the development of epilepsy.

    The researchers identified 26 distinct areas in our DNA that appear to be involved in epilepsy. This included 19 which are specific to a particular form of epilepsy called genetic generalized epilepsy. They were also able to point to 29 genes that are probably contributing to epilepsy within these DNA regions.

    The scientists found that the genetic picture was quite different when comparing distinct types of epilepsy, in particular, when ‘focal’ and ‘generalized’ epilepsies were compared. The results also suggested that proteins that carry electrical impulse across the gaps between neurons in our brain make up some of the risk for generalized forms of epilepsy.

    “Gaining a better understanding of the genetic underpinnings of epilepsy is key to developing new therapeutic options and consequently a better quality of life for the over 50 million people globally living with epilepsy,” said Professor Gianpiero Cavalleri, Professor of Human Genetics at RCSI School of Pharmacy and Biomolecular Science, Deputy Director of the SFI FutureNeuro Research Centre and member of the ILAE Consortium on Complex Epilepsies.

    “The discoveries we report on here could only be achieved through international collaboration,  on a global scale. We are proud of how the global community of scientists working to better understand the genetics of the epilepsies have pooled resources and collaborated effectively, for the benefit of people impacted the condition,” commented Professor Cavalleri.

    The researchers also showed that many of the current medications for epilepsy work by targeting the same epilepsy risk genes that were highlighted in this study. However, based on their data, the researchers were able to propose some potentially effective alternative drugs. These will need to be clinically tested for use in epilepsy as they are normally used for other conditions, but they are known to target some of the other epilepsy risk genes uncovered.

    “This identification of epilepsy associated genetic changes will allow us to improve diagnosis and classification of different epilepsy subtypes. This in turn, will guide clinicians in selecting the most beneficial treatment strategies, minimising seizures,” said Professor Colin Doherty, Consultant Neurologist, St James’s Hospital, Co-author and Clinical Investigator at the SFI FutureNeuro Centre, and member of the Irish Epilepsy League.

    Over 150 researchers, based across Europe, Australia, Asia, South America and North America, carried out the research. They worked together as part of the ILAE Consortium on Complex Epilepsies. The ILAE Consortium was formed by researchers in 2010, recognising that the complexity of genetic and environmental factors underlying epilepsy would require research across massive datasets, and therefore unprecedented collaboration on an international scale.

    “Undertaking such a comprehensive study is a remarkable achievement….The challenge now is to translate the findings of this research to improve the lives of people with epilepsy,” concluded Professor Cavalleri.

    “With this study, we have bookmarked parts of our genome that should be the major focus of future epilepsy research. It will form the basis for further work looking at the molecular pathways involved in seizure generation, neuronal dysfunction and altered brain activity,” said Professor Samuel Berkovic, University of Melbourne.

    “This is a major milestone for the ILAE Consortium on Complex Epilepsies, demonstrating what can be achieved when scientists openly collaborate and share data from across the world. The outputs are wide-reaching and applicable to epilepsy patients globally,” said Professor Helen Cross, President of the International League Against Epilepsy.

    Guided by the ILAE vision of a world in which no person’s life is limited by epilepsy, the driving principle behind the ILAE Consortium is that through collaboration and synergy, researchers will make more progress towards fully understanding the inherited components of epilepsy than can be realized by individual groups.


    More information: GWAS meta-analysis of over 29,000 people with epilepsy identifies 26 risk loci and subtype-specific genetic architecture, Nature Genetics (2023). DOI: 10.1038/s41588-023-01485-w

    [ad_2]

    International League Against Epilepsy

    Source link

  • Greening cities cuts carbon

    Greening cities cuts carbon

    [ad_1]

    Newswise — Dozens of European cities could reach net zero carbon emissions over the next 10 years by incorporating nature into their infrastructure, according to a new study.

    Published recently in the journal, Nature Climate Change, the analysis shows the ways cities can orchestrate a wide range of green solutions like parks, streetscaping and roof gardens to not only capture carbon emissions, but help reduce them.

    The study was undertaken by researchers from Sweden, the U.S. and China. It recommends the most effective approaches for natural carbon sequestration in 54 cities in the EU. And it shows how blending these steps with other climate actions can enable cities to reach net-zero carbon and actually reduce emissions by an average of 17.4 percent.

    Zahra Kalantari, an associate professor in Water and Environmental Engineering at KTH Royal Institute of Technology, says the researchers focused on the indirect ways that so-called “nature-based solutions” can contribute to carbon neutrality.

    “Nature-based solutions not only offset a proportion of a city’s emissions, but can contribute to reduction in emissions and resource consumption too,” Kalantari says.

    The results are based on integrating data from previous studies on the effects of nature-based solutions. These include urban farming, permeable pavements which enable rainwater absorption into the ground, narrower roads with more greenery and trees, wildlife habitat preservation, and creating more agreeable environments for walking and bicycling.

    For example, urban parks, greenspace and trees promote more walking, bicycling and other environmentally positive habits that replace automobile driving. Combined with other solutions like green infrastructure, these measures can further improve urban microclimates by absorbing heat and cold, and as a result reduce energy use in buildings.

    It also provides guidance on which measures should be prioritized and where to locate them for the best effect, she says. For example, in Berlin the study recommends prioritizing green buildings and urban green spaces, which could result in an emissions reduction rate of 6 percent for residences, 13 percent in industry and 14 percent in transportation.

    “There are many studies that examine the effects of individual nature-based solutions, but this merges all of them and analyzes the potential systemic effect,” she says. “That’s new.”

    The study was a collaboration by researchers from KTH Royal Institute of Technology in Stockholm, MIT, Stockholm University, University of Gävle, Linköping University, Royal Swedish Academy of Sciences and Shanghai Jiao Tong University.

    [ad_2]

    Kungliga Tekniska Hogskolan (KTH) [Royal Institute of Technology]

    Source link

  • LJI scientists harness ‘helper’ T cells to treat tumors

    LJI scientists harness ‘helper’ T cells to treat tumors

    [ad_1]

    Newswise — LA JOLLA, CA—Scientists are on the hunt for a unique set of mutations, called “neoantigens,” that let the immune system distinguish tumor cells from normal cells. Their goal is to help the immune system react to neoantigens and target tumor cells for destruction.

    This area of research has led to life-saving antibody therapeutics, such as immune checkpoint inhibitors, which rely on antibodies to help immune cells kill tumors. Unfortunately, antibody-based cancer immunotherapies don’t work for all patients.

    At La Jolla Institute for Immunology (LJI), Professor Stephen Schoenberger, Ph.D., and his colleagues are looking beyond antibodies. Schoenberger’s lab leads pioneering research into how the immune system’s CD4+ “helper” T cells detect neoantigens.

    Now Schoenberger and his colleagues have published a pair of studies that show how we might harness CD4+ T cells while boosting the cancer-fighting power of CD8+ “killer” T cells. In fact, the researchers demonstrate a new kind of vaccine design that recruits both types of T cells to destroy large tumors.

    “Therapeutic cancer vaccines can work,” says Schoenberger, who serves as a member of the LJI Center for Cancer Immunotherapy. “But they should leverage the natural synergy of CD4+ and CD8+ T cells.”

    Researchers help CD4+ T cells detect tumors

    As Schoenberger points out, CD4+ and CD8+ T cells already work together when fighting viruses and bacteria. “Why not learn from the immune system’s natural way of keeping us protected and turn that against cancer?” he says.

    In a paper published recently in Nature Immunology, Schoenberger worked closely with LJI Professor Bjoern Peters, Ph.D,. to demonstrate the essential role of CD4+ T cells in recognizing tumor cells. Their strategy depends on an innovative way to predict which tumor neoantigens will spark a strong CD4+ T cell response. 

    As Schoenberger explains, tumor cells arise from normal cells in the body. This means the body has a harder time recognizing tumor cells as dangerous. Other threats, such as viruses, tend to carry around very un-human looking peptide sequences. “With prompting from CD4+ T cells, immune cells called dendritic cells can capture these peptide sequences and show them to CD8+ T cells—sending the immune system into red alert. “CD8+ T cells execute the tumor,” says Schoenberger, “but they require the cooperation of CD4+ T cells to do so efficiently.”

    But tumor cells share most of their peptide sequences with normal cells, and are therefore harder for the immune system to “see.” To get around this problem, Schoenberger and Peters have devised computational tools to identify the genetic mutations and specific peptides that serve as neoantigens to distinguish tumor cells from their neighbors.

    The Nature Immunology study shows that CD4+ T cells that recognize a single target mutation can  drive a diverse CD8+ T cell response that eradicates large established tumors . The researchers tested T cells recognizing this target mutation for “avidity,” which is how strongly their antigen receptors bind to the neoantigen. Their surprising results showed that neoantigen-specific CD4+ T cells can mediate their effect across a range of affinities.

    “This is brand new because no one has ever studied the neoantigen-specific CD4+ repertoire at the level of T cell receptors,” says Schoenberger.

    The researchers also found that the most effective responses happened when the transferred CD4+ T cells were induced to develop into stem cell memory-like CD4+ T cells. This type of T cell are endowed with special properties of longevity and the ability to generate powerful effector cells. As Schoenberger’s research spans the lab to the clinic, these findings will be translated to clinical trials in the near future.

    New vaccine brings T cells together

    In a second study, published recently in the Journal of Clinical Investigation, Schoenberger and his colleagues showed how a new vaccine strategy can induce CD4+ T cells and CD8+ T cells to work together to destroy large, aggressive tumors in a mouse model.

    For the study, Schoenberger collaborated with Joseph Dolina, Ph.D., a senior scientist at Pfizer Inc., and former member of the Schoenberger Lab (Pfizer has no financial disclosures to this specific study).

    The team began with an aggressive squamous cell tumor that contained a low number of mutations, as many human cancers do. The researchers identified 270 mutations that make this tumor stand out from normal cells, and they performed in-depth studies on 39 of these mutations. They narrowed that group down to five mutations that were recognized by the natural anti-tumor T cell response—with some mutations targeted by CD4+ T cells and others by CD8+ T cells. Remarkably, only mutations targeted by both CD4+ and CD8+ T cells were capable of triggering protective or therapeutic responses against the tumor.

    “These neoantigens had to be physically linked to mediate therapy,” says Schoenberger. “We could make large tumors go away so long as the vaccine activated both CD4+ and CD8+ T cells via the same antigen-presenting cell.”

    Going forward, Schoenberger plans to work with his clinical colleagues at the UC San Diego Moores Cancer Center to study whether this type of linked vaccine is effective in human patients. He hopes a future clinical trial can give hope to patients with especially aggressive tumors.

    “The other message here is that we think we can greatly increase the number of patients who could benefit from checkpoint blockade immunotherapy if we combine it with a personalized cancer vaccine,” says Schoenberger.

    Additional authors of the Nature Immunology study, “Neoantigen-specific stem cell memory-like CD4+ T cells mediate CD8+ T cell-dependent immunotherapy of MHC class II-negative solid tumors,” include Spencer E. Brightman (first author), Angelica Becker, Rukman R. Thota, Martin S. Naradikian, Leila Chihab, Karla Soria Zavala, Ryan Q. Griswold, Joseph S. Dolina, Ezra E. W. Cohen and Aaron M. Miller.

    This study was supported by the National Institutes of Health (grant UO1 DE028227), the San Diego Center for Precision Immunotherapy, and the Sandor and Rebecca Shapery Family.

    Nature Immunology DOI: https://doi.org/10.1038/s41590-023-01543-9

    Additional authors of the Journal of Clinical Investigation study, “Linked CD4+/CD8+ T cell neoantigen vaccination overcomes immune checkpoint blockade resistance and enables tumor regression,” include Joey Lee, Spencer E. Brightman, Sara McArdle, Samantha M. Hall, Rukman R. Thota, Karla S. Zavala, Manasa Lanka, Ashmitaa Logandha Ramamoorthy Premlal, Jason A. Greenbaum, Ezra E.W. Cohen and Bjoern Peters.

    This study was supported by the National Institutes of Health (grants U01 DE028227, P30CA23100, S10 RR027366 and S10 OD016262), the San Diego Center for Precision Immunotherapy, and the Sandor and Rebecca Shapery Family.

    [ad_2]

    La Jolla Institute for Immunology

    Source link

  • Scientists Develop Efficient Spray Technique for Bioactive Materials

    Scientists Develop Efficient Spray Technique for Bioactive Materials

    [ad_1]

    BYLINE: Kitta MacPherson

    Newswise — Rutgers scientists have devised a highly accurate method for creating coatings of biologically active materials for a variety of medical products. Such a technique could pave the way for a new era of transdermal medication, including shot-free vaccinations, the researchers said.

    Writing in Nature Communications, researchers described a new approach to electrospray deposition, an industrial spray-coating process. Essentially, Rutgers scientists developed a way to better control the target region within a spray zone as well as the electrical properties of microscopic particles that are being deposited. The greater command of those two properties means that more of the spray is likely to hit its microscopic target.

    In electrospray deposition, manufacturers apply a high voltage to a flowing liquid, such as a biopharmaceutical, converting it into fine particles. Each of those droplets evaporates as it travels to a target area, depositing a solid precipitate from the original solution.

    “While many people think of electrospray deposition as an efficient method, applying it normally does not work for targets that are smaller than the spray, such as the microneedle arrays in transdermal patches,” said Jonathan Singer, an associate professor in the Department of Mechanical and Aerospace Engineering in the Rutgers School of Engineering and an author on the study. “Present methods only achieve about 40 percent efficiency. However, through advanced engineering techniques we’ve developed, we can achieve efficiencies statistically indistinguishable from 100 percent.”

    Coatings are increasingly critical for a variety of medical applications. They are used on medical devices implanted into the body, such as stents, defibrillators and pacemakers. And they are beginning to be used more frequently in new products employing biologicals, such as transdermal patches.

    Advanced biological or “bioactive” materials – such as drugs and vaccines – can be costly to produce, especially if any of the material is wasted, which can greatly limit whether a patient can receive a given treatment.

    “We were looking to evaluate if electrospray deposition, which is a well-established method for analytical chemistry, could be made into an efficient approach to create biomedically active coatings,” Singer said.

    Higher efficiencies could be the key to making electrospray deposition more appealing for the manufacture of medical devices using bioactive materials, researchers said.

    “Being able to deposit with 100 percent efficiency means none of the material would be wasted, allowing devices or vaccines to be coated in this way,” said Sarah Park, a doctoral student in the Department of Materials Science and Engineering who is first author on the paper. “We anticipate that future work will expand the range of compatible materials and the material delivery rate of this high‐efficiency approach.”

    In addition, unlike other coating techniques used in manufacturing, such as dip coating and inkjet printing, the new electrospray deposition technique is characterized as “far field,” meaning that it doesn’t need highly accurate positioning of the spray source, the researchers said. As a result, the equipment necessary to employ the technique for mass manufacturing would be more affordable and easier to design.

    Other Rutgers scientists on the study included professors Jerry Shan and Hao Lin, former doctoral students Lin Lei (now at Chongqing Jiaotong University) and Emran Lallow (now at GeneOne Life Science, Inc.), and former undergraduate student Darrel D’Souza, all of the Department of Mechanical and Aerospace Engineering; and professors David Shreiber and Jeffrey Zahn, doctoral student Maria Atzampou, and former doctoral student Emily DiMartini, all of the Department of Biomedical Engineering. This work was supported by GeneOne Life Science, Inc.

    [ad_2]

    Rutgers University-New Brunswick

    Source link

  • New tool aligns data from tissue slices virtually

    New tool aligns data from tissue slices virtually

    [ad_1]

    Newswise — SAN FRANCISCO, CA—Imagine a few roughly cut slices of bread on a plate. With just those slices, could you picture, in fine detail, the loaf they came from?

    Now, imagine several thin slices of tissue from, say, a small tumor. You’ve tested which of several genes are active at every point across each slice’s length and width. With that two-dimensional data from just a few slices, could you predict which of the genes are active throughout the entire three-dimensional structure of the tumor? Not easy, right?

    Discerning the 3D makeup of a tumor—or other tissue—using data from just a few slices is a serious computational challenge. But a new method developed at Gladstone Institutes enables researchers to do just that. This approach, published in the journal Nature Methods, could allow for much deeper understanding of biological tissue samples.

    “Without that third dimension, you can miss a lot of what’s happening in tissue,” says Gladstone Senior Investigator Barbara Engelhardt, PhD, senior author of the study. “Putting together slices in 3D space should help us begin to answer questions for which 2D data falls short. For instance, what are the precise boundaries of a tumor? Where have immune cells infiltrated the tumor? Where in the tumor would be best to inject a treatment?”

    The new method, named Gaussian Process Spatial Alignment (GPSA), is not just for tumors. It can be applied to nearly any kind of tissue and any type of data obtained from tissue slices, such as the structure of cells or which genes or proteins are switched on within them—with broad implications for research and medicine.

    Filling in the Blanks

    One of the most widely used ways to understand biological tissue—whether from a patient with an illness or an animal in a lab—is to surgically remove some of the affected tissue and analyze it. In labs around the world, technicians may slice the removed tissue into thin pieces to view under a microscope or to test for the presence of specific molecules that could aid diagnosis, guide treatment, or hint at how well a drug is working.

    However, the time, budget, and computational power needed to analyze each slice means that researchers and doctors are often limited to just a few slices from different parts of the tissue. What’s more, tissue slices become physically warped when they are cut, processed, and analyzed in a lab, making it difficult to discern exactly how the slices line up and fit together within the overall 3D structure of the original tissue.

    “The first step in going from 2D slice data to a full, 3D picture of the tissue is to computationally reverse warping so that we can realign the slices in virtual space,” says Engelhardt, who is also a professor in the Department of Biomedical Data Science at Stanford University.

    To address this challenge, the GPSA method uses what Engelhardt and her team refer to as a two-layer Gaussian process. This statistical approach harnesses data from the 2D tissue slices and, in the first layer, fits the warped 2D slice onto a 3D model of the tissue. In the second layer, GPSA attributes to each point in the 3D model some data collected from the slice, such as what genes are turned on at that point. In this way, GPSA reverses warping virtually and enables a highly precise alignment of the slices.

    During this process, the GPSA model fills in the spaces between slices with predictions of gene or protein expression for every point throughout the tissue, ultimately generating a 3D “atlas” of the tissue.

    “Say you have four slices from different locations in a person’s breast cancer tumor, and for every point on each slice you know which of 20,000 genes are turned on or off,” Engelhardt says. “GPSA creates a fully query-able 3D atlas where, for any single ‘x, y, z’ coordinate, for any of the 20,000 genes, we can dive in and ask: What genes are on and off at this position in the tumor? And how certain are we in this estimate?”

    A Highly Flexible Framework

    With GPSA, researchers can construct tissue atlases with data obtained from slices of inconsistent sizes, using different technologies, and at different scales and levels of resolution. While prior methods require the 3D scaffolds or “coordinate frameworks” to be pre-specified, GPSA estimates this 3D framework from the 2D slices alone when a coordinate framework for the tissue does not yet exist. The new method can also combine multiple types of tissue-slice data—say, both information about which genes are switched on and information about cellular structure—into a single atlas.

    In addition, when applied to slices taken from the same tissue at different points in time, GPSA can generate atlases that predict how every location within the tissue changes over time. In this way, the technique could help deepen understanding of aging, how illnesses progress, or how different tissues develop in a growing organism.

    “Flexibility is one of the main strengths of our new tool,” Engelhardt says.

    She and her team are now conducting analyses to further demonstrate that flexibility. For instance, they have developed a method that could be used by labs on a budget to determine the minimum number of tissue slices needed—and the precise locations where those slices should be cut—for GPSA to construct a tissue atlas with the desired information.

    “The goal is to maximize the insights we can gain from tissue slices, in order to allow researchers and clinicians to deeply query 3D tissues that are well-studied or tumors that are unique to a patient, and ultimately improve healthcare,” Engelhardt says.

    ###

    [ad_2]

    Gladstone Institutes

    Source link

  • Crucial Role of Society in Advancing Green Energy Transition

    Crucial Role of Society in Advancing Green Energy Transition

    [ad_1]

    Newswise — As wind energy emerges as a linchpin in the global push towards a cleaner future, resistance to deploying renewable energy technologies has risen. This underscores the need for a collective socio-technical approach to designing and implementing renewable energy systems.

    A recent review paper in Nature Energy promotes an interdisciplinary research approach that bridges technical ‘grand challenges’ with societal dynamics, making renewable energy truly sustainable—technically and socially. Julia Kirch Kirkegaard, leading the study, emphasizes the risks societies face if they fail to consider local communities’ values and concerns:

    “Today, design decisions are often made without much debate. And when the public then raises concerns, the response is often not taken seriously, or it’s too late. Societies, therefore, risk losing public backing to the essential energy transition,” says Julia Kirch Kirkegaard, Associate Professor at DTU Wind and Energy Systems and lead author of the recent review article published in Nature Energy on socio-technical grand challenges in wind energy.

    Silo-mentality gets in the way

    Addressing the grand challenge of climate change is often done from the perspective of individual technical disciplines. However, this is at the risk of ignoring how technologies – and their design, development, and deployment – are always social. They are set into specific places and contexts and create certain social responses.

    With local opposition against renewables rising, the paper states there is an urgent need for interdisciplinary perspectives better to address the socio-technical nature of the energy transition. In other words, to meet global decarbonization goals, the technical sciences need to collaborate more with the social sciences and humanities to engage with – and create value for – local communities and broader society.

    The need for increased public participation concerns the planning and development phases and the design and end-of-life phases. In the design phase, in particular, important decisions are made that concern whose interests are considered – and whose aren’t. And recent research shows that these decisions even go back to the algorithms found in digital design tools.

    Case in point: Wind energy. There is little doubt that wind power will play a massive role in the future energy system to meet worldwide decarbonization goals. The level of effort that made wind an initial success got us to roughly a 9% share of electricity usage. That will not be sufficient, however, to make the transformative changes required to reach the expected one-third to one-half of total electricity, according to Julia Kirch Kirkegaard.

    “Denmark, for instance, is normally seen as a pioneer in wind energy, but only a handful of wind turbines were installed onshore in 2022. With an ambition to produce four times as much solar and wind energy on land and five times as much offshore by 2030, we need to find radically new approaches so that we do not see the controversies simply multiplied,” she says.

    “While wind turbines are getting larger, and less land is becoming available, local, societal opposition to deployments of new wind energy infrastructure has been growing. We need to understand better and acknowledge why that is so—otherwise, there is a real risk that societies’ ability to meet climate ambitions is jeopardized.”

    A new approach to socio-technical grand challenges

    Better recognition of how technical and natural sciences, on the one hand, and state-of-the-art in the social sciences, on the other, address the grand challenges facing wind power is needed since, according to Julia Kirch Kirkegaard, they often do not even agree on the most significant challenges.

    The authors warn that the socio-technical research gaps may become grand challenges in their own right if the wind energy sector cannot confront them in due time. Julia Kirch Kirkegaard explains that while it will be a challenge for research, industry and society as a whole to bridge these gaps, the timing for engaging the participants in the deployment of wind energy is obvious:

    “Major technological progress is facing growing resistance from the public. Since we’ll likely see similar conflicts in the future – as we address other aspects of the energy transition and climate mitigation technologies such as Power-to-X, energy islands and more – the time to explore how to bridge these manifold perspectives is now.”

    FACT BOX: Call to action:

    The Nature Energy paper Tackling grand challenges in wind energy through a socio-technical perspective promotes a lens founded in STS (Science & Technology Studies) to push the technical sciences and the state-of-the-art in social sciences and humanities on the issue (i.e., the social acceptance literature) forward and towards more interdisciplinary research:

    • Technical sciences need to move beyond their perspective on local opposition as a barrier to be tackled through technical or economic means to appreciate better their role in society and how their design and deployment decisions shape societal dynamics. It might even make it possible to look at public opposition not as something that must be done away with but as a potential for learning and value-creation.
    • The state-of-the-art in the social sciences (the social acceptance literature) has tended to focus on the planning and development phases, largely overlooking the technologies themselves, their design, and scientific reasons. With this, they lack an appreciation of how decisions about whose concerns should count (or not) are already made in the design phase. Sometimes making solutions to tackle local opposition in the planning and development phases are in vain and too late.

    The work on the Nature Energy paper is a collaborative effort between European and American scholars – at DTU Wind and Energy Systems (Technical University of Denmark), National Renewable Energy Laboratory (NREL), and Wageningen University & Research (WUR).

    The paper is part of ten papers on the grand challenges in wind energy science, published in Science and Wind Energy Science, encompassing topics like atmosphere, environmental concerns, digitalization, etc.

    The work on grand challenges in wind energy science is facilitated by the International Energy Agency (IEA) Wind Programme, which has recently determined that for wind power to fulfil its expected role as a major global supplier of carbon-free energy, critical challenges around the design, development, and deployment of wind energy must be addressed.

    [ad_2]

    Technical University of Denmark (DTU)

    Source link

  • China’s oldest water pipes were a communal effort

    China’s oldest water pipes were a communal effort

    [ad_1]

    Newswise — A system of ancient ceramic water pipes, the oldest ever unearthed in China, shows that neolithic people were capable of complex engineering feats without the need for a centralised state authority, finds a new study by UCL researchers.

    In a study published in Nature Water, the archaeological team describe a network of ceramic water pipes and drainage ditches at the Chinese walled site of Pingliangtai dating back 4,000 years to a time known as the Longshan period. The network shows cooperation amongst the community to build and maintain the drainage system, though no evidence of a centralised power or authority.

    Dr Yijie Zhuang (UCL Institute of Archaeology), senior and corresponding author on the paper, said: “The discovery of this ceramic water pipe network is remarkable because the people of Pingliangtai were able to build and maintain this advanced water management system with stone age tools and without the organisation of a central power structure. This system would have required a significant level of community-wide planning and coordination, and it was all done communally.”

    The ceramic water pipes make up a drainage system which is the oldest complete system ever discovered in China. Made by interconnecting individual segments, the water pipes run along roads and walls to divert rainwater and show an advanced level of central planning at the neolithic site.

    What’s surprising to researchers is that the settlement of Pingliangtai shows little evidence of social hierarchy. Its houses were uniformly small and show no signs of social stratification or significant inequality amongst the population. Excavations at the town’s cemetery likewise found no evidence of a social hierarchy in burials, a marked difference from excavations at other nearby towns of the same era.

    But, despite the apparent lack of a centralised authority, the town’s population came together and undertook the careful coordination needed to produce the ceramic pipes, plan their layout, install and maintain them, a project which likely took a great deal of effort from much of the community.

    The level of complexity associated with these pipes refutes an earlier understanding in archaeological fields that holds that only a centralised state power with governing elites would be able to muster the organisation and resources to build a complex water management system. While other ancient societies with advanced water systems tended to have a stronger, more centralised governance, or even despotism, Pingliangtai demonstrates that was not always needed, and more egalitarian and communal societies were capable of these kinds of engineering feats as well.

    Co-author Dr Hai Zhang of Peking University said: “Pingliangtai is an extraordinary site. The network of water pipes shows an advanced understanding of engineering and hydrology that was previously only thought possible in more hierarchical societies.”

    Pingliangtai is located in what is now the Huaiyang District of Zhoukou City in central China. During neolithic times, the town was home to about 500 people with protective earthen walls and a surrounding moat. Situated on the Upper Huai River Plain on the vast Huanghuaihai Plain, the area’s climate 4,000 years ago was marked by big seasonal climate shifts, where summer monsoons would commonly dump half a metre of rain on the region monthly.

    Managing these deluges was important to prevent floodwaters from overwhelming the region’s communities. To help mitigate the excessive rainwater during the rainy seasons, the people of Pingliangtai built and operated a two-tier drainage system that was unlike any other seen at the time. They built simple but coordinated lines of drainage ditches that ran parallel to their rows of houses in order to divert water from the residential area to a series of ceramic water pipes that carried the water into the surrounding moat, and away from the village.

    These ceramic water pipes represented an advanced level of technology for the time. While there was some variety in decoration and styles, each pipe segment was about 20 to 30 centimetres in diameter and about 30 to 40 centimetres long. Numerous segments were slotted into each other to transport water over long distances.

    Researchers cannot say specifically how the people of Pingliangtai organised and divided the labour amongst themselves to build and maintain this type of infrastructure. This kind of communal coordination would also have been necessary to build the earthen walls and moat surrounding the village as well. 

    The Pingliangtai drainage system is unique from water systems elsewhere in the world at the time. Its purpose to drain rain and flood water from monsoons differs from other neolithic systems in the world, many of which were used for sewerage water drainage or other purposes.

    Funding was provided by the National Natural Science Foundation of China, the National Natural Science Foundation of China and the Newton Advanced Fellowship of the British Academy.

    [ad_2]

    University College London

    Source link

  • A new look inside Ebola’s “viral factories”

    A new look inside Ebola’s “viral factories”

    [ad_1]

    The research team, which included experts from Scripps Research and UC San Diego School of Medicine, found that Ebola virus’s replication machinery forms fascinating microscopic structures that become viral factories. By understanding the architecture and function of these microscopic manufacturing hubs, researchers may be closer to developing new therapies that interrupt the Ebola virus life cycle and prevent severe disease.

    “We are imaging these fluid and dynamic assembly centers for the first time. Understanding how they work and what they require gives us the information needed to defeat them,” says LJI President and CEO Erica Ollmann Saphire, Ph.D., senior author of the new study.

    What is a viral factory?

    Scientists first spotted what would turn out to be “virus factories” in virus-infected animal cells back in the 1960s, but they didn’t know what they were seeing. Within a sea of normal cellular proteins, these areas looked like fuzzy splotches.

    “People had already seen that Ebola-infected cells had these ‘inclusions,’” says LJI Postdoctoral Researcher Jingru Fang, Ph.D., first author of the new study. For a long time, scientists thought of these “inclusions” as helpful visual indicators of infection, without understanding their true purpose. “But in fact, these ‘inclusion bodies’ actively gather an enormous quantity of viral proteins and viral RNAs.”

    Many viral pathogens, including rabies virus and RSV (respiratory syncytial virus) form inclusions in host cells, Fang explains. “Recent studies suggest that these cellular inclusions are the site where viruses make their RNA genomes. They are ‘viral factories’ with actual functional purpose: to offer a secured space for viral RNA synthesis,” says Fang. “The process of viral RNA synthesis involves flux of viral building blocks. This means molecules gathered inside viral factories should be able to move freely rather than being static.”

    For the new study, Saphire, Fang and their colleagues wondered: Can we observe the movement of viral building blocks directly in living cells?

    Fang began by tagging a viral protein called VP35 with a fluorescent marker that makes the protein glow in the dark. VP35 is a critical component of the viral factory and is important for viral RNA synthesis (and the making of new copies of Ebola virus). Working with imaging experts in the LJI Microscopy and Histology Core, Fang followed the glowing proteins in live cells, which express a simplified and non-infectious version of Ebola viral factories.

    Under the microscope, Fang and colleagues could indeed see and even measure how molecules move inside the viral factories formed in host cells. This finding added evidence that viral proteins are clumping together like droplets so they can churn out the proteins needed to help the virus replicate. Those mysterious inclusions really are viral factories. The researcher dubbed these “droplet-like” viral factories.

    Then the scientists saw something odd. Some of the glowing proteins didn’t gather into clumps. Instead, they joined up with a smattering of other viral proteins, creating a fluorescent swirl that evoked van Gogh’s “Starry Night.” These trails of viral proteins still had the right ingredients to replicate Ebola virus, so the scientists dubbed them “network-like” viral factories.

    “These are two different flavors of the viral factory,” says Fang. “People have mostly focused on the droplet-like form, which is the majority, and not paid too much attention to this other form.”

    Besides their shapes, there was a key difference between the two factories. It appeared the network-like factories had the right ingredients for the incoming Ebola virus to express its genes, but they didn’t actually produce virus progenies.

    A multi-tasking machine

    Next, the researchers looked at a key player in infection: a protein called virus polymerase. Polymerase is a multifunctional nanomachine that comes with the virus. This machine not only copies the Ebola virus genomic material, it also transcribes the viral genome into messenger RNAs, which instruct infected cells to produce loads of viral proteins. The researchers wanted to understand how this viral machine functions inside viral factories.

    Ebola virus polymerase is already known as a hard-working protein—all Ebola viral proteins have to be. Ebola virus is a highly efficient pathogen because it gets by with just seven genes (humans have more than 20,000 genes). Saphire has led research showing that Ebola virus survives by making proteins that can transform and take on different jobs during the course of infection.

    Just last year, Saphire, Fang, and collaborators published a related discovery that viral polymerase actually harnesses a druggable human protein to help the virus replicate its genome. The team reported that while polymerase is essential for viral replication, the polymerase doesn’t actually jump into action until infection is well underway.

    This work was important for understanding how polymerase stepped into action, but scientists also needed to know where polymerase was active. Fang knew it would be important to look at what polymerase might be up to in viral factories.

    The researchers discovered that polymerase actually builds its own special structures inside viral factories. Many copies of polymerase gather in small bundles, called foci. The researchers found that these bundles spread out when a droplet-like viral factory starts replicating viral material.

    Scientists aren’t sure exactly why polymerase needs to form bundles before it can do its job, but the spatial arrangement of the bundles must be important. As Fang points out, the idea of many small components coming together to build a structure isn’t a new concept in nature. “You can use a beehive or coral reef as the analogy to help understand why a specific spatial arrangement is important for a biological system to function,” she says.

    With this finding, scientists now know how to find different kinds of viral factories and how polymerase organizes itself down on the factory floor.

    Fighting back

    More than 30 human pathogens are known to assemble viral factories inside host cells, including respiratory syncytial virus (RSV) and even rabies virus. With this new view of Ebola’s viral factories, the scientists are curious whether other viruses construct similar forms of viral factories—and whether other viruses use their own versions of polymerase in the same way.

    “If that’s true, maybe we can target the feature of viral factory formation that has been shared by multiple different viruses,” says Fang.

    Going forward, Fang would also like to study how Ebola virus forms viral factories in different kinds of host cells. Do these viral factories look different in cells from animals (such as the virus’s natural hosts, the fruit bats) that can carry the virus around without getting sick? “Can we find some explanation for host-specific viral pathogenesis?” she asks.

    The new study also demonstrates the importance of collaboration across San Diego’s Torrey Pines Mesa. The LJI team worked closely with Scripps Research Professor Ashok Deniz, Ph.D., and UC San Diego Professor Mark H. Ellisman, Ph.D., Director of the National Center for Microscopy and Imaging Research.

    “The combination of state-of-the-art tools available on the Torrey Pines Mesa allowed us to combine the biophysical characterization with the human health insight,” says Saphire

    Additional authors of the study, “Spatial and functional arrangement of Ebola virus polymerase inside phase-separated viral factories,” include Guillaume Castillon, Sebastien Phan, Sara McArdle, Chitra Hariharan, and Aiyana Adams.

    This study was supported by the National Institute of Health (grants NIH S10OD021831, R24GM137200, and S10OD021784), an Imaging Scientist grant (2019‐198153) from the Chan Zuckerberg Initiative, LJI institutional funds, and the Donald E. and Delia B. Baxter Foundation Fellowship.

    DOI: 10.1038/s41467-023-39821-7

    ###

    About La Jolla Institute

    The La Jolla Institute for Immunology is dedicated to understanding the intricacies and power of the immune system so that we may apply that knowledge to promote human health and prevent a wide range of diseases. Since its founding in 1988 as an independent, nonprofit research organization, the Institute has made numerous advances leading toward its goal: life without disease. Visit lji.org for more information.

    [ad_2]

    La Jolla Institute for Immunology

    Source link

  • New research points to possible seasonal climate patterns on early Mars

    New research points to possible seasonal climate patterns on early Mars

    [ad_1]

    Newswise — LOS ALAMOS, N.M., Aug. 9, 2023—New observations of mud cracks made by the Curiosity Rover show that high-frequency, wet-dry cycling occurred in early Martian surface environments, indicating that the red planet may have once seen seasonal weather patterns or even flash floods. The research was published today in Nature.

    “These exciting observations of mature mud cracks are allowing us to fill in some of the missing history of water on Mars. How did Mars go from a warm, wet planet to the cold, dry place we know today? These mud cracks show us that transitional time, when liquid water was less abundant but still active on the Martian surface,” said Nina Lanza, principal investigator of the ChemCam instrument onboard the Curiosity Rover. “These features also point to the existence of wet-dry environments that on Earth are extremely conducive to the development of organic molecules and potentially life. Taken as a whole, these results a giving us a clearer picture of Mars as a habitable world.”

    The presence of long-term wet environments, such as evidence of ancient lakes on Mars, is well-documented, but far less is known about short-term climate fluctuations.

    After years of exploring terrain largely comprised of silicates, the rover entered a new area filled with sulfates, marking a major environment transition. In this new environment, the research team found a change in mud crack patterns, signifying a change in the way the surface would have dried. This indicates that water was still present on the surface of Mars episodically, meaning water could have been present for a time, evaporated, and repeated until polygons, or mud cracks, formed.

    “A major focus of the Curiosity mission, and one of the main reasons for selecting Gale Crater, is to understand the transition of a ‘warm and wet’ ancient Mars to a ‘cold and dry’ Mars we see today,” said Patrick Gasda of the Laboratory’s Space Remote Sensing and Data Science group and coauthor of the paper. “The rover’s drive from clay lakebed sediments to drier non-lakebed and sulfate-rich sediments is part of this transition.”

    On Earth, initial mud cracks in mud form a T-shaped pattern, but subsequent wetting and drying cycles cause the cracks to form more of a Y-shaped pattern, which is what Curiosity observed. Additionally, the rover found evidence that the mud cracks were only a few centimeters deep, which could mean that wet-dry cycles were seasonal, or may have even occurred more quickly, such as in a flash flood. 

    These findings could mean that Mars once had an Earth-like wet climate, with seasonal or short-term flooding, and that Mars may have been able to support life at some point.  

    “What’s important about this phenomenon is that it’s the perfect place for the formation of polymeric molecules required for life, including proteins and RNA, if the right organic molecules were present at this location,” Gasda said “Wet periods bring molecules together while dry periods drive reactions to form polymers. When these processes occur repeatedly at the same location, the chance increases that more complex molecules formed there.”

    The paper: “Sustained wet-dry cycling on early Mars.” Nature. DOI: 10.1038/s41586-023-06220-3

    Funding:  NASA’s Mars Exploration Program and in France is conducted under the authority of CNES. Mastcam mosaics were processed by the Mastcam team at Malin Space Science Systems. Edwin Kite funding by NASA grant 80NSSC22K0731. Lucy Thompson funding as MSL team member is provided by the CSA.

    [ad_2]

    Los Alamos National Laboratory

    Source link

  • Unveiling the Deceptive Tactics of Herpes Virus HCMV in Host Cells

    Unveiling the Deceptive Tactics of Herpes Virus HCMV in Host Cells

    [ad_1]

    Newswise — Herpes viruses are treacherous: once you are infected, you can never get rid of the virus. This is because herpes viruses lie dormant in certain host cells in the body for a lifetime. Almost every adult unknowingly carries at least one of the nine different human herpes viruses. The virus can be reactivated due to age, stress or a weakened immune system and lead to sometimes severe diseases.

    Herpes viruses are so successful because they have adapted well to humans and developed effective strategies to escape the immune system. Proteins that make the infected cell believe that it is not infected or threatened play a central role in camouflage. It is known, for example, that every herpes virus has a powerful proteome, i.e. a large number of these proteins, which, highly adapted to the host, enables it to replicate efficiently immediately after infection. The complex proteome also ensures that multilayered particles are built up in the already infected cell. These newly formed viruses – also called virions – contain numerous viral proteins as well as host proteins. In the center of the particles is the viral DNA, which is enclosed by a nucleocapsid. A layer of numerous other proteins called tegument is formed around this capsid.

    Particles come into play in the reactivation of the virus

    The particles are crucial in enabling the virus to replicate again and spread systemically in the body after reactivation triggered by whatever means. They are therefore central to the outbreak of disease – after a long period of dormancy (latency).

    However, little is known about the internal organization of these particles, especially the protein-protein interactions within the tegument. Researchers from the Leibniz-Forschungsinstitut für Molekulare Pharmakologie (FMP) and the CharitéUniversitätsmedizin Berlin have therefore taken a closer look at the particles, specifically in human cytomegalovirus (HCMV). HCMV occurs particularly frequently in the population and can be really dangerous, especially for transplant recipients and unborn children who become infected via the mother. Despite intensive research, there is currently no well-tolerated antiviral therapy that could effectively control or even eliminate the virus. There is also no vaccination against this type of virus.

    Map shows which proteins interact with each other

    In the current work, the team led by Fan Liu (FMP) and Lüder Wiebusch (Charité) has for the first time created a detailed map of the spatial interactions between viral and host cell proteins within HCMV particles. Among other things, this revealed that certain host cell proteins are recruited by viral proteins and play a role in viral replication. For example, a viral protein called UL32 recruits a cellular protein (protein phosphatase , PP1) into the particle to avoid binding of other, unwanted, host cell proteins.

    “HCMV itself does not have any phosphatases like PP1, so you can see that the virus takes advantage of certain host cell proteins to replicate efficiently,” says FMP virologist Boris Bogdanow, explaining a key strategy for how HCMV tricks its host.

    To study the interactions between the different proteins in intact HCMV particles layer by layer, the researchers used a technique called cross-linking mass spectrometry. “This method also allows us to draw conclusions about the identity of the proteins,” emphasized Fan Liu, an expert in mass spectrometry at the FMP. “But what is special and unique about cross-linking is that we can see which proteins interact with each other and where.”

    Never before has this innovative technology been used to map the spatial organization of interactions within herpesviral particles. With the data thus obtained, a computer model of the HCMV particle was subsequently created at FU Berlin by Mohsen Sadeghi. The virtual model allows simulation of each protein within the particle and visualizes the biophysical processes in a vivid way.

    “The identified protein-protein interaction is important to better understand the complex life cycle of HCMV,” Boris Bogdanow classifies the results. “And this, in turn, is important for finding candidate anti-viral drugs against HCMV.”

    [ad_2]

    Leibniz-Forschungsinstitut fur Molekulare Pharmakologie

    Source link

  • Free energy principle predicts self-organized learning in neurons

    Free energy principle predicts self-organized learning in neurons

    [ad_1]

    Newswise — An international collaboration between researchers at the RIKEN Center for Brain Science (CBS) in Japan, the University of Tokyo, and University College London has demonstrated that self-organization of neurons as they “learn” follows a mathematical theory called the free energy principle. The principle accurately predicted how real neural networks spontaneously reorganize to distinguish incoming information, as well as how altering neural excitability can disrupt the process. The findings thus have implications for building animal-like artificial intelligences and for understanding cases of impaired learning. The study was published August 7 in Nature Communications.

    When we learn to tell the difference between voices, faces, or smells, networks of neurons in our brains automatically organize themselves so that they can distinguish between the different sources of incoming information. This process involves changing the strength of connections between neurons, and is the basis of all learning in the brain. Takuya Isomura from RIKEN CBS and his international colleagues recently predicted that this type of network self-organization follows the mathematical rules that define the free energy principle. In the new study, they put this hypothesis to the test in neurons taken from the brains of rat embryos and grown in a culture dish on top of a grid of tiny electrodes.

    Once you can distinguish two sensations, like voices, you will find that some of your neurons respond to one of the voices, while other neurons respond to the other voice. This is the result of neural network reorganization, which we call learning. In their culture experiment, the researchers mimicked this process by using the grid of electrodes beneath the neural network to stimulate the neurons in a specific pattern that mixed two separate hidden sources. After 100 training sessions, the neurons automatically became selective—some responding very strongly to source #1 and very weakly to source #2, and others responding in the reverse. Drugs that either raise or lower neuron excitability disrupted the learning process when added to the culture beforehand. This shows that the cultured neurons do just what neurons are thought to do in the working brain.

    The free energy principle states that this type of self-organization will follow a pattern that always minimizes the free energy in the system. To determine whether this principle is the guiding force behind neural network learning, the team used the real neural data to reverse engineer a predictive model based on it. Then, they fed the data from the first 10 electrode training sessions into the model and used it to make predictions about the next 90 sessions. At each step, the model accurately predicted the responses of neurons and the strength of connectivity between neurons. This means that simply knowing the initial state of the neurons is enough to determine how the network would change over time as learning occurred.

    “Our results suggest that the free-energy principle is the self-organizing principle of biological neural networks,” says Isomura. “It predicted how learning occurred upon receiving particular sensory inputs and how it was disrupted by alterations in network excitability induced by drugs.”

    “Although it will take some time, ultimately, our technique will allow modelling the circuit mechanisms of psychiatric disorders and the effects of drugs such as anxiolytics and psychedelics,” says Isomura. “Generic mechanisms for acquiring the predictive models can also be used to create next-generation artificial intelligences that learn as real neural networks do.”

    [ad_2]

    RIKEN

    Source link

  • Understanding the Brain’s Circuit for Socially Subjective Reward Valuation

    Understanding the Brain’s Circuit for Socially Subjective Reward Valuation

    [ad_1]

    Newswise — Okazaki, Japan – Although you might never have consciously considered it, it’s very likely that when you receive a reward, part of the value that you place on it depends on what other people have received as similar rewards. In a recent study published in Nature Communications, Japanese researchers have identified an important brain circuit for this specific process.
    Although researchers have identified the brain regions that are important for deciding the value of a reward in relation to those of others (a process the authors termed ‘socially subjective reward valuation’), the connections between these regions have never been tested experimentally. The research team from the National Institute for Physiological Sciences (NIPS) decided to create a temporary disconnect between the medial prefrontal cortex, which is part of the social brain network, and the lateral hypothalamus, which is involved in social reward valuation.
    “We used a relatively new technique that is commonly known as DREADD, or ‘designer receptor exclusively activated by designer drug’, in macaque monkeys,” says senior author of the study Masaki Isoda. “This method allowed us to temporarily block most of the connections from the brain’s medial prefrontal cortex to the lateral hypothalamus.”

    To test the effects of functionally disconnecting two regions of the monkeys’ brains responsible for socially subjective reward valuation, the researchers used an existing experimental setup. Two monkeys were sat together and shown pictures on a screen. After seeing each picture, only one of the monkeys (or sometimes neither of the monkeys) received water as a reward. By varying the probability of reward for each monkey over a series of tests, the researchers were able to see what happened when the monkeys expected a reward for themselves (they made many licking motions with their tongues) versus a reward for the other monkey (they made fewer licking motions).

    “Using this test, we were able to see the effects of disconnecting the medial prefrontal cortex from the lateral hypothalamus on the monkeys’ expectations of rewards,” says Isoda. “We were excited to see that, with this disconnect, the monkeys were much less susceptible to the prospect of others receiving rewards, but that their own expectations of a reward did not change, suggesting that this pathway is a key circuit in socially subjective reward valuation only.”

    Together with recent research suggesting that the medial prefrontal cortex/lateral hypothalamus circuit is crucial for social rank information in mice, these results indicate that this circuit underlies many important social behaviors. A better understanding of this pathway will aid in the clinical diagnosis and treatment of injuries or alterations to the medial prefrontal cortex and lateral hypothalamus.

    ###
    The article, “Chemogenetic dissection of a prefrontal-hypothalamic circuit for socially subjective reward valuation in macaques,” was published in Nature Communications at DOI: 10.1038/s41467-023-40143-x.

     

    [ad_2]

    National Institutes of Natural Sciences (NINS)

    Source link

  • New platform empowers high-entropy alloy electrocatalysis study

    New platform empowers high-entropy alloy electrocatalysis study

    [ad_1]

    Newswise — Introduced in 2004, high-entropy alloys (HEAs) are alloys composed of multiple principal elements in nearly equiatomic proportions. Their unique chemical composition results in a high degree of chemical disorder, i.e. entropy, and produces remarkable properties such as high strength, ductility, and strong wear-and-tear resistance even at high temperatures. Scientists have dedicated a significant amount of attention to developing novel HEAs to help improve the performance of various electrocatalyst materials.

    Because they are made up of differing constituent elements, HEAs’ atomic-level surface designs can be complex. But unravelling this complexity is crucial, since the surface properties of materials often dictate their catalytic activity. Hence why researchers are seeking to understand the correlation between the atomic arrangement and the catalytic properties exhibited by HEAs.

    Now, a collaborative research team has created a new experimental platform that enables the control of the atomic-level structure of HEAs’ surfaces and the ability to test their catalytic properties. Their breakthrough was reported in the journal Nature Communications on July 26, 2023.

    “In our study we made thin layers of an alloy called a Cantor alloy, which contains a mix of elements (Cr-Mn-Fe-Co-Ni), on platinum (Pt) substrates,” explains Toshimasa Wadayama, co-author of the paper and a professor at Tohoku University’s Graduate School of Environmental Studies. “This produced a model surface for studying a specific reaction called the oxygen reduction reaction (ORR).”

    Using advanced imaging techniques, the group examined the atomic-level structure of the Pt-HEAs’ surfaces and studied their ORR properties. They discovered that the Pt-HEAs’ surfaces performed better in ORR compared to surfaces made of a platinum-cobalt alloy. This indicates that the atomic arrangement and distribution of elements near the surface, which creates a ‘pseudo-core-shell-like structure,’ contributes to the excellent catalytic properties of Pt-HEAs.

    Wadayama and his group stress the wide applicability of their findings, both for any constituent elements and to other nanomaterials.

    “Our newly constructed experimental study platform provides us with a powerful tool to elucidate the detailed relationship between multi-component alloy surface microstructures and their catalytic properties. It is valid for clarifying the precise correlations among the atomic-level, surface microstructure and electrocatalytic properties of HEAs of any constituent elements and ratios and, thus, would provide reliable training datasets for materials informatics. The platform is applicable not only to electrocatalysis but also in various fields of functional nanomaterials.”

    Looking ahead, the group hopes to expand this platform into practical electrocatalysis by using Pt-HEA-nanoparticles that seek to increase electrochemical surface areas.

    [ad_2]

    Tohoku University

    Source link

  • New Resource Harmonizes 16S and Shotgun Sequencing Data for Microbiome Research

    New Resource Harmonizes 16S and Shotgun Sequencing Data for Microbiome Research

    [ad_1]

    Newswise — Two leading sequencing techniques are no longer at odds, thanks to an international effort led by scientists at University of California San Diego. In a study published July 27, 2023 in Nature Biotechnology, the researchers debuted a new reference database called Greengenes2, which makes it possible to compare and combine microbiome data derived from either 16S ribosomal RNA gene amplicon (16S) or shotgun metagenomics sequencing techniques.

    “This is a significant moment in microbiome research, as we’ve effectively rescued over a decade’s worth of 16S data that might have otherwise become obsolete in the modern world of shotgun sequencing,” said senior author Rob Knight, PhD, professor in the departments of Pediatrics at UC San Diego School of Medicine and Bioengineering and Computer Science at UC San Diego Jacobs School of Engineering. “Standardizing results across these two methods will significantly improve our chances of discovering microbiome biomarkers for health and disease.”

    Microbiome studies depend on scientists’ ability to identify which microorganisms are present in a sample. To do this, they sequence the genetic information in the sample and compare it to reference databases that list which sequences belong to which organisms. 16S and shotgun sequencing are the two techniques most widely used in microbiome research, but they often yield different results.

    “Many researchers assumed that data from 16S and shotgun sequencing were simply too different to ever be integrated,” said first author of the study Daniel McDonald, PhD, scientific director of The Microsetta Initiative at UC San Diego School of Medicine. “Here we show that is not the case, and provide a reference database that researchers can now use to do just that.”

    The original Greengenes database had been widely used in the microbiome field for well over a decade. It was the reference database used by notable projects including the National Institutes of Health Human Microbiome Project, the American Gut Project, the Earth Microbiome Project and many others.

    However, one of its fundamental limitations was that it relied on the sequence of a single gene, 16S, to identify the organisms in a sample. This well-studied gene has long been used as a taxonomic marker, with each organism having its own 16S “barcode.” This method can describe the contents of a microbiome sample with genus-level resolution, but it cannot always identify specific species or strains of microbes, which is important for clinical work.

    Modern microbiome studies have since transitioned to using shotgun sequencing, which looks at DNA from all over the organisms’ genomes, rather than focusing on only one gene. This powerful approach gives researchers more species-level specificity and also provides insight into the microbes’ function.

    Scientists often attributed the discrepancies between the two techniques to differences in the way the samples are prepared in the lab. However, the new study demonstrates that incompatibilities between the two techniques arise from differences in computation, where a better reference database allows for the same conclusions to be drawn from both methods. This addresses an important issue in the reproducibility of microbiome research and allows the re-use of data from millions of samples in older studies.

    In trying to resolve these incompatibilities, the researchers first expanded the Web of Life whole genome database. They then used several new computational tools developed with co-author Siavash Mirarab, PhD, associate professor at UC San Diego Jacobs School of Engineering, to integrate existing high-quality full-length 16S sequences into the whole-genome phylogeny. With another machine learning tool developed by Mirarab’s group, they placed 16S fragments from over 300,000 microbiome samples. The result was an expansive reference database that both 16S and shotgun sequencing data could be mapped onto.

    To confirm whether Greengenes2 would help standardize findings from either sequencing technique, the researchers acquired both 16S and shotgun sequencing data from the same human microbiome samples and analyzed them both against the backdrop of the Greengenes2 phylogeny. The results from both techniques showed highly correlated diversity assessments, taxonomic profiles and effect sizes — something researchers had not seen before.

    “Through Greengenes2, a huge repository of 16S data can now be brought back into the fold and even combined with modern shotgun data in new meta-analyses,” said McDonald. “This is a major step forward in improving the reproducibility of microbiome studies and strengthening physicians’ ability to draw clinical conclusions from microbiome data.”

    Co-authors include: Yueyu Jiang, Metin Balaban, Kalen Cantrell, Antonio Gonzalez, Giorgia Nicolaou, Se Jin Song and Andrew Bartko, all at UC San Diego, as well as Qiyun Zhu at Arizona State University, James T. Morton at the National Institutes of Health, Donovan H. Parks and Philip Hugenholtz at The University of Queensland, Søren Karst at Columbia University, Mads Albertsen at Aalborg University, Todd DeSantis at Second Genome, Aki S. Havulinna, Pekka Jousilahti, Teemu Niiranen and Veikko Salomaa at the Finnish Institute for Health and Welfare, Susan Cheng at Brigham and Women’s Hospital and Cedars-Sinai Medical Center, Mike Inouye at University of Cambridge and Baker Heart and Diabetes Institute, Mohit Jain at Sapient Bioanalytics and Leo Lahti at University of Turku.

    This work was funded, in part, by the National Science Foundation (grants XSEDE BIO210103 and RAPID 20385.09), the National Institutes of Health (grants 1R35GM142725, U19AG063744, U24DK131617, DP1-AT010885), the Emerald Foundation 3022, Danone Nutricia Research, the Center for Microbiome Innovation and the intramural research program of the Eunice Kennedy Shriver National Institute of Child Health and Human Development.

    # # #

    Disclosures: Rob Knight is a consultant and advisory board member with equity and income in BiomeSense related to the proposed PHS-funded research.

    [ad_2]

    University of California San Diego

    Source link