ReportWire

Tag: Nature (journal)

  • Quantum-Enhanced Microscope Doubles Resolution

    Quantum-Enhanced Microscope Doubles Resolution

    [ad_1]

    Newswise — Using a “spooky” phenomenon of quantum physics, Caltech researchers have discovered a way to double the resolution of light microscopes.

    In a paper appearing in the journal Nature Communications, a team led by Lihong Wang, Bren Professor of Medical Engineering and Electrical Engineering, shows the achievement of a leap forward in microscopy through what is known as quantum entanglement. Quantum entanglement is a phenomenon in which two particles are linked such that the state of one particle is tied to the state of the other particle regardless of whether the particles are anywhere near each other. Albert Einstein famously referred to quantum entanglement as “spooky action at a distance” because it could not be explained by his relativity theory.

    According to quantum theory, any type of particle can be entangled. In the case of Wang’s new microscopy technique, dubbed quantum microscopy by coincidence (QMC), the entangled particles are photons. Collectively, two entangled photons are known as a biphoton, and, importantly for Wang’s microscopy, they behave in some ways as a single particle that has double the momentum of a single photon.

    Since quantum mechanics says that all particles are also waves, and that the wavelength of a wave is inversely related to the momentum of the particle, particles with larger momenta have smaller wavelengths. So, because a biphoton has double the momentum of a photon, its wavelength is half that of the individual photons.

    This is key to how QMC works. A microscope can only image the features of an object whose minimum size is half the wavelength of light used by the microscope. Reducing the wavelength of that light means the microscope can see even smaller things, which results in increased resolution.

    Quantum entanglement is not the only way to reduce the wavelength of light being used in a microscope. Green light has a shorter wavelength than red light, for example, and purple light has a shorter wavelength than green light. But due to another quirk of quantum physics, light with shorter wavelengths carries more energy. So, once you get down to light with a wavelength small enough to image tiny things, the light carries so much energy that it will damage the items being imaged, especially living things such as cells. This is why ultraviolet (UV) light, which has a very short wavelength, gives you a sunburn.

    QMC gets around this limit by using biphotons that carry the lower energy of longer-wavelength photons while having the shorter wavelength of higher-energy photons.

    “Cells don’t like UV light,” Wang says. “But if we can use 400-nanometer light to image the cell and achieve the effect of 200-nm light, which is UV, the cells will be happy, and we’re getting the resolution of UV.”

    To achieve that, Wang’s team built an optical apparatus that shines laser light into a special kind of crystal that converts some of the photons passing through it into biphotons. Even using this special crystal, the conversion is very rare and occurs in about one in a million photons. Using a series of mirrors, lenses, and prisms, each biphoton—which actually consists of two discrete photons—is split up and shuttled along two paths, so that one of the paired photons passes through the object being imaged and the other does not. The photon passing through the object is called the signal photon, and the one that does not is called the idler photon. These photons then continue along through more optics until they reach a detector connected to a computer that builds an image of the cell based on the information carried by the signal photon. Amazingly, the paired photons remain entangled as a biphoton behaving at half the wavelength despite the presence of the object and their separate pathways.

    Wang’s lab was not the first to work on this kind of biphoton imaging, but it was the first to create a viable system using the concept. “We developed what we believe a rigorous theory as well as a faster and more accurate entanglement-measurement method.  We reached microscopic resolution and imaged cells.”

    While there is no theoretical limit to the number of photons that can be entangled with each other, each additional photon would further increase the momentum of the resulting multiphoton while further decreasing its wavelength.

    Wang says future research could enable entanglement of even more photons, although he notes that each extra photon further reduces the probability of a successful entanglement, which, as mentioned above, is already as low as a one-in-a-million chance.

    The paper describing the work, “Quantum Microscopy of Cells at the Heisenberg Limit,” appears in the April 28 issue of Nature Communications. Co-authors are Zhe He and Yide Zhang, both postdoctoral scholar research associates in medical engineering; medical engineering graduate student Xin Tong (MS ’21); and Lei Li (PhD ’19), formerly a medical engineering postdoctoral scholar and now an assistant professor of electrical and computer engineering at Rice University.

    Funding for the research was provided by the Chan Zuckerberg Initiative and the National Institutes of Health.

    [ad_2]

    California Institute of Technology

    Source link

  • Camera array detects optical emission of gamma-ray burst

    Camera array detects optical emission of gamma-ray burst

    [ad_1]

    Newswise — Researchers led by Dr. XIN Liping from the Space-based Multi-band Astronomical Variable Objects Monitor (SVOM) research team, National Astronomical Observatories of the Chinese Academy of Sciences (NAOC), have detected the prompt optical emission and its transition to the early afterglow of a gamma-ray burst (GRB 201223A), using the Ground Wide Angle Camera Array (GWAC) located at Xinglong Observatory of NAOC.

    The study was published in Nature Astronomy on April 10.

    Gamma-ray bursts (GRBs) are produced by the collapse of massive stars or the merger of binary neutron stars. They are accompanied by extreme relativistic jets emitting enormous amounts of energy within a few seconds of the bursts. This phenomenon includes the prompt emission caused by the shock in the jet and the afterglow produced by interaction between the jet and external medium.

    Typical high-energy emission lasts only a few milliseconds to tens of seconds, and it is difficult to follow up in real time when ground-based optical telescopes receive alerts triggered by space-based high-energy instruments. Up till now, only a few cases of optical emission have been detected before the end of prompt high-energy emission. These GRBs have longer duration of high-energy emission (>30 seconds). Furthermore, all these measurements were contaminated with reverse shock, making it difficult to clearly review the transition from prompt emission to afterglow.

    GWAC, proposed and led by Prof. WEI Jianyan, principal investigator of the SVOM mission, is one of the key ground-based telescopes for the SVOM project. It can cover an ultra-large sky area with a temporal resolution of 15 seconds and a detection capability of magnitude 16. Its scientific purpose is to conduct systematic research on the prompt optical emission of GRBs discovered by the SVOM mission.

    In this study, GWAC recorded the entire process—before, during and after the trigger time of the burst. The duration of the high-energy emission was 29 seconds. The emergence of optical and gamma-ray emissions was detected simultaneously.

    “The prompt optical emission is far brighter than expected by about four orders of magnitude, if only gamma-ray emission is analyzed, which requires a special physical interpretation for these measurements,” said by Dr. XIN.

    According to joint analysis using the follow-up observations by F60A, an optical telescope jointly operated by NAOC and Guangxi University, the complete transition from prompt optical emission to afterglow was clearly achieved without any contamination from reverse shock.

    The extremely early unique data provided by GWAC place a fine constraint on the characteristics of the progenitor. Scientists expect strong stellar winds around a massive star, which is thought to be the ideal progenitor of a gamma-ray burst. However, the stellar wind is quite small for this event, even at a very close distance from the burst, thus suggesting the progenitor has a small stellar mass.

    After the launch of SVOM, simultaneous observations by GWAC and SVOM space-based instruments will have the potential to provide essential data for GRB studies, and finally a large sample with prompt optical observations will be built during SVOM mission.

    [ad_2]

    Chinese Academy of Sciences

    Source link

  • Improved Gene Editing Method Could Power the Next Generation of Cell and Gene Therapies

    Improved Gene Editing Method Could Power the Next Generation of Cell and Gene Therapies

    [ad_1]

    Newswise — PHILADELPHIA— A new approach to the genetic engineering of cells promises significant improvements in speed, efficiency, and reduction in cellular toxicity compared to current methods. The approach could also power the development of advanced cell therapies for cancers and other diseases, according to a study from researchers in the Perelman School of Medicine at the University of Pennsylvania.

    In the study, which appeared this week in Nature Biotechnology, researchers found that protein fragments used by some viruses to help them get into cells could also be used to get CRISPR-Cas gene editing molecules into cells and their DNA-containing nuclei with extraordinarily high efficiency and low cellular toxicity.

    The scientists expect the new technique to be particularly useful for modifying T cells and other cells from a patient’s own body to make cell therapies. One such application could be CAR T (chimeric antigen receptor T cell) therapy, which uses specially modified immune cells from a patient to treat cancer. The T cells—a type of white blood cell—are removed from the patient and reprogrammed to find and attack cancer cells when reintroduced to the bloodstream.  

    The first FDA-approved CAR T therapy was developed at Penn Medicine, and received Food & Drug Administration approval in 2017. There are now six FDA-approved CAR T cell therapies in the United States. The therapies have revolutionized the treatment of certain B cell leukemias, lymphomas, and other blood cancers, putting many patients who otherwise had little hope into long-term remission.

    “This new approach—building on Penn Medicine’s history of cell and gene therapy innovation—has the potential to be a major enabling technology for engineered cellular therapies,” said co-senior author E. John Wherry, PhD, Richard and Barbara Schiffrin President’s Distinguished Professor and chair of Systems Pharmacology & Translational Therapeutics at Penn Medicine.

    CRISPR-Cas molecules are derived from ancient bacterial antiviral defenses, and are designed to precisely remove DNA at desired locations in a cell’s genome. Some CRISPR-Cas-based systems combine the deletion of old DNA with the insertion of new DNA for versatile genome editing. This approach can be used to replace faulty genes with corrected ones or delete or modify genes to enhance cellular function. Some systems can also add genes that confer new properties to CAR T cells such as the ability to recognize tumors or withstand the harsh tumor microenvironment that normally exhausts T cells.

    Although CRISPR-Cas systems are already widely used as standard laboratory tools for molecular biology, their use in modifying patients’ cells to make cell-based therapies has been limited—in part because CRISPR-Cas molecules can be hard to get into cells and then into cells’ DNA-containing nuclei.

    “Current methods of getting CRISPR-Cas systems into cells, which include the use of carrier viruses and electric pulses, are inefficient for cells taken directly from patients (called primary cells). These methods also typically kill many of the cells they are used on, and can even cause broad unwanted changes in gene activity,” said co-senior author Shelley L. Berger, PhD, the Daniel S. Och University Professor in Cell and Developmental Biology and Genetics and director of the Penn Epigenetics Institute.

    In the study, researchers explored the use of small, virus-derived protein fragments, called peptides, to pilot CRISPR-Cas molecules more efficiently through the outer membranes of primary human cells and into their nuclei. Notably, researchers found that a fused combination of two modified peptides—one found in HIV and one in influenza viruses—could be mixed with CRISPR-Cas molecules to get them into primary human or mouse cells and their nuclei with efficiencies of up to nearly 100 percent, depending on the cell type—with almost no toxicity or gene-expression changes.

    The team demonstrated the approach, which they call PAGE (peptide-assisted genome editing), for several types of envisioned cell therapy including CAR T cell therapies.

    In addition to its potential use in cell and gene therapies, the authors note the PAGE approach could see wide application in basic scientific research. The inefficiency of standard CRISPR-Cas cell penetration methods has meant that gene-editing to create mouse models of diseases typically requires a multi-step, time-consuming process of generating transgenic mice—to introduce the gene-editing machinery into their DNA. By contrast, PAGE with its high efficiency and low toxicity might enable rapid, efficient, and straightforward gene editing in ordinary lab mice.

    “The simplicity and power of the peptide-assist concept suggests that it could potentially be adapted in the future for the delivery into primary cells of other genome-editing proteins, or even protein-based drugs,” said co-senior author Junwei Shi, PhD, an assistant professor of Cancer Biology and member of the Penn Epigenetics Institute and Abramson Family Cancer Research Institute.

    The study was a collaboration that included the laboratories of Penn co-author Rahul Kohli, MD, PhD, an associate professor of Infectious Diseases and Biochemistry and Biophysics, and co-author Gerd Blobel, MD, PhD, the Frank E. Weise III Professor of Pediatrics and co-director of the Epigenetics institute.

    This study was supported by the National Institute of Health (R01-HL119479, R01-GM138908, AI105343, AI082630, AI108545, AI155577, AI149680, U19AI082630, R35-CA263922, R01-CA258904), the Parker Institute for Cancer Immunotherapy, and institutional funds from University of Pennsylvania.

    [ad_2]

    Perelman School of Medicine at the University of Pennsylvania

    Source link

  • New MRI method images brain glucose metabolism

    New MRI method images brain glucose metabolism

    [ad_1]

    Newswise —

    Metabolic disorders are involved in many common health conditions such as Alzheimer’s, depression, diabetes, and cancer. Non-invasive diagnostic methods are needed to reliably detect these disorders. Until now, radioactive substances have been used to map glucose metabolism in the brain. However, a research team at MedUni Vienna has developed a new MRI approach that uses a harmless glucose solution to generate reliable results. This new method can be used with all common MRI scanners and has been published in the scientific journal Nature Biomedical Engineering.

    The research team conducted a study to improve current diagnostic methods for mapping brain glucose metabolism. They measured blood glucose levels and metabolic products in healthy subjects multiple times over 90 minutes, using a harmless glucose solution instead of radiolabeled glucose. This method indirectly measured the concentrations and metabolism of glucose based on changes in signal intensity for the product. Unlike other approaches, this method does not require additional hardware components, making it easy to use with other MRI devices. Wolfgang Bogner from MedUni Vienna explained the significance of this finding for clinical practice.

    Broad range of potential applications

    The study was conducted by researchers from the Department of Psychiatry and Psychotherapy and the Department of Medicine III at MedUni Vienna, using the university’s 7-Tesla MRI scanner, which is the only ultra-high-field MR scanner in Austria. The researchers were able to demonstrate that their new approach also works on 3-Tesla MR scanners, which are commonly used in clinical applications. This finding was an important step in validating the practicality and widespread applicability of the new method. Fabian Niess, the lead author of the follow-up study, highlighted the significance of this development.

    Further studies needed to confirm results

    Many common diseases are characterized by abnormalities in glucose metabolism. For example, cancer and tumor cells consume more glucose than normal cells, which is used in diagnosing and locating tumors through PET-CT scans. However, this requires injecting patients with radioactive glucose. While the new, less invasive MRI method developed at MedUni Vienna shows promise, further studies are needed to validate its effectiveness before it can be used for patient benefit.

    [ad_2]

    Medical University of Vienna (MedUni Wien)

    Source link

  • First-ever measurement of a quantum paradoxical phenomenon

    First-ever measurement of a quantum paradoxical phenomenon

    [ad_1]

    Newswise — Some things are related, others are not. Suppose you randomly select a person from a crowd who is significantly taller than the average. In that case, there is a good chance that they will also weigh more than the average. Statistically, one quantity also contains some information about the other.

    Quantum physics allows for even stronger links between different quantities: different particles or parts of an extensive quantum system can “share” a certain amount of information. There are curious theoretical predictions about this: surprisingly, the measure of this “mutual information” does not depend on the size of the system but only on its surface. This surprising result has been confirmed experimentally at the TU Wien and published in Nature Physics. Theoretical input to the experiment and its interpretation came from the Max-Planck-Institut für Quantenoptik in Garching, FU Berlin, ETH Zürich and  New York University.

    Quantum information: More strongly connected than classical physics allows

    “Let’s imagine a gas container in which small particles fly around and behave in a very classical way like small spheres,” says Mohammadamin Tajik of the Vienna Center for Quantum Science and Technology (VCQ) – Atominstitut of TU Wien, first author of the current publication. “If the system is in equilibrium, then particles in different areas of the container know nothing about each other. One can consider them completely independent of each other. Therefore, one can say that the mutual information these two particles share is zero.”

    In the quantum world, however, things are different: If particles behave quantumly, then it may happen that you can no longer consider them independently of each other. They are mathematically connected – you can’t meaningfully describe one particle without saying something about the other.

    “For such cases, there has long been a prediction about the mutual information shared between different subsystems of a many-body quantum system,” explains Mohammadamin Tajik. “In such a quantum gas, the shared mutual information is larger than zero, and it does not depend on the size of the subsystems – but only on the outer bounding surface of the subsystem.”

    This prediction seems intuitively strange: In the classical world, it is different. For example, the information contained in a book depends on its volume – not merely on the area of the book’s cover. In the quantum world, however, information is often closely linked to surface area.

    Measurements with ultracold atoms

    An international research team led by Prof. Jörg Schmiedmayer has now confirmed for the first time that the mutual information in a many body quantum system scales with the surface area rather than with the volume. For this purpose, they studied a cloud of ultracold atoms. The particles were cooled to just above absolute zero temperature and held in place by an atom chip. At extremely low temperatures, the quantum properties of the particles become increasingly important. The information spreads out more and more in the system, and the connection between the individual parts of the overall system becomes more and more significant. In this case, the system can be described with a quantum field theory. 

    “The experiment is very challenging,” says Jörg Schmiedmayer. “We need complete information about our quantum system, as best as quantum physics allows. For this, we have developed a special tomography technique. We get the information we need by perturbing the atoms just a bit and then observing the resulting dynamics. It’s like throwing a rock into a pond and then getting information about the state of the liquid and the pond from the consequent waves.”

    As long as the system’s temperature does not reach absolute zero (which is impossible), this “shared information” has a limited range. In quantum physics, this is related to the  “coherence length” – it indicates the distance to which particles quantumly behave similar, and thereby know from each other. “This also explains why shared information doesn’t matter in a classical gas,” says Mohammadamin Tajik. “In a classical many-body system, coherence disappears; you can say the particles no longer know anything about their neighboring particles.” The effect of temperature and coherence length on mutual information was also confirmed in the experiment.

    Quantum information plays an essential role in many technical applications of quantum physics today. Thus, the experiment results are relevant to various research areas – from solid-state physics to the quantum physical study of gravity.

    [ad_2]

    Vienna University of Technology

    Source link

  • Can technology save mangrove forests with deep learning?

    Can technology save mangrove forests with deep learning?

    [ad_1]

    Newswise — Mangrove forests are an essential component of the coastal zones in tropical and subtropical areas, providing a wide range of goods and ecosystem services that play a vital role in ecology. They are also threatened, disappearing, and degraded across the globe.

    One way to stimulate effective mangrove conservation and encourage policies for their protection is to carefully assess mangrove habitats and how they change, and identify fragmented areas. But obtaining this kind of information is not always an easy task.

    “Since mangrove forests are located in tidal zones and marshy areas, they are hardly accessible,” says Dr. Neda Bihamta Toosi, postdoc at Isfahan University of Technology in Iran working on landscape pattern changes using remote sensing. In a recent study in the journal Nature Conservation, together with a team of authors, she explored ways to classify these fragile ecosystems using machine learning.

    Comparing the performance of different combinations of satellite images and classification techniques, the researchers looked at how good each method was at mapping mangrove ecosystems.

    “We developed a novel method with a focus on landscape ecology for mapping the spatial disturbance of mangrove ecosystems,” she explains. “The provided disturbance maps facilitate future management and planning activities for mangrove ecosystems in an efficient way, thus supporting the sustainable conservation of these coastal areas.”

    The results of the study showed that object-oriented classification of fused Sentinel images can significantly improve the accuracy of mangrove land use/land cover classification.

    “Assessing and monitoring the condition of such ecosystems using model-based landscape metrics and principal component analysis techniques is a time- and cost-effective approach. The use of multispectral remote sensing data to generate a detailed land cover map was essential, and freely available Sentinel-2 data will guarantee its continuity in future,” explains Dr. Bihamta Toosi.

    The research team hopes this approach can be used to provide information on the trend of changes in land cover that affect the development and management of mangrove ecosystems, supporting better planning and decision-making.

    “Our results on the mapping of mangrove ecosystems can contribute to the improvement of management and conservation strategies for these ecosystems impacted by human activities,“ they write in their study.

    Research article:

    Soffianian AR, Toosi NB, Asgarian A, Regnauld H, Fakheran S, Waser LT (2023) Evaluating resampled and fused Sentinel-2 data and machine-learning algorithms for mangrove mapping in the northern coast of Qeshm island, Iran. Nature Conservation 52: 1-22. https://doi.org/10.3897/natureconservation.52.89639

    [ad_2]

    Pensoft Publishers

    Source link

  • Finnish study shows those at risk were less likely to get vaccinated.

    Finnish study shows those at risk were less likely to get vaccinated.

    [ad_1]

    Newswise — A large-scale registry study in Finland has identified several factors associated with uptake of the first dose of COVID-19 vaccination. In particular, persons with low or no labor income and persons with mental health or substance abuse issues were less likely to vaccinate.

    The study, carried out in collaboration between the University of Helsinki and the Finnish Institute of Health and Welfare, tested the association of nearly 3000 health, demographic and socio-economic variables with the uptake of the first COVID-19 vaccination dose across the entire Finnish population. 

    This work, just published in the Nature Human Behavior, is the largest study to date on this topic. 

    The single most significant factors that associated with reduced likelihood of being vaccinated were lack of labor income in the year preceding the pandemic, mother tongue other than Finnish or Swedish and having unvaccinated close relatives, especially the mother. Among health-related variables, factors related to mental health and substance abuse problems associated with reduced vaccination.

    “Lack of labor income can be due to unemployment, sickness or retirement. Furthermore, among individuals with labor income, we saw that low-income earners where the least likely to vaccinate”, explains Tuomo Hartonen, Postdoctoral Researcher at the Institute for Molecular Medicine Finland FIMM, University of Helsinki.

    The study was based on the FinRegistry data. Researchers analysed population-wide national health and population register data from the pre-pandemic period and compared these with the vaccination status data. The analyses were limited to people aged 30-80 years.

    “A particular strength of our study is that it is based on registers covering the entire Finnish population. This way we can avoid all selection bias, which is a major challenge of survey studies”, Postdoctoral Researcher Bradley Jermy from FIMM says.

    The researchers stress that their results describe the association between the studied variables and vaccination uptake at the population level, but do not allow conclusions to be drawn about causal relationships. Furthermore, the generalizability of the findings outside Finland requires further studies. However, it is clear from the results that in Finland, vaccination uptake was lowest among those who are already in a vulnerable position.

    Researchers created a machine learning-based model to predict vaccination uptake

    In addition to studying single predictors, the research team constructed a machine learning-based model to predict vaccination uptake. This prediction model allowed the researchers to group individuals according to their likelihood of receiving the COVID-19 vaccine.

    Approximately 90% of the total study population received at least one dose of COVID-19 vaccination. In contrast, the group with the lowest probability of being vaccinated based on the model had a vaccination rate of less than 19%.

    “Our research has created a framework for using machine learning and statistical approaches to identify those groups that are at higher risk of not vaccinating”, says the corresponding author of the study, Associate Professor Andrea Ganna from FIMM.

     “These results and the predictive model could be used in the future, for example in designing vaccination campaigns”, says the Principal Investigator of the FinRegistry study, Research Professor Markus Perola from THL.

    “This study is a great example of the possibilities that the FinRegistry study creates for investigating highly topical issues in a short timeframe. The collaboration between THL’s genetic and registry researchers and FIMM scientists will help to understand the many pathways that lead to susceptibility to different diseases,” Perola continues.

    The study is part of the FinRegistry project, a joint research project between the Finnish Institute for Health and Welfare (THL) and the Institute for Molecular Medicine Finland (FIMM) at the University of Helsinki.

    [ad_2]

    University of Helsinki

    Source link

  • Gut germs use strong substances to avoid antibiotics.

    Gut germs use strong substances to avoid antibiotics.

    [ad_1]

    Newswise — The discovery shows why it can be so difficult to tackle drug-resistant bacteria, but does provide a possible avenue for tackling the problem. The super-polymer structures the bacteria use to transfer genes could also be exploited for precise drug delivery in future medicine.

    Gut bacteria form extracellular appendages called F-pili to connect to each other and transfer packets of DNA, called genes, that allow them to resist antibiotics. It was thought that the harsh conditions inside human and animal guts, including turbulence, heat, and acids, would break the F-pili, making transfer more difficult.

    However, new research by a team led by Imperial College London researchers has shown that the F-pili are actually stronger in these conditions, helping the bacteria transfer resistance genes more efficiently, and to clump into ‘biofilms’ – protective bacterial consortia – that help them fend off antibiotics.

    The results are published in Nature Communications.

    First author Jonasz Patkowski, from the Department of Life Sciences at Imperial, said: “The death toll from antimicrobial resistance is expected to match cancer by 2050, meaning we urgently need new strategies to combat this trend. Much of the spread of resistance is driven by bacteria swapping genes, so detailed understanding of this process could lead to new ways to interrupt it.”

    Not so fragile

    Different classes of bacteria use different types of pili to transfer genes in a process called conjugation. A classic experiment seemed to show that this process was fragile and could be interrupted by agitation, but this left a mystery: why do so many bacteria living in harsh conditions like guts use these systems if they are so fragile?

    The team therefore set out to test this assumption. By shaking E. coli bacteria while they used F-pili during conjugation, they discovered that agitation actually increased the efficiency of gene transfer between bacteria. They also observed that after transferring genes, the conjugated bacteria in shaken conditions clumped together more easily to form biofilms, which protect inner bacteria from the surrounding antibiotic molecules.

    To determine how the F-pili are able to do this, the team subjected them to a strength test by mounting a bacterium on a stage, connecting a glass bead using ‘molecular tweezers’ to the end of one of its F-pili, and pulling. The F-pili proved highly elastic, with spring-like properties that prevented them from breaking.

    They also tested the F-pili’s ability to withstand other common gut conditions, subjecting them to sodium hydroxide, urea, and excessively high temperatures of 100°C – all of which the F-pili survived.

    Molecular properties

    The team then went a step further, looking at the F-pili on a molecular level to see what gives them these incredible properties. They are primarily made up of F-pilin ‘subunits’ with interlinked phospholipid molecules.

    By modelling the F-pili without the phospholipids, the team showed how important these molecules are for the structure’s springiness and elastic strength. Repeating the pulling experiment revealed that the subunits quickly disassemble without the phospholipids supporting them, proving their novel role as a ‘molecular glue’ in long biopolymers.

    Lead researcher Dr Tiago Costa, from the Department of Life Sciences at Imperial, said: “Making F-pili is very costly to the bacteria in terms of resources and energy, so it’s no surprise they are worth the effort. We have shown how F-pili accelerate the spread of antibiotic resistance and biofilm formation in turbulent environments, but the challenge now is to find ways to combat this very efficient process.”

    While it would be advantageous to break F-pili in pathogenic bacteria, their properties could be helpful if we can engineer them for use in, for example, drug delivery. Patkowski explained: “It’s hard to find a tubular appendage with such strong properties. Bacteria use it to transfer genes, but if we could mimic these properties, we could use similar structures to precisely deliver drugs where they are needed in the body.”

    https://www.imperial.ac.uk/news/244513/gut-bacteria-superpolymers-dodge-antibiotics/

    [ad_2]

    Imperial College London

    Source link

  • Genetic code of hornets sequenced to understand their successful invasion

    Genetic code of hornets sequenced to understand their successful invasion

    [ad_1]

    Newswise — The genomes of two hornet species, the European hornet and the Asian hornet (or yellow-legged hornet) have been sequenced for the first time by a team led by UCL (University College London) scientists.

    By comparing these decoded genomes with that of the giant northern hornet, which has recently been sequenced by another team, the researchers have revealed clues suggesting why hornets have been so successful as invasive species across the globe.

    Hornets are the largest of the social wasps; they play important ecological roles as top predators of other insects. In their native regions, they are natural pest controllers, helping regulate the populations of insects such as flies, beetles, caterpillars and other types of wasps. These services are critical for healthy, functional ecosystems, as well as for agriculture.

    But hornets also tend to be very successful as invasive species. They can become established in areas they are not native to and cause potentially huge ecological and economic damage by hunting important pollinators, such as honeybees, wild bees and hoverflies.

    To better understand how these species have so successfully expanded their ranges, the international team of scientists investigated the genomes of three types of hornets.

    A genome sequence is the set of instructions – a genetic code – that makes a species. Comparing the genomes of different species can give insights into their biology – their behaviour, evolution, and how they interact with the environment.

    The researchers have newly sequenced the genomes of the native European hornet, Vespa crabro – an important top predator, which is protected in parts of Europe – and the invasive yellow-legged Asian hornet Vespa velutina, which has become established through much of Europe over the last 20 years threatening native ecosystems, and has occasionally been sighted in the UK. They compared these with the genome of the giant northern hornet, Vespa mandarinia – a species known for its role as pest controller, pollinator and food provider in its native Asian range, but is a recent arrival in North America, where it may threaten native fauna.

    By analysing differences between the three related species, the researchers were able to identify genes that have been rapidly evolving since the species differentiated themselves from other wasps and from one another, and found some noteworthy genes that are rapidly evolving, particularly relating to communication and olfaction (smell).

    The study’s first author, Dr Emeline Favreau (UCL Centre for Biodiversity & Environment), said: “We were excited to find evidence of rapid genome evolution in these hornet genomes, compared to other social insects. Lots of genes have been duplicated or mutated; these included genes that are likely to be involved in communication and in sensing the environment.”

    Genome evolution allows organisms to adapt to their environment and make the most of their surroundings by developing new behaviours and physiology.

    Co-author Dr Alessandro Cini, who began the work at UCL before moving to the University of Pisa, said: “These findings are exciting, as they may help explain why hornets have been so successful in establishing new populations in non-native regions.

    “Hornets are carried to different parts of the world accidentally by humans. All that is needed is a small number of mated queens to be transported, hidden in cargo perhaps. The genomes suggest that hornets have lots of genes involved in detecting and responding to chemical cues – these may make them especially good at adapting to hunt different types of prey in non-native regions.”

    Senior author Professor Seirian Sumner (UCL Centre for Biodiversity & Environment) said: “These hornet genomes are just the beginning. The genomes of more than 3,000 insect species have now been sequenced by efforts around the world, but wasps are under-represented among these.

    “Genomes tell us about aspects of the ecology and evolution that other methods cannot. Evolution has equipped these insects with an incredible genetic toolbox with which to exploit their environment and hunt their prey.”

    Armed with these new genomes, the scientists hope to help improve the management of hornet populations, both for their ecosystem services as pest controllers in native zones, and as ecological threats in regions where they are invasive.

    The study involved researchers in the UK, Italy, Spain, Israel, France, New Zealand, and Austria, and was primarily funded by the Natural Environment Research Council.

    [ad_2]

    University College London

    Source link

  • Unveiling the secrets of sealed animal coffins in ancient Egypt

    Unveiling the secrets of sealed animal coffins in ancient Egypt

    [ad_1]

    Newswise — The contents of six sealed ancient Egyptian animal coffins — which were imaged using a non-invasive technique — are described in a study published in Scientific Reports.

    The mummification of animals was a widespread practice in ancient Egypt and previous research has suggested that some mummified animals were believed to be physical incarnations of deities, while others may have represented offerings to deities or have been used in ritual performances.

    Daniel O’Flynn and colleagues imaged the contents of six sealed animal coffins using neutron tomography — a technique that creates images of objects based on the extent to which neutrons emitted by a source can pass through them — after previous attempts to study the coffins with x-rays were unsuccessful. All six of the coffins are made of copper compounds. The authors note that it is rare for such coffins to still be sealed. Three of the coffins, topped with lizard and eel figures as well as loops, have been dated to between 500 and 300 BCE and were discovered in the ancient city of Naukratis. A fourth coffin, topped by a lizard figure, has been dated to between 664 and 332 BCE and was discovered in the ancient city of Tell el-Yehudiyeh. The two other coffins, topped with part-eel, part-cobra figures with human heads, have been dated to between approximately 650 and 250 BCE and are of unknown origin.

    The authors identified bones in three of the coffins, including an intact skull with dimensions similar to those of a group of wall lizards containing species that are endemic to North Africa, as well as evidence of broken-down bones in a further two coffins. They also identified textile fragments within three coffins that were possibly made from linen, which was commonly used in Ancient Egyptian mummification. They propose that linen may have been wrapped around the animals before they were placed in the coffins. The authors found lead within the three coffins without loops, which they suggest may have been used to aid weight distribution within two of them and to repair a hole found in the other. They speculate that lead may have been selected due to its status in ancient Egypt as a magical material, as previous research has proposed that lead was used in love charms and curses. The authors did not identify additional lead within the three coffins topped with loops. They suggest that the loops may have been used to suspend these lighter coffins from shrine or temple walls or from statues or boats used during religious processions, while the heavier lead-containing coffins without loops may have been used for different purposes.

    The findings provide further insight into the manufacture and use of animal coffins in ancient Egypt.

    ###

    [ad_2]

    Scientific Reports

    Source link

  • Life-Friendly Environments Found in Metal-Poor Stars

    Life-Friendly Environments Found in Metal-Poor Stars

    [ad_1]

    Newswise —

    Researchers from the Max Planck Institutes for Solar System Research and for Chemistry, together with the University of Göttingen, have discovered that stars rich in heavy elements are less conducive to the development of complex life than stars with low metal content. The team demonstrated the correlation between a star’s metallicity and the ability of its planets to create an ozone layer for protection against harmful ultraviolet light emitted by the star. This finding provides valuable information for scientists searching for habitable star systems using space telescopes. Additionally, the study suggests a surprising conclusion: the universe becomes progressively less hospitable to the emergence of complex life on newly formed planets as it ages.

    Over the past few years, researchers have increasingly concentrated on the gas envelopes of distant planets in their search for habitable or inhabited worlds. They examine observational data to determine whether these planets possess an atmosphere, and whether it includes gases like oxygen or methane, which are primarily produced by lifeforms on Earth. In the coming years, the James Webb Telescope, developed by NASA, will expand these observations to unprecedented levels. It will allow researchers to not only characterize the atmospheres of large gas giants, such as Super-Neptunes, but also to scrutinize the much fainter spectrographic signals emanating from rocky planet atmospheres for the first time.

    The study, which was recently published in Nature Communications, employed numerical simulations to examine the ozone content of exoplanet atmospheres. Like on Earth, this molecule, composed of three oxygen atoms, can safeguard the planet’s surface and its resident life forms against harmful ultraviolet (UV) radiation. Therefore, an ozone layer is a critical prerequisite for the emergence of complex life. “Our aim was to determine the characteristics of a star that must exist for its planets to generate a protective ozone layer,” Anna Shapiro, the first author of the study and a researcher at the Max Planck Institute for Solar System Research, stated in outlining the study’s fundamental concept.

    As is often the case in scientific research, the concept of the current study was prompted by a previous discovery. Three years ago, a team of scientists from the Max Planck Institute for Solar System Research examined the variations in the Sun’s brightness in comparison to those of hundreds of similar stars. The outcome revealed that the visible light intensity of many of these stars fluctuated significantly more than that of the Sun. Alexander Shapiro, who participated in both studies, remarked, “We observed enormous intensity spikes,” and he suggested that the Sun might be capable of producing similar fluctuations. “In such cases, the ultraviolet light intensity would also increase significantly,” he added. Sami Solanki, co-author of both studies and the director of the Max Planck Institute for Solar System Research

    Dual role of UV radiation

    The researchers focused their calculations on the subgroup of stars, approximately half of all stars, around which exoplanets have been observed to orbit, and whose surface temperatures range from approximately 5,000 to 6,000 degrees Celsius. The Sun, with a surface temperature of around 5500 degrees Celsius, is also a member of this subgroup. “Ultraviolet radiation from the Sun plays a dual role in the atmospheric chemistry of Earth,” explains Anna Shapiro, whose previous research has concentrated on the effects of solar radiation on the Earth’s atmosphere. Ozone can be created and destroyed through reactions with individual oxygen atoms and oxygen molecules. Long-wave UV-B radiation destroys ozone, while short-wave UV-C radiation generates protective ozone in the middle atmosphere. “It was therefore plausible to assume that ultraviolet light might have a similarly intricate impact on exoplanet atmospheres,” the astronomer notes. The precise wavelengths of radiation are critical.

    To determine the impact of ultraviolet light on exoplanet atmospheres, the researchers conducted calculations that precisely identified the wavelengths of the ultraviolet light emitted by stars. They also took into account the effect of metallicity, a property that characterizes the ratio of hydrogen to heavier elements, which are often referred to as “metals” by astrophysicists. The Sun, for example, has a ratio of more than 31,000 hydrogen atoms to one iron atom. The study also considered stars with lower and higher iron content. This is the first time that metallicity has been factored in to such calculations.

    Simulated interactions of UV radiation with gases

    After identifying the ultraviolet light wavelengths emitted by stars and considering the effect of metallicity, the researchers went on to investigate how this calculated UV radiation would impact the atmospheres of planets orbiting at a life-friendly distance around these stars. Life-friendly distances refer to those orbits where the temperature is moderate enough to support liquid water on the planet’s surface. Using computer simulations, the team investigated the processes triggered in the planet’s atmosphere by the parent star’s characteristic UV light.

    To compute the composition of planetary atmospheres the researchers used a chemistry-climate model that simulates the processes that control oxygen, ozone, and many other gases, and their interactions with ultraviolet light from stars, at very high spectral resolution. This model allowed the investigation of a wide variety of conditions on exoplanets and comparison with the history of the Earth’s atmosphere in the last half billion years. During this period the high atmospheric oxygen content and the ozone layer were established that allowed the evolution of life on land on our planet. “It is feasible that the history of the Earth and its atmosphere holds clues about the evolution of life that may also apply to exoplanets” says Jos Lelieveld, Managing Director of the Max Planck Institute for Chemistry, who was involved in the study.

    Promising candidates

    The simulations yielded unexpected results for the researchers. It was found that in general, stars with lower metallicity emit more ultraviolet (UV) radiation than their higher metallicity counterparts. However, the proportion of UV radiation that produces ozone (UV-C) compared to that which destroys it (UV-B) is critically dependent on the metallicity. In stars with lower metallicity, UV-C radiation dominates, resulting in the formation of a dense ozone layer. In contrast, in stars with higher metallicity, UV-B radiation predominates, resulting in a much sparser protective envelope. “These findings suggest that, contrary to expectations, stars with lower metallicity may offer more conducive conditions for the emergence of life,” concludes Anna Shapiro.

    The researchers were surprised by the outcomes of their simulations, which revealed that, in general, stars with lower metallicity emit more ultraviolet (UV) radiation than their higher metallicity counterparts. However, the ratio of UV radiation that produces ozone (UV-C) to that which destroys it (UV-B) is critical and varies based on metallicity. Stars with lower metallicity exhibit higher levels of UV-C radiation, which leads to the formation of a denser ozone layer. In contrast, stars with higher metallicity have more UV-B radiation, which results in a sparser protective envelope. “These findings suggest that stars with lower metallicity may offer more favorable conditions for the emergence of life, contrary to expectations,” says Anna Shapiro.

    Paradoxical conclusion

    In addition, the study draws an almost paradoxical conclusion: as the universe evolves, it may become less hospitable to life. Heavy elements and metals are synthesized in stars towards the end of their multi-billion-year lifetimes and are then released into space either via stellar wind or a supernova explosion. This material becomes the building blocks for the formation of the next generation of stars. “Thus, each new star has more metal-rich material available than its predecessors, and stars in the universe become more metal-rich with each generation,” explains Anna Shapiro. The new study indicates that the likelihood of star systems producing life decreases as the universe ages. However, there is still hope in the search for life, as many host stars of exoplanets have similar ages to our Sun, which is known to support complex and diverse lifeforms on at least one of its planets.

    [ad_2]

    Max Planck Society (Max-Planck-Gesellschaft)

    Source link

  • New research may hold key to better treatments for aggressive brain cancer

    New research may hold key to better treatments for aggressive brain cancer

    [ad_1]

    BYLINE: Sarina Gleason

    Newswise — Southfield, Mich., April 18, 2023 – For decades, researchers have marveled at the ability of glioblastoma, a particularly aggressive brain cancer, to turn off a patient’s cancer-fighting immune cells, thereby allowing tumors to grow freely. This remains a primary reason why there are very few effective therapies available for this mostly fatal disease.

    In a new study using more than 100 patient-derived glioblastoma tumors, Prakash Chinnaiyan, M.D., a physician scientist in the Department of Radiation Oncology at Corewell Health in Southeast, Mich., along with colleague and lead author Pravin Kesarwani, Ph.D., have discovered that a naturally produced chemical in the body is helping glioblastoma cells go unrecognized by a patient’s own immune cells whose job it is to stop them.

    “Specifically, we found that glioblastoma cells work with a patient’s immune system to generate a chemical called quinolinate,” Dr. Chinnaiyan said. “In turn, this chemical ‘puts the brakes’ on the surrounding immune cells and prevents them from attacking the tumor.”

    The findings, published in Nature Communications, could be the key to unlocking new and more successful treatments down the road. While there have been some immunotherapy drugs that have been successful in reactivating a patient’s immune cells against certain cancers, none have worked on glioblastoma tumors and there are currently no available drugs to stop the production of quinolinate.

    As a result, the team took their research one step further and created a genetically engineered mouse model that could no longer produce quinolinate.

    “Tumors implanted in these mice grew significantly slower, and when analyzed in the lab, demonstrated robust immune activation, suggesting this pathway may serve as a new therapeutic target for glioblastoma,” Dr. Chinnaiyan said. “This provides a framework to design new immunotherapies that can target quinolinate accumulation associated with this disease and possibly others.”

    As for next steps, Dr. Chinnaiyan said they will continue his research to help identify a compound that may be able to target quinolinate. The team also will look at extending their findings to other diseases such as Alzheimer’s and Parkinson’s where quinolinate accumulation has been found to be a potential factor as well.

    “Our research may provide the key for better treatments and help save lives in the future, removing the mystery around glioblastoma and possibly other diseases,” Dr. Chinnaiyan said.

    Additional study authors from the Chinnaiyan Lab at Corewell Health included research scientists Shiva Kant, Ph.D., Yi Zhao, Ph.D., Antony Prabhu, Ph.D., and Katie Buelow as well as C. Ryan Miller, M.D., Ph.D., with the Heersink School of Medicine at the University of Alabama at Birmingham.

     

    About Corewell Health™

    People are at the heart of everything we do, and the inspiration for our legacy of outstanding outcomes, innovation, strong community partnerships, philanthropy and transparency. Corewell Health is a not-for-profit health system that provides health care and coverage with an exceptional team of 60,000+ dedicated people—including more than 11,500 physicians and advanced practice providers and more than 15,000 nurses providing care and services in 22 hospitals, 300+ outpatient locations and several post-acute facilities—and Priority Health, a provider-sponsored health plan serving more than 1.2 million members. Through experience and collaboration, we are reimagining a better, more equitable model of health and wellness. For more information, visit corewellhealth.org.

     

    ###

    [ad_2]

    Beaumont Health

    Source link

  • Study Links Poor Diet to 14 Million Cases of Type 2 Diabetes Globally

    Study Links Poor Diet to 14 Million Cases of Type 2 Diabetes Globally

    [ad_1]

    Newswise — A research model of dietary intake in 184 countries, developed by researchers at the Friedman School of Nutrition Science and Policy at Tufts University, estimates that poor diet contributed to over 14.1 million cases of type 2 diabetes in 2018, representing over 70% of new diagnoses globally. The analysis, which looked at data from 1990 and 2018, provides valuable insight into which dietary factors are driving type 2 diabetes burden by world region. The study was published April 17 in the journal Nature Medicine.

    Of the 11 dietary factors considered, three had an outsized contribution to the rising global incidence of type 2 diabetes: Insufficient intake of whole grains, excesses of refined rice and wheat, and the overconsumption of processed meat. Factors such as drinking too much fruit juice and not eating enough non-starchy vegetables, nuts, or seeds, had less of an impact on new cases of the disease.

    “Our study suggests poor carbohydrate quality is a leading driver of diet-attributable type 2 diabetes globally, and with important variation by nation and over time,” says senior author Dariush Mozaffarian, Jean Mayer Professor of Nutrition and dean for policy at the Friedman School. “These new findings reveal critical areas for national and global focus to improve nutrition and reduce devastating burdens of diabetes.”

    Type 2 diabetes is characterized by the resistance of the body’s cells to insulin. Of the 184 countries included in the Nature Medicine study, all saw an increase in type 2 diabetes cases between 1990 and 2018, representing a growing burden on individuals, families, and healthcare systems.

    The research team based their model on information from the Global Dietary Database, along with population demographics from multiple sources, global type 2 diabetes incidence estimates, and data on how food choices impact people living with obesity and type 2 diabetes from multiple published papers.  

    The analysis revealed that poor diet is causing a larger proportion of total type 2 diabetes incidence in men versus women, in younger versus older adults, and in urban versus rural residents at the global level.

    Regionally, Central and Eastern Europe and Central Asia —particularly in Poland and Russia, where diets tend to be rich in red meat, processed meat, and potatoes —had the greatest number of type 2 diabetes cases linked to diet. Incidence was also high in Latin America and the Caribbean, especially in Colombia and Mexico, which was credited to high consumption of sugary drinks, processed meat, and low intake of whole grains.

    Regions where diet had less of an impact on type 2 diabetes cases included South Asia and Sub-Sharan Africa —though the largest increases in type 2 diabetes due to poor diet between 1990 and 2018 were observed in Sub-Saharan Africa. Of the 30 most populated countries studied, India, Nigeria, and Ethiopia had the fewest case of type 2 diabetes related to unhealthy eating.

    “Left unchecked and with incidence only projected to rise, type 2 diabetes will continue to impact population health, economic productivity, health care system capacity, and drive heath inequities worldwide,” says first author Meghan O’Hearn. She conducted this research while a PhD candidate at the Friedman School and currently works as Impact Director for Food Systems for the Future, a non-profit institute and for-profit fund that enables innovative food and agriculture enterprises to measurably improve nutrition outcomes for underserved and low-income communities. “These findings can help inform nutritional priorities for clinicians, policymakers, and private sector actors as they encourage healthier dietary choices that address this global epidemic.”

    Other recent studies have estimated that 40% of type 2 diabetes cases globally are attributed to suboptimal diet, lower than the 70% reported in the Nature Medicine paper. The research team attributes this to the new information in their analysis, such as the first ever inclusion of refined grains, which was one of the top contributors to diabetes burdens; and updated data on dietary habits based on national individual-level dietary surveys, rather than agricultural estimates. The investigators also note that they presented the uncertainty of these new estimates, which can continue to be refined as new data emerges.

     

    Research reported in this article was supported by the Bill and Melinda Gates Foundation. Complete information on authors, funders, methodology, and conflicts of interest is available in the published paper. The content is solely the responsibility of the authors and does not necessarily represent the official views of the funders.

    [ad_2]

    Tufts University

    Source link

  • New test could help identify type 2 diabetes risk

    New test could help identify type 2 diabetes risk

    [ad_1]

    Newswise — Analysing changes to DNA in the blood can improve the ability to predict a person’s risk of developing type 2 diabetes within a decade.

    Scientists looked at the influence of these changes – known as DNA methylation – alongside other risk factors in almost 15,000 people to predict the likelihood of developing the condition years in advance of any symptoms developing.

    The findings could lead to preventative measures being put in place earlier, reducing the economic and health burden caused by type 2 diabetes.

    Methylation is a chemical process in the body in which a small molecule called a methyl group is added to DNA.

    Current risk prediction tools for type 2 diabetes use information such as age, sex, BMI and family history of the disease.

    Researchers from the University of Edinburgh found that the inclusion of DNA methylation data alongside these risk factors provided a more accurate prediction.

    The scientists used their results to estimate the predictive performance using a hypothetical screening scenario of 10,000 people, where one in three individuals develop type 2 diabetes over a 10-year period.

    The model that used DNA methylation correctly classed an extra 449 individuals compared with traditional risk factors alone.

    The addition or removal of these methyl groups can affect how some molecules act in the body. These methylation patterns can help to track ageing processes and development of disease.

    Data came from 14,613 volunteers in the Generation Scotland study – a large study designed to help scientists investigate the causes of disease, understand the country’s healthcare priorities, and inform future medical treatments and health policies.

    The team also repeated the analyses in 1,451 individuals from a study based in Germany to ensure their findings could be replicated in people from different backgrounds.

    Type 2 diabetes is a serious condition where the insulin a pancreas makes cannot work properly, or a pancreas cannot make enough insulin. This can lead to high blood sugar levels and, in turn, a range of health issues such as heart diseases and stroke, nerve damage and foot problems.

    More than 4.9 million people live with diabetes in the UK, with 90 per cent of those with type 2.

    The study is published in the journal Nature Aging: https://www.nature.com/articles/s43587-023-00391-4. Researchers from the University of Edinburgh were supported by experts at the University of Helsinki, the German Research Center for Environmental Health (GmbH) and the German Center for Diabetes Research (DZD).

    Yipeng Cheng, a PhD student from the University of Edinburgh’s Centre for Genomic and Experimental Medicine, said: “It is promising that our findings were observed in the Scottish and German studies with both showing an improvement in prediction above and beyond commonly used risk factors. Delaying onset is important as diabetes is a risk factor for other common diseases, including dementias.”

    The study’s principal investigator, Professor Riccardo Marioni, also from the University of Edinburgh’s Centre for Genomic and Experimental Medicine, said: “Similar approaches could be taken for other common diseases to generate broad health predictors from a single blood or saliva sample. We are incredibly grateful for our study volunteers who make this research possible – the more people that join our study, the more precisely we can identify signals that will help delay or reduce the onset of diseases as we age.”

    Generation Scotland is currently recruiting volunteers and has recently opened to young people aged between 12 and 15 for the first time. Anyone who lives in Scotland can sign up online at www.generationscotland.org

    [ad_2]

    University of Edinburgh

    Source link

  • Blind dating in bacteria evolution

    Blind dating in bacteria evolution

    [ad_1]

    Newswise — Proteins are the key players for virtually all molecular processes within the cell. To fulfil their diverse functions, they have to interact with other proteins. Such protein-protein interactions are mediated by highly complementary surfaces, which typically involve many amino acids that are positioned precisely to produce a tight, specific fit between two proteins. However, comparatively little is known about how such interactions are created during evolution.

    Classical evolutionary theory suggests that any new biological feature involving many components (like the amino acids that enable an interaction between proteins) evolves in a stepwise manner. According to this concept, each tiny functional improvement is driven by the power of natural selection because there is some benefit associated with the feature. However, whether protein-protein interactions also always follow this trajectory was not entirely known.

    Using a highly interdisciplinary approach, an international team led by Max Planck researcher Georg Hochberg from the Terrestrial Microbiology in Marburg have now shed new light on this question. Their study provides definitive evidence that highly complementary and biologically relevant protein-protein interactions can evolve entirely by chance.

    Proteins cooperate in a photoprotection system

    The research team made their discovery in a biochemical system that microbes use to adapt to stressful light conditions. Cyanobacteria use sunlight to produce their own food through photosynthesis. Since much light damages the cell, cyanobacteria have evolved a mechanism known as photoprotection: if light intensities become dangerously high, a light intensity sensor named Orange Carotenoid Protein (OCP) changes its shape. In this activated form, OCP protects the cell by converting excess light energy into harmless heat. In order to return into its original state, some OCPs depend on a second protein: The Fluorescence Recovery Protein (FRP) binds to activated OCP1 and strongly accelerates its recovery.

    ‘Our question was: Is it possible that the surfaces that allow these two proteins to form a complex evolved entirely by accident, rather than through direct natural selection?’ says Georg Hochberg. ‘The difficulty is that the end result of both processes looks the same, so we usually cannot tell why the amino acids required for some interaction evolved – through natural selection for the interaction or by chance. To tell them apart, we would need a time machine to witness the exact moment in history these mutations occurred, ’Georg Hochberg explains.

    Luckily, recent breakthroughs in molecular and computational biology has equipped Georg Hochberg and his team with a laboratory kind of time machine: ancestral sequence reconstruction. In addition, the light protection system of cyanobacteria, which is under study in the group of Thomas Friedrich from Technische Universität Berlin since many years, is ideal for studying the evolutionary encounter of two protein components. Early cyanobacteria acquired the FRP proteins from a proteobacterium by horizontal gene transfer. The latter had no photosynthetic capacity itself and did not possess the OCP protein.

    To work out how the interaction between OCP1 and FRP evolved, graduate student Niklas Steube inferred the sequences of ancient OCPs and FRPs that existed billions of years ago in the past, and then resurrected these in the laboratory. After translation of the amino acid sequences into DNA he produced them using E. coli bacterial cells in order to be able to study their molecular properties.

    A fortunate coincidence

    The Berlin team then tested whether ancient molecules could form an interaction. This way the scientists could retrace how both protein partners got to know each other. ‘Surprisingly, the FRP from the proteobacteria already matched the ancestral OCP of the cyanobacteria, before gene transfer had even taken place. The mutual compatibility of FRP and OCP has thus evolved completely independently of each other in different species, says Thomas Friedrich. This allowed the team to prove that their ability to interact must have been a happy accident: selection could not plausibly have shaped the two proteins’ surfaces to enable an interaction if they had never met each other. This finally proved that such interactions can evolve entirely without direct selective pressure.

    ‘This may seem like an extraordinary coincidence,’ Niklas Steube says. ‘Imagine an alien spaceship landed on earth and we found that it contained plug-shaped objects that perfectly fit into human-made sockets. But despite the perceived improbability, such coincidences could be relatively common. But in fact, proteins often encounter a large number of new potential interaction partners when localisation or expression patterns change within the cell, or when new proteins enter the cell through horizontal gene transfer.’ Georg Hochberg adds, ‘Even if only a small fraction of such encounters ends up being productive, fortuitous compatibility may be the basis of a significant fraction of all interactions we see inside cells today. Thus, as in human partnerships, a good evolutionary match could be the result of a chance meeting of two already compatible partners.’

    [ad_2]

    Max Planck Society (Max-Planck-Gesellschaft)

    Source link

  • tRNA biomarkers for cancer diagnosis and prognosis enabled by new method

    tRNA biomarkers for cancer diagnosis and prognosis enabled by new method

    [ad_1]

    Newswise — Ribonucleic acid (RNA) molecules are present in all living cells, with different types of RNA having different jobs. For example, messenger RNA is copied from DNA and carries instructions on how to make a protein. Transfer RNA (tRNA) links the mRNA sequence with its corresponding amino acid, ensuring that proteins are stitched together correctly as instructed by DNA. 

    Cells naturally modify RNA molecules in order to enhance their stability, structure and function. When this modification process goes wrong, it can have important consequences for human health and disease. In the case of tRNA, incorrect or missing modifications produce faulty or incomplete proteins, with the dysregulation of tRNA modifications being linked to various human diseases, including neurodegenerative diseases, metabolic diseases, and cancer. 

    tRNAs are “information-rich” molecules with huge potential for the diagnosis and prognosis of diseases, but so far haven’t been exploited for such purpose due to the lack of methods that can capture this information in a quantitative and cost-efficient manner. For example, some types of cancers are difficult to diagnose because their symptoms are non-specific and can be confused with other conditions. At the same time, certain tRNA modification profiles are only known to exist in specific cancer types and can serve as highly-specific biomarkers. 

    Being able to isolate tRNA molecules from blood samples and quantify their modifications can help diagnose cancers without the use of imaging tests or invasive biopsies. Furthermore, the type of tRNA modifications can change depending on the state of the disease, providing valuable information about the prognosis of the condition. 

    Current methods for measuring tRNA molecules typically involve techniques such as next-generation sequencing or mass spectrometry, however, these methods have limited use for diagnostic purposes because they are either unable to detect modifications, or they cannot identify at which location of the tRNA they are occurring at.  

    Researchers at the Centre for Genomic Regulation (CRG) in Barcelona have addressed this challenge by developing a new method that can measure both the abundance and modification of tRNA molecules in a single step. The method is called Nano-tRNAseq and is first described today in the journal Nature Biotechnology. 

    Nano-tRNAseq is based on nanopore sequencing, a technology that can directly sequence individual RNA molecules by passing them through a small pore. Each of the nucleotides that compose an RNA molecule has a slightly different size and shape, with a corresponding change in the electrical current that occurs as each nucleotide passes through the pore. Computer programs detect changes in the current to identify the sequence of the RNA molecules, including any modifications. As a proof of concept, the researchers used Nano-tRNAseq to accurately measure tRNA abundances and modifications in samples taken from yeast cells exposed to different environmental conditions. 

    The method has significant advantages over conventional techniques. “For the first time, we can study both tRNA abundance and tRNA modification profiles simultaneously. As a bonus, the method is rapid, cost-effective, high-throughput, and has single-molecule resolution. Previously, we relied on two separate methods that, together, are less informative, and it would take weeks and cost thousands of euros to obtain results. Nano-tRNAseq is a fraction of the cost, and we can have results within a couple of days, and in the near future, within a few hours” says Morghan Lucas, PhD candidate at the Centre for Genomic Regulation and first author of the study.  

    The rapid data analysis enabled by the method is critical for clinical decision-making. Another advantage is that the nanopore sequencing machines required for the technique are small, lightweight and can be powered by a laptop or portable battery, making them easy to transport to remote locations and enable use in the field or the clinic. 

    The researchers note there are still some limitations to the new method, such as the inability to predict which tRNA modification is dysregulated in a given sample unless the precise modifications found in that tRNA have been previously identified using other experimental methods. “While tRNA modification profiles of lower eukaryotic species, such as yeast, are well characterized, this is not the case for humans. By using Nano-tRNAseq in parallel with other methods, we can describe the modification profiles of the complete set of human tRNAs and, in the future, use Nano-tRNAseq to identify which changes in tRNAs are associated with a given human disease,” adds Morghan Lucas. 

    The method was developed by Dr. Eva Novoa’s research group at the Centre for Genomic Regulation (CRG). Dr. Novoa plans on using the technology to further her research efforts funded by the Spanish Association Against Cancer (AECC). 

    “tRNA molecules can be cleaved into small but stable RNA fragments which circulate in blood plasma. These molecules are typically altered in cancer patients, and are hugely information-rich for diagnostic and prognostic purposes. Nano-tRNAseq is a proof-of-concept technology that paves the way for the development of a simple, cost-effective and highly-precise method that can quantify these molecules in a non-invasive manner. Our aim is to further develop this technology and combine it with artificial intelligence tools to determine the malignancy of a biological sample in less than 3 hours, and at a cost of no more than 50 euros per sample” says Dr. Eva Novoa, senior author of the study and researcher at the Centre for Genomic Regulation. 

    The study was funded by the Spanish Ministry of Economy, Industry, and Competitiveness and a European Research Council Starting Grant. Collaborators include the Institute for Research in Biomedicine (IRB) in Barcelona and the CNRS-Université de Lorraine in Nancy, France. 

    [ad_2]

    Center for Genomic Regulation

    Source link

  • Hidden ice melt in Himalaya: Study

    Hidden ice melt in Himalaya: Study

    [ad_1]

    Newswise — A new study reveals that the mass loss of lake-terminating glaciers in the greater Himalaya has been significantly underestimated, due to the inability of satellites to see glacier changes occurring underwater, with critical implications for the region’s future projections of glacier disappearance and water resources.

    Published in Nature Geoscience on April 3, the study was conducted by an international team including researchers from the Chinese Academy of Sciences (CAS), Graz University of Technology (Austria), the University of St. Andrews (UK), and Carnegie Mellon University (USA).

    The researchers found that a previous assessment underestimated the total mass loss of lake-terminating glaciers in the greater Himalaya by 6.5%. The most significant underestimation of 10% occurred in the central Himalaya, where glacial lake growth was the most rapid. A particularly interesting case is Galong Co in this region, with a high underestimation of 65%.

    This oversight was largely due to the limitations of satellite imaging in detecting underwater changes, which has led to a knowledge gap in our understanding of the full extent of glacier loss. From 2000 to 2020, proglacial lakes in the region increased by 47% in number, 33% in area, and 42% in volume. This expansion resulted in an estimated glacier mass loss of around 2.7 Gt, equivalent to 570 million elephants, or over 1,000 times the total number of elephants in the world. This loss was not considered by previous studies since the utilized satellite data can only measure the lake water surface but not underwater ice that is replaced by water.

    “These findings have important implications for understanding the impact of regional water resources and glacial lake outburst floods,” said lead author ZHANG Guoqing from the Institute of Tibetan Plateau Research, CAS.

    By accounting for the mass loss from lake-terminating glaciers, the researchers can more accurately assess the annual mass balance of these glaciers compared to land-terminating ones, thus further highlighting the accelerated glacier mass loss across the greater Himalaya.

    The study also highlights the need to understand the mechanisms driving glacier mass loss and the underestimated mass loss of lake-terminating glaciers globally, which is estimated to be around 211.5 Gt, or roughly 12%, between 2000 and 2020.

    “This emphasizes the importance of incorporating subaqueous mass loss from lake-terminating glaciers in future mass-change estimates and glacier evolution models, regardless of the study region,” said co-corresponding author Tobias Bolch from Graz University of Technology.

    David Rounce, a co-author from Carnegie Mellon University, noted that in the long run, the mass loss from lake-terminating glaciers may continue to be a major contributor to total mass loss throughout the 21st century as glaciers with significant mass loss may disappear more rapidly compared to existing projections.

    “By more accurately accounting for glacier mass loss, researchers can better predict future water resource availability in the sensitive mountain region,” said co-author YAO Tandong, who also co-chairs Third Pole Environment (TPE), an international science program for interdisciplinary study of the relationships among water, ice, climate, and humankind in the region and beyond.

    [ad_2]

    Chinese Academy of Sciences

    Source link

  • Illegal trade and poor regulation threaten pangolins in China

    Illegal trade and poor regulation threaten pangolins in China

    [ad_1]

    Newswise — Pangolins, unique scale-covered mammals, are drastically declining in numbers across Asia and Africa, largely due to illegal trade. Part of the trade, both legal and illegal, supports the traditional Chinese medicine market, which has attracted conservation attention. The level of demand for pangolins and other animals in traditional Chinese medicine, however, hasn’t been thoroughly studied.

    In a new study published in the journal Nature Conservation, Dr Yifu Wang, currently a postdoc researcher at the University of Hong Kong, investigated pangolin scale trade in China, interviewing staff in hospitals and pharmaceutical shops in two provinces (Henan and Hainan). Between October 2016 and April 2017, she and her team talked to doctors from 41 hospitals and shop owners and assistants from 134 pharmaceutical shops.

    The research found pangolin scales and their derivatives were widely available in hospitals and pharmaceutical shops across Henan and Hainan Provinces. The legislation in place, however, has not been able to prevent ongoing illegal trade in pangolin products. Her team found that 46% of surveyed hospitals and 34% of surveyed pharmaceutical shops were selling pangolin scale products illegally.

    “Existing legal trade allows 711 hospitals to sell pangolin products as medicine with regulations on manufacturer, package, and national annual sale quantity,” explains Dr Yifu Wang. “However, we show that pangolin scales are under heavy demand and unpermitted sellers are commonly found illegally selling pangolin products.”

    “Quantities of products traded by permitted legal sellers are estimated to greatly exceed the supply capacity of legal sources,” she continues.

    This widespread illegal trade, coupled with the very limited legal supply capacity compared to market demand, is concerning. The researchers point to the urgent need to reduce demand from traditional Chinese medicine on pangolin scales and revise the current legal pangolin scale trade system.

    “We also highlight the importance of incorporating the traditional Chinese medicine sector into combating illegal wildlife trade and species conservation beyond pangolins,” they conclude.

    The researchers plan to continue investigating the pangolin scale market in China to understand the trade after COVID-19.

     

    Original source:

    Wang Y, Turvey ST, Leader-Williams N (2023) The scale of the problem: understanding the demand for medicinal pangolin products in China. Nature Conservation 52: 47-61. https://doi.org/10.3897/natureconservation.52.95916

    [ad_2]

    Pensoft Publishers

    Source link

  • A healthy microbiome may prevent deadly infections in critically ill people

    A healthy microbiome may prevent deadly infections in critically ill people

    [ad_1]

    Newswise — Twenty to 50 per cent of all critically ill patients contract potentially deadly infections during their stay in the intensive care unit or in hospital after being in the ICU – markedly increasing the risk of death.

    “Despite the use of antibiotics, hospital-acquired infections are a major clinical problem that persists to be a huge issue for which we don’t have good solutions,” says Dr. Braedon McDonald, MD, PhD, an intensive care physician at the Foothills Medical Centre (FMC) and assistant professor at the Cumming School of Medicine (CSM). “We tackled this issue from a different angle. We looked at the body’s natural defense to infection to better understand why some people are more susceptible to these deadly infections.”

    The study involved 51 patients newly admitted to the intensive care unit (ICU) at FMC. Patients were studied over the first week of acute critical illness. The research showed that the gut microbiota and systemic immunity work together as a dynamic “metasystem,” in which problems with gut microbes and immune system dysfunction are associated with significantly increased rates of hospital-acquired infections.

    “The signal that we’ve seen in our research is that a family of bacteria, that naturally live in the gut, seems to be important for directing the immune system,” says Jared Schlechte, PhD candidate in McDonald’s lab and first author of the study. “However, during critical illness the microbiome becomes injured allowing these bacteria to start taking over.”

    The study published in Nature Medicine found that patients who experienced an abnormal increase in the growth of this common bacteria, called a bloom, were at the highest risk of severe infections.

    “This information is important because it gives us a whole new avenue to start thinking about not just ways to treat infections, but a potential treatment to prevent them,” says McDonald. “The findings suggest that if we want to fight infection, we can’t just target these bad bacteria in isolation and the immune system in isolation. We really need to have a more holistic view of how things are functioning.” McDonald says the study’s findings

    As a next step, McDonald and the team plan to launch a randomized, controlled clinical trial – based on a precision medicine approach that borrows from probiotics therapy, and utilizes multiple different bacteria engineered to specifically target the bacteria identified in the study. People who agree to participate will be given engineered microbiomes.

    “What we’re trying to do is restore the normal mechanism that work when we’re healthy, and take advantage of that to help protect people from infections,” McDonald says.

    UCalgary faculty co-authors included Drs. Christopher Doig, MD, Kathy McCoy, PhD, and Mary Dunbar, MD. PhD candidate Amanda Zucoloto, along with research technician and laboratory manager Ian-Ling Yu, also co-authored the study. The study was supported by the Canadian Institutes of Health Research and the Alberta Health Services Critical Care Strategic Clinical Network

    Braedon McDonald is an assistant professor in the Department of Critical Care Medicine at the Cumming School of Medicine (CSM), an intensive care physician at the Foothills Medical Centre, and a member of the Snyder Institute for Chronic Diseases at CSM.

    The Snyder Institute for Chronic Diseases is a team of more than 480 clinician-scientists and basic scientists dedicated to uncovering new knowledge leading to disease prevention, tailored medical applications and ultimately cures for those with chronic and infectious disease. Visit snyder.ucalgary.ca and follow @SnyderInstitute to learn more.

    [ad_2]

    University of Calgary

    Source link

  • Stones for the climate

    Stones for the climate

    [ad_1]

    Newswise — If the cook varies the amount of ingredients when preparing a dish, a completely new taste is created. It is exactly the same with binding of CO2 in the sea – a change in the substances in the water changes everything. The so-called alkalinity, i.e. the acid binding capacity, is created by the weathering of rocks and their entry into the ocean. Increased erosion on land causes an increase in weathering of silicates and carbonates. The researchers identified the factors for more alkalinity using the model: Degree of erosion, area fraction of carbonate, temperatures, catchment size, and soil thickness.

    Method and influencing factors

    “The model we used is a statistical, not a mechanistic model. We applied it to identify the factors influencing alkalinity based on our compiled data set and to describe their interdependencies,” says Nele Lehmann of the Hereon Institute for Carbon Cycles, lead author of the study, which was an international collaboration with the Alfred Wegener Institute Helmholtz-Zentrum für Polar- und Meeresforschung (AWI) and funding from the Deutscher Akademischer Austauschdienst (DAAD).

    If warming continues slowly, alkalinity would drop by up to 68 percent by 2100, depending on the watersheds. That means the ocean’s ability to sequester CO2 would decrease significantly. Rapidly progressing warming, on the other hand, would lead to higher temperatures and thus more precipitation in temperate climate zones. This would increase alkalinity by up to 33 percent. “But that doesn’t mean that more emissions are good for the climate. The impact of alkalinity is small compared to the amounts of man-made CO2 emitted around the world. The process of weathering unfolds its effects over much longer periods of time,” Lehmann said.

    Climate change is greatly accelerating the interplay of carbon cycling and weathering that is fundamental to the development of life. The team first looked for existing data. The goal was to find as many alkalinity measurements as possible in the immediate vicinity of erosion measurement sites. To do this, the researchers searched databases and publications, and took samples themselves. They conducted the investigation of the alkalinity factors using their new model. The biggest limitation: the erosion rate measurements the researchers used have often only been taken over 20 years, are complex and expensive. This made it difficult to produce the data set. Especially in the higher latitudes, there are hardly any measurements, so the study is limited to the mid-latitudes.

    New questions in the Arctic

    Next, Lehmann would like to investigate alkalinity and the erosion rate in the Arctic. There, the data situation is patchy. And climate change is clearly being noticeable, so potentially the biggest change in alkalinity flux could also occur. Of particular importance: whether erosion itself is changing as a result of climate change.

    [ad_2]

    Helmholtz-Zentrum Hereon

    Source link