ReportWire

Tag: All Journal News

  • Vaccination status, mortality among intubated patients with COVID-19–related acute respiratory distress syndrome

    Vaccination status, mortality among intubated patients with COVID-19–related acute respiratory distress syndrome

    [ad_1]

    About The Study: Full vaccination status compared with controls was associated with lower mortality among critically ill patients who required invasive mechanical ventilation owing to COVID-19–related acute respiratory distress syndrome in this study including 265 patients. These results may inform discussions with families about prognosis.

    Authors: Ilias I. Siempos, M.D., D.Sc., of the National and Kapodistrian University of Athens Medical School in Athens, Greece, is the corresponding author.

    To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/  

    (doi:10.1001/jamanetworkopen.2022.35219)

    Editor’s Note: Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

    #  #  #

    Embed this link to provide your readers free access to the full-text article This link will be live at the embargo time http://jamanetwork.com/journals/jamanetworkopen/fullarticle/10.1001/jamanetworkopen.2022.35219?utm_source=For_The_Media&utm_medium=referral&utm_campaign=ftm_links&utm_term=100722

    About JAMA Network Open: JAMA Network Open is the new online-only open access general medical journal from the JAMA Network. On weekdays, the journal publishes peer-reviewed clinical research and commentary in more than 40 medical and health subject areas. Every article is free online from the day of publication.

    [ad_2]

    JAMA – Journal of the American Medical Association

    Source link

  • Widespread metabolic dysregulation in different organs in type 2 diabetes

    Widespread metabolic dysregulation in different organs in type 2 diabetes

    [ad_1]

    Newswise — The most typical alterations in people with type 2 diabetes are insufficient secretion of insulin and reduced sensitivity to insulin in different organs. To examine what happens in these organs when type 2 diabetes develops, the researchers in the current study have looked at proteins both in the cell islets in the pancreas where insulin is produced, and in the main tissues that insulin acts on, namely the liver, skeletal muscle, fat and blood.

    The researchers compared proteins in samples from people with type 2 diabetes, prediabetes, i.e. a stage before fully developed type 2 diabetes, and without any diabetes. The results showed far more disturbances in metabolic pathways than previously known. There was also a correlation between the alterations and the different stages of the disease.

    “We detected many protein levels that were either higher or lower than normal in tissues from people at different stages of disease. People with prediabetes displayed major alterations that are associated with inflammation, coagulation and the immune system in the pancreatic islets. In fully developed type 2 diabetes there were more widespread abnormalities, for example in lipid and glucose metabolism and in energy production in the liver, muscle and fat,” says Professor Claes Wadelius, who coordinated the study.

    The study builds on tissue samples collected from donors at different stages of disease and healthy individuals. The samples have been collected in the strategic initiative EXODIAB, which is led in Uppsala by Professor Olle Korsgren.

    Using novel techniques, the researchers could quantify thousands of proteins from each organ and therefore obtain a view of the metabolism that has not been possible before.

    “The techniques for measuring proteins have evolved rapidly in recent years and our colleagues at Copenhagen University who participated in the study are world leaders in the field,” says Dr Klev Diamanti, who performed the analyses in Uppsala together with Associate Professor Marco Cavalli and Professor Jan Eriksson.

    In summary, the findings show a highly disturbed metabolism in different pathways in examined organs and at different stages of disease. The data points to new potentially causal mechanisms of the disease, which can be further investigated in the search for new ways of preventing or treating type 2 diabetes.

    “Our results may also support the development of simple tests that can identify people at high risk of diabetes and its complications, and also guide which type of intervention is best for the individual,” says clinical diabetologist Jan Eriksson.

    [ad_2]

    Uppsala University

    Source link

  • How the mother’s mood influences her baby’s ability to speak

    How the mother’s mood influences her baby’s ability to speak

    [ad_1]

    Newswise — Up to 70 percent of mothers develop postnatal depressive mood, also known as baby blues, after their baby is born. Analyses show that this can also affect the development of the children themselves and their speech. Until now, however, it was unclear exactly how this impairment manifests itself in early language development in infants.

    In a study, scientists at the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig have now investigated how well babies can distinguish speech sounds from one another depending on their mother’s mood. This ability is considered an important prerequisite for the further steps towards a well-developed language. If sounds can be distinguished from one another, individual words can also be distinguished from one another. It became clear that if mothers indicate a more negative mood two months after birth, their children show on average a less mature processing of speech sounds at the age of six months. The infants found it particularly difficult to distinguish between syllable-pitches. Specifically, they showed that the development of their so-called Mismatch Response was delayed than in those whose mothers were in a more positive mood. This Mismatch Response in turn serves as a measure of how well someone can separate sounds from one another. If this development towards a pronounced mismatch reaction is delayed, this is considered an indication of an increased risk of suffering from a speech disorder later in life.

    “We suspect that the affected mothers use less infant-directed-speech,” explains Gesa Schaadt, postdoc at MPI CBS, professor of development in childhood and adolescence at FU Berlin and first author of the study, which has now appeared in the journal JAMA Network Open. “They probably use less pitch variation when directing speech to their infants.” This also leads to a more limited perception of different pitches in the children, she said. This perception, in turn, is considered a prerequisite for further language development.

    The results show how important it is that parents use infant-directed speech for the further language development of their children. Infant-directed speech that varies greatly in pitch, emphasizes certain parts of words more clearly – and thus focuses the little ones’ attention on what is being said – is considered appropriate for children. Mothers, in turn, who suffer from depressive mood, often use more monotonous, less infant-directed speech. “To ensure the proper development of young children, appropriate support is also needed for mothers who suffer from mild upsets that often do not yet require treatment,” Schaadt says. That doesn’t necessarily have to be organized intervention measures. “Sometimes it just takes the fathers to be more involved.”

    The researchers investigated these relationships with the help of 46 mothers who reported different moods after giving birth. Their moods were measured using a standardized questionnaire typically used to diagnose postnatal upset. They also used electroencephalography (EEG), which helps to measure how well babies can distinguish speech sounds from one another. The so-called Mismatch Response is used for this purpose, in which a specific EEG signal shows how well the brain processes and distinguishes between different speech sounds. The researchers recorded this reaction in the babies at the ages of two and six months while they were presented with various syllables such as “ba,” “ga” and “bu.

    [ad_2]

    Max Planck Institute for Human Cognitive and Brain Sciences

    Source link

  • Study: Novel Imaging Technique Reveals Excellent Biologic Fixation in Cementless Knee Replacement

    Study: Novel Imaging Technique Reveals Excellent Biologic Fixation in Cementless Knee Replacement

    [ad_1]

    Newswise — Cementless knee replacement, an alternative approach to the traditional surgery in which bone cement is used, is gaining interest among orthopedic surgeons. Using a novel MRI technique, researchers at Hospital for Special Surgery (HSS) found that a cementless implant demonstrated excellent biologic fixation, and even improved fixation of implant components in some areas in the joint, compared to the standard cemented implant.

    HSS hip and knee surgeon Geoffrey Westrich, MD, and colleagues in the HSS Radiology Department used an advanced imaging technique known as “multi-acquisition variable-resonance image combination selective MRI” to assess fixation in patients who had a cementless knee replacement compared to those whose implant was affixed with bone cement.

    “The purpose of our study was to quantify and compare the fixation of uncemented versus cemented knee replacement components,” said Dr. Westrich, lead investigator. “At an average patient follow-up of 16 months, our study demonstrated robust fixation of the cementless knee replacement components, with results comparable to the cemented total knee replacements. And while there was no clinically significant difference regarding overall fixation in the knee, there were some component areas in which cementless fixation appeared to be superior.” The study was published in the October edition of the journal Arthroplasty Today.  

    The HSS researchers performed MRIs in 20 patients who had a cementless knee replacement. A matched control group of 20 patients with a cemented knee replacement was also evaluated. The images were reviewed by a fellowship-trained musculoskeletal radiologist specializing in the interpretation of joint replacement MRI, including more than 20 years of experience in assessing bony fixation of knee replacement components.

    In a traditional knee replacement, implant components are secured in the joint using bone cement. It’s a tried-and-true technique that has worked well for decades. But eventually, over time, the cement may start to loosen from the bone and/or the implant. This loosening is the leading cause of revision surgery, in which a patient needs a second knee replacement.

    “With the cementless prosthesis, the components are press fit into place for biologic fixation, which basically means that the bone will grow into the implant,” explains Dr. Westrich, who believes a well-designed cementless implant will make loosening over time less likely. This could enable a total knee replacement to last much longer, a particular concern for younger patients.

    “Overall, traditional knee replacement offers excellent outcomes and longevity,” he says. “However, younger patients generally put more demands on their joint, causing more wear and tear and potential loosening. The cemented knee implant used in a traditional joint replacement usually lasts 15 to 20 years.”

    Cementless implants have been used successfully in total hip replacement surgery for many years. It has been much more challenging to develop a cementless prosthesis that would work well in the knee because of its particular anatomy, Dr. Westrich explains.

    “Early generation cementless implants had numerous design flaws resulting in loosening and poor survivorship compared to cemented knee replacements,” he says. “More contemporary cementless knee components such as those used in our study utilize highly porous surfaces to promote biologic fixation of the prosthesis. This should improve outcomes.”

    Candidates for the cementless procedure are generally patients under age 70 with good bone quality to promote biologic fixation. In addition to younger patients, Dr. Westrich notes that the cementless implant may prove to be a good option for very overweight patients who tend to put more stress on their joint replacement.

    “While our study found that early fixation of cementless total knee components are comparable, if not superior, to cemented total knee replacement, further study with a larger number of patients over a lengthier time period is needed to assess long-term durability and fixation.”

    Disclosure: Research support received from Stryker Corporation. 

     

    [ad_2]

    Geoffrey Westrich, MD

    Source link

  • Detecting Alzheimer’s disease in the blood

    Detecting Alzheimer’s disease in the blood

    [ad_1]

    Newswise — Researchers from Hokkaido University and Toppan have developed a method to detect build-up of amyloid β in the brain, a characteristic of Alzheimer’s disease, from biomarkers in blood samples.

    Alzheimer’s disease is a neurodegenerative disease, characterised by a gradual loss of neurons and synapses in the brain. One of the primary causes of Alzheimer’s disease is the accumulation of amyloid β (Aβ) in the brain, where it forms plaques. Alzheimer’s disease is mostly seen in individuals over 65 years of age, and cannot currently be stopped or reversed. Thus, Alzheimer’s disease is a major concern for nations with ageing populations, such as Japan.

    A team of scientists from Hokkaido University and Toppan, led by Specially Appointed Associate Professor Kohei Yuyama at the Faculty of Advanced Life Science, Hokkaido University, have developed a biosensing technology that can detect Aβ-binding exosomes in the blood of mice, which increase as Aβ accumulates in the brain. Their research was published in the journal Alzheimer’s Research & Therapy.

    When tested on mice models, the Aβ-binding exosome Digital ICATM (idICA) showed that the concentration of Aβ-binding exosomes increased with the increase in age of the mice. This is significant as the mice used were Alzheimer’s disease model mice, where Aβ builds up in the brain with age.

    In addition to the lack of effective treatments of Alzheimer’s, there are few methods to diagnose Alzheimer’s. Alzheimer’s can only be definitively diagnosed by direct examination of the brain—which can only be done after death. Aβ accumulation in the brain can be measured by cerebrospinal fluid testing or by positron emission tomography; however, the former is an extremely invasive test that cannot be repeated, and the latter is quite expensive. Thus, there is a need for a diagnostic test that is economical, accurate and widely available.

    Previous work by Yuyama’s group has shown that Aβ build-up in the brain is associated with Aβ-binding exosomes secreted from neurons, which degrade and transport Aβ to the microglial cells of the brain. Exosomes are membrane-enclosed sacs secreted by cells that possess cell markers on their surface. The team adapted Toppan’s proprietary Digital Invasive Cleavage Assay (Digital ICATM) to quantify the concentration of Aβ-binding exosomes in as little as 100 µL of blood. The device they developed traps molecules and particles in a sample one-by-one in a million micrometer-sized microscopic wells on a measurement chip and detects the presence or absence of fluorescent signals emitted by the cleaving of the Aβ-binding exosomes.

    Clinical trials of the technology are currently underway in humans. This highly sensitive idICA technology is the first application of ICA that enables highly sensitive detection of exosomes that retain specific surface molecules from a small amount of blood without the need to learn special techniques; as it is applicable to exosome biomarkers in general, it can also be adapted for use in the diagnosis of other diseases.

    [ad_2]

    Hokkaido University

    Source link

  • Opening the eye of the storm

    Opening the eye of the storm

    [ad_1]

    Newswise — For the first time, high-energy muon particles created in the atmosphere have allowed researchers to explore the structures of storms in a way that traditional visualization techniques, such as satellite imaging, cannot. The detail offered by this new technique could aid researchers modeling storms and related weather effects. This could also lead to more accurate early warning systems.

    It’s hard not to notice the number of stories in the news about heavy storms in different parts of the world, often attributed to climate change. Weather prediction and early warning systems have always been important, but with increased storm activity it seems especially so these days. A team of researchers, led by Professor Hiroyuki Tanaka from Muographix at the University of Tokyo, offer the world of meteorology a novel way of detecting and exploring tropical cyclones using a quirk of particle physics that takes place above our heads all the time.

    “You’ve probably seen photographs of cyclones taken from above, showing swirling vortices of clouds. But I doubt you’ve ever seen a cyclone from the side, perhaps as a computer graphic, but never as actual captured sensor data,” said Tanaka. “What we offer the world is the ability to do just this, visualize large-scale weather phenomena like cyclones from a 3D perspective, and in real time too. We do this using a technique called muography, which you can think of like an X-ray, but for seeing inside truly enormous things.”

    Muography creates X-ray-like images of large objects, including volcanoes, the pyramids, bodies of water, and now, for the first time, atmospheric weather systems. Special sensors called scintillators are joined together to make a grid, a little like the pixels on your smartphone’s camera sensor. However, these scintillators don’t see optical light, but instead see particles called muons which are created in the atmosphere when cosmic rays from deep space collide with the atoms in the air. Muons are special because they pass through matter easily without scattering as much as other types of particles. But the small amount they do deviate by as they pass through solid, liquid, or even gaseous matter, can reveal details of their journey between the atmosphere and the sensors. By capturing a large number of muons passing through something, an image of it can be reconstructed.

    “We successfully imaged the vertical profile of a cyclone, and this revealed density variations essential to understanding how cyclones work,” said Tanaka. “The images show cross sections of the cyclone which passed through Kagoshima Prefecture in western Japan. I was surprised to see clearly it had a low-density warm core that contrasted dramatically with the high-pressure cold exterior. There is absolutely no way to capture such data with traditional pressure sensors and photography.”

    The detector the researchers used has a viewing angle of 90 degrees, but Tanaka envisages combining similar sensors to create hemispherical and therefore omnidirectional observation stations which could be placed along the length of a coastline. These could potentially see cyclones as far away as 300 kilometers. Although satellites already track these storms, the extra detail offered by muography could improve predictions about approaching storms.

    “One of the next steps for us now will be to refine this technique in order to detect and visualize storms at different scales,” said Tanaka. “This could mean better modeling and prediction not only for larger storm systems, but more local weather conditions as well.”

    ###

    Journal article: Hiroyuki K.M. Tanaka, Jon Gluyas, Marko Holma, Jari Joutsenvaara, Pasi Kuusiniemi, Giovanni Leone, Domenico Lo Presti, Jun Matsushima, László Oláh, Sara Steigerwald, Lee F. Thompson, Ilya Usoskin, Stepan Poluianov, Dezső Varga, Yusuke Yokota. “Atmospheric Muography for Imaging and Monitoring Tropic Cyclones”Scientific Reports.

     

    About The University of Tokyo
    The University of Tokyo is Japan’s leading university and one of the world’s top research universities. The vast research output of some 6,000 researchers is published in the world’s top journals across the arts and sciences. Our vibrant student body of around 15,000 undergraduate and 15,000 graduate students includes over 4,000 international students. Find out more at www.u-tokyo.ac.jp/en/ or follow us on Twitter at @UTokyo_News_en.

    [ad_2]

    University of Tokyo

    Source link

  • Scientists peel back ancient layers of banana DNA to reveal ‘mystery ancestors’

    Scientists peel back ancient layers of banana DNA to reveal ‘mystery ancestors’

    [ad_1]

    Newswise — Bananas are thought to have been first domesticated by people 7,000 years ago on the island of New Guinea. But the domestication history of bananas is complicated, while their classification is hotly debated, as boundaries between species and subspecies are often unclear.

    Now, a study in Frontiers in Plant Science shows that this history is even more complex than previously thought. The results confirm that the genome of today’s domesticated varieties contains traces of three extra, as yet unknown, ancestors.

    “Here we show that most of today’s diploid cultivated bananas that descend from the wild banana M. acuminata are hybrids between different subspecies. At least three extra wild ‘mystery ancestors’ must have contributed to this mixed genome thousands of years ago, but haven’t been identified yet,” said Dr Julie Sardos, a scientist at The Alliance of Bioversity International and CIAT in Montpellier, France, and the study’s first author. 

    Complex domestication history

    Domesticated bananas (except for the Fei bananas in the Pacific) are thought to be descended from a cluster of four ancestors  ̶  either subspecies of the wild banana Musa acuminata, or distinct but closely related species. M. acuminata seems to have evolved in the northern borderlands between India and Myanmar, and to have existed across Australasia approximately 10m years before it was first domesticated. A further complication is that domesticated varieties can have two (‘diploid’), three (‘triploid’), or four (‘tetraploid’) copies of every chromosome, and that many are also descended from the wild species M. balbisiana.

    Recent smaller-scale studies suggested that even this already complex scenario might not be the whole story, and that further ancestors related to M. acuminata could have been involved in the domestication. The new results not only confirm that this is indeed the case, they also show for the first time that that these gene pools are common in domesticated banana genomes.

    Banana collecting missions

    The authors sequenced the DNA in 226 extracts leaf extracts from the world’s largest collection of banana samples at The Alliance of Bioversity International and CIAT’s ‘Musa Germplasm Transit Centre’ in Belgium. Among these samples, 68 belonged to nine wild subspecies of M. acuminata, 154 to diploid domesticated varieties descended from M. acuminata, and four more distantly related wild species and hybrids as comparisons. Many had previously been gathered in dedicated ‘banana collecting missions’ to Indonesia, the island of New Guinea, and the autonomous region of Bougainville.

    The researchers first measured the levels of relatedness between cultivars and wild bananas and made ´family trees´ based on the diversity at 39,031 Single Nucleotide Polymorphisms (SNPs). They used a subset of these – evenly spread across the genome, with each pair demarcating a block of approximately 100,000 ‘DNA letters’ – to statistically analyze the ancestry of each block. For the first time they detected traces of three further ancestors in the genome of all domesticated samples, for which no matches are yet known from the wild.

    Mystery ancestors might survive somewhere

    The mystery ancestors might be long since extinct. “But our personal conviction is that they are still living somewhere in the wild, either poorly described by science or not described at all, in which case they are probably threatened,” said Sardos.

    Sardos et al. have a good idea where to look for them: “Our genetic comparisons show that the first of these mystery ancestors must have come from the region between the Gulf of Thailand and west of the South China Sea. The second, from the region between north Borneo and the Philippines. The third, from the island of New Guinea.”

    Could help breed better bananas

    Which useful traits these mystery ancestors might have contributed to domesticated bananas is not yet known. For example, the crucial trait of parthenocarpy, fruit setting without the need for pollination, is thought to have been inherited from M. acuminata, while cooking bananas owe a large chunk of their DNA to the subspecies (or perhaps separate species) M. acuminata banksii.

    Second corresponding author Dr Mathieu Rouard, likewise at Bioversity International, said: “Identifying the ancestors of cultivated bananas is important, as it will help us understand the processes and the paths that shaped the banana diversity observed today, a crucial step to breed bananas of the future.”

    “Breeders need to understand the genetic make-up of today’s domesticated diploid bananas for their crosses between cultivars, and this study is a major first step toward the characterization in great detail of many of these cultivars.”

    Sardos said: “Based on these results, we will work with partners to explore and genotype wild banana diversity in the three geographic regions that our study pinpointed, with the hope to identify these unidentified contributors to cultivated bananas. It will also be important to investigate the different advantages and traits that each of these contributors provided to cultivated bananas.”

    [ad_2]

    Frontiers

    Source link

  • Sleep mode makes Energy Internet more energy efficient

    Sleep mode makes Energy Internet more energy efficient

    [ad_1]

    Newswise — A group of scientists in Nagoya University, Japan, have developed a possible solution to one of the biggest problems of the Internet of Energy, energy efficiency. They did so by creating a controller that has a sleep mode and only procures energy when needed. 

    Widespread generation of electricity based on renewable energy has become necessary to combat the climate crisis. One solution to realize society’s electrification needs is the Internet of Energy, which would operate like the information Internet, except that it would consist of energy linked by smart power generation, smart power consumption, smart interconnection, and cloud sharing.  

    When information is sent over the Internet, it is divided into transmittable units called ‘packets’, which are tagged with their destination.  The energy Internet is based on a similar concept. Information tags are added to power pulses to create units called ‘power packets’.  On the basis of requests from terminals, these are then distributed over networks to where they are needed. However, one problem is that since the packets are sent sporadically, the energy supply is intermittent. Current solutions, such as storage batteries or capacitors, complicate the system and reduce its efficiency.  

    An alternative solution is what is known as ‘sparse control’, where the terminal’s actuators are active part of the time and are in sleep mode for the rest of the time. In sleep mode, they do not consume fuel or electricity, leading to efficient energy saving and reducing environmental and noise pollution.  Although sparse control has been used with a single actuator, it does not necessarily provide good performance when multiple actuators are used. The problem of determining how to do this for multiple actuators is called the ‘maximum turn-off control problem’. 

    Now, a Nagoya University research group, led by Professor Shun-ichi Azuma and Doctoral student Takumi Iwata of the Graduate School of Engineering, has developed a model control scheme for multiple actuators. The model has an awake mode, during which it procures and controls the necessary power packets for when they are needed, and a sleep mode. The research was published in the International Journal of Robust and Nonlinear Control

    “We can see our research being useful in the motor control of production equipment,” explains Professor Azuma. “This research provides a control system configuration method based on the assumption that the energy supply is intermittent. It has the advantage of eliminating the need for storage batteries and capacitors. It is expected to accelerate the practical application of the power packet type energy Internet.” 

    This research was supported by Japan Science and Technology Agency Emergent Research Support Program and Grant-in-Aid for Scientific Research from the Ministry of Education, Culture, Sports, Science and Technology of Japan. 

    [ad_2]

    Nagoya University

    Source link

  • ‘Warm Blob’ marine heatwave helps invasive algae take over Baja Californian waters

    ‘Warm Blob’ marine heatwave helps invasive algae take over Baja Californian waters

    [ad_1]

    An unusually long period of warm waters caused invasive species of algae to completely replace a community of native kelp surrounding a Mexican island, according to results published in De Gruyter’s international journal Botanica Marina.

    The waters of the Todos Santos Islands, around 120km off the northwest coast of Baja California in Mexico, are usually dominated by the giant kelp species Macrocystis pyrifera.

    But an unusually long period of warm waters affecting the Pacific coast of North America from 2013 to 2016 (known as the ‘Warm Blob’) seems to have shifted the location and population of several marine species.

    “Those of us who have dove there before noticed the dramatic community change,” said Dr Luis Malpica-Cruz, one of the authors of the paper. Between 2018 and 2019, the authors assessed the density of the invasive species at a rocky reef roughly ten meters below the surface to see how the community had changed.

    During that time they found that native kelps suffered greatly: M. pyrifera went from an average of 0.7 individuals per square meter in 2018 to zero in 2019. During that same time, the researchers say, there was a threefold increase of the population of the invasive macroalgae species Sargassum horneri and kelp species Undaria pinnatifida.

    “We were shocked,” said Malpica-Cruz. “We looked at other local sites that witnessed M. pyrifera loss and saw other native species had taken its place. But the Todos Santos site had a completely different kelp ecosystem.”

    The authors reason that the ‘Warm Blob’ marine heatwave both held back the native M. pyrifera and allowed invasive species such as S. horneri and U. pinnatifida to thrive.

    This alteration in the ecosystem could be the first sign of wider changes, with the dramatic loss of M. pyrifera affecting other algae, invertebrates and fish further up the food chain. 

    While the kelp slowly regained its territory from 2017 onwards at other locations, it continued to lose ground to invasive species at the Todos Santos Islands.

    Malpica-Cruz says he doubts that native species will return to the islands’ waters in the near future, since the dramatic shift happened within a year: “There is hope that not all M. pyrifera kelp forest will be lost. However in those forests that do change it is uncertain how the community will ultimately be impacted.”

    The authors want to continue studying the invasive species at these islands as it could help others to design strategies to manage invasive kelp when a future marine heatwave arrives.

    [ad_2]

    De Gruyter

    Source link

  • Borderline personality disorder-related stigma undermines patient care and efforts to reduce suicide

    Borderline personality disorder-related stigma undermines patient care and efforts to reduce suicide

    [ad_1]

    Newswise — People with a diagnosis of borderline personality disorder and their carers report experiencing discrimination and stigma when presenting to health services following self-harm or a suicide attempt, leading to inadequate treatment and care for suicide prevention, say authors of a new large-scale review.

    Researchers at Flinders University are calling for better use of existing resources to improve health and community-based services and staff training, which would not only boost the health and wellbeing of all Australians but significantly contribute to a reduction in emergency department presentations and hospital admissions.

    Led by Pauline Klein, a Casual Academic and PhD Candidate in Flinders University’s College of Medicine and Public Health, the research team undertook a review of the international literature to investigate people with a diagnosis of borderline personality disorder, their carers, and health practitioners’ experiences of health services.

    “Our aim was to identify any challenges, gaps, and barriers in health services and supports, as well as recommendations for addressing these issues,” says Ms Klein.

    Borderline personality disorder affects one to two percent of the global population but has high rates of self-harm and suicide, leading to frequent presentations to emergency departments and mental health services, the review found.

    “Unlike schizophrenia, borderline personality disorder is much less likely to respond to medications, with previous research finding longer-term solutions, such as face to face therapy and ongoing support, better suited to manage the underlying trauma that is thought to have led to the disorder for many of the people who experience it,” says co-author Dr Kate Fairweather, a Mental Health Epidemiologist and Public Health/Health Equity Lecturer at Flinders University.

    The review identified significant structural problems in the health system for people with a diagnosis of borderline personality disorder and their carers, including the limited public health services and community group programs available to meet the urgent demand for support.

    “We found that the available public health services and programs have long wait lists, and specialist services are not an affordable option for many people with a diagnosis of borderline personality disorder and their families,” says Ms Klein.

    “Similarly, health practitioners reported experiencing challenges navigating health services and referral pathways, due to the limited services and supports available.”

    The research further suggests that there is a dominant stigmatising culture, particularly in emergency and acute mental health services, that perpetuates misconceptions regarding the legitimacy of the diagnosis of borderline personality disorder as well as its treatability and recovery prospects, leading to reluctance among some health practitioners to diagnose or treat people with this mental health condition.

    “Alarmingly, there are consistent reports in the literature indicating that when experiencing a suicidal crisis, people with a borderline personality disorder diagnosis and their carers are treated disrespectfully and denied treatment when presenting to some health services, leading to a lack of support being offered to these patients at a pivotal time when crisis intervention is needed,” says Ms Klein.

    “The Clinical Practice Guidelines for the Management of Borderline Personality Disorder, developed in 2012, state that treatment for this disorder is a legitimate use of healthcare resources and that having a diagnosis of borderline personality disorder is never a reason for withholding healthcare to a person.

    “These stigmatising experiences lead to patients and their carers facing discrimination and high levels of anxiety when seeking treatment because the presenting condition is not taken seriously, undermining patient care and potentially retraumatising and exacerbating patients’ self-harming behaviour.”

    The authors say the results of the review echo existing structural problems impacting other areas of the health system and provide further evidence of a critical need for health reform.

    “This should serve as a call to action for governments to prioritise and address these important public health concerns,” says Ms Klein.

    “We need a system-wide approach including providing health practitioners who work with people with borderline personality disorder ongoing access to education, training, and supervision to better support them in their role.”

    The paper ‘Structural stigma and its impact on healthcare for borderline personality disorder: a scoping review’ by Pauline Klein, Kate Fairweather, and Sharon Lawn is published in the International Journal of Mental Health Systems. DOI: 10.1186/s13033-022-00558-3.

    The research was funded by the Suicide Prevention Research Fund, established by the Federal Government to support research into suicide prevention. The aim of the fund is to support world-class Australian research and facilitate the rapid translation of knowledge into more effective services for individuals, families, and communities. Suicide Prevention Australia manages the fund on behalf of the Federal Government. We also acknowledge our partner organisation, Lived Experience Australia.

    [ad_2]

    Flinders University

    Source link

  • Mapping human brain development

    Mapping human brain development

    [ad_1]

    Newswise — The human brain is probably the most complex organ in the entire living world and has long been an object of fascination for researchers. However, studying the brain, and especially the genes and molecular switches that regulate and direct its development, is no easy task.

    To date, scientists have proceeded using animal models, primarily mice, but their findings cannot be transferred directly to humans. A mouse’s brain is structured differently and lacks the furrowed surface typical of the human brain. Cell cultures have thus far been of limited value in this field, as cells tend to spread over a large area when grown on a culture dish; this does not correspond to the natural three-dimensional structure of the brain.

    Mapping molecular fingerprints

    A group of researchers led by Barbara Treutlein, ETH Professor at the Department of Biosystems Science and Engineering in Basel, has now taken a new approach to studying the development of the human brain: they are growing and using organoids – millimetre-sized three-dimensional tissues that can be grown from what are known as pluripotent stem cells.

    Provided these stem cells receive the right stimulus, researchers can program them to become any kind of cell present in the body, including neurons. When the stem cells are aggregated into a small ball of tissue and then exposed to the appropriate stimulus, they can even self-organise and form a three-dimensional brain organoid with a complex tissue architecture.

    In a new study just published in Nature, Treutlein and her colleagues have now studied thousands of individual cells within a brain organoid at various points in time and in great detail. Their goal was to characterise the cells in molecular-genetic terms: in other words, the totality of all gene transcripts (transcriptome) as a measure of gene expression, but also the accessibility of the genome as a measure of regulatory activity. They have managed to represent this data as a kind of map showing the molecular fingerprint of each cell within the organoid.

    However, this procedure generates immense data sets: each cell in the organoid has 20,000 genes, and each organoid in turn consists of many thousands of cells. “This results in a gigantic matrix, and the only way we can solve it is with the help of suitable programs and machine learning,” explains Jonas Fleck, a doctoral student in Treutlein’s group and one of the study’s co-lead authors. To analyse all this data and predict gene regulation mechanisms, the researchers developed their own program. “We can use it to generate an entire interaction network for each individual gene and predict what will happen in real cells when that gene fails,” Fleck says.

    Identifying genetic switches

    The aim of this study was to systematically identify those genetic switches that have a significant impact on the development of neurons in the different regions of brain organoids.

    With the help of a CRISPR-Cas9 system, the ETH researchers selectively switched off one gene in each cell, altogether about two dozen genes simultaneously in the entire organoid. This enabled them to find out what role the respective genes played in the development of the brain organoid.

    “This technique can be used to screen genes involved in disease. In addition, we can look at the effect these genes have on how different cells within the organoid develop,” explains Sophie Jansen, also a doctoral student in Treutlein’s group and the second co-lead author of the study.

    Checking pattern formation in the forebrain

    To test their theory, the researchers chose the GLI3 gene as an example. This gene is the blueprint for the transcription factor of the same name, a protein that docks onto certain sites on DNA in order to regulate another gene. When GLI3 is switched off, the cellular machinery is prevented from reading this gene and transcribing it into an RNA molecule.

    In mice, mutations in the GLI3 gene can lead to malformations in the central nervous system. Its role in human neuronal development was previously unexplored, but it is known that mutations in the gene lead to diseases such as Greig cephalopolysyndactyly and Pallister Hall Syndromes.

    Silencing this GLI3 gene enabled the researchers both to verify their theoretical predictions and to determine directly in the cell culture how the loss of this gene affected the brain organoid’s further development. “We have shown for the first time that the GLI3 gene is involved in the formation of forebrain patterns in humans. This had previously been shown only in mice,” Treutlein says.

    Model systems reflect developmental biology

    “The exciting thing about this research is that it lets you use genome-wide data from so many individual cells to postulate what roles individual genes play,” she explains. “What’s equally exciting in my opinion is that these model systems made in a Petri dish really do reflect developmental biology as we know it from mice.”

    Treutlein also finds it fascinating how the culture medium can give rise to self-organised tissue with structures comparable to those of the human brain – not only at the morphological level but also (as the researchers have shown in their latest study) at the level of gene regulation and pattern formation. “Organoids like this are truly an excellent way to study human developmental biology,” she points out.

    Versatile brain organoids

    Research on organoids made up of human cell material has the advantage that the findings are transferable to humans. They can be used to study not only basic developmental biology but also the role of genes in diseases or developmental brain disorders. For example, Treutlein and her colleagues are working with organoids of this type to investigate the genetic cause of autism and of heterotopia; in the latter, neurons appear outside their usual anatomical location in the cerebral cortex. 

    Organoids may also be used for testing drugs, and possibly for culturing transplantable organs or organ parts. Treutlein confirms that the pharmaceutical industry is very interested in these cell cultures.

    However, growing organoids takes both time and effort. Moreover, each clump of cells develops individually rather than in a standardised way. That is why Treutlein and her team are working to improve the organoids and automate their manufacturing process.

    [ad_2]

    ETH Zurich

    Source link

  • Can cats and coyote co-exist?

    Can cats and coyote co-exist?

    [ad_1]

    Newswise — As urban environments continue to encroach on natural habitats, instances of human-wildlife conflict tend to increase. While some animals avoid human contact at all costs, other species thrive in urban habitats. Coyotes, in particular, have become frequent visitors near human settlements, and are generally regarded as a significant source of human-wildlife conflict. These urban predators have adapted to consume a range of human food sources, such as garbage, ornamental fruits, and domestic pets. As a result, city residents often worry about the safety of their pets, especially outdoor cats. Is it possible to minimize conflict between these two species in an urban setting?

    Numerous studies throughout the United States from Seattle to New York have demonstrated that cats comprise less than 5% of coyote diet. Why then do diet studies in Los Angeles reveal that cats make up nearly 20% of coyote diet? Residents in Culver City, a suburb of Los Angeles, reported that 72 cats were killed in 18 months, allegedly the victim of coyote attacks. A recent study conducted by Rebecca Davenport and colleagues from the Center for Urban Resistance (CURes) at Loyola Marymount University may offer the first glimpse into this anomaly. The study, “Spatiotemporal relationships of coyotes and free-ranging domestic cats as indicators of conflict in Culver City, California,” was published in the peer-reviewed, scientific journal PeerJ – Life and Environment this month.

    Davenport et al. (2022) installed 20 motion-sensor cameras in Culver City parks, neighborhoods, and green spaces to monitor the presence of cats and coyotes for six months. Similar to other studies, Davenport et al. found that coyotes prefer green spaces to urbanized and/or residential areas. However, cats did not display a preference for a particular habitat type. This result is quite surprising, as studies in Chicago and North Carolina found that cats prefer urban areas and directly avoid areas where coyotes are prevalent. Instead, cats in Culver City were present in the same green space fragments as coyotes. Additionally, cats in this Los Angeles suburb displayed more nocturnal behavior than is typical for urban cats. These unexpected results may explain why there have been such frequent cases of cat mortality in Culver City. 

    Residents have a common perception that coyotes intentionally hunt down pets within their neighborhoods. On the contrary, Davenport et al. (2022) suggest that coyotes tend to stick to natural areas around the city. Urban green spaces contain plenty of alternative prey sources for coyotes, such as cottontail rabbits. Therefore, it is unlikely that coyotes choose to leave their preferred green space habitat in order to seek out domestic pets. Instead, high rates of cat mortality in Culver City may be a result of cats roaming freely through urban green spaces and displaying increased nocturnality compared to cats in other cities. 

    Given that coyotes are perceived as a source of conflict in urban areas, countless management efforts focus on the control or eradication of “problem” coyotes. However, Davenport et al. (2022) recognize that coyotes are native to these environments, while domestic cats have been widely introduced to urban and rural areas across the United States. Unfortunately, cats have been shown to devastate populations of native species, such as songbirds and small mammals. Given these ecological consequences, Davenport et al. (2022) recommend that management efforts consider restrictions or control measures for outdoor cats, rather than solely focusing on the role of coyotes in urban human-wildlife conflict. 

    The CURes team has been studying urban coyotes in Culver City for three years and is currently preparing further analyses. For more information, you can read the article in PeerJ – Life and Environment https://peerj.com/articles/14169/

     

    [ad_2]

    PeerJ

    Source link

  • Scientists design electrolyte for lithium metal anodes for use in lithium metal batteries

    Scientists design electrolyte for lithium metal anodes for use in lithium metal batteries

    [ad_1]

    Newswise — With the growing demand for electric vehicles, the need for high-safety, long-life batteries also rises. Yet the electric vehicles’ demand for high energy density batteries outpaces the capabilities of the current lithium-ion batteries. Scientists are looking to develop lithium metal batteries with lithium metal as the anode because these batteries have a much higher charging capacity. However, there are safety issues with lithium-metal batteries because dendrites—spiky, metallic microstructures—form during the charging process.

    A team of Chinese researchers set out to solve the problem of the lithium dendrite formation and to build high-safety, long-life lithium metal batteries. The team has successfully designed an electrolyte that suppresses the formation of the dendrites. This electrolyte delivers excellent performance in lithium metal batteries and offers solutions in the research toward building high-safety, long-life lithium metal batteries.

    The team’s findings are published in the journal Nano Research on October 3, 2022.

    While lithium metal anodes hold great potential for high-energy storage batteries, the uncontrollable lithium dendrite growth raises significant concerns. The dendrite growth occurs when the lithium ions move and convert to one specific location on the lithium metal surface. The dendrites cause poor cycling efficiency in the battery and are a severe safety issue.

    The team tackled the dendrite problem by combining the advantages of conventional electrolytes and high-concentration electrolytes. The high-concentration electrolytes overcome some of the shortcomings of conventional electrolytes, and hold strong promise for use in next-generation batteries. The electrolyte the team created delivers excellent electrochemical performance in lithium metal batteries and suppresses formation of dendrites. “Its unique structure not only promotes the uniform conversion of ions on the electrode surface but also ensures the rapid movement of ions in the electrolyte,” said Chunpeng Yang, a professor at Tianjin University.

    The researchers began their work by running numerical simulations to explore the effect of a negatively charged coating to induce the interfacial high-concentration electrolyte. Then as a proof-of-concept material, the researchers coated nitrogen- and oxygen-doped carbon nanosheets, that have surface negative charges, with nickel foam to create the electrode. The positively charged lithium ions are concentrated near the nitrogen- and oxygen-doped carbon electrode that is coated with nickel. This concentration of lithium ions promotes the charge transfer reactions on the electrode contribute to outstanding electrochemical cycling performances. The researchers conducted half-cell and full-cell tests on the electrode with excellent results. Their electrode performs much better than other electrodes based on pure nickel foam.

    “This provides a simple principle for suppressing the lithium dendrites by simultaneously taking into account the advantages of conventional electrolyte and high-concentration electrolyte for stable Li metal anode, which may be applied to other substrates for practical metal batteries,” said Yang.

    Beyond coating negatively surface-charged materials on the electrode to guide the formation of interfacial high-concentrated electrolytes, the team plans to look for other ways to obtain this unique electrolyte structure as means to achieving high-performance batteries. The researchers hope to achieve the commercial application of Li metal batteries with high energy density, high safety and long life, by systematically optimizing the battery components. “Our study results could be extended to more metal-battery systems, such as sodium, zinc and magnesium metal batteries, which will contribute to the realization of large-scale energy storage for sustainable energy supply,” said Yang

    The research team includes Haotian Lu, Feifei Wang, and Lu Wang from Tianjin University, the Haihe Laboratory of Sustainable Chemical Transformations, and the National University of Singapore; Chunpeng Yang, from Tianjin University and the Haihe Laboratory of Sustainable Chemical Transformations; Jinghong Zhou, from East China University of Science and Technology; Wei Chen, from  National University of Singapore; and Quan-Hong Yang from Tianjin University and the Haihe Laboratory of Sustainable Chemical Transformations.

    This research is funded by the National Key Research and Development Program of China, the Haihe Laboratory of Sustainable Chemical Transformations, and the Fundamental Research Funds for the Central Universities.

     

    ##

     

    About Nano Research 

    Nano Research is a peer-reviewed, international and interdisciplinary research journal, publishes all aspects of nano science and technology, featured in rapid review and fast publishing, sponsored by Tsinghua University and the Chinese Chemical Society. It offers readers an attractive mix of authoritative and comprehensive reviews and original cutting-edge research papers. After 15 years of development, it has become one of the most influential academic journals in the nano field. In 2022 InCites Journal Citation Reports, Nano Research has an Impact Factor of 10.269 (9.136, 5 years), the total cites reached 29620, ranking first in China’s international academic journals, and the number of highly cited papers reached 120, ranked among the top 2.8% of over 9000 academic journals.

     

    About Tsinghua University Press

    Established in 1980, belonging to Tsinghua University, Tsinghua University Press (TUP) is a leading comprehensive higher education and professional publisher in China. Committed to building a top-level global cultural brand, after 41 years of development, TUP has established an outstanding managerial system and enterprise structure, and delivered multimedia and multi-dimensional publications covering books, audio, video, electronic products, journals and digital publications. In addition, TUP actively carries out its strategic transformation from educational publishing to content development and service for teaching & learning and was named First-class National Publisher for achieving remarkable results.

    [ad_2]

    Tsinghua University Press

    Source link

  • Global warming at least doubled the probability of extreme ocean warming around Japan

    Global warming at least doubled the probability of extreme ocean warming around Japan

    [ad_1]

    Newswise — In the past decade, the marginal seas of Japan frequently experienced extremely high sea surface temperatures (SSTs). A new study led by National Institute for Environmental Studies (NIES) researchers revealed that the increased occurrence frequency of extreme ocean warming events since the 2000s is attributable to global warming due to industrialization.

    In August 2020, the southern area of Japan and the northwestern Pacific Ocean experienced unprecedentedly high SSTs, according to the Japan Meteorological Agency (JMA). A recent study published in January 2021 revealed that the record-high northwestern Pacific SST observed in August 2020 could not be expected to occur without human-induced climate changes. Since then, the JMA again announced that the record high SSTs were observed near Japan in July and October 2021 and from June to August 2022, but it remains unclear to what extent climate change has altered the occurrence likelihood of these regional extreme warming events.

    “Impacts of global warming is not uniform, rather show regional and seasonal differences,” said a co-author Hideo Shiogama, the head of the Earth System Risk Assessment Section at Earth System Division, NIES. “A comprehensive analysis on regional SSTs for a long period may provide a quantitative understanding of how much ocean condition near Japan has been and will be affected by global warming. This better informs policymakers to plan climate change mitigation and adaptation strategies.”

    The paper published in Geophysical Research Letters today figures out the contribution of global warming to discrete monthly extreme ocean warming events in Japan’s marginal seas, which could occur less than once per 20 years in the preindustrial era. A climate research group at NIES focused on ten monitoring areas operationally used by the JMA, including the Japan Sea, East China Sea, Okinawa Islands, east of Taiwan, and the Pacific coasts of Japan. The scientists confirmed that observed SST changes from 1982 to 2021 were well reproduced by 24 climate models participating in the sixth phase of the Coupled Model Intercomparison Project (CMIP6), except for the region east of Hokkaido. Then, the extreme ocean warming events were identified in nine monitoring areas to reveal the contribution of climate change therein.

    Extreme ocean warming and climate change

    “In the present climate, every extreme ocean warming event is linked to global warming,” said corresponding lead author Michiya Hayashi, a research associate at NIES. The scientists estimated the occurrence frequencies of each event in the present and preindustrial climate conditions from January 1982 through July 2022 based on the CMIP6 climate models. “We found that the occurrence probability of almost all the extreme ocean warming events has already at least doubled since the 2000s than the preindustrial era. It is increased more than tenfold in sizeable cases since the mid-2010s, especially in southern Japan.”

    For instance, in July 2022, anomalously high SSTs observed in five monitoring areas, including the Japan Sea (Areas 1, 3), East China Sea (Areas 5, 8), and south of Okinawa near Taiwan (Area 10), are identified as the extreme ocean warming events. The updated results based on the preliminary data retrieved from the NEAR-GOOS RRTDB website on 15 September 2022 (not included in the published paper) show that, in August 2022, the events are also identified in six monitoring areas at the south of 35°N: the East China Sea (Areas 5, 8), south and east of Okinawa (Areas 10, 9), southeastern Kanto (Area 7), and seas off Shikoku and Tokai (Area 6). “We estimate that, in all of these identified events in July and August 2022, the occurrence frequencies are increased at least doubled due to climate change, and more than tenfold for those in the south of 35°N except for the north of East China Sea,” stated Hayashi.

    “Climate change impacts on extreme ocean warming events in northern Japan began to emerge relatively late compared to southern Japan,” noted Shiogama. The increased global aerosol emissions until the 1980s tend to cool the Earth’s surface, which is more substantial in the North Pacific especially near northern Japan via atmospheric large-scale circulation changes. In addition, the year-to-year natural variability of SST is large in northern Japan so the global warming signal was less detectable than in southern Japan. Since in the last decades global aerosol emissions have been reduced, the cooling effect becomes less dominant to human-induced greenhouse gas warming. “Our study indicates,“ continued Shiogama, “that the contribution of climate change to SST extremes has been already discernible beyond natural variability even in northern Japan under the present climate condition.”

    What about the ocean conditions expected in the future? The researchers further compared the probabilities of exceeding the monthly record high SSTs around Japan at different global warming levels from 0°C to 2°C using the 24 CMIP6 climate model outputs from 1901 to 2100. “Once global warming reaches 2°C, all of nine monitoring areas are expected to experience SSTs warmer than the past highest levels at least every two years,” said Tomoo Ogura, a co-author and the head of the Climate Modeling and Analysis Section at Earth System Division, NIES. He added, “Limiting global warming below 1.5°C is necessary not to have the record warm conditions in Japan’s marginal seas as the new normal climate.”

    The quantitative analysis of SSTs around Japan implies that climate change has already become the major factor for most of the record high SSTs in recent years. “In the future, dynamics of each extreme warming event need to be examined by taking the long-term climate change and year-to-year natural variability into account,” noted Hayashi. “Nevertheless, we expect that our statistical results based on the latest climate models will help to implement adaptation and mitigation measures for climate change.”

    [ad_2]

    National Institute for Environmental Studies

    Source link

  • Watching Plants Switch on Genes

    Watching Plants Switch on Genes

    [ad_1]

    The Science

    Biologists often use green fluorescent protein (GFP) to see what happens inside cells. GFP, which scientists first isolated in jellyfish, is a protein that changes light from one color into another. Attaching it to other proteins allows researchers to find out if cells produce those proteins and where within cells to find them. This in turn shows how cells deliver and use genes. The problem is that this usually requires expensive equipment, such as fluorescence microscopes, and it can be time consuming. In this study, researchers describe how a special type of GFP can be used to ‘see’ protein production with the unaided eye. Modifying the genes of plants allowed the team to see GFP production using a simple black light to provide long-wave ultraviolet (UV) light.

    The Impact

    The research demonstrates real-time imaging of cellular and molecular events in a wide range of plants with the unaided eye and a black-light flashlight. This will enable quick and affordable screening for research and development or for real time monitoring of molecular events in mature plants.

    Summary

    Reporter genes are attached to other genes of interest to provide an inexpensive, rapid, and sensitive assay for studying gene delivery and gene expression. These reporters have long been an essential tool for live-cell imaging. Today, imaging and analysis are becoming more accessible through the development of UV-visible fluorescent reporters. This research from scientists at Oak Ridge National Laboratory aimed to advance the use and efficiency of these reporters in two herbaceous plant species (Arabidopsis and tobacco) and two woody plant species (poplar and citrus).

    After designing and building a GFP UV reporter protein (eYGFPuv) that provides enhanced signals for all tested plant species, the researchers demonstrated that strong fluorescence could be captured using either a fluorescence microscope or UV light. Moreover, this UV‐excitable reporter can be observed across a wide range of scales from sub‐meter level seedlings to whole plants without need for special emission filters. For instance, by using a simple UV flashlight, the scientists demonstrated how this new reporter can facilitate rapid quantification of transformation efficiency in plant systems. These improved features will make this newly developed GFP-UV reporter a valuable tool for a wide range of applications in plant science research.

     

    Funding

    The research was supported by the Center for Bioenergy Innovation (CBI), a Department of Energy (DOE) Research Center and the Secure Ecosystem Engineering and Design (SEED) project funded by the Genomic Science Program of the DOE Office of Science, Office of Biological and Environmental Research (BER) as part of the Secure Biosystems Design Science Focus Area (SFA).

    SEE ORIGINAL STUDY

    [ad_2]

    Department of Energy, Office of Science

    Source link

  • 3D map reveals DNA organization within human retina cells

    3D map reveals DNA organization within human retina cells

    [ad_1]

    Newswise — National Eye Institute researchers mapped the organization of human retinal cell chromatin, the fibers that package 3 billion nucleotide-long DNA molecules into compact structures that fit into chromosomes within each cell’s nucleus. The resulting comprehensive gene regulatory network provides insights into regulation of gene expression in general, and in retinal function, in both rare and common eye diseases. The study published in Nature Communications.

     “This is the first detailed integration of retinal regulatory genome topology with genetic variants associated with age-related macular degeneration (AMD) and glaucoma, two leading causes of vision loss and blindness,” said the study’s lead investigator, Anand Swaroop, Ph.D., senior investigator and chief of the Neurobiology Neurodegeneration and Repair Laboratory at the NEI, part of the National Institutes of Health.

    Adult human retinal cells are highly specialized sensory neurons that do not divide, and are therefore relatively stable for exploring how the chromatin’s three-dimensional structure contributes to the expression of genetic information.

    Chromatin fibers package long strands of DNA, which are spooled around histone proteins and then repeatedly looped to form highly compact structures. All those loops create multiple contact points where genetic sequences that code for proteins interact with gene regulatory sequences, such as super enhancers, promoters, and transcription factors. 

    Such non-coding sequences were long considered “junk DNA.” But more advanced studies demonstrate ways these sequences control which genes get transcribed and when, shedding light on the specific mechanisms by which non-coding regulatory elements exert control even when their location on a DNA strand is remote from the genes they regulate.

    Using deep Hi-C sequencing, a tool used for studying 3D genome organization, the researchers created a high-resolution map that included 704 million contact points within retinal cell chromatin. Maps were constructed using post-mortem retinal samples from four human donors.

    The researchers then integrated that chromatin topology map with datasets on retinal genes and regulatory elements. What emerged was a dynamic picture of interactions within chromatin over time, including gene activity hot spots and areas with varying degrees of insulation from other regions of DNA.

    They found distinct patterns of interaction at retinal genes suggesting how chromatin’s 3D organization plays an important role in tissue-specific gene regulation.

    “Having such a high-resolution picture of genomic architecture will continue to provide insights into the genetic control of tissue-specific functions,” Swaroop said. 

    Furthermore, similarities between mice and human chromatin organization suggest conservation across species, underscoring the relevance of chromatin organizational patterns for retinal gene regulation. More than a third (35.7%) of gene pairs interacting through a chromatin loop in mice also did so in human retina.

    The researchers integrated the chromatin topology map with data on genetic variants identified from genome-wide association studies for their involvement in AMD and glaucoma, two leading causes of vision loss and blindness. The findings point to specific candidate causal genes involved in those diseases.

    The integrated genome regulatory map will also assist in evaluating genes associated with other common retina-associated diseases such as diabetic retinopathy, determining missing heritability and understanding genotype-phenotype correlations in inherited retinal and macular diseases. 

    The study was supported by the NEI Intramural Research Program, grants ZIAEY000450 and ZIAEY000546. 

    Reference: Marchal C, Singh N, Batz Z, Advani J, Jaeger C, Corso-Diaz X, and Swaroop A. “High-resolution genome topology of human retina uncovers super enhancer-promoter interactions at tissue-specific and multifactorial disease loci.” Published October 7, 2022, Nature Communications. DOI:10.1038/s41467-022-33427-1

     

    ##

    This press release describes a basic research finding. Basic research increases our understanding of human behavior and biology, which is foundational to advancing new and better ways to prevent, diagnose, and treat disease. Science is an unpredictable and incremental process— each research advance builds on past discoveries, often in unexpected ways. Most clinical advances would not be possible without the knowledge of fundamental basic research. To learn more about basic research, visit https://www.nih.gov/news-events/basic-research-digital-media-kit.

    NEI leads the federal government’s efforts to eliminate vision loss and improve quality of life through vision research…driving innovation, fostering collaboration, expanding the vision workforce, and educating the public and key stakeholders. NEI supports basic and clinical science programs to develop sight-saving treatments and to broaden opportunities for people with vision impairment. For more information, visit https://www.nei.nih.gov.

    About the National Institutes of Health (NIH): NIH, the nation’s medical research agency, includes 27 Institutes and Centers and is a component of the U.S. Department of Health and Human Services. NIH is the primary federal agency conducting and supporting basic, clinical, and translational medical research, and is investigating the causes, treatments, and cures for both common and rare diseases. For more information about NIH and its programs, visit https://www.nih.gov/.

    NIH…Turning Discovery Into Health®

     

    [ad_2]

    NIH, National Eye Institute (NEI)

    Source link

  • Scientists Discover Protein Partners that Could Heal Heart Muscle

    Scientists Discover Protein Partners that Could Heal Heart Muscle

    [ad_1]

    Newswise — CHAPEL HILL, N.C. – Scientists at the UNC School of Medicine have made a significant advance in the promising field of cellular reprogramming and organ regeneration, and the discovery could play a major role in future medicines to heal damaged hearts.

    In a study published in the journal Cell Stem Cell, scientists at the University of North Carolina at Chapel Hill discovered a more streamlined and efficient method for reprogramming scar tissue cells (fibroblasts) to become healthy heart muscle cells (cardiomyocytes). Fibroblasts produce the fibrous, stiff tissue that contributes to heart failure after a heart attack or because of heart disease. Turning fibroblasts into cardiomyocytes is being investigated as a potential future strategy for treating or even someday curing this common and deadly condition.

    Surprisingly, the key to the new cardiomyocyte-making technique turned out to be a gene activity-controlling protein called Ascl1, which is known to be a crucial protein involved in turning fibroblasts into neurons. Researchers had thought Ascl1 was neuron-specific.

    “It’s an outside-the-box finding, and we expect it to be useful in developing future cardiac therapies and potentially other kinds of therapeutic cellular reprogramming,” said study senior author Li Qian, PhD, associate professor in the UNC Department of Pathology and Lab Medicine and associate director of the McAllister Heart Institute at UNC School of Medicine.

    Scientists over the last 15 years have developed various techniques to reprogram adult cells to become stem cells, then to induce those stem cells to become adult cells of some other type. More recently, scientists have been finding ways to do this reprogramming more directly – straight from one mature cell type to another. The hope has been that when these methods are made maximally safe, effective, and efficient, doctors will be able to use a simple injection into patients to reprogram harm-causing cells into beneficial ones.

    “Reprogramming fibroblasts has long been one of the important goals in the field,” Qian said. “Fibroblast over-activity underlies many major diseases and conditions including heart failure, chronic obstructive pulmonary disease, liver disease, kidney disease, and the scar-like brain damage that occurs after strokes.”

    In the new study, Qian’s team, including co-first-authors Haofei Wang, PhD, a postdoctoral researcher, and MD/PhD student Benjamin Keepers, used three existing techniques to reprogram mouse fibroblasts into cardiomyocytes, liver cells, and neurons. Their aim was to catalogue and compare the changes in cells’ gene activity patterns and gene-activity regulation factors during these three distinct reprogrammings.

    Unexpectedly, the researchers found that the reprogramming of fibroblasts into neurons activated a set of cardiomyocyte genes. Soon they determined that this activation was due to Ascl1, one of the master-programmer “transcription factor” proteins that had been used to make the neurons.

    Since Ascl1 activated cardiomyocyte genes, the researchers added it to the three-transcription-factor cocktail they had been using for making cardiomyocytes, to see what would happen. They were astonished to find that it dramatically increased the efficiency of reprogramming – the proportion of successfully reprogrammed cells – by more than ten times. In fact, they found that they could now dispense with two of the three factors from their original cocktail, retaining only Ascl1 and another transcription factor called Mef2c.

    In further experiments they found evidence that Ascl1 on its own activates both neuron and cardiomyocyte genes, but it shifts away from the pro-neuron role when accompanied by Mef2c. In synergy with Mef2c, Ascl1 switches on a broad set of cardiomyocyte genes.

    “Ascl1 and Mef2c work together to exert pro-cardiomyocyte effects that neither factor alone exerts, making for a potent reprogramming cocktail,” Qian said.

    The results show that the major transcription factors used in direct cellular reprogramming aren’t necessarily exclusive to one targeted cell type.

    Perhaps more importantly, they represent another step on the path towards future cell-reprogramming therapies for major disorders. Qian says that she and her team hope to make a two-in-one synthetic protein that contains the effective bits of both Ascl1 and Mef2c, and could be injected into failing hearts to mend them.

    “Cross-lineage Potential of Ascl1 Uncovered by Comparing Diverse Reprogramming Regulatomes” was co-authored by Haofei Wang, Benjamin Keepers, Yunzhe Qian, Yifang Xie, Marazzano Colon, Jiandong Liu, and Li Qian.

    Funding was provided by the American Heart Association and the National Institutes of Health (T32HL069768, F30HL154659, R35HL155656, R01HL139976, R01HL139880).

    [ad_2]

    University of North Carolina School of Medicine

    Source link

  • On-site reactors could affordably turn CO2 into valuable chemicals

    On-site reactors could affordably turn CO2 into valuable chemicals

    [ad_1]

    Newswise — New technology developed at the University of Waterloo could make a significant difference in the fight against climate change by affordably converting harmful carbon dioxide (CO2) into fuels and other valuable chemicals on an industrial scale.

    Outlined in a study published today in the journal Nature Energy, the system yields 10 times more carbon monoxide (CO) – which can be used to make ethanol, methane and other desirable substances – than existing, small-scale technologies now limited to testing in laboratories.

    Its individual cells can also be stacked to form reactors of any size, making the technology a customizable, economically viable solution that could be installed right on site, for example, at factories with CO2 emissions.

    “This is a critical bridge to connect CO2 lab technology to industrial applications,” said Dr. Zhongwei Chen, a chemical engineering professor at Waterloo. “Without it, it is very difficult for materials-based technologies to be used commercially because they are just too expensive.”

    The system features devices known as electrolyzers that convert CO2, a major greenhouse gas produced by burning fossil fuels, into CO using water and electricity.

    Electrolyzers developed by the researchers have new electrodes and a new kind of liquid-based electrolyte, which is saturated with CO2 and flowed through the devices for conversion into CO via an electrochemical reaction.

    Their electrolyzers are essentially 10-centimetre by 10-centimetre cells, many times larger than existing devices, that can be stacked and configured in reactors of any size.

    “This is a completely new model for a CO2 reactor,” said Chen, the Canada Research Chair in Advanced Materials for Clean Energy. “It makes the whole process economically viable for industrialization and can be customized to meet specific requirements.”

    The researchers envision on-site reactors at coal-fired power plants and factories, perhaps the size of a house or more, that would be directly fed CO2 emissions, further reducing costs by eliminating the need to capture and collect CO2 first.

    They are also developing plans to power the reactors with on-site renewable energy sources such as solar panels, contributing to the environmental benefits.

    “I’m excited by the potential of this technology,” Chen said. “If we really want to make a difference by reducing emissions, we have to concentrate on reducing costs to make it affordable.”

    Chen’s collaborators at Waterloo included postdoctoral fellow Dr. Guobin Wen and chemical engineering professors Dr. Aiping Yu and Dr. Jeff Gostick. Several researchers at the South China Normal University also contributed.

    [ad_2]

    University of Waterloo

    Source link

  • More accurate assessments of hurricane damage for responders

    More accurate assessments of hurricane damage for responders

    [ad_1]

    Newswise — COLUMBUS, Ohio – Emergency crews responding to hurricane-damaged areas may soon get an assist from a machine learning model that can better predict the extent of building damage soon after the storm passes.

    The model uses remote sensing from satellites that can generate building footprints from pre-hurricane images and then compare them with images taken after the storm.

    While some previous models could only tell if a building was damaged or not damaged, this deep learning model can accurately classify how much damage buildings sustained – key information for emergency responders, said Desheng Liu, co-author of the study and professor of geography at The Ohio State University.

    “Often it is difficult or impossible to rapidly assess the impact of a hurricane or other natural disaster from the ground,” Liu said.

    “Our goal is to be able to provide near real-time information about building damage that can help emergency crews respond to disasters.”

    Liu conducted the study with Polina Berezina, a graduate student in geography at Ohio State.  Their results were published earlier this year in the journal Geomatics, Natural Hazards and Risk.

    The researchers tested their new model on data from Hurricane Michael in 2018 and found that its overall damage assessment was 86.3% accurate in one region of Florida – an 11% improvement over one current state-of-the-art model.

    The research study area included Bay County and parts of neighboring Calhoun, Gulf, Washington, Leon and Holmes counties on the panhandle of Florida. Panama City is the major metropolitan area included in the study.

    The National Oceanic and Atmospheric Administration estimated the total damage to the U.S. economy from Hurricane Michael to equal $25 billion – of that, $18.4 billion occurred in Florida.

    The researchers obtained commercial satellite images for the study area.  Pre-hurricane images were from October or November 2017.  Post-event imagery was obtained on cloud-free days directly after the hurricane impact, mostly on Oct. 13, 2018.  The hurricane had made landfall on Oct. 10.

    Within the dataset the researchers used, the study area included 22,686 buildings.

    Berezina and Liu used a type of machine learning called convolutional neural networks (or CNN) to first generate building footprints from the pre-hurricane satellite imagery and then classify the amount of damage after the storm.

    Their model classified buildings as undamaged, minor damage, major damage or destroyed.

    Overall, the new model has an overall accuracy of 86.3%, improving upon the 75.3% accuracy of the support vector machine model (or SVM) to which it was compared.

    “The SVM struggled to distinguish between minor and major damage, which can be a major issue for teams responding after a hurricane,” Liu said.

    “Overall, our results for Hurricane Michael are promising.”

    In live hurricane situations, Liu said the model could be used to rate the probability that individual buildings are in a certain damage class – such as minor damage or major damage – to help direct emergency management and first responders to where they should check first.

    [ad_2]

    Ohio State University

    Source link

  • Survival Is a Mixed Matter for Deadliest of Pancreatic Cancers

    Survival Is a Mixed Matter for Deadliest of Pancreatic Cancers

    [ad_1]

    Newswise — Pancreatic ductal adenocarcinoma (PDAC) is the most common and most lethal form of pancreatic cancer. The overall 5-year survival for patients with PDAC is just 7.1 percent.

    All cancers are different. A unique feature of PDAC is extensive tumor desmoplasia or fibrous connective tissue within the tumor, which is caused by infiltration of the tumor mass by fibroblasts and the extracellular matrix they secrete. The main component of the matrix is type I collagen or Col 1, a protein broadly used in the body to form the basic structure of bone, skin, blood vessels and connective tissues.

    The effect of Col 1 on PDAC development and its response to therapy has been a matter of intense debate among researchers, with some arguing that Col 1 promotes tumor growth and spread and others contending that it restricts tumor growth and protects the cancer cells from immune attack.

    In a new study, published October 5, 2022, in Nature, co-first authors Hua Su, PhD, a postdoctoral fellow in the lab of senior author Michael Karin, PhD, Distinguished Professor of Pharmacology and Pathology at University of California San Diego School of Medicine, and Fei Yang, PhD, a scientist working with Beicheng Sun, MD, PhD, at Nanjing University School of Medicine, settle the debate by showing that it is not the amount of Col 1 present in the tumor that matters, but its quality and nature.

    Specifically, they report that Col 1 that has been cleaved by matrix metalloproteases (enzymes that break down matrix proteins, such as collagen) stimulates tumor growth while intact and non-cleaved Col 1 inhibits tumor growth.

    “Moreover,” said Su, “cleaved Col 1 activates a signaling pathway that stimulates energy production in pancreatic cancer cells by binding to a receptor protein called DDR1. Non-cleaved Col 1 inhibits this pathway by inducing the degradation of DDR1.”

    The research was conducted using mice models and a novel culture system in which PDAC cells were plated on extracellular matrix that contained either cleaved or non-cleaved Col 1.

    The authors said the findings have important clinical implications.

    The relative amounts of cleaved versus non-cleaved Col 1 in the human PDAC stroma or connective tissue strongly affect patient survival after surgical resection. Patients whose tumors were enriched in cleaved Col 1 and whose cancerous cells expressed high levels of DDR1 fared poorly, with most succumbing to their disease within two years of surgery.

    This patient group represented 75 percent of the 106 patients analyzed as part of the study, using cancer specimens provided by Beicheng Sun, MD, PhD, and colleagues at the Affiliated Drum Tower Hospital of Nanjing University Medical School in China.

    In contrast, the 25 percent of patients whose tumors mainly contained non-cleaved Col 1 with low levels of DDR1 expression experienced much better survival prospects.

    “This work is important because it provides a simple way for patient stratification and suggests that patients with high levels of cleaved Col 1 and DDR1 expression need more aggressive post-surgery treatments,” said Karin.

    “It also provides evidence that the most effective therapy for this group of patients should include inhibitors of DDR1 or key components of its signaling pathway whose activation results in increased number of mitochondria, the cellular power plants, in PDAC cells.”

    In addition to DDR1 inhibitors not yet in clinical practice, the authors suggested another treatment option, shown to be effective in PDAC-bearing mice, is the U.S. Food and Drug Administration approved antibiotic tigecycline, which can inhibit mitochondrial protein synthesis and decrease the number of energy-producing PDAC mitochondria.

    Co-authors include: Rao Fu, Nanjing University Medical School; Brittney Trinh, Nina Sun, Junlai Liu, Jacopo Baglieri, Nachanok Sinchai, Jeremy Siruno, Stephen Dozier, Ajay Nair, Aveline Filliol, Sara Brin Rosenthal, Jennifer Santini, Anthony Molina, Robert F. Schwabe, Andrew M. Lowy and David Brenner, all at UC San Diego; and Avi Kumar and Christian M. Metallo, Salk Institute.

    Funding for this research came, in part, from the National Institutes of Health (grants R01CA211794, R37AI043477, P01DK098108, U01AA027681, U01CA274295), the Padres Pedal the Cause/C3, the Cancer Center Support Grant and UC San Diego School of Medicine Microscopy Core.

    # # #

    [ad_2]

    University of California San Diego

    Source link