ReportWire

Tag: Imperial College London

  • Boosting engagement in heterosexual men may reduce HIV in Uganda

    Boosting engagement in heterosexual men may reduce HIV in Uganda

    [ad_1]

    Newswise — A study looking at 15 years of HIV transmission and suppression in Uganda reveals how closing gender gaps in treatment could slash infection rates.

    Providing more heterosexual men with easy access to HIV treatment and care could help to suppress the virus and rapidly cut transmission to their female partners, shows a new study published in Nature Microbiology.

    The research, led by scientists from Imperial College London and the Rakai Health Sciences Program in Uganda, analysed 15 years of data from 2003-2018, during which the US President’s Emergency Plan For AIDS Relief (PEPFAR) has delivered an extensive programme of HIV/AIDS testing, prevention, and treatment.

    This included distributing Antiretroviral Therapy (ART) drugs, which supress the virus so a person is no longer infectious. The analysis shows that the PEPFAR program and other services have greatly reduced new infections among young women and heterosexual men, but that reductions were less substantial in women aged 25 and above.

    This is thought to be because women are more likely to reach viral suppression through uptake and effective use of HIV treatment, preventing them from passing HIV to their male partners, but that the same is not true the other way around.

    Gender disparity

    The analysis showed that the number of women reaching and maintaining undetectable (non-transmissible) levels of HIV infection were 1.5 to 2 times higher than men across all ages by the year 2018. The analysis shows that had men reached the same levels of virus suppression as women, around half the new infections that occurred between 2016 and 2018 could have been avoided.

    The team also reconstructed transmission networks based on the genetic code of the virus from thousands of participants, which confirmed that overall, the proportion of transmissions from men is increasing and is now at 63% of all transmissions in the area – even though a greater number of women are living with HIV than men.

    The team say the disparity could be because men need to travel for work, that clinics are closed when they are back home, or for other reasons, including social stigma.

    Dr Oliver Ratmann, senior author of the study from the Department of Mathematics at Imperial, said: “In this evolving battle against HIV, it is critical we adapt our strategies, bridge gaps in care, and ensure that individuals, regardless of their gender, have access to the lifesaving benefits of ART.

    “It is important to design services in a way that everybody who would like to use them is able and feels empowered to do so. By routinely monitoring the changing dynamics of the epidemic and striving for equity in HIV care, we can move closer to the ultimate goal of controlling and, one day, eliminating HIV transmission.”

    Dr Kate Grabowski, a co-author of the study from the Johns Hopkins School of Medicine, added: “The continued success of the President’s Emergency Plan for AIDS Relief (PEPFAR) in reducing infections and saving lives is crucial for ending HIV transmission. With United States Congress currently evaluating PEPFAR funding, our evidence strongly supports the program’s efficacy and provides a clear roadmap to ending the pandemic through enhanced HIV treatment coverage, particularly among men.”

    Closing the gap in transmission

    The team used data from the Rakai Community Cohort Study (RCCS) in southern Uganda, a region where more than 9% of adults are living with HIV – approximately 20 times higher than in the US. Since 2003, a period predating the widespread availability of ART in Africa, RCCS has enrolled nearly 37,000 individuals, tracking changes in HIV infection as new interventions came on board.

    The analysis tracked evolving heterosexual HIV epidemic dynamics in 36 communities over a 15-year span of RCCS surveillance data, including records of new infections, deep sequence HIV genomic data, HIV treatment uptake, viral suppression, and behavioural information.

    Analyses in earlier years showed that the highest number of new HIV cases in southern Uganda was among adolescent girls and young women aged 15-24 years. In more recent years tracked in the new study, women 25-34 years old have become a new focal group, experiencing a slower decline in new infections than other age groups. This is alongside a significant difference in the declines in new infections between men and women, with those among boys and men declining much faster.

    To estimate the likely impact of getting men to the same level of viral suppression, the team applied statistical models based on the data about transmission dynamics. The resulting projections indicate that closing the viral suppression gap in men could have effectively halved rates of new infections among women and eliminated gender disparities in acquiring HIV.

    Dr Joseph Kagaayi, previous director of the Rakai Health Sciences program and senior co-author of the study, said: “Our study findings emphasise the importance of addressing disparities in ART uptake and viral suppression between men and women. By doing so, we can not only reduce HIV infections among women but also work towards closing the gender gap in HIV transmission. Achieving these goals will require concerted efforts, informed policies, and strengthened healthcare services.”

    [ad_2]

    Imperial College London

    Source link

  • Hidden way for us to feel touch uncovered by Imperial researchers

    Hidden way for us to feel touch uncovered by Imperial researchers

    [ad_1]

    Newswise — Previously, touch was thought to be detected only by nerve endings present within the skin and surrounding hair follicles. This new research from Imperial College London has found that that cells within hair follicles – the structures that surround the hair fibre – are also able to detect the sensation in cell cultures.

    The researchers also found that these hair follicle cells release the neurotransmitters histamine and serotonin in response to touch – findings that might help us in future to understand histamine’s role in inflammatory skin diseases like eczema.

    Lead author of the paper Dr Claire Higgins, from Imperial’s Department of Bioengineering, said: “This is a surprising finding as we don’t yet know why hair follicle cells have this role in processing light touch. Since the follicle contains many sensory nerve endings, we now want to determine if the hair follicle is activating specific types of sensory nerves for an unknown but unique mechanism.”

    A touchy subject

    We feel touch using several mechanisms: sensory nerve endings in the skin detect touch and send signals to the brain; richly innervated hair follicles detect the movement of hair fibres; and sensory nerves known as C-LTMRs, that are only found in hairy skin, process emotional, or ‘feel-good’ touch.

    Now, researchers may have uncovered a new process in hair follicles. To carry out the study, the researchers analysed single cell RNA sequencing data of human skin and hair follicles and found that hair follicle cells contained a higher percentage of touch-sensitive receptors than equivalent cells in the skin. 

    They established co-cultures of human hair follicle cells and sensory nerves, then mechanically stimulated the hair follicle cells, finding that this led to activation of the adjacent sensory nerves.

    They then decided to investigate how the hair follicle cells signalled to the sensory nerves. They adapted a technique known as fast scan cyclic voltammetry to analyse cells in culture and found that the hair follicle cells were releasing the neurotransmitters serotonin and histamine in response to touch.

    When they blocked the receptor for these neurotransmitters on the sensory neurons, the neurons no longer responded to the hair follicle cell stimulation. Similarly, when they blocked synaptic vesicle production by hair follicle cells, they were no longer able to signal to the sensory nerves.

    They therefore concluded that in response to touch, hair follicle cells release that activate nearby sensory neurons.

    The researchers also conducted the same experiments with cells from the skin instead of the hair follicle. The cells responded to light touch by releasing histamine, but they didn’t release serotonin.

    Dr Higgins said: “This is interesting as histamine in the skin contributes to inflammatory skin conditions such as eczema, and it has always been presumed that immune cells release all the histamine. Our work uncovers a new role for skin cells in the release of histamine, with potential applications for eczema research.”

    The researchers note that the research was performed in cell cultures, and will need to be replicated in living organisms to confirm the findings. The researchers also want to determine if the hair follicle is activating specific types of sensory nerves. Since C-LTMRs are only present within hairy skin, they are interested to see if the hair follicle has a unique mechanism to signal to these nerves that we have yet to uncover.

    This work was funded by Engineering and Physical Research Council (EPSRC, part of UKRI), Proctor & Gamble, Wellcome Trust, and Biotechnology and Biological Sciences Research Council (BBSRC, part of UKRI).

    [ad_2]

    Imperial College London

    Source link

  • Anti-Allergy Formula Is on the Rise. Milk Allergies Might Not Be.

    Anti-Allergy Formula Is on the Rise. Milk Allergies Might Not Be.

    [ad_1]

    This article was originally published by Undark Magazine.

    For Taylor Arnold, a registered dietitian nutritionist, feeding her second baby was not easy. At eight weeks old, he screamed when he ate and wouldn’t gain much weight. Arnold brought him to a gastroenterologist, who diagnosed him with allergic proctocolitis—an immune response to the proteins found in certain foods, which she narrowed down to cow’s milk.

    Cow’s-milk-protein allergies, or CMPA, might be on the rise—following a similar trend in other children’s food allergies—and they can upend a caregiver’s feeding plans: In many cases, a breastfeeding parent is told to eliminate dairy from their diet, or switch to a specialized hypoallergenic formula, which can be expensive.

    But although some evidence suggests that CMPA rates are climbing, the source and extent of that increase remain unclear. Some experts say that the uptick is partly because doctors are getting better at recognizing symptoms. Others claim that the condition is overdiagnosed. And among those who believe that milk-allergy rates are inflated, some suspect that the global formula industry, valued at $55 billion according to a 2022 report from the World Health Organization and UNICEF, may have an undue influence.

    Meanwhile, “no one has ever studied these kids in a systematic way,” Victoria Martin, a pediatric gastroenterologist and allergy researcher at Massachusetts General Hospital, told me. “It’s pretty unusual in disease that is this common, that has been going on for this long, that there hasn’t been more careful, controlled study.”

    This lack of clarity can leave doctors in the dark about how to diagnose the condition and leave parents with more questions than answers about how best to treat it.

    When Arnold’s son became sick with CMPA symptoms, it was “really, really stressful,” she told me. Plus, “I didn’t get a lot of support from the doctors, and that was frustrating.”

    Though the gastroenterologist recommended that she switch to formula, Arnold ultimately used a lactation consultant and gave up dairy so she could continue breastfeeding. But she said she can understand why others might not make the same choice: “A lot of moms go to formula because there’s not a lot of support for how to manage the diet.”


    Food allergies primarily come in two forms: One, called an IgE-mediated allergy, has symptoms that appear soon after ingesting a food—such as swelling, hives, or difficulty breathing—and may be confirmed by a skin-prick test. The second, which Arnold’s son was diagnosed with, is a non-IgE-mediated allergy, or food-protein-induced allergic proctocolitis, and is harder to diagnose.

    With non-IgE allergies, symptom onset doesn’t tend to happen immediately after a person eats a triggering food, and there is no definitive test to confirm a diagnosis. (Some specialists don’t like to call the condition an allergy, because it doesn’t present with classic allergy symptoms.) Instead, physicians often rely on past training, online resources, or published guidelines written by experts in the field, which list symptoms and help doctors make a treatment plan.

    Numerous such guidelines exist to help providers diagnose milk allergies, but the process is not always straightforward. “It’s a perfect storm” of vague and common symptoms and no diagnostic test, Adam Fox, a pediatric allergist and a professor at King’s College London, told me, noting that commercial interests such as formula-company marketing can also be misleading. “It’s not really a surprise that you’ve got confused patients and, frankly, a lot of very confused doctors.”

    Fox is the lead author of the International Milk Allergy in Primary Care, or iMAP, guidelines, one of many similar documents intended to help physicians diagnose CMPA. But some guidelines—including iMAP, which was known as the Milk Allergy in Primary Care Guideline until 2017—have been criticized for listing a broad range of symptoms, like colic, nonspecific rashes, and constipation, which can be common in healthy infants during the first year of their life.

    “Lots of babies cry, or they [regurgitate milk], or they get a little minor rash or something,” Michael Perkin, a pediatric allergist based in the U.K., told me. “But that doesn’t mean they’ve got a pathological process going on.”

    In a paper published online in December 2021, Perkin and colleagues found that in a food-allergy trial, nearly three-quarters of the infants’ parents reported at least two symptoms that matched the iMAP guidelines’ “mild-moderate” non-IgE-mediated cow’s-milk-allergy symptoms, such as vomiting. But another study, whose authors included Perkin and Robert Boyle, a children’s-allergy specialist at Imperial College London, reviewed available evidence and found estimated that only about 1 percent of babies have a milk allergy that has been proved by what’s called a “food challenge,” in which a person is exposed to the allergen and their reactions are monitored.

    That same study reported that as many as 14 percent of families believe their baby has a milk allergy. Another study by Boyle and colleagues showed that milk-allergy formula prescriptions increased 2.8-fold in England from 2007 to 2018. Researchers at the University of Rochester found similar trends stateside: Hypoallergenic-formula sales rose from 4.9 percent of formula sold in the U.S. in 2017 to 7.6 percent in 2019.

    Perkin and Boyle suspect that the formula industry has influenced diagnosis guidelines. In their 2020 report, published in JAMA Pediatrics, they found that 81 percent of authors who had worked on various physicians’ guidelines for the condition—including several for iMAP’s 2013 guidance—reported a financial conflict of interest with formula manufacturers.

    The formula industry also sends representatives and promotional materials to some pediatric clinics. One recent study found that about 85 percent of U.S. pediatricians surveyed reported a visit by a representative, some of whom sponsored meals with them.

    Formula companies “like people getting the idea that whenever a baby cries, or does a runny poo, or anything,” it might be a milk allergy, Boyle told me.

    In response to criticism that the guidelines have influenced the increase in specialized-formula sales, Fox, the lead author of the iMap guidelines, noted that the rise began in the early 2000s. One of the first diagnosis guidelines, meanwhile, was published in 2007. He also said that the symptoms listed in the iMAP guidelines are those outlined by the U.K.’s National Institute for Health and Care Excellence and the U.S.’s National Institute of Allergy and Infectious Diseases.

    As for the conflicts of interest, Fox said: “We never made any money from this; there was never any money for the development of it. We’ve done this with best intentions. We absolutely recognize where that may not have turned out the way that we intended it; we have tried our best to address that.”

    Following backlash over close ties between the formula industry and health-care professionals, including author conflicts of interest, iMAP updated its guidelines in 2019. The new version responded directly to criticism and said the guidelines received no direct industry funding, but it acknowledged “a potential risk of unconscious bias” related to research funding, educational grants, and consultant fees. The authors noted that the new guidelines had tried to mitigate such influence through independent patient input.

    Fox also said he cut all formula ties in 2018, and led the British Society for Allergy & Clinical Immunology to do the same when he was president.

    I reached out to the Infant Nutrition Council of America, an association of some of the largest U.S. manufacturers of infant formula, multiple times but did not receive any comment in response.


    Though the guidelines have issues, Nigel Rollins, a pediatrician and researcher at the World Health Organization, told me, he sees the rise in diagnoses as driven by formula-industry marketing to parents, which can fuel the idea that fussiness or colic might be signs of a milk allergy. Parents then go to their pediatrician to talk about milk allergy, Rollins said, and “the family doctor isn’t actually well positioned to argue otherwise.”

    Rollins led much of the research in the 2022 report from the WHO and UNICEF, which surveyed more than 8,500 pregnant and postpartum people in eight countries (not including the U.S.). Of those participants, 51 percent were exposed to aggressive formula-milk marketing, which the report states “represents one of the most underappreciated risks to infants and young children’s health.”

    Amy Burris, a pediatric allergist and immunologist at the University of Rochester Medical Center, told me that there are many likely causes of overdiagnosis: “I don’t know that there’s one particular thing that stands out in my head as the reason it’s overdiagnosed.”

    Some physicians rely on their own criteria, rather than the guidelines, to diagnose non-IgE milk allergy—for instance, conducting a test that detects microscopic blood in stool. But Burris and Rollins both pointed out that healthy infants, or infants who have recently had a virus or stomach bug, can have traces of blood in their stool too.

    Martin, the allergy researcher at Massachusetts General Hospital, said the better way to confirm an infant dairy allergy is to reintroduce milk about a month after it has been eliminated: If the symptoms reappear, then the baby most likely has the allergy. The guidelines say to do this, but both Martin and Perkin told me that this almost never happens; parents can be reluctant to reintroduce a food if their baby seems better without it.

    “I wish every physician followed the guidelines right now, until we write better guidelines, because, unequivocally, what folks are doing not following the guidelines is worse,” Martin said, adding that kids are on a restricted diet for a longer time than they should be.


    Giving up potentially allergenic foods, including dairy, isn’t without consequences. “I think there’s a lot of potential risk in having moms unnecessarily avoid cow’s milk or other foods,” Burris said. “Also, you’re putting the breastfeeding relationship at risk.”

    By the time Burris sees a baby, she said, the mother has in many cases already given up breastfeeding after a primary-care provider suggested a food allergy, and “at that point, it’s too late to restimulate the supply.” It also remains an open question whether allergens in breast milk actually trigger infant allergies. According to Perkin, the amount of cow’s-milk protein that enters breast milk is “tiny.”

    For babies, Martin said, dietary elimination may affect sensitivity to other foods. She pointed to research indicating that early introduction of food allergens such as peanuts can reduce the likelihood of developing allergies.

    Martin also said that some babies with a CMPA diagnosis may not have to give up milk entirely. She led a 2020 study suggesting that even when parents don’t elect to make any dietary changes for babies with a non-IgE-mediated food-allergy diagnosis, they later report an improvement in their baby’s symptoms by taking other steps, such as acid suppression. But when parents do make changes to their baby’s diet, in Martin’s experience, if they later reintroduce milk, “the vast majority of them do fine,” she said. “I think some people would argue that maybe you had the wrong diagnosis initially. But I think the other possibility is that it’s the right diagnosis; it just turns around pretty fast.”

    Still, many parents who give up dairy or switch to a hypoallergenic formula report an improvement in their baby’s symptoms. Arnold said her son’s symptoms improved when she eliminated dairy. But when he was about eight months old, they reintroduced the food group to his diet, and he had no issues.

    Whether that’s because the cow’s-milk-protein allergy was short-lived or because his symptoms were due to something else is unclear. But Arnold sees moms self-diagnosing their baby with food allergies on social media, and believes that many are experiencing a placebo effect when they say their baby improves. “Nobody’s immune to that. Even me,” she said. “There’s absolutely a chance that that was the case with my baby.”

    [ad_2]

    Christina Szalinski

    Source link

  • Gut germs use strong substances to avoid antibiotics.

    Gut germs use strong substances to avoid antibiotics.

    [ad_1]

    Newswise — The discovery shows why it can be so difficult to tackle drug-resistant bacteria, but does provide a possible avenue for tackling the problem. The super-polymer structures the bacteria use to transfer genes could also be exploited for precise drug delivery in future medicine.

    Gut bacteria form extracellular appendages called F-pili to connect to each other and transfer packets of DNA, called genes, that allow them to resist antibiotics. It was thought that the harsh conditions inside human and animal guts, including turbulence, heat, and acids, would break the F-pili, making transfer more difficult.

    However, new research by a team led by Imperial College London researchers has shown that the F-pili are actually stronger in these conditions, helping the bacteria transfer resistance genes more efficiently, and to clump into ‘biofilms’ – protective bacterial consortia – that help them fend off antibiotics.

    The results are published in Nature Communications.

    First author Jonasz Patkowski, from the Department of Life Sciences at Imperial, said: “The death toll from antimicrobial resistance is expected to match cancer by 2050, meaning we urgently need new strategies to combat this trend. Much of the spread of resistance is driven by bacteria swapping genes, so detailed understanding of this process could lead to new ways to interrupt it.”

    Not so fragile

    Different classes of bacteria use different types of pili to transfer genes in a process called conjugation. A classic experiment seemed to show that this process was fragile and could be interrupted by agitation, but this left a mystery: why do so many bacteria living in harsh conditions like guts use these systems if they are so fragile?

    The team therefore set out to test this assumption. By shaking E. coli bacteria while they used F-pili during conjugation, they discovered that agitation actually increased the efficiency of gene transfer between bacteria. They also observed that after transferring genes, the conjugated bacteria in shaken conditions clumped together more easily to form biofilms, which protect inner bacteria from the surrounding antibiotic molecules.

    To determine how the F-pili are able to do this, the team subjected them to a strength test by mounting a bacterium on a stage, connecting a glass bead using ‘molecular tweezers’ to the end of one of its F-pili, and pulling. The F-pili proved highly elastic, with spring-like properties that prevented them from breaking.

    They also tested the F-pili’s ability to withstand other common gut conditions, subjecting them to sodium hydroxide, urea, and excessively high temperatures of 100°C – all of which the F-pili survived.

    Molecular properties

    The team then went a step further, looking at the F-pili on a molecular level to see what gives them these incredible properties. They are primarily made up of F-pilin ‘subunits’ with interlinked phospholipid molecules.

    By modelling the F-pili without the phospholipids, the team showed how important these molecules are for the structure’s springiness and elastic strength. Repeating the pulling experiment revealed that the subunits quickly disassemble without the phospholipids supporting them, proving their novel role as a ‘molecular glue’ in long biopolymers.

    Lead researcher Dr Tiago Costa, from the Department of Life Sciences at Imperial, said: “Making F-pili is very costly to the bacteria in terms of resources and energy, so it’s no surprise they are worth the effort. We have shown how F-pili accelerate the spread of antibiotic resistance and biofilm formation in turbulent environments, but the challenge now is to find ways to combat this very efficient process.”

    While it would be advantageous to break F-pili in pathogenic bacteria, their properties could be helpful if we can engineer them for use in, for example, drug delivery. Patkowski explained: “It’s hard to find a tubular appendage with such strong properties. Bacteria use it to transfer genes, but if we could mimic these properties, we could use similar structures to precisely deliver drugs where they are needed in the body.”

    https://www.imperial.ac.uk/news/244513/gut-bacteria-superpolymers-dodge-antibiotics/

    [ad_2]

    Imperial College London

    Source link

  • When ‘good genes’ go bad: how sexual conflict can cause population collapse

    When ‘good genes’ go bad: how sexual conflict can cause population collapse

    [ad_1]

    Newswise — Males of a species evolving traits for sexual conflict can cause problems for females, and, ultimately, the whole population.

    A new model by Imperial College London and University of Lausanne researchers, published in Proceedings of the National Academy of Sciences, shows how so-called ‘good genes’ can sometimes cause a population to collapse.

    Males of any species may compete for females, either by fighting other males for access or impressing females to win their approval. In both cases, males expressing the most competitive traits – such as the best ornaments, like peacock feathers, or the best weapons, like big body size – access more females.

    To have the best traits the males must be in good condition, for example to be in better shape or carry less disease. Over time, as better-condition males mate with more females, the prevalence of ‘good genes’ increases throughout the population of the animal, leading to the population as a whole to improve in condition.

    However, it can also backfire. Traits than improve a male’s competitive prowess can also damage females. For example, some insect males have evolved penises that tear the females’ insides, and in many species, including mammals, males have evolved to harass females to induce mating. These behaviours reduce female fecundity or may even kill them.

    The team’s model tested theories of sexual competition where males harm females, and compared the results with data for various population experiments. Previous experiments have shown conflicting accounts as to whether sexual selection is positive or negative for the population as a whole. The new model provides an explanation for why some experiments show male condition improving, without female fitness or population viability improving alongside.

    First author Dr Ewan Flintham, from Imperial College London and the University of Lausanne, said: “Where males evolve selfish traits that help them individually win, they can actually end up causing the population to crash – it’s a form of evolutionary suicide. Even when females evolve to counter male harm and prevent population collapse, the population still decreases significantly, reducing its viability.”

    Sexual interactions like these are an important component of understanding population demographics and conservation. For example, where there are more males, sexual competition intensifies, meaning harm towards females is more likely. This is also true in human-managed populations, for example domestic carp, where males and females must be isolated during spawning season.

    Dr Flintham completed the research as part of the Centre for Doctoral Training in Quantitative and Modelling Skills in Ecology and Evolution at Imperial. His project supervisor and study co-author Professor Vincent Savolainen, Director of the Georgina Mace Centre for the Living Planet at Imperial, said: “Male harm evolved in nature as something that was supposed to be good, but is detrimental to females and the whole population. Questions like how and why this happens can only be answered with quantitative methods – data and mathematical models – which can be just as important as field studies.”

    [ad_2]

    Imperial College London

    Source link

  • We Have a Mink Problem

    We Have a Mink Problem

    [ad_1]

    Bird flu, at this point, is somewhat of a misnomer. The virus, which primarily infects birds, is circulating uncontrolled around much of the world, devastating not just birds but wide swaths of the animal kingdom. Foxes, bobcats, and pigs have fallen ill. Grizzly bears have gone blind. Sea creatures, including seals and sea lions, have died in great numbers.

    But none of the sickened animals has raised as much concern as mink. In October, a bird-flu outbreak erupted at a Spanish mink farm, killing thousands of the animals before the rest were culled. It later became clear that the virus had spread between the animals, picking up a mutation that helped it thrive in mammals. It was likely the first time that mammal-to-mammal spread drove a huge outbreak of bird flu. Because mink are known to spread certain viruses to humans, the fear was that the disease could jump from mink to people. No humans got sick from the outbreak in Spain, but other infections have spread from mink to humans before: In 2020, COVID outbreaks on Danish mink farms led to new mink-related variants that spread to a small number of humans.

    As mammals ourselves, we have good reason to be concerned. Outbreaks on crowded mink farms are an ideal scenario for bird flu to mutate. If, in doing so, it picks up the ability to spread between humans, it could potentially start another global pandemic. “There are many reasons to be concerned about mink,” Tom Peacock, a flu researcher at Imperial College London, told me. Right now, mink are a problem we can’t afford to ignore.

    For two animals with very different body types, mink and humans have some unusual similarities. Research suggests that we share similar receptors for COVID, bird flu, and human flu, through which these viruses can gain entry into our bodies. The numerous COVID outbreaks on mink farms during the early pandemic, and the bird-flu outbreak in Spain, gravely illustrate this point. It’s “not surprising” that mink can get these respiratory diseases, James Lowe, a veterinary-medicine professor at the University of Illinois at Urbana-Champaign, told me. Mink are closely related to ferrets, which are so well known for their susceptibility to human flu that they’re the go-to model for flu research.

    Mink wouldn’t get sick as often, and wouldn’t be as big an issue for humans, if we didn’t keep farming them for fur in the perfect conditions for outbreaks. Many barns used to raise mink are partially open-air, making it easy for infected wild birds to come in contact with the animals, sharing not only air but potentially food. Mink farms are also notoriously cramped: The Spanish farm, for example, kept tens of thousands of mink in about 30 barns. Viral transmission would be all but guaranteed in those conditions, but the animals are especially vulnerable. Because mink are normally solitary creatures, they face significant stress in packed barns, which may further predispose them to disease, Angela Bosco-Lauth, a biomedical-sciences professor at Colorado State University, told me. And because they’re often inbred so their coats look alike, an entire population may share a similar genetic susceptibility to disease. The frequency of outbreaks among mink, Bosco-Lauth said, “may actually have less to do with the animals and more to do with the fact that we raise them in the same way … we would an intensive cattle farm or chickens.”

    So far, there’s no evidence that mink from the Spanish farm spread bird flu to humans: None of the workers tested positive for the virus, and since then, no other mink farms have reported outbreaks. “We’re just not very susceptible” to bird flu, Lowe said. Our bird-flu receptors are tucked deep in our lungs, but when we’re exposed, most of the virus gets caught in the nose, throat, and other parts of the upper respiratory tract. This is why bird-flu infection is less common in people but is often pneumonia-level severe when it does happen. Indeed, a few humans have gotten sick and died from bird flu in the 27 years that the current strain of bird flu, known as H5N1, has circulated. This month, a girl in Cambodia died from the virus after potentially encountering a sick bird. The more virus circulating in an environment, the higher the chances a person will get infected. “It’s a dose thing,” Lowe said.

    But our susceptibility to bird flu could change. Another mink outbreak would give the virus more opportunities to keep mutating. The worry is that this could create a new variant that’s better at binding to the human flu receptors in our upper respiratory tract, Stephanie Seifert, a professor at Washington State University who studies zoonotic pathogens, told me. If the virus gains the ability to infect the nose and throat, Peacock, at Imperial College London, said, it would be better at spreading. Those mutations “would worry us the most.” Fortunately, the mutations that arose on the Spanish mink farm “were not as bad as many of us worried about,” he added, “but that doesn’t mean that the next time this happens, this will also be the case.”

    Because mink carry the receptors for both bird flu and human flu, they could serve as “mixing vessels” for the viruses to combine, researchers wrote in 2021. (Ferrets, pigs, and humans share this quality too.) Through a process called reassortment, flu viruses can swap segments of their genome, resulting in a kind of Frankenstein pathogen. Although viruses remixed in this way aren’t necessarily more dangerous, they could be, and that’s not a risk worth taking. “The previous three influenza pandemics all arose due to mixing between avian and human influenza viruses,” Peacock said.

    While there are good reasons to be concerned about mink, it is hard to gauge just how concerned we should be—especially given what we still don’t know about this changing virus. After the death of the young girl in Cambodia, the World Health Organization called the global bird flu situation “worrying,” while the CDC maintains that the risk to the public is low. Lowe said “it’s certainly not very risky” that bird flu will spill over into humans, but is worth keeping an eye on. H5N1 bird flu is not new, he added, and it hasn’t affected people en masse yet. But the virus has already changed in ways that make it better at infecting wild birds, and as it spreads in the wild, it may continue to change to better infect mammals, including humans. “We don’t understand enough to make strong predictions of public-health risk,” Jonathan Runstadler, an infectious-diseases professor at Tufts University, told me.

    As bird flu continues to spread among birds and in domestic and wild animal populations, it will only become harder to control. The virus, formally seasonal, is already present year-round in parts of Europe and Asia, and it is poised to do the same in the Americas. Breaking the chain of transmission is vital to preventing another pandemic. An important step is to avoid situations where humans, mink, or any other animal could be infected with both human and bird flu at the same time.

    Since the COVID outbreaks, mink farms have generally beefed up their biosecurity: Farm workers are often required to wear masks and protective gear, such as disposable overalls. To limit the risk to mink—and other susceptible hosts—farms need to reduce their size and density, reduce contact between mink and wild birds, and monitor the virus, Runstadler said. Some nations, including Mexico, Ecuador, have recently embraced bird-flu vaccines for poultry in light of the outbreaks. H5N1 vaccines are also available for humans, though they aren’t readily available.  Still, one of the most obvious options is to shut mink farms down. “We probably should have done that after SARS-CoV-2,” Bosco-Lauth, at Colorado State, said. Doing so is controversial, however, because the global mink industry is valuable, with a huge market in China. Denmark, which produces up to 40 percent of the world’s mink pelts, temporarily banned mink breeding in 2020 after a spate of COVID outbreaks, but the ban expired last month, and farms are returning, albeit in a limited capacity.

    But mink  are far from the only animal that poses a bird-flu risk to humans. “Frankly, with what we’re seeing with other wildlife species, there really aren’t any mammals that I would discount at this point in time,” Bosco-Lauth said. Any mammal species repeatedly infected by the virus is a potential risk, including marine mammals, such as seals. But we should be most concerned about the ones humans frequently come into close contact with, especially animals that are raised in high density, such as pigs, Runstadler said. This doesn’t pose just a human public-health concern, he said, but the potential for “ecological disruption.” Bird flu can be a devastating disease for wildlife, killing animals swiftly and without mercy.

    Whether bird flu makes the jump into humans, it isn’t the last virus that will threaten us—or mink. The era we live in has become known as the “Pandemicene,” as my colleague Ed Yong has called it, one defined by the regular spillover of viruses into humans, caused by our disruption of the normal trajectories of viral movement in nature. Mink may never pass bird flu to us. But that doesn’t mean they won’t be a risk the next time a novel influenza or coronavirus comes around. Doing nothing about mink essentially means choosing luck as a public-health strategy. Sooner or later, it will run out.

    [ad_2]

    Yasmin Tayag

    Source link

  • Meteorites reveal likely origin of Earth’s volatile chemicals

    Meteorites reveal likely origin of Earth’s volatile chemicals

    [ad_1]

    Newswise — By analysing meteorites, Imperial researchers have uncovered the likely far-flung origin of Earth’s volatile chemicals, some of which form the building blocks of life. 

    They found that around half the Earth’s inventory of the volatile element zinc came from asteroids originating in the outer Solar System – the part beyond the asteroid belt that includes the planets Jupiter, Saturn, and Uranus. This material is also expected to have supplied other important volatiles such as water. 

    Volatiles are elements or compounds that change from solid or liquid state into vapour at relatively low temperatures. They include the six most common elements found in living organisms, as well as water. As such, the addition of this material will have been important for the emergence of life on Earth. 

    Prior to this, researchers thought that most of Earth’s volatiles came from asteroids that formed closer to the Earth. The findings reveal important clues about how Earth came to harbour the special conditions needed to sustain life. 

    Senior author Professor Mark Rehkämper, of Imperial College London’s Department of Earth Science and Engineering, said: “Our data show that about half of Earth’s zinc inventory was delivered by material from the outer Solar System, beyond the orbit of Jupiter. Based on current models of early Solar System development, this was completely unexpected.” 

    Previous research suggested that the Earth formed almost exclusively from inner Solar System material, which researchers inferred was the predominant source of Earth’s volatile chemicals. In contrast, the new findings suggest the outer Solar System played a bigger role than previously thought. 

    Professor Rehkämper added: “This contribution of outer Solar System material played a vital role in establishing the Earth’s inventory of volatile chemicals. It looks as though without the contribution of outer Solar System material, the Earth would have a much lower amount of volatiles than we know it today – making it drier and potentially unable to nourish and sustain life.” 

    The findings are published today in Science

    To carry out the study, the researchers examined 18 meteorites of varying origins – eleven from the inner Solar System, known as non-carbonaceous meteorites, and seven from the outer Solar System, known as carbonaceous meteorites.  

    For each meteorite they measured the relative abundances of the five different forms – or isotopes – of zinc. They then compared each isotopic fingerprint with Earth samples to estimate how much each of these materials contributed to the Earth’s zinc inventory. The results suggest that while the Earth only incorporated about ten per cent of its mass from carbonaceous bodies, this material supplied about half of Earth’s zinc. 

    The researchers say that material with a high concentration of zinc and other volatile constituents is also likely to be relatively abundant in water, giving clues about the origin of Earth’s water. 

    First author on the paper Rayssa Martins, PhD candidate at the Department of Earth Science and Engineering, said: “We’ve long known that some carbonaceous material was added to the Earth, but our findings suggest that this material played a key role in establishing our budget of volatile elements, some of which are essential for life to flourish.” 

    Next the researchers will analyse rocks from Mars, which harboured water 4.1 to 3 billion years ago before drying up, and the Moon. Professor Rehkämper said: “The widely held theory is that the Moon formed when a huge asteroid smashed into an embryonic Earth about 4.5 billion years ago. Analysing zinc isotopes in moon rocks will help us to test this hypothesis and determine whether the colliding asteroid played an important part in delivering volatiles, including water, to the Earth.” 

    This work was funded by the Science and Technology Facilities Council (STFC – part of UKRI) and Rayssa Martins is funded by an Imperial College London Presidents’ PhD Scholarship. 

    [ad_2]

    Imperial College London

    Source link

  • Scars mended using transplanted hair follicles in Imperial College London study

    Scars mended using transplanted hair follicles in Imperial College London study

    [ad_1]

    Newswise — In a new study involving three volunteers, skin scars began to behave more like uninjured skin after they were treated with hair follicle transplants. The scarred skin harboured new cells and blood vessels, remodelled collagen to restore healthy patterns, and even expressed genes found in healthy unscarred skin.  

    The findings could lead to better treatments for scarring both on the skin and inside the body, leading to hope for patients with extensive scarring, which can impair organ function and cause disability.  

    Lead author Dr Claire Higgins, of Imperial’s Department of Bioengineering, said: “After scarring, the skin never truly regains its pre-wound functions, and until now all efforts to remodel scars have yielded poor results. Our findings lay the foundation for exciting new therapies that can rejuvenate even mature scars and restore the function of healthy skin.” 

    The research is published today in Nature Regenerative Medicine. 

    Hope in hair 

    Scar tissue in the skin lacks hair, sweat glands, blood vessels and nerves, which are vital for regulating body temperature and detecting pain and other sensations. Scarring can also impair movement as well as potentially causing discomfort and emotional distress. 

    Compared to scar tissue, healthy skin undergoes constant remodelling by the hair follicle. Hairy skin heals faster and scars less than non-hairy skin– and hair transplants had previously been shown to aid wound healing. Inspired by this, the researchers hypothesised that transplanting growing hair follicles into scar tissue might induce scars to remodel themselves. 

    To test their hypothesis, Imperial researchers worked with Dr Francisco Jiménez, lead hair transplant surgeon at the Mediteknia Clinic and Associate Research Professor at University Fernando Pessoa Canarias, in Gran Canaria, Spain. They transplanted hair follicles into the mature scars on the scalp of three participants in 2017. The researchers selected the most common type of scar, called normotrophic scars, which usually form after surgery. 

    They took and microscope imaged 3mm-thick biopsies of the scars just before transplantation, and then again at two, four, and six months afterwards. 

    The researchers found that the follicles inspired profound architectural and genetic shifts in the scars towards a profile of healthy, uninjured skin. 

    Dr Jiménez said: “Around 100 million people per year acquire scars in high-income countries alone, primarily as a result of surgeries. The global incidence of scars is much higher and includes extensive scarring formed after burn and traumatic injuries. Our work opens new avenues for treating scars and could even change our approach to preventing them.” 

    Architects of skin 

    After transplantation, the follicles continued to produce hair and induced restoration across skin layers. 

    Scarring causes the outermost layer of skin – the epidermis – to thin out, leaving it vulnerable to tears. At six months post-transplant, the epidermis had doubled in thickness alongside increased cell growth, bringing it to around the same thickness as uninjured skin.  

    The next skin layer down, the dermis, is populated with connective tissue, blood vessels, sweat glands, nerves, and hair follicles. Scar maturation leaves the dermis with fewer cells and blood vessels, but after transplantation the number of cells had doubled at six months, and the number of vessels had reached nearly healthy-skin levels by four months. This demonstrated that the follicles inspired the growth of new cells and blood vessels in the scars, which are unable to do this unaided. 

    Scarring also increases the density of collagen fibres – a major structural protein in skin – which causes them to align such that scar tissue is stiffer than healthy tissue. The hair transplants reduced the density of the fibres, which allowed them to form a healthier, ‘basket weave’ pattern, which reduced stiffness – a key factor in tears and discomfort. 

    The authors also found that after transplantation, the scars expressed 719 genes differently to before. Genes that promote cell and blood vessel growth were expressed more, while genes that promote scar-forming processes were expressed less. 

    Multi-pronged approach 

    The researchers are unsure precisely how the transplants facilitated such a change. In their study, the presence of a hair follicle in the scar was cosmetically acceptable as the scars were on the scalp. They are now working to uncover the underlying mechanisms so they can develop therapies that remodel scar tissue towards healthy skin, without requiring transplantation of a hair follicle and growth of a hair fibre. They can then test their findings on non-hairy skin, or on organs like the heart, which can suffer scarring after heart attacks, and the liver, which can suffer scarring through fatty liver disease and cirrhosis. 

    Dr Higgins said: “This work has obvious applications in restoring people’s confidence, but our approach goes beyond the cosmetic as scar tissue can cause problems in all our organs. 

    “While current treatments for scars like growth factors focus on single contributors to scarring, our new approach tackles multiple aspects, as the hair follicle likely delivers multiple growth factors all at once that remodel scar tissue. This lends further support to the use of treatments like hair transplantation that alter the very architecture and genetic expression of scars to restore function.” 

    This work was funded by the Medical Research Council and Engineering and Physical Sciences Research Council (part of UKRI). 

    [ad_2]

    Imperial College London

    Source link

  • Recycled gold from SIM cards could help make drugs more sustainable

    Recycled gold from SIM cards could help make drugs more sustainable

    [ad_1]

    Newswise — Researchers have used gold extracted from electronic waste as catalysts for reactions that could be applied to making medicines.

    Re-using gold from electronic waste prevents it from being lost to landfill, and using this reclaimed gold for drug manufacture reduces the need to mine new materials. Current catalysts are often made of rare metals, which are extracted using expensive, energy-intensive and damaging mining processes.

    The method for extracting gold was developed by researchers at the University of Cagliari in Italy and the process for using the recovered gold was developed by researchers at Imperial College London. The study is published in ACS Sustainable Chemistry & Engineering.

    Waste electrical and electronic equipment (WEEE) is typically sent to landfill, as separating and extracting the components requires a lot of energy and harsh chemicals, undermining its economic viability. However, WEEE contains a wealth of metals that could be used in a range of new products.

    Finding ways to recover and use these metals in a low-cost, low-energy and non-toxic way is therefore crucial for making our use of electronic goods more sustainable.

    Lead researcher Professor James Wilton-Ely, from the Department of Chemistry at Imperial, said: “It is shocking that most of our electronic waste goes to landfill and this is the opposite of what we should be doing to curate our precious elemental resources. Our approach aims to reduce the waste already within our communities and make it a valuable resource for new catalysts, thereby also reducing our dependence on environmentally damaging mining practices.”

    “We are currently paying to get rid of electronic waste, but processes like ours can help reframe this ‘waste’ as a resource. Even SIM cards, which we routinely discard, have a value and can be used to reduce reliance on mining and this approach has the potential to improve the sustainability of processes such as drug manufacture.”

    Professors Angela Serpe and Paola Deplano, from the University of Cagliari, developed a low-cost way to extract gold and other valued metals from electronic waste such as printed circuit boards (PCBs), SIM cards and printer cartridges under mild conditions. This patented process involves selective steps for the sustainable leaching and recovery of base metals like nickel, then copper, silver and, finally, gold, using green and safe reagents.

    However, the gold produced from this process is part of a molecular compound and so cannot be re-used again for electronics without investing a lot more energy to obtain the gold metal. Seeking a use for this compound of recovered gold, the team of Professor Wilton-Ely and his colleague, Professor Chris Braddock, investigated whether it could be applied as a catalyst in the manufacture of useful compounds, including pharmaceutical intermediates.

    Catalysts are used to increase the rate of a chemical reaction while remaining unchanged and are used in most processes to produce materials. The team tested the gold compound in a number of reactions commonly used in pharmaceutical manufacture, for example for making anti-inflammatory and pain-relief drugs.

    They found that the gold compound performed as well, or better, than the currently used catalysts, and is also reusable, further improving its sustainability.

    The researchers suggest that making it economically viable to recover gold from electronic waste could create spin-off uses for other components recovered in the process. For example, in the process, copper and nickel are also separated out, as is the plastic itself, with all these components potentially being used in new products.

    Sean McCarthy, the PhD student leading the research in the lab at Imperial, said: “By weight, a computer contains far more precious metals than mined ore, providing a concentrated source of these metals in an ‘urban mine’.”

    Professor Serpe said: “Research like ours aims to contribute to the cost-effective and sustainable recovery of metals by building a bridge between the supply of precious metals from scrap and industrial demand, bypassing the use of virgin raw materials.”

    The teams are working to extend this approach to the recovery and re-use of the palladium content of end-of-life automotive catalytic converters. This is particularly pressing as palladium is widely used in catalysis and is even more expensive than gold.

    [ad_2]

    Imperial College London

    Source link

  • Sea level rise to dramatically speed up erosion of rock coastlines by 2100

    Sea level rise to dramatically speed up erosion of rock coastlines by 2100

    [ad_1]

    Newswise — Rock coasts, which make up over half the world’s coastlines, could retreat more rapidly in the future due to accelerating sea level rise. 

    This is according to new Imperial College London research that modelled likely future cliff retreat rates of two rock coasts in the UK. The forecasts are based on predictions of sea level rise for various greenhouse gas emissions and climate change scenarios.  

    The study found that rock coasts, traditionally thought of as stable compared to sandy coasts and soft cliffs, are likely to retreat at a rate not seen for 3,000-5,000 years.  

    At the UK study sites in Yorkshire and Devon, this will cause rock coast cliffs to retreat by at least 10-22 metres inland. The rate of erosion is likely between three and seven times today’s rate and potentially up to tenfold. 

    Senior author Dr Dylan Rood, of Imperial’s Department of Earth Science said: “Coastal erosion is one of the greatest financial risks to society of any natural hazard. Some rock cliffs are already crumbling, and within the next century, rock coast erosion rates could increase tenfold. Even rock coasts that have been stable in the last hundred years will likely respond to sea level rise by 2030.” 

    Globally, coasts are home to hundreds of millions of people and hundreds of billions of dollars of infrastructure like homes, businesses, nuclear power stations, transport links, and agriculture.  

    The researchers are calling on policymakers, planners, and insurers to take action to classify rock coasts as high-risk areas in future planning for climate change response, as well as to limit climate change through achieving Net Zero as an immediate priority.  

    Dr Rood added: “Rock coast erosion is irreversible: now is the time to limit future sea level rise before it’s too late. Humanity can directly control the fate of our coastlines by reducing greenhouse gas emissions — the future of our coasts is in our hands.” 

    The research is published today in Nature Communications. 

    A rocky road 

    The new study is the first to validate models of the expected erosion of hard rock coasts from sea level rise using observational data over prehistoric timescales. Previous studies have mostly focused on theoretical models of soft, sandy coasts. The new results suggest that as sea levels continue to rise, the rate of rock coastal erosion will also accelerate. 

    To study the future rate of erosion, the researchers looked at past and present cliff retreat rates on the coastlines near Scalby in Yorkshire and Bideford in Devon, finding that by 2100 they will likely retreat by 13-22m and 10-14m, respectively.  

    They collected rock samples and analysed them for rare isotopes called cosmogenic radionuclides (CRNs) that build up in rocks exposed to cosmic rays. Concentrations of CRNs in rock reveal how quickly, and for how long, the rock has been exposed, reflecting the rate of erosion and retreat. 

    They combined these data with observed coastal topography to calibrate a model that tracks the evolution of these rock coasts over time, before comparing them with rates of past sea level change dating back 8000 years. They found that the rate of coastal erosion on these two sites has closely matched the rate of sea level rise.  

    The researchers say this is clear evidence of a causal relationship between cliff retreat and sea level from which future forecasts can be made, and that rock coasts are more sensitive to sea level rise than previously thought. The findings, they say, could be applied to rock coasts worldwide because the rock type is common globally, and similar hard rock coasts are likely to respond in a similar way to sea level rise. 

    Lead author Dr Jennifer Shadrick, who conducted the work in Imperial’s Department of Earth Science and Engineering as a member of the NERC Science & Solutions for a Changing Planet Doctoral Training Partnership, and now works in the marine and coastal risk management team at JBA Consulting, said: “Sea level rise is accelerating, and our results confirm that rock coast retreat will accelerate in line with this. It isn’t a matter of if, but when. 

    “The more positive news is that, now that we have a better idea of magnitudes and timescales, we can adapt accordingly. The more data we have on the effects of climate change on sea level rise and coastal erosion, the more we can prepare by championing urgent policies that protect coasts and their communities.” 

    Sea level rise 

    As the climate warms, sea levels are forecast to rise one metre by 2100 unless greenhouse gas emissions are reduced. 

    This study is the first to confirm with observational data that the rate of past coastal erosion followed the rate of sea level rise over prehistoric timescales. The researchers say this erosion was driven by waves, which will likely get larger and more forceful as future sea level rises, and more land is given over to the sea. 

    While this study looked at the effects of sea level rise, it did not account for the effects of stronger storms, which some studies forecast will happen more frequently due to climate change. Next, the researchers will adapt their model to also forecast the rate of cliff retreat for softer rock coasts, such as chalk. 

    Dr Rood said: “Our study did not account for the effect of increased storms, which may become stronger and more frequent in the future as the climate changes, on wave-driven cliff erosion. However, increased storms would only speed up the cliff retreat even more than our forecasts. This is another angle to the climate crisis we will account for in future studies to give a more complete picture of likely rates of rock coast erosion. We are also looking to improve our models for softer rock coasts where erosion other than by waves is more important.” 

    Dr Shadrick said: “The findings are a stark warning that we must better adapt to coastal retreat or face the loss of the people, homes, and infrastructure that call coastal areas home.” 

    Study co-author Dr Martin Hurst at the University of Glasgow said: “The implication is that rock coasts are more sensitive to sea level rise than previously thought. We need to pay more attention to how our rock coasts continue to erode as sea levels rise. 

    “Heightened erosion risks at our coasts will continue throughout this century. Even if we achieve Net Zero tomorrow, a substantive amount of sea level rise is already baked in as our climate, glaciers and oceans continue to respond to the emissions that have already taken place.”

    This study was funded by the Natural Environmental Research Council (NERC), the British Geological Survey (BGS), and the Australian Nuclear Science and Technology Organisation (ANSTO). 

    [ad_2]

    Imperial College London

    Source link

  • Scientists uncover potential ‘electrical language’ of breast cancer cells

    Scientists uncover potential ‘electrical language’ of breast cancer cells

    [ad_1]

    Newswise — New research has found variable voltages in the membranes of breast cancer cells, revealing clues about how they grow and spread.

    The research, led by Imperial College London and The Institute of Cancer Research, London, could help us better understand how cancer cells ‘decide’ when to multiply and where to spread to.

    When cells become cancerous, they undergo a series of bioelectric changes. For example, the layer surrounding cells, called the cell membrane, becomes more positively charged than healthy cell membranes.

    This new research, published today in Communications Biology, found that as well as the membrane voltage being higher than in healthy cells, it also fluctuates over time – with breast cancer cells behaving much like neurons. The researchers believe this could indicate an electrical communications network between cancer cells that could in future be a target for disruption, creating possible new treatments.

    Co-lead author Dr Amanda Foust, from Imperial’s Department of Bioengineering, said: “When healthy cells become cancerous, the changes they undergo can help them to grow and spread. We know, for example, that certain genes that control cell multiplication can switch off, causing uncontrolled cell growth.

    “We don’t yet know why the voltage of membranes fluctuates in cancer cells – but our discovery and technology, enabled by the exciting collaboration of engineers and biologists, opens doors to further work that could help us better understand cancer signalling networks and growth.”

    Testing the network

    To test the voltages, the researchers grew cells from eight breast cancer cell lines and one healthy breast cell line. They then recorded the voltages of their cell membranes with a microscope originally engineered to film electrical activity in brain cells, before using machine learning to categorise and characterise the signals.

    Unexpectedly, they found fluctuations in the voltage of the cancer cell membranes. Though more research is needed, the researchers suspect the ‘blinking’ and ‘waving’ electrical signals might be a form of communication between cells.

    They added tetrodotoxin, a potent neurotoxin that blocks sodium channels to prevent the generation of electrical charge in nerve cells. Previous studies had shown that cancer cells rely on these sodium channels to become more invasive.

    They found that, similarly to its effect on nerve cells, tetrodotoxin suppressed the voltage fluctuations in cancer cells. The researchers say this could potentially indicate new treatment avenues for blocking cancer cell communication and behaviour.

    Co-lead author Professor Chris Bakal, Professor of Cancer Morphodynamics at The Institute of Cancer Research, London, said: “This is the first time we’ve observed such rapid fluctuations in electrical activity within breast cancer cells. It looks like breast cancer cells have established a type of electrical language. We still don’t know how complex the language is, but it could allow cancer cells to relay information about nearby nutrients or hostile environments across large distances, and ultimately promote tumour survival.”

    To further test their findings, they induced cancer in the healthy cell line before recording them again. They found that once these cells had become cancerous, the voltage of their membranes was also fluctuating.

    The level of electrical signals varied across cancer types. The more aggressive and untreatable cancer cell lines featured more frequent fluctuations, with signals sometimes appearing as a wave traveling from cell to cell.

    Co-author Emeritus Professor Mustafa Djamgoz at Imperial’s Department of Life Sciences said: “Of all the cells in the body, we usually associate ‘excitable’ brain or heart cells with electrical activity. Our research suggests a hidden electrical signalling network among cancer cells that might play a key role in cancer cell behaviour including communication with each other and other cells within the tumour. We know already that the spreading of cancer, the main cause of death from cancer, is facilitated by electrical activity.”

    Professor Bakal added: “We think these networks may even allow cancer cells to form brain-like structures that allow cancer cells to act together as a single machine, rather than as individual units.”

    Connecting the clues

    The researchers are now working to identify and unpick the potential links between cell membrane voltage and the behaviour of cancer cells, to see if they can be cut. Professor Bakal said: “If you can stop cancer cells communicating with one another, they could become easier to treat. It’s not so dissimilar from a war. If you can stop a commander from relaying information to soldiers at the front, the battle becomes easier to win.”

    Dr Foust said: “We are now investigating the role of voltage in cancer cell behaviour. Do cancer cells clone themselves and multiply as their voltage fluctuates in a certain pattern, or break off to invade other body parts? Can we use this knowledge to interject at a particular stage of fluctuation to prevent cancer spread? These are key questions we hope to answer with our ongoing work.”

    This study was funded by Integrated Biological Imaging Network, the Royal Academy of Engineering, the Biotechnology and Biology Research Council (BBRC, part of UKRI), Wellcome Trust, Engineering and Physical Sciences Research Council (EPSRC, part of UKRI), Cancer Research UK and Stand Up to Cancer UK.

    [ad_2]

    Imperial College London

    Source link

  • New Flexible, Steerable Device Placed in Live Brains by Minimally Invasive Robot

    New Flexible, Steerable Device Placed in Live Brains by Minimally Invasive Robot

    [ad_1]

    Newswise — The early-stage research tested the delivery and safety of the new implantable catheter design in two sheep to determine its potential for use in diagnosing and treating diseases in the brain.  

    If proven effective and safe for use in people, the platform could simplify and reduce the risks associated with diagnosing and treating disease in the deep, delicate recesses of the brain.   

    It could help surgeons to see deeper into the brain to diagnose disease, deliver treatment like drugs and laser ablation more precisely to tumours, and better deploy electrodes for deep brain stimulation in conditions such as Parkinson’s and epilepsy.  

    Senior author Professor Ferdinando Rodriguez y Baena, of Imperial’s Department of Mechanical Engineering, led the European effort and said: “The brain is a fragile, complex web of tightly packed nerve cells that each have their part to play. When disease arises, we want to be able to navigate this delicate environment to precisely target those areas without harming healthy cells.  

    “Our new precise, minimally invasive platform improves on currently available technology and could enhance our ability to safely and effectively diagnose and treat diseases in people, if proven to be safe and effective.” 

    Developed as part of the Enhanced Delivery Ecosystem for Neurosurgery in 2020 (EDEN2020) project, the findings are published in PLOS ONE. 

    Stealth Surgery  

    The platform improves on existing minimally invasive, or ‘keyhole’, surgery, where surgeons deploy tiny cameras and catheters through small incisions in the body.   

    It includes a soft, flexible catheter to avoid damaging brain tissue while delivering treatment, and an artificial intelligence (AI)-enabled robotic arm to help surgeons navigate the catheter through brain tissue.   

    Inspired by the organs used by parasitic wasps to stealthily lay eggs in tree bark, the catheter consists of four interlocking segments that slide over one another to allow for flexible navigation. 

    It connects to a robotic platform that combines human input and machine learning to carefully steer the catheter to the disease site. Surgeons then deliver optical fibres via the catheter so they can see and navigate the tip along brain tissue via joystick control. 

    The AI platform learns from the surgeon’s input and contact forces within brain tissues to guide the catheter with pinpoint accuracy. 

    Compared to traditional ‘open’ surgical techniques, the new approach could eventually help to reduce tissue damage during surgery, and improve patient recovery times and length of post-operative hospital stays. 

    While performing minimally invasive surgery on the brain, surgeons use deeply penetrating catheters to diagnose and treat disease. However, currently used catheters are rigid and difficult to place precisely without the aid of robotic navigational tools. The inflexibility of the catheters combined with the intricate, delicate structure of the brain means catheters can be difficult to place precisely, which brings risks to this type of surgery.   

    To test their platform, the researchers deployed the catheter in the brains of two live sheep at the University of Milan’s Veterinary Medicine Campus. The sheep were given pain relief and monitored for 24 hours a day over a week for signs of pain or distress before being euthanised so that researchers could examine the structural impact of the catheter on brain tissue.  

    They found no signs of suffering, tissue damage, or infection following catheter implantation.   

    Lead author Dr Riccardo Secoli, also from Imperial’s Department of Mechanical Engineering, said: “Our analysis showed that we implanted these new catheters safely, without damage, infection, or suffering. If we achieve equally promising results in humans, we hope we may be able to see this platform in the clinic within four years.   

    “Our findings could have major implications for minimally invasive, robotically delivered brain surgery. We hope it will help to improve the safety and effectiveness of current neurosurgical procedures where precise deployment of treatment and diagnostic systems is required, for instance in the context of localised gene therapy.”  

    Professor Lorenzo Bello, study co-author from the University of Milan, said: “One of the key limitations of current MIS is that if you want to get to a deep-seated site through a burr hole in the skull, you are constrained to a straight-line trajectory. The limitation of the rigid catheter is its accuracy within the shifting tissues of the brain, and the tissue deformation it can cause. We have now found that our steerable catheter can overcome most of these limitations.” 

    This study was funded by the EU Horizon 2020 programme.  

    Modular robotic platform for precision neurosurgery with a bio-inspired needle: system overview and first in-vivo deployment” by Riccardo Secoli, Eloise Matheson, Marlene Pinzi, Stefano Galvan, Abdulhamit Donder, Thomas Watts, Marco Riva, Davide Zani, Lorenzo Bello, and Ferdinando Rodriguez y Baena. Published 19 October 2022 in PLOS ONE. 

    [ad_2]

    Imperial College London

    Source link