ReportWire

Tag: University of Cambridge

  • Should You Blame Cannabis For Feeling Blah

    Should You Blame Cannabis For Feeling Blah

    [ad_1]

    Some people get a case of the blues or the blahs.  But blaming anhedonia on cannabis isn’t really based in science.

    The blues, the blahs, depression, or just feeling sad happens to must people. Relationship issues, bad news, stress and a lack of social outlets emerge over and over in data as major causes of feeling blah. Almost everyone has felt “down in the dumps” at times or had a case of “the blues.” In this state, you may have referred to yourself as feeling depressed. But over 5% suffer from some form clinical depression. But should you blame cannabis for feeling blah?  Marijuana has components which, used in moderation, make you feel happy, but it has gotten a bad reputation.

    There is a myth cannabis use causes Anhedonia. It is the inability to experience joy or pleasure. You may feel numb or less interested in things that you once enjoyed. It’s a common symptom of many mental health conditions like depression. Marijuana use has been accused of trigger this ailment, but the facts are not there to back up the statement.

    RELATED: 5 Morning Activities To Help You Feel Happier

    No one is advocating those under 21 should use marijuana or alcohol, since the brain is still developing.  But marijuana has not be truly proven to cause depression, laziness, or blocking the feeling of pleasure. Research has shown THC in cannabis causes an increase in levels of dopamine, the pleasure chemical, in the brain. Used in moderation, it can have a positive effect.  And, in the right dose, can also relieve anxiety, which tends to help people approach life more positively.

    Photo by Edu Lauton via Unsplash

    Cambridge University has published a paper showing adolescent cannabis users are not more likely to “lack motivation and the ability to enjoy life’s pleasures.”  This shows  the stereotypical cannabis user as often portrayed by the media is not grounded in science. The study was carried out by researchers from UCL, Kings College of London Institute of Psychiatry and Neuroscience, and the University of Cambridge. The results were published in the International Journal of Neuropsychopharmacology. From the research, regular cannabis users had slightly lower scores for anhedonia.

    RELATED: Science Says Medical Marijuana Improves Quality Of Life

    Another study, published in The International Journal of Neuropsychopharmacology, showed a null relationship between anhedonia and regular cannabis use. The researchers used data from an earlier study that had investigated cannabis use in teens, called the “CannTeen study.”

    Researchers examined 274 participants including adults (26-29 years) and adolescents (16-17 years). The participants were regular cannabis users who had used cannabis in the last three months, with an average use of four times per week. The Snaith Hamilton Pleasure Scale was used to measure anhedonia while the Apathy Evaluation Scale was used to measure apathy.

    The results showed that the control group (those who didn’t use cannabis or didn’t use it regularly) had higher levels of anhedonia. This was quite surprising and contrary to the widely held belief that regular cannabis use diminishes one’s enthusiasm for life.

    [ad_2]

    Amy Hansen

    Source link

  • One of Tuberculosis’s Biggest, Scariest Numbers Is Probably Wrong

    One of Tuberculosis’s Biggest, Scariest Numbers Is Probably Wrong

    [ad_1]

    Growing up in India, which for decades has clocked millions of tuberculosis cases each year, Lalita Ramakrishnan was intimately familiar with how devastating the disease can be. The world’s greatest infectious killer, rivaled only by SARS-CoV-2, Mycobacterium tuberculosis spreads through the air and infiltrates the airways, in many cases destroying the lungs. It can trigger inflammation in other tissues too, wearing away bones and joints; Ramakrishnan watched her own mother’s body erode in this way. The sole available vaccine was lackluster; the microbe had rapidly evolved resistance to the drugs used to fight it. And the disease had a particularly insidious trait: After entering the body, the bacterium could stow away for years or decades, before erupting without warning into full-blown disease.

    This state, referred to as latency, supposedly afflicted roughly 2 billion people—a quarter of the world’s population. Ramakrishnan, now a TB researcher at the University of Cambridge, heard that fact over and over, and passed it down to her own students; it was what every expert did with the dogma at the time. That pool of 2 billion people was understood to account for a large majority of infections worldwide, and it represented one of the most intimidating obstacles to eradicating the disease. To end TB for good, the thinking went, the world would need to catch and cure every latent case.

    In the years since, Ramakrishnan’s stance on latent TB has shifted quite a bit. Its extent, she argues, has been exaggerated for a good three decades, by at least an order of magnitude—to the point where it has scrambled priorities, led scientists on wild-goose chases, and unnecessarily saddled people with months of burdensome treatment. In her view, the term latency is so useless, so riddled with misinformation, that it should disappear. “I taught that nonsense forever,” she told me; now she’s spreading the word that TB’s largest, flashiest number may instead be its greatest, most persistent myth.

    Ramakrishnan isn’t the only one who thinks so. Together with her colleagues Marcel Behr, of Quebec’s McGill University, and Paul Edelstein, of the University of Pennsylvania (“we call ourselves the three BERs,” Ramakrishnan told me), she’s been on a years-long crusade to set the record straight. Their push has attracted its fair share of followers—and objectors. “I don’t think they’re wrong,” Carl Nathan, a TB researcher at Cornell, told me. “But I’m not confident they’re right.”

    Several researchers told me they’re largely fine with the basic premise of the BERs’ argument: Fewer than 2 billion isn’t that hard to get behind. But how many fewer matters. If current latency estimates overshoot by just a smidge, maybe no practical changes are necessary. The greater the overestimate, though, the more treatment recommendations might need to change; the more research and funding priorities might need to shift; the more plans to control, eliminate, and eventually eradicate disease might need to be wholly and permanently rethought.

    The muddled numbers on latency seem to be based largely on flawed assumptions about certain TB tests. One of the primary ways to screen people for the disease involves pricking harmless derivatives of the bacterium into skin, then waiting for an inflamed lump to appear—a sign that the immune system is familiar with the microbe (or a TB vaccine), but not direct proof that the bacterium itself is present. That means that positive results can guarantee only that the immune system encountered something resembling MTB at some point—perhaps even in the distant past, Rein Houben, an epidemiologist at the London School of Hygiene & Tropical Medicine, told me.

    But for a long time, a prevailing assumption among researchers was that all TB infections had the potential to be lifelong, Behr told me. The thought wasn’t entirely far-fetched: Other microbial infections can last a lifetime, and there are historical accounts of lasting MTB infections, including a case in which a man developed tuberculosis more than 30 years after his father passed the bacterium to him. Following that logic—that anyone once infected had a good enough chance of being infected now—researchers added everyone still reacting to the bug to the pool of people actively battling it. By the end of the 1990s, Behr and Houben told me, prominent epidemiologists had used this premise to produce the big 2 billion number, estimating that roughly a third of the population had MTB lurking within.

    That eye-catching figure, once rooted, rapidly spread. It was repeated in textbooks, academic papers and lectures, news articles, press releases, government websites, even official treatment guidelines. The World Health Organization parroted it too, repeatedly calling for research into vaccines and treatments that could shrink the world’s massive latent-TB cohort. “We were all taught this dogma when we were young researchers,” Soumya Swaminathan, the WHO’s former chief scientist, told me. “Each generation passed it on to the next.”

    But, as the BERs argue, for TB to be a lifelong sentence makes very little sense. Decades of epidemiological data show that the overwhelming majority of disease arises within the first two years after infection, most commonly within months. Beyond that, progression to symptomatic, contagious illness becomes vanishingly rare.

    The trio is convinced that a huge majority of people are clearing the bug from their body rather than letting it lie indefinitely in wait—a notion that recent modeling studies support. If the bacteria were lingering, researchers would expect to see a big spike in disease late in life among people with positive skin tests, as their immune system naturally weakens. They would also expect to see a high rate of progression to full-blown TB among people who start taking immunosuppressive drugs or catch HIV. And yet, neither of those trends pans out: At most, some 5 to 10 percent of people who have tested positive by skin test and later sustain a blow to their immune system develop TB disease within about three to five years—a hint that, for almost everyone else, there may not be any MTB left. “If there were a slam-dunk experiment, that’s it,” William Bishai, a TB researcher at Johns Hopkins, told me.

    Nathan, of Cornell, was less sold. Immunosuppressive drugs and HIV flip very specific switches in the immune system; if MTB is being held in check by multiple branches, losing some immune defenses may not be enough to set the bacteria loose. But most of the experts I spoke with are convinced that lasting cases are quite uncommon. “Some people will get into trouble in old age,” Bouke de Jong, a TB researcher at the Institute of Tropical Medicine, in Antwerp, told me. “But is that how MTB hangs out in everybody? I don’t think so.”

    If anything, people with positive skin tests might be less likely to eventually develop disease, Ramakrishnan told me, whether because they harbor defenses against MTB or because they are genetically predisposed to clear the microbe from their airway. In either case, that could radically change the upshot of a positive test, especially in countries such as the U.S. and Canada, where MTB transmission rarely occurs and most TB cases can be traced from abroad. Traditionally, people in these places with positive skin tests and no overt symptoms have been told, “‘This means you’ve got sleeping bacteria in you,’” Behr said. “‘Any day now, it may pop out and cause harm.’” Instead, he told me, health-care workers should be communicating widely that there could be up to a 95 percent chance that these patients have already cleared the infection, especially if they’re far out from their last exposure and might not need a drug regimen. TB drugs, although safe, are not completely benign: Standard regimens last for months, interact with other meds, and can have serious side effects.

    At the same time, researchers disagree on just how much risk remains once people are a couple of years past an MTB exposure. “We’ve known for decades that we are overtreating people,” says Madhu Pai, a TB researcher at McGill who works with Behr but was not directly involved in his research. But treating a lot of people with positive skin tests has been the only way to ensure that the people who are carrying viable bacteria get the drugs they need, Robert Horsburgh, an epidemiologist at Boston University, told me. That strategy squares, too, with the goal of elimination in places where spread is rare. To purge as much of the bug as possible, “clinicians will err on the side of caution,” says JoAnne Flynn, a TB researcher at the University of Pittsburgh.

    Elsewhere in the world, where MTB transmission is rampant and repeat infections are common, “to be honest, nobody cares if there’s latent TB,” Flynn told me. Many people with very symptomatic, very contagious cases still aren’t getting diagnosed or treated; in too many places, the availability of drugs and vaccines is spotty at best. Elimination remains a long-term goal, but active outbreaks demand attention first. Arguably, quibbling about latency now is like trying to snuff stray sparks next to an untended conflagration.

    One of the BERs’ main goals could help address TB’s larger issues. Despite decades of research, the best detection tools for the disease remain “fundamentally flawed,” says Keertan Dheda, a TB researcher at the London School of Hygiene & Tropical Medicine and the University of Cape Town. A test that could directly detect viable microbes in tissues, rather than an immune proxy, could definitively diagnose ongoing infections and prioritize people across the disease spectrum for treatment. Such a diagnostic would also be the only way to finally end the fuss over latent TB’s prevalence. Without it, researchers are still sifting through only indirect evidence to get at the global TB burden—which is probably still “in the hundreds of millions” of cases, Houben told me, though the numbers will remain squishy until the data improve.

    That 2 billion number is still around—though not everywhere, thanks in part to the BERs’ efforts. The WHO’s most recent annual TB reports now note that a quarter of the world’s population has been infected with MTB, rather than is infected with MTB; the organization has also officially discarded the term latent from its guidance on the disease, Dennis Falzon, of the WHO Global TB Programme, told me in an email. However subtle, these shifts signal that even the world’s biggest authorities on TB are dispensing with what was once conventional wisdom.

    Losing that big number does technically shrink TB’s reach—which might seem to minimize the disease’s impact. Behr argues the opposite. With a huge denominator, TB’s mortality rate ends up minuscule—suggesting that most infections are benign. Deflating the 2 billion statistic, then, reinforces that “this is one of the world’s nastiest pathogens, not some symbiont that we live with in peace,” Behr told me. Fewer people may be at risk than was once thought. But for those who are harboring the microbe, the dangers are that much more real.

    [ad_2]

    Katherine J. Wu

    Source link

  • AI system self-organises to develop features of brains of complex organisms

    AI system self-organises to develop features of brains of complex organisms

    [ad_1]

    Newswise — Cambridge scientists have shown that placing physical constraints on an artificially-intelligent system – in much the same way that the human brain has to develop and operate within physical and biological constraints – allows it to develop features of the brains of complex organisms in order to solve tasks.

    As neural systems such as the brain organise themselves and make connections, they have to balance competing demands. For example, energy and resources are needed to grow and sustain the network in physical space, while at the same time optimising the network for information processing. This trade-off shapes all brains within and across species, which may help explain why many brains converge on similar organisational solutions.

    Jascha Achterberg, a Gates Scholar from the Medical Research Council Cognition and Brain Sciences Unit (MRC CBSU) at the University of Cambridge said: “Not only is the brain great at solving complex problems, it does so while using very little energy. In our new work we show that considering the brain’s problem solving abilities alongside its goal of spending as few resources as possible can help us understand why brains look like they do.”

    Co-lead author Dr Danyal Akarca, also from the MRC CBSU, added: “This stems from a broad principle, which is that biological systems commonly evolve to make the most of what energetic resources they have available to them. The solutions they come to are often very elegant and reflect the trade-offs between various forces imposed on them.”

    In a study published today in Nature Machine Intelligence, Achterberg, Akarca and colleagues created an artificial system intended to model a very simplified version of the brain and applied physical constraints. They found that their system went on to develop certain key characteristics and tactics similar to those found in human brains.

    Instead of real neurons, the system used computational nodes. Neurons and nodes are similar in function, in that each takes an input, transforms it, and produces an output, and a single node or neuron might connect to multiple others, all inputting information to be computed.

    In their system, however, the researchers applied a ‘physical’ constraint on the system. Each node was given a specific location in a virtual space, and the further away two nodes were, the more difficult it was for them to communicate. This is similar to how neurons in the human brain are organised.

    The researchers gave the system a simple task to complete – in this case a simplified version of a maze navigation task typically given to animals such as rats and macaques when studying the brain, where it has to combine multiple pieces of information to decide on the shortest route to get to the end point.

    One of the reasons the team chose this particular task is because to complete it, the system needs to maintain a number of elements – start location, end location and intermediate steps – and once it has learned to do the task reliably, it is possible to observe, at different moments in a trial, which nodes are important. For example, one particular cluster of nodes may encode the finish locations, while others encode the available routes, and it is possible to track which nodes are active at different stages of the task.

    Initially, the system does not know how to complete the task and makes mistakes. But when it is given feedback it gradually learns to get better at the task. It learns by changing the strength of the connections between its nodes, similar to how the strength of connections between brain cells changes as we learn. The system then repeats the task over and over again, until eventually it learns to perform it correctly.

    With their system, however, the physical constraint meant that the further away two nodes were, the more difficult it was to build a connection between the two nodes in response to the feedback. In the human brain, connections that span a large physical distance are expensive to form and maintain.

    When the system was asked to perform the task under these constraints, it used some of the same tricks used by real human brains to solve the task. For example, to get around the constraints, the artificial systems started to develop hubs – highly connected nodes that act as conduits for passing information across the network.

    More surprising, however, was that the response profiles of individual nodes themselves began to change: in other words, rather than having a system where each node codes for one particular property of the maze task, like the goal location or the next choice, nodes developed a flexible coding scheme. This means that at different moments in time nodes might be firing for a mix of the properties of the maze. For instance, the same node might be able to encode multiple locations of a maze, rather than needing specialised nodes for encoding specific locations. This is another feature seen in the brains of complex organisms.

    Co-author Professor Duncan Astle, from Cambridge’s Department of Psychiatry, said: “This simple constraint – it’s harder to wire nodes that are far apart – forces artificial systems to produce some quite complicated characteristics. Interestingly, they are characteristics shared by biological systems like the human brain. I think that tells us something fundamental about why our brains are organised the way they are.”

     

    Understanding the human brain

    The team are hopeful that their AI system could begin to shed light on how these constraints, shape differences between people’s brains, and contribute to differences seen in those that experience cognitive or mental health difficulties.

    Co-author Professor John Duncan from the MRC CBSU said: “These artificial brains give us a way to understand the rich and bewildering data we see when the activity of real neurons is recorded in real brains.”

    Achterberg added: “Artificial ‘brains’ allow us to ask questions that it would be impossible to look at in an actual biological system. We can train the system to perform tasks and then play around experimentally with the constraints we impose, to see if it begins to look more like the brains of particular individuals.”

     

    Implications for designing future AI systems

    The findings are likely to be of interest to the AI community, too, where they could allow for the development of more efficient systems, particularly in situations where there are likely to be physical constraints.

    Dr Akarca said: “AI researchers are constantly trying to work out how to make complex, neural systems that can encode and perform in a flexible way that is efficient. To achieve this, we think that neurobiology will give us a lot of inspiration. For example, the overall wiring cost of the system we’ve created is much lower than you would find in a typical AI system.”

    Many modern AI solutions involve using architectures that only superficially resemble a brain. The researchers say their works shows that the type of problem the AI is solving will influence which architecture is the most powerful to use.

    Achterberg said: “If you want to build an artificially-intelligent system that solves similar problems to humans, then ultimately the system will end up looking much closer to an actual brain than systems running on large compute cluster that specialise in very different tasks to those carried out by humans. The architecture and structure we see in our artificial ‘brain’ is there because it is beneficial for handling the specific brain-like challenges it faces.”

    This means that robots that have to process a large amount of constantly changing information with finite energetic resources could benefit from having brain structures not dissimilar to ours.

    Achterberg added: “Brains of robots that are deployed in the real physical world are probably going to look more like our brains because they might face the same challenges as us. They need to constantly process new information coming in through their sensors while controlling their bodies to move through space towards a goal. Many systems will need to run all their computations with a limited supply of electric energy and so, to balance these energetic constraints with the amount of information it needs to process, it will probably need a brain structure similar to ours.”

    The research was funded by the Medical Research Council, Gates Cambridge, the James S McDonnell Foundation, Templeton World Charity Foundation and Google DeepMind.

    Reference

    Achterberg, J & Akarca, D et al. Spatially embedded recurrent neural networks reveal widespread links between structural and functional neuroscience findings. Nature Machine Intelligence; 20 Nov 2023; DOI: 10.1038/s42256-023-00748-9

    [ad_2]

    University of Cambridge

    Source link

  • Cancer drug repurposed to treat inflammatory diseases

    Cancer drug repurposed to treat inflammatory diseases

    [ad_1]

    Newswise — A cancer drug currently in the final stages of clinical trials could offer hope for the treatment of a wide range of inflammatory diseases, including gout, heart failure, cardiomyopathy, and atrial fibrillation, say scientists at the University of Cambridge.

    In a study published today in the Journal of Clinical Investigation, the researchers have identified a molecule that plays a key role in triggering inflammation in response to materials in the body seen as potentially harmful.

    We are born with a defence system known as innate immunity, which acts as the first line of defence against harmful materials in the body. Some of these materials will come from outside, such as bacterial or viral infections, while others can be produced within the body.

    Innate immunity triggers an inflammatory response, which aims to attack and destroy the perceived threat. But sometimes, this response can become overactive and can itself cause harm to the body.

    One such example of this is gout, which occurs when urate crystals build up in joints, causing excessive inflammation, leading to intense pain. Another example is heart attack, where dead cell build up in the damaged heart – the body sees itself as being under attack and an overly-aggressive immune system fights back, causing collateral damage to the heart.

    Several of these conditions are characterised by overactivation of a component of the innate immune response known as an inflammasome – specifically, the inflammasome NLRP3. Scientists at the Victor Phillip Dahdaleh Heart and Lung Research Institute at Cambridge have found a molecule that helps NLRP3 respond.

    This molecule is known as PLK1. It is involved in a number of processes within the body, including helping organise tiny components of our cells known as microtubules cytoskeletons. These behave like train tracks inside of the cell, allowing important materials to be transported from one part of the cell to another.

    Dr Xuan Li from the Department of Medicine at the University of Cambridge, the study’s senior author, said: “If we can get in the way of the microtubules as they try to organise themselves, then we can in effect slow down the inflammatory response, preventing it from causing collateral damage to the body. We believe this could be important in preventing a number of common diseases that can cause pain and disability and in some cases can lead to life-threatening complications.”

    But PLK1 also plays another important role in the body – and this may hold the key to developing new treatments for inflammatory diseases.

    For some time now, scientists have known that PLK1 is involved in cell division, or mitosis, a process which, when it goes awry, can lead to runaway cell division and the development of tumours. This has led pharmaceutical companies to test drugs that inhibit its activity as potential treatments for cancer. At least one of these drugs is in phase three clinical trials – the final stages of testing how effective a drug is before it can be granted approval.

    When the Cambridge scientists treated mice that had developed inflammatory diseases with a PLK1 inhibitor, they showed that it prevented the runaway inflammatory response – and at a much lower dose than would be required for cancer treatment. In other words, inhibiting the molecule ‘calmed down’ NLRP3 in non-dividing cells, preventing the overly aggressive inflammatory response seen in these conditions.

    The researchers are currently planning to test its use against inflammatory diseases in clinical trials.

    “These drugs have already been through safety trials for cancer – and at higher doses than we think we would need – so we’re optimistic that we can minimise delays in meeting clinical and regulatory milestones,” added Dr Li.

    “If we find that the drug is effective for these conditions, we could potentially see new treatments for gout and inflammatory heart diseases – as well as a number of other inflammatory conditions – in the not-too-distant future.”

    The research was funded by the British Heart Foundation. Professor James Leiper, Associate Medical Director at the British Heart Foundation said: “This innovative research has uncovered a potential new treatment approach for inflammatory heart diseases such as heart failure and cardiomyopathy. It’s promising that drugs targeting PLK1 – that work by dampening down the inflammatory response – have already been proven safe and effective in cancer trials, potentially helping accelerate the drug discovery process.

    “We hope that this research will open the door for new ways to treat people with heart diseases caused by overactive and aggressive immune responses, and look forward to more research to uncover how this drug could be could be repurposed.”

    [ad_2]

    University of Cambridge

    Source link

  • Bumblebees prioritize maximizing calorie intake in minimal time.

    Bumblebees prioritize maximizing calorie intake in minimal time.

    [ad_1]

    Newswise — Research has found that bumblebees make foraging choices to collect the most sugar from flowers in the shortest time – even if that means using more energy in the process – to provide an immediate energy boost for the colony.

    A new study investigating nectar drinking in one of the most common bumblebees in the UK, Bombus terrestris, has found that when foraging they maximise the amount of nectar sugar they take back to the colony each minute.

    To make their choices, the bumblebees trade off the time they spend collecting nectar with the energy content of that nectar. This means they will forage to collect nectar that’s hard to access – but only if the sugar content of that nectar makes it worth doing so.

    This big-and-fast approach contrasts with honeybee foraging: honeybees make their decisions by optimising their individual energy expenditure for any nectar they collect. This more measured approach should prolong the honeybee’s working life.

    “As they forage, bumblebees are making decisions about which nectar sources will give the greatest immediate energetic return, rather than optimising the energy efficiency of their foraging,” said Dr Jonathan Pattrick, joint first author of the report, who started the research while in the University of Cambridge’s Department of Plant Sciences.

    Pattrick, now based at the University of Oxford, added: “Our results allow us to make predictions about the sorts of flowers the bumblebees are likely to visit, which could inform the choice of which flowers to plant in field margins to support these important pollinators. It’s also relevant to crop breeders who want to make varieties that are ‘better’ for bumblebees.”

    The results are published today in the journal iScience.

    Over six months the researchers made 60,000 behavioural observations of the bumblebees, allowing them to precisely estimate bumblebee foraging energetics. It was painstaking work: each bumblebee in the study was watched for up to eight hours a day without a break.

    The team used vertically and horizontally oriented artificial flowers, with surfaces that were slippery and difficult for the bumblebees to grip.

    A custom computer program allowed the team to measure the split-second timing as the bumblebees flew between the artificial flowers and foraged from them. This meant the team could track how much energy the bumblebees spent flying as well as how much they collected when drinking, and identify how the bumblebees decided whether to spend extra time and energy collecting high-sugar nectar from slippery flowers, or take the easier option of collecting lower-sugar nectar from flowers they could land on.

    “It’s amazing that even with a brain smaller than a sesame seed, bumblebees can make such complex decisions,” said Dr Hamish Symington in the University of Cambridge’s Department of Plant Sciences and joint first author of the report.

    He added: “It’s clear that bumblebee foraging isn’t based on a simple idea that ‘the more sugar there is in nectar, the better’ – it’s much more subtle than that. And it highlights that there’s still so much to learn about insect behaviour.”

    Individual bumblebees were given one of three tests. In the first test, the nectar on both vertical and horizontal artificial flowers had the same amount of sugar, and the bumblebees made the obvious choice to forage from the horizontal flowers, rather than spend extra time and energy hovering at the vertical ones. In the second test, the nectar on the vertical flowers was much more sugary than the nectar on the horizontal flowers, and the bumblebees chose to drink almost exclusively from the vertical flowers.

    In the third test, the vertical flowers offered nectar which was only slightly more sugary than the horizontal flowers. This created a situation in which the bumblebees had to make a trade-off between the time and energy they spent foraging and the energy in the nectar they were drinking – and they switched to feeding from the horizontal flowers.

    The results show that bumblebees can choose to spend additional time and energy foraging from hard-to-access nectar sources – but only if the reward is worth it.

    Bumblebees drink nectar from flowers, then offload it in their nest – by regurgitation – for use by other bumblebees in the nest. Unlike honeybees, bumblebees only store a small amount of nectar in the nest, so they need to make the most of every opportunity to forage.

    [ad_2]

    University of Cambridge

    Source link

  • Prior infections and vaccinations affect COVID-19 mutation vulnerability, per study.

    Prior infections and vaccinations affect COVID-19 mutation vulnerability, per study.

    [ad_1]

    Newswise — A person’s immune response to variants of SARS-CoV-2, the virus that causes COVID-19, depends on their previous exposure – and differences in the focus of immune responses will help scientists understand how to optimise vaccines in the future to provide broad protection.

    A new study has found that people differ in how vulnerable they are to different mutations in emerging variants of SARS-CoV-2.

    This is because the variant of SARS-CoV-2 a person was first exposed to determines how well their immune system responds to different parts of the virus, and how protected they are against other variants.

    It also means that the same COVID-19 vaccine might work differently for different people, depending on which variants of SARS-CoV-2 they have previously been exposed to and where their immune response has focused.

    The discovery underlies the importance of continuing surveillance programmes to detect the emergence of new variants, and to understand differences in immunity to SARS-CoV-2 across the population.

    It will also be important for future vaccination strategies, which must consider both the virus variant a vaccine contains and how immune responses of the population may differ in their response to it.

    “It was a surprise how much of a difference we saw in the focus of immune responses of different people to SARS-CoV-2. Their immune responses appear to target different specific regions of the virus, depending on which variant their body had encountered first,” said Dr Samuel Wilks at the University of Cambridge’s Centre for Pathogen Evolution in the Department of Zoology, first author of the report.

    He added: “Our results mean that if the virus mutates in a specific region, some people’s immune system will not recognize the virus as well – so it could make them ill, while others may still have good protection against it.”

    The research, published today in the journal Science, involved a large-scale collaboration across ten research institutes including the University of Cambridge and produced a comprehensive snapshot of early global population immunity to COVID-19.

    Researchers collected 207 serum samples – extracted from blood samples – from people who had either been infected naturally with one of the many previously circulating SARS-CoV-2 variants, or who had been vaccinated against SARS-CoV-2 with different numbers of doses of the Moderna vaccine.

    They then analysed the immunity these people had developed, and found significant differences between immune responses depending on which variant a person had been infected with first.

    “These results give us a deep understanding of how we might optimise the design of COVID-19 booster vaccines in the future,” said Professor Derek Smith, Director of the University of Cambridge’s Centre for Pathogen Evolution in the Department of Zoology, senior author of the report.

    He added: “We want to know the key virus variants to use in vaccines to best protect people in the future.”

    The research used a technique called ‘antigenic cartography’ to compare the similarity of different variants of the SARS-CoV-2 virus. This measures how well human antibodies, formed in response to infection with one virus, respond to infection with a variant of that virus. It shows whether the virus has changed enough to escape the human immune response and cause disease.

    The resulting ‘antigenic map’ shows the relationship between a wide selection of SARS-CoV-2 variants that have previously circulated. Omicron variants are noticeably different from the others – which helps to explain why many people still succumbed to infection with Omicron despite vaccination or previous infection with a different variant.

    Immunity to COVID-19 can be acquired by having been infected with SARS-CoV-2 or by vaccination. Vaccines provide immunity without the risk from the disease or its complications. They work by activating the immune system so it will recognise and respond rapidly to exposure to SARS-CoV-2 and prevent it causing illness. But, like other viruses, the SARS-CoV-2 virus keeps mutating to try and escape human immunity.

    During the first year of the pandemic, the main SARS-CoV-2 virus in circulation was the B.1 variant. Since then, multiple variants emerged that escaped pre-existing immunity, causing reinfections in people who had already had COVID.

    “The study was an opportunity to really see – from the first exposure to SARS-CoV-2 onwards – what the basis of people’s immunity is, and how this differs across the population,” said Wilks.

    [ad_2]

    University of Cambridge

    Source link

  • Experts call for urgent mental health support for people living with long term autoimmune diseases

    Experts call for urgent mental health support for people living with long term autoimmune diseases

    [ad_1]

    Newswise — More than half of patients with auto-immune conditions experience mental health conditions such as depression or anxiety, yet the majority are rarely or never asked in clinic about mental health symptoms, according to new research from the University of Cambridge and King’s College London.

    In a study published today in Rheumatology, researchers found that over half of the patients had rarely or never reported their mental health symptoms to a clinician, and that the range of possible mental health and neurological symptoms is much wider than has been previously reported.

    The team surveyed neurological and psychiatric symptoms amongst 1,853 patients with systemic auto-immune rheumatic diseases (SARDs) such as lupus and rheumatoid arthritis. The researchers also surveyed 289 clinicians, mostly rheumatologists, psychiatrists and neurologists, and conducted 113 interviews with patients and clinicians. 

    The 30 symptoms that the team asked about included fatigue, hallucinations, anxiety and depression. Among the patients in the study, experience of most of these symptoms was very widespread. 

    55% of SARD patients were experiencing depression, 57% experiencing anxiety, 89% had experienced severe fatigue and 70% had experienced cognitive dysfunction, for example. The overall prevalence of symptoms was significantly higher than previously thought, and much higher than in a control group of healthy volunteers.

    The mental health symptoms described by patients contrasted strongly with clinician estimates. For example, three times as many lupus patients reported experiencing suicidal thoughts compared to the estimate by clinicians (47% versus 15%). Clinicians were often surprised and concerned by the frequency and wide range of symptoms that patients reported to the researchers. 

    Some clinicians were much more focused on joint symptoms over mental health symptoms as they held the opinion that SARDs do not commonly affect the brain.

    However, other clinicians felt that these symptoms were under-estimated because patients were rarely asked about them in clinic. One rheumatology nurse interviewed said: “Doctors don’t go looking for it [hallucinations], so if we don’t ask we don’t think it exists much.”

    The study found disagreements between clinicians specialising in different aspects of care, but very few hospitals had effective systems where rheumatologists, neurologists and psychiatrists worked together.

    Dr Tom Pollak from the Institute of Psychiatry, Psychology & Neuroscience at King’s College London, said the study highlights the importance of all clinicians asking their patients about mental health: “We have known for some time that having a systemic autoimmune disease can negatively affect one’s mental health, but this study paints a startling picture of the breadth and impact of these symptoms. Everyone working in healthcare with these patients should routinely ask about mental wellbeing, and patients should be supported to speak up without fear of judgement. No patient should suffer in silence.” 

    The study showed that patients were often reticent to report to clinicians mental health problems they might be having, sometimes feeling that they might be stigmatised. Patients frequently said that even when they did share their mental health symptoms with clinicians, they were often not commented on or not documented accurately or at all. 

    One patient expressed how this felt: “Feel guilty and useless as well as depressed and very unwell. I don’t really feel supported, understood, listened to, hopeful at all. It is awful living like this…. All just feels hopeless.” 

    Dr Melanie Sloan from the Department of Public Health and Primary Care at the University of Cambridge said: “The low level of reporting we identified is a major concern as problems with mental health, fatigue and cognition can be life-changing, and sometimes life-threatening. It’s only by fully engaging patients in their healthcare and by asking them for their views that we will be able to determine the extent of these often hidden symptoms, and help patients get the understanding, support and treatment they need.”

    The research team suggests that though they found neurological and psychiatric symptoms to be under-elicited in clinic, under-identified in research and under-represented in clinical guidelines, they described almost all clinicians as highly motivated to improve care. Rapidly evolving knowledge – including the behavioural and cognitive impacts of chronic inflammation and a widening range of potential biomarkers – means that there is grounds for optimism. 

    Sarah Campbell, Chief Executive of the British Society for Rheumatology, commented: “This study highlights the urgent need for improvements in the access patients have to integrated mental health support. Given what the study finds on the prevalence of this issue and the deep impact neurological and psychiatric symptoms have on patients, it should be of grave concern to policymakers that only 8% of rheumatology departments in England and Wales have a psychologist embedded in their team. We fully support the study team’s conclusion that more inter-disciplinary and patient-clinician collaboration is needed to ensure equity in the care of patients’ mental and physical health.”

    The Rt Hon the Lord Blunkett said: “It’s both surprising and deeply concerning that almost half of lupus patients have experienced suicidal thoughts, and that clinicians greatly under-estimate the mental health burden of these chronic diseases. This highlights the importance of extra funding for the NHS and the holistic care that is urgently needed for these patients. I echo the British Society of Rheumatologists’ concerns about the poor current provision of mental health support. Now is the time for the Government to act to give them the support they desperately need.”

    The research was funded by The Lupus Trust and LUPUS UK 

    Reference

    Sloan, M et al. Prevalence and identification of neuropsychiatric symptoms in systemic autoimmune rheumatic diseases: an international mixed methods study. Rheumatology; 26 Jul 2023; DOI: 10.1093/rhe/kead369

    ENDS 

    Once the embargo has lifted, the study will be live at: https://academic.oup.com/rheumatology/article-lookup/doi/10.1093/rhe/kead369

    [ad_2]

    University of Cambridge

    Source link

  • Troubling Trend: Greening of Peru’s Pacific Slope Raises Concerns

    Troubling Trend: Greening of Peru’s Pacific Slope Raises Concerns

    [ad_1]

    Newswise — Analysing satellite data spanning the past 20 years, the research team based at the Cavendish Laboratory in Cambridge examined how vegetation has been changing along the Pacific coast of Peru and northern Chile. This area is known for its unique and delicate arid and semi-arid environments.

    The analysis revealed that certain areas experienced positive vegetation growth, known as greening, while others displayed negative trends, referred to as browning. Unsurprisingly, the changes in vegetation are influenced by things like farming and urban development or change in land use practices.

    But more interestingly this study, published in MDPI Remote Sensing, revealed the discovery of a huge section of the West Slope of the Andes undergoing significant greening in the past 20 years. This section, which extends from Northern Peru to Northern Chile, spanning a length of about 2000km, has seen its vegetation growing significantly over time. This greening trend varies with altitude, with different vegetation types at different elevations.

    The research team, consisting of mathematicians, geographers, biologists, and earth scientists, used satellite images from 2000 to 2020 to observe changes in vegetation over time in this area. They plotted 450 data points and developed a mathematical model to remove artificial variations (such as cloudy days) and seasonality, and used statistical analysis to ensure that they were only analysing areas with a significant trend.

    “It took three years to sort the methodology and the statistical model,” said Hugo Lepage, mathematician at the Cavendish laboratory and first author of the study. “We really needed to bulletproof it to make sure that something was really happening on a massive scale, and it was not just a fluke.”

    To verify what they were seeing in the data, the researchers conducted numerous filed trips to make observations on the ground to corroborate their numerical statements.

    “We started with a very local area to study the impact of mining on local vegetation,” explained Eustace Barnes, a geographer in the Cavendish Laboratory’s Environmental Physics Group, which ran the research. “To our surprise, the data was suggesting that the area was greening instead of browning. So, we zoomed out and realised other areas were also greening on large scale. When we went to check on the ground, we observed a similar trend.”

    Beyond the empirical observation of the greening strip itself, the researchers were struck by its surprising features.

    “First, the strip ascends as we look southward, going from 170-780 m in northern Peru to 2600-4300 m in the south of Peru”, explained Barnes. “This is counterintuitive, as we would expect the surface temperatures to drop both when moving south and ascending in altitude.”

    Even more surprisingly, this huge greening strip does not align with the climate zones established by the Köppen-Geiger classification – the widely used, vegetation-based, empirical climate classification system, whereas the greening and browning trends in the coastal deserts and high Andes, do match well.

    “Indeed, in northern Peru, the greening strip mostly lies in the climate zone corresponding to the hot arid desert,” said Lepage. “As we scan the strip going south, it ascends to lie mostly in the hot arid steppe and finally traverses to lie in the cold arid steppe. This did not match what we expected based on the climate in those regions.”

    The results of this study have far-reaching implications for environmental management and policymaking in the region. Although the exact cause or resulting consequences of this greening are not known, any large change (30-60% index increase) in vegetation will necessarily have an impact on ecosystems and the environment.

    “The Pacific slope provides water for two-thirds of the country, and this is where most of the food for Peru is coming from too,” said Barnes. “This rapid change in vegetation, and to water level and ecosystems, will inevitably have an impact on water and agricultural planning management.”

    The researchers believe their findings will contribute significantly to the scientific community’s understanding of the complex interactions between climate change and delicate ecosystems in arid and semi-arid environments.

    “This is a warning sign, like the canary in the mine. There is nothing we can do to stop changes at such a large scale. But knowing about it will help to plan better for the future,” concluded Lepage.

    This research was carried out by the Environmental Physics Group led by Prof. Crispin Barnes and funded by Universidad Nacional de Cañete (UNDC), dpto Lima, Peru.

    [ad_2]

    University of Cambridge

    Source link

  • Climate change threatens small, light-colored butterflies most

    Climate change threatens small, light-colored butterflies most

    [ad_1]

    Newswise — Butterflies with smaller or lighter coloured wings are likely to be ‘losers’ when it comes to climate change, with the Lycaenidae family, which contains over 6,000 species of butterflies, the majority of which live in the tropics, found to be particularly vulnerable.

    Butterflies with larger or darker coloured wings are likely to fare better under increasing temperatures, but only to a point. Researchers say these butterflies could still experience dramatic declines if there were sudden heatwaves or if cool microclimates were lost through deforestation.

    The results are published today in the Journal of Animal Ecology.

    Butterflies rely on the sun’s warmth to give them the energy they need to function. They use ‘thermoregulation’ strategies to maintain a balanced body temperature against changing air temperatures.

    Generally, strategies to keep cool involve adaptive behaviours like flying to a shady spot or angling wings away from the sun (thermal buffering). But when this is not possible or temperatures become too hot, species have to rely on physiological mechanisms such as the production of heat shock proteins to withstand high temperatures (thermal tolerance). Both of these strategies are needed to cope with climate change.

    Researchers collaborated with the Smithsonian Tropical Research Institute (STRI) to study the thermal buffering and thermal tolerance strategies of tropical butterflies. They collected data from multiple habitats in Panama.  

    Equipped with hand-held nets, ecologists took the temperature of over 1,000 butterflies using a tiny thermometer-like probe. They compared each butterfly’s temperature to that of the surrounding air or the vegetation it was perched on. This gave a measurement of thermal buffering – the ability to maintain a steady body temperature against fluctuating air temperatures.

    A second experiment was conducted at STRI Gamboa facilities and involved assessing butterflies’ thermal tolerance – their ability to withstand extreme temperatures, such as those they may experience during a heatwave. This was assessed by capturing a subset of butterflies and placing them in glass jars within a water bath – the temperature of which was steadily increased. Thermal tolerance was assessed as the temperature at which butterflies could no longer function.

    Butterflies that had large wings tended to have greater thermal buffering ability but less thermal tolerance than smaller butterflies. Indeed, in a further study conducted by the same research team, butterflies with larger, longer and narrower wings were found to be better at thermal buffering.

    Thermal buffering abilities were found to be stronger in darker-winged butterflies who could also tolerate higher temperatures than paler-winged butterflies.

    Butterflies from the Lycaenidae family which have small, bright, and often iridescent, wings had the poorest thermal buffering and low thermal tolerance. If temperatures continue to rise at the current rate, forests continue to be cut down, and cool microclimates are lost, there is a very real threat that we could lose many species in this family in the future, say the researchers.

    A trade-off in terms of butterflies’ cooling strategies was observed: those that were good at thermal buffering were less good at thermal tolerance and vice versa.

    Scientists say this suggests that tropical butterflies have evolved to cope with temperature changes using one of these strategies at the expense of the other, and that this is likely to be due to selective pressures.

    Lead author Esme Ashe-Jepson, a PhD student at Cambridge’s Department of Zoology, said: “Butterflies with physical characteristics that may help them to avoid the sun’s heat, like having large wings that enable them to fly quickly into shade, rarely experience high temperatures, and so have not evolved to cope with them. On the other hand, species which can cope with higher temperatures physiologically have experienced less selective pressure to evolve heat-avoiding behaviours.

    “As temperatures continue to rise, and forest fragments get smaller and further apart because of deforestation, butterflies which rely on their surroundings to avoid high temperatures may not be able to travel between forest fragments, or cope with increasingly common heatwaves.”

    The researchers say this means that species with large dark wings that are good at thermal buffering may initially be unaffected by warming temperatures, as they can continue to thermoregulate effectively using behaviour and microclimates, but their survival could be at risk if there are sudden heatwaves, or they can no longer escape to cool vegetation.

    “Ultimately all insects, including butterflies, the world over are likely to be affected by climate change,” said Ashe-Jepson. “Adaptation to climate change is complex and can be impacted by other factors such as habitat destruction. We need to address these two global challenges together.”

    Further research is needed to investigate the effect a warming climate may have on other life stages of butterflies, such as caterpillars and eggs, and other insect groups.

    Senior author Greg Lamarre, at the Czech Academy of Science and Research Associate at STRI said: “Worldwide, most entomologists are observing drastic declines in insect biodiversity. Understanding the causes and consequences of insect decline has become an important goal in ecology, particularly in the tropics, where most of terrestrial diversity occurs.”

    The research was funded by the GACR Czech Science Foundation, an ERC Starting Grant, a Smithsonian Tropical Research Institute short-term fellowship, and the Sistema Nacional de Investigación (SENACYT), Panama.

    [ad_2]

    University of Cambridge

    Source link

  • Innovative memory tech cuts energy use, boosts performance

    Innovative memory tech cuts energy use, boosts performance

    [ad_1]

    Newswise — Researchers have developed a new design for computer memory that could both greatly improve performance and reduce the energy demands of internet and communications technologies, which are predicted to consume nearly a third of global electricity within the next ten years.

    The researchers, led by the University of Cambridge, developed a device that processes data in a similar way as the synapses in the human brain. The devices are based on hafnium oxide, a material already used in the semiconductor industry, and tiny self-assembled barriers, which can be raised or lowered to allow electrons to pass.

    This method of changing the electrical resistance in computer memory devices, and allowing information processing and memory to exist in the same place, could lead to the development of computer memory devices with far greater density, higher performance and lower energy consumption. The results are reported in the journal Science Advances.

    Our data-hungry world has led to a ballooning of energy demands, making it ever-more difficult to reduce carbon emissions. Within the next few years, artificial intelligence, internet usage, algorithms and other data-driven technologies are expected to consume more than 30% of global electricity.  

    “To a large extent, this explosion in energy demands is due to shortcomings of current computer memory technologies,” said first author Dr Markus Hellenbrand, from Cambridge’s Department of Materials Science and Metallurgy. “In conventional computing, there’s memory on one side and processing on the other, and data is shuffled back between the two, which takes both energy and time.”

    One potential solution to the problem of inefficient computer memory is a new type of technology known as resistive switching memory. Conventional memory devices are capable of two states: one or zero. A functioning resistive switching memory device however, would be capable of a continuous range of states – computer memory devices based on this principle would be capable of far greater density and speed.

    “A typical USB stick based on continuous range would be able to hold between ten and 100 times more information, for example,” said Hellenbrand.

    Hellenbrand and his colleagues developed a prototype device based on hafnium oxide, an insulating material that is already used in the semiconductor industry. The issue with using this material for resistive switching memory applications is known as the uniformity problem. At the atomic level, hafnium oxide has no structure, with the hafnium and oxygen atoms randomly mixed, making it challenging to use for memory applications.

    However, the researchers found that by adding barium to thin films of hafnium oxide, some unusual structures started to form, perpendicular to the hafnium oxide plane, in the composite material.

    These vertical barium-rich ‘bridges’ are highly structured, and allow electrons to pass through, while the surrounding hafnium oxide remains unstructured. At the point where these bridges meet the device contacts, an energy barrier was created, which electrons can cross. The researchers were able to control the height of this barrier, which in turn changes the electrical resistance of the composite material.

    “This allows multiple states to exist in the material, unlike conventional memory which has only two states,” said Hellenbrand.

    Unlike other composite materials, which require expensive high-temperature manufacturing methods, these hafnium oxide composites self-assemble at low temperatures. The composite material showed high levels of performance and uniformity, making them highly promising for next-generation memory applications.

    A patent on the technology has been filed by Cambridge Enterprise, the University’s commercialisation arm.

    “What’s really exciting about these materials is they can work like a synapse in the brain: they can store and process information in the same place, like our brains can, making them highly promising for the rapidly growing AI and machine learning fields,” said Hellenbrand.

    The researchers are now working with industry to carry out larger feasibility studies on the materials, in order to understand more clearly how the high-performance structures form. Since hafnium oxide is a material already used in the semiconductor industry, the researchers say it would not be difficult to integrate into existing manufacturing processes.

    The research was supported in part by the U.S. National Science Foundation and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI).

    [ad_2]

    University of Cambridge

    Source link

  • Volcanic Eruptions’ Climate Impact Underestimated

    Volcanic Eruptions’ Climate Impact Underestimated

    [ad_1]

    Newswise — Researchers have found that the cooling effect that volcanic eruptions have on Earth’s surface temperature is likely underestimated by a factor of two, and potentially as much as a factor of four, in standard climate projections.

    While this effect is far from enough to offset the effects of global temperature rise caused by human activity, the researchers, led by the University of Cambridge, say that small-magnitude eruptions are responsible for as much as half of all the sulphur gases emitted into the upper atmosphere by volcanoes.

    The results, reported in the journal Geophysical Research Letters, suggest that improving the representation of volcanic eruptions of all magnitudes will in turn make climate projections more robust.

    Where and when a volcano erupts is not something that humans can control, but volcanoes do play an important role in the global climate system. When volcanoes erupt, they can spew sulphur gases into the upper atmosphere, which forms tiny particles called aerosols that reflect sunlight back into space. For very large eruptions, such as Mount Pinatubo in 1991, the volume of volcanic aerosols is so large that it single-handedly cause global temperatures to drop.

    However, these large eruptions only happen a handful of times per century – most small-magnitude eruptions happen every year or two.  

    “Compared with the greenhouse gases emitted by human activity, the effect that volcanoes have on the global climate is relatively minor, but it’s important that we include them in climate models, in order to accurately assess temperature changes in future,” said first author May Chim, a PhD candidate in the Yusuf Hamied Department of Chemistry.

    Standard climate projections, such as the Intergovernmental Panel on Climate Change (IPCC) Sixth Assessment Report, assume that explosive volcanic activity over 2015–2100 will be at the same level as the 1850–2014 period, and overlook the effects of small-magnitude eruptions.

    “These projections mostly rely on ice cores to estimate how volcanoes might affect the climate, but smaller eruptions are too small to be detected in ice-core records,” said Chim. “We wanted to make a better use of satellite data to fill the gap and account for eruptions of all magnitudes.”

    Using the latest ice-core and satellite records, Chim and her colleagues from the University of Exeter, the German Aerospace Center (DLR), the Ludwig-Maximilians University of Munich, the Durham University, and the UK Met Office, generated 1000 different scenarios of future volcanic activity. They selected scenarios representing lower, median and high levels of volcanic activity, and then performed climate simulations using the UK Earth System Model.

    Their simulations show that the impacts of volcanic eruptions on climate, including global surface temperature, sea level and sea ice extent, are underestimated because current climate projections largely underestimate the plausible future level of volcanic activity.

    For the median future scenario, they found that the effect of volcanoes on the atmosphere, known as volcanic forcing, is being underestimated in climate projections by as much as 50%, due in large part to the effect of small-magnitude eruptions.

    “We found that not only is volcanic forcing being underestimated, but small-magnitude eruptions are actually responsible for as much as half of the volcanic forcing,” said Chim. “These small-magnitude eruptions may not have a measurable effect individually, but collectively, their effect is significant.

    “I was surprised to see just how important these small-magnitude eruptions are – we knew they had an effect, but we didn’t know it was so large.”

    Although the cooling effect of volcanoes is being underestimated in climate projections, the researchers stress that it does not compare with human-generated carbon emissions.

    “Volcanic aerosols in the upper atmosphere typically stay in the atmosphere for a year or two, whereas carbon dioxide stays in the atmosphere for much, much longer,” said Chim. “Even if we had a period of extraordinarily high volcanic activity, our simulations show that it wouldn’t be enough to stop global warming. It’s like a passing cloud on a hot, sunny day: the cooling effect is only temporary.”

    The researchers say that fully accounting for the effect of volcanoes can help make climate projections more robust. They are now using their simulations to investigate whether future volcanic activity could threaten the recovery of the Antarctic ozone hole, and in turn maintain relatively high level of harmful ultraviolet radiation at the Earth’s surface.

    The research was supported in part by the Croucher Foundation and The Cambridge Commonwealth, European & International Trust, the European Union, and the Natural Environment Research Council (NERC), part of UK Research and Innovation (UKRI).

    [ad_2]

    University of Cambridge

    Source link

  • Weak policies and political ideologies risk jeopardising plans to tackle health and climate change, says Cambridge expert

    Weak policies and political ideologies risk jeopardising plans to tackle health and climate change, says Cambridge expert

    [ad_1]

    Newswise — Efforts to tackle major issues facing the UK, including the nation’s health and climate change, are being hampered because politicians often ignore the existing evidence when setting policies, according to Dame Theresa Marteau, a public health expert at the University of Cambridge.

    Writing in the journal Science and Public Policy, Professor Marteau argues that this ‘evidence-neglect’ is a result of incentive structures that encourage politicians to set ambitious policy goals while simultaneously disincentivising them from implementing the policies needed to achieve them, and of political ideologies and interests that conflict with effective policies.

    Two changes could mitigate these factors, she writes: engaging citizens more in policy-making so their interests dominate; and increasing the accountability of politicians through legally binding systems for all stages of policy-making. 

    Recent UK governments have set ambitious goals to improve the nation’s health and tackle climate change. These include halving childhood obesity by 2030, eradicating smoking by 2030, narrowing the gap in healthy life expectancy by 2030, and achieving net zero carbon emissions by 2050.

    But, says Professor Marteau, Director of the Behaviour and Health Research Unit at Cambridge, “None of these ambitions is on course. Of course, scientific evidence is just one of many sources of information for policymakers to consider, but neglecting evidence is a sure-fire route to unsuccessful policymaking.”

    According to predictions, childhood obesity is on track to double, not halve, by 2030. Smoking eradication is on track sometime after 2050, not by 2030. By 2030, the gap in healthy life expectancy between local areas where it is highest and lowest will have narrowed, but by 2035 is set to rise by five years. And the UK Sixth Carbon Budget – a key target towards achieving net zero carbon emissions – is likely to be missed by “a huge margin”. 

    Achieving each of these ambitions requires sustained changes in several sets of behaviour across all socio-economic groups including what we eat, drink, whether we smoke, and how we travel.  A wealth of research demonstrates that achieving such change is difficult, requiring many interventions that change the environments or systems that too readily cue, reinforce and maintain unhealthier and unsustainable behaviours.

    “There are many possible reasons why these policy ambitions are so far off-track, but chief among them is the neglect of evidence, particularly around achieving sustained changes in behaviour across populations,” said Professor Marteau. 

    “Put simply, these failures are baked-in, given the policies designed to achieve these ambitions are based on interventions that cannot achieve the change required.” 

    Part of the problem, she says, lies in the incentive structures for politicians, which favour setting ambitious policy goals whether as part of achieving election promises, attracting positive publicity or both. But they also discourage the policies needed to achieve them.

    “Fear of electoral damage plays a role here. Take taxes on tobacco, alcohol, junk food and carbon emissions: these are among the most effective interventions for improving health and the climate, but they are unpopular with the public and so politicians are unwilling to adopt them.” 

    Such policies may not just be unpopular with the public – they may also run counter to political interests and ideologies. Neoliberalism, for example, emphasises a small role for governments in the economy and public policy more generally, and a larger role for individuals to be personally responsible for behaving in ways to achieve health, wealth and happiness. Such ideologies often portray attempts by the government to intervene as ‘Nanny Statism’. 

    Certain industries, too, focus on personal responsibility to discourage politicians from adopting effective policies that conflict with their industries’ interests, such as those aimed at reducing consumption of fossil fuels, tobacco, alcohol, meat and junk food. These industries may cast doubt on the effectiveness of policies that would reduce their sales, as well as lobbying governments to persuade them of the business case for the status quo

    Professor Marteau added: “There are no quick or single fixes to overcoming these problems, but there are two changes which could help: engaging citizens more in priority setting and policy design, and increasing the accountability of politicians through introducing legally-binding systems for reporting progress on policy ambitions.”

    There are a number of options available to policymakers when it comes to engaging citizens, including: surveys, focus groups, town hall meetings and citizen assemblies, as well as working with civil society organisations. This approach has the potential to reduce the political costs of unpopular policies by exposing citizens to evidence for the effectiveness of policies, which – across many studies – has been shown to increase policy support. Policies designed with citizen engagement also attract more public support, such policies being seen as fairer and more successful as a result.

    Introducing legally binding systems for reporting policies and progress on policy ambitions, with plans to get back on track if progress is off course, could be a powerful way to decrease the neglect of evidence which is central to policy success. 

    An example of this is the UK government’s recent Levelling Up strategy paper, which included plans to introduce a statutory obligation for government to report annually on progress towards meeting the Levelling Up missions.  Alongside these plans, it published a set of metrics against which to measure progress against the missions and evaluate the success of the strategy.

    “Although these requirements are by no means perfect, the legislation as drafted will at least allow parliament significantly more scrutiny of progress towards a government ambition than is often the case.”

    According to Professor Marteau, failure to consider the evidence risks undermining the government’s attempts to take action.

    “Laudable policy ambitions to improve a nation’s health and protect life on the planet will remain unfulfilled ambitions unless and until evidence is given a more central role in the policy-making process.”

    Reference

    Marteau, TM. Evidence-neglect: addressing a barrier to UK health and climate policy ambitions. Science and Public Policy; 20 June 2023; DOI: 10.1093/scipol/scad021 

    ENDS

     

    Once the published, the full article will be available at: https://academic.oup.com/spp/article-lookup/doi/10.1093/scipol/scad021

    [ad_2]

    University of Cambridge

    Source link

  • Almost half of people with concussion still show symptoms of brain injury six months later

    Almost half of people with concussion still show symptoms of brain injury six months later

    [ad_1]

    Newswise — Even mild concussion can cause long-lasting effects to the brain, according to researchers at the University of Cambridge. Using data from a Europe-wide study, the team has shown that for almost a half of all people who receive a knock to the head, there are changes in how regions of the brain communicate with each other, potentially causing long term symptoms such as fatigue and cognitive impairment.

    Mild traumatic brain injury – concussion – results from a blow or jolt to the head. It can occur as a result of a fall, a sports injury or from a cycling accident or car crash, for example. But despite being labelled ‘mild’, it is commonly linked with persistent symptoms and incomplete recovery. Such symptoms include depression, cognitive impairment, headaches, and fatigue.

    While some clinicians in recent studies predict that nine out of 10 individuals who experience concussion will have a full recovery after six months, evidence is emerging that only a half achieve a full recovery. This means that a significant proportion of patients may not receive adequate post-injury care.

    Predicting which patients will have a fast recovery and who will take longer to recover is challenging, however. At present, patients with suspected concussion will typically receive a brain scan – either a CT scan or an MRI scan, both of which look for structural problems, such as inflammation or bruising – yet even if these scans show no obvious structural damage, a patient’s symptoms may still persist.

    Dr Emmanuel Stamatakis from the Department of Clinical Neurosciences and Division of Anaesthesia at the University of Cambridge said: “Worldwide, we’re seeing an increase in the number of cases of mild traumatic brain injury, particularly from falls in our ageing population and rising numbers of road traffic collisions in low- and middle-income countries.

    “At present, we have no clear way of working out which of these patients will have a speedy recovery and which will take longer, and the combination of over-optimistic and imprecise prognoses means that some patients risk not receiving adequate care for their symptoms.”

    Dr Stamatakis and colleagues studied fMRI brain scans – that is, functional MRI scans, which look at how different areas of the brain coordinate with each other – taken from 108 patients with mild traumatic brain injury and compared them with scans from 76 healthy volunteers. Patients were also assessed for ongoing symptoms.

    The patients and volunteers had been recruited to CENTER-TBI, a large European research project which aims to improve the care for patients with traumatic brain injury, co-chaired by Professor David Menon (head of the division of Anaesthesia) and funded by the European Union.

    In results published today in Brain, the team found that just under half (45%) were still showing symptoms resulting from their brain injury, with the most common being fatigue, poor concentration and headaches.

    The researchers found that these patients had abnormalities in a region of the brain known as the thalamus, which integrates all sensory information and relays this information around the brain. Counter-intuitively, concussion was associated with increased connectivity between the thalamus and the rest of the brain – in other words, the thalamus was trying to communicate more as a result of the injury – and the greater this connectivity, the poorer the prognosis for the patient.

    Rebecca Woodrow, a PhD student in the Department of Clinical Neuroscience and Hughes Hall, Cambridge, said: “Despite there being no obvious structural damage to the brain in routine scans, we saw clear evidence that the thalamus – the brain’s relay system – was hyperconnected. We might interpret this as the thalamus trying to over-compensate for any anticipated damage, and this appears to be at the root of some of the long-lasting symptoms that patients experience.”

    By studying additional data from positron emission tomography (PET) scans, which can measure regional chemical composition of body tissues, the researchers were able to make associations with key neurotransmitters depending on which long-term symptoms a patient displayed. For example, patients experiencing cognitive problems such as memory difficulties showed increased connectivity between the thalamus and areas of the brain rich in the neurotransmitter noradrenaline; patients experiencing emotional symptoms, such as depression or irritability, showed greater connectivity with areas of the brain rich in serotonin.

    Dr Stamatakis, who is also Stephen Erskine Fellow at Queens’ College, Cambridge, added: “We know that there already drugs that target these brain chemicals so our findings offer hope that in future, not only might we be able to predict a patient’s prognosis, but we may also be able to offer a treatment targeting their particular symptoms.”

    Reference

    Woodrow, RE et al. Acute thalamic connectivity precedes chronic postconcussive symptoms in mild traumatic brain injury. Brain; 26 April 2023; DOI: 10.1093/brain/awad056

     

    ENDS

     

    About the University of Cambridge

    The University of Cambridge is one of the world’s leading universities, with a rich history of radical thinking dating back to 1209. Its mission is to contribute to society through the pursuit of education, learning and research at the highest international levels of excellence.

    Cambridge was second in the influential 2023 QS World University Rankings, the highest rated institution in the UK.

    The University comprises 31 autonomous Colleges and over 100 departments, faculties and institutions. Its 20,000 students include around 9,000 international students from 147 countries. In 2022, 72.5% of its new undergraduate students were from state schools and more than 25% from economically disadvantaged backgrounds.

    Cambridge research spans almost every discipline, from science, technology, engineering and medicine through to the arts, humanities and social sciences, with multi-disciplinary teams working to address major global challenges. In the Times Higher Education’s rankings based on the UK Research Excellence Framework, the University was rated as the highest scoring institution covering all the major disciplines.

    The University sits at the heart of the ‘Cambridge cluster’, in which more than 5,200 knowledge-intensive firms employ more than 71,000 people and generate £19 billion in turnover. Cambridge has the highest number of patent applications per 100,000 residents in the UK.

    www.cam.ac.uk

    [ad_2]

    University of Cambridge

    Source link

  • New research advances stable and affordable organic solar cells for renewable electricity

    New research advances stable and affordable organic solar cells for renewable electricity

    [ad_1]

    Newswise — Due to the recent improvements in the efficiency with which solar cells made from organic (carbon-based) semiconductors can convert sunlight into electricity, improving the long-term stability of these photovoltaic devices is becoming an increasingly important topic.  Real-world applications of the technology demand that the efficiency of the photovoltaic device be maintained for many years. To address this key problem, researchers have studied the degradation mechanisms for the two components used in the light-absorbing layer of organic solar cells: the ‘electron donor’ and ‘electron acceptor’ materials. These two components are needed to split the bound electron-hole pair formed after the absorption of a photon into the free electrons and holes that constitute electrical current.

    In this new study reported in Joule, an international team of researchers led by the Cavendish Laboratory, University of Cambridge, have for the first time considered the degradation pathways of both the electron donor and electron acceptor materials. The detailed investigation of the electron donor material sets the current research work apart from the previous studies and provides important new insights for the field. Specifically, the identification of an ultrafast deactivation process unique to the electron donor material has not been observed before and provides a new angle to consider material degradation in organic solar cells.

    To understand how these materials degraded, the Cavendish researchers worked as part of an international team with scientists in the UK, Belgium, and Italy. Together, they combined photovoltaic device stability studies, where the operational solar cell is subject to intense light that closely matches sunlight, with ultrafast laser spectroscopy performed in Cambridge. Through this laser technique, they have been able to identify a new degradation mechanism in the electron donor material involving twisting in the polymer chain. As a result, when the twisted polymer absorbs a photon, it undergoes an extremely rapid deactivation pathway on femtosecond timescales (a millionth billionth of a second). This undesirable process is fast enough to outcompete the generation of free electrons and holes from a photon, which the scientists were able to correlate with the reduced efficiency of the organic solar cell after it had been exposed to simulated sunlight.

    “It was interesting to find that something as seemingly minor as the twisting of a polymer chain could have such a large effect on the solar cell efficiency,” said Dr. Alex Gillett, the lead author of the paper. “In the future, we plan to build on our findings by collaborating with chemistry groups to design new electron donor materials with more rigid polymer backbones. We hope that this will reduce the propensity of the polymer to twist and thus improve the stability of the organic solar cell device.”

    Due to their unique properties, organic solar cells can be used in a wide range of applications for which traditional silicon photovoltaics aren’t suitable. This could include electricity generating windows for greenhouses that transmit the colours of light required for photosynthesis, or even photovoltaics that could be rolled up for easy transportation and mobile electricity generation. Thus, by identifying the degradation mechanism that needs to be solved, the current research directly brings the next generation of photovoltaic materials and applications closer to reality.

    [ad_2]

    University of Cambridge

    Source link

  • Increasing availability of non-alcoholic drinks may reduce amount of alcohol purchased online

    Increasing availability of non-alcoholic drinks may reduce amount of alcohol purchased online

    [ad_1]

    Newswise — Increasing the proportion of non-alcoholic drinks on sale in online supermarkets could reduce the amount of alcohol people purchase, suggests a study published today led by researchers at the University of Cambridge.

    The team used a simulated supermarket that presented shoppers with varying proportions of alcoholic and non-alcoholic drinks and asked them to select drinks to purchase for their next online shop. They found that shoppers who were exposed to more non-alcoholic drinks selected and purchased fewer units of alcohol. The findings are published in PLOS Medicine.

    Excessive alcohol consumption is a major risk factor for a number of diseases, including cancer, heart disease and stroke. Encouraging people to change their behaviour could therefore have significant health benefits at both an individual and population level.

    There is increasing evidence that people can be ‘nudged’ towards reducing their alcohol consumption by making small adjustments to their environment. For example, scientists at Cambridge’s Behaviour and Health Research Unit have previously shown that serving wine in smaller glasses – even while keeping the amount of wine in the glasses the same – led to people consuming less alcohol.

    A recent analysis found that reducing the proportion of unhealthy snacks available can reduce how much of these food products people consume, though the evidence included was limited in both quality and quantity. The Cambridge team wanted to see if a similar approach might work to nudge people towards consuming fewer alcoholic drinks.

    The researchers recruited 737 adults living in England and Wales, all of whom regularly purchased alcohol online, to take part in the study. Of these, just over 600 completed the study and were included in the final analysis – 60% were female and the average (mean) age was 38.

    Participants selected drinks from 64 options in a simulated online supermarket designed to look and function like a real online supermarket. Options included a range of beers, ciders, alcohol-free beer and cider alternatives, and soft drinks.

    Participants were randomly assigned to one of three groups, each of which was presented with a different proportion of alcoholic and non-alcoholic drinks. 25% of the drinks seen by Group 1 were non-alcoholic. For Group 2, this increased to 50%, and for Group 3 the proportion of non-alcoholic drinks seen rose to 75%.

    Those exposed to the highest proportion of non-alcoholic drinks (Group 3) selected fewer alcohol units, 17.5 units, compared to 29.4 units in those exposed to the lowest proportion of non-alcoholic drinks (Group 1) – equivalent to a reduction of about 41%.

    Participants were then asked to actually purchase the same drinks in an online supermarket, Tesco, the largest national supermarket in the UK. Around two-thirds of participants completed this second stage, with 422 participants going on to purchase drinks. The researchers point out that ‘cart abandonment’ – where people do not purchase items they put in their shopping cart – is common in online shopping contexts.

    The researchers found that amongst participants exposed to the highest proportion of non-alcoholic drinks, 52% of the drinks purchased were alcoholic, compared to 70% of drinks that were purchased by those exposed to the lowest proportion of non-alcoholic drinks.

    Lead author Dr Natasha Clarke said: “We created our simulated supermarket to be as close as possible to an actual online supermarket and found that increasing the proportion of non-alcoholic drinks that shoppers were exposed to made a meaningful difference to their alcohol selection. Though we’d need to confirm these findings using only a real online supermarket, they are very promising.”

    While the current market for alcohol-free beer, wine and spirits represents only a small share of the global alcohol industry, it is rapidly growing. For example, low and no-alcohol beer currently accounts for 3% of the total beer market, but this is forecast to increase by nearly 13% per year over the next 3 years and is the fastest growing drinks segment in the UK.

    Senior author Dr Gareth Hollands said: “Supermarkets typically stock a wider range of alcoholic drinks than non-alcoholic alternatives aimed at adults, but this is slowly changing. Our results suggest that if non-alcoholic options were to become the majority instead, we might expect to see substantial reductions in alcohol purchasing.”

    Importantly, the overall number of drinks that participants selected and purchased remained similar between groups, suggesting that effects were a result of shifting people’s choices. This implies overall drink sales and potentially revenues may be relatively unchanged, dependent on the pricing of non-alcoholic drinks.

    Professor Dame Theresa Marteau, Director of the Behaviour and Health Research Unit, said: “We all know that drinking too much alcohol is bad for us, but we’re often unaware of how much we are influenced by the environment around us. Making changes to this environment – from exposing people to a greater proportion of healthier options through to changing the sizes of the utensils we eat and drink from – can help us cut down on potentially unhealthy habits. Even relatively small changes can make a difference both to individuals and at a population level.”

    Although some of the non-alcoholic drink options in the current study contained no sugar and were generally lower in calories than the alcoholic options – an average of 64 calories per non-alcoholic drink versus 233 calories per alcoholic drink – many soft drinks and alcohol-free alternatives still contain large amounts of sugar and calories. The researchers argue that, given the health risks associated with sugary drink consumption, continued regulation and policies to reduce sugar content and consumption from both alcoholic and non-alcoholic drinks is needed to mitigate these risks.

    The research was funded by Wellcome and carried out at the Behaviour and Health Research Unit, University of Cambridge. Dr Clarke is now a Lecturer in Psychology at Bath Spa University. Dr Hollands is a Principal Research Fellow at UCL.

    Reference

    Clarke, N et al. Impact on alcohol selection and online purchasing of changing the proportion of available non-alcoholic versus alcoholic drinks: A randomised controlled trial. PLOS Med; 30 Mar 2023; DOI: 10.1371/journal.pmed.1004193

    [ad_2]

    University of Cambridge

    Source link

  • Men may not ‘perceive’ domestic tasks as needing doing in the same way as women, philosophers argue

    Men may not ‘perceive’ domestic tasks as needing doing in the same way as women, philosophers argue

    [ad_1]

    Newswise — Philosophers seeking to answer questions around inequality in household labour and the invisibility of women’s work in the home have proposed a new theory – that men and women are trained by society to see different possibilities for action in the same domestic environment. 

    They say a view called “affordance theory” – that we experience objects and situations as having actions implicitly attached – underwrites the age-old gender disparity when it comes to the myriad mundane tasks of daily home maintenance.

    For example, women may look at a surface and see an implied action – ‘to be wiped’ – whereas men may just observe a crumb-covered countertop.    

    The philosophers believe these deep-seated gender divides in domestic perception can be altered through societal interventions such as extended paternal leave, which will encourage men to build up mental associations for household tasks.

    Writing in the journal Philosophy and Phenomenological Research, they argue that available data – particularly data gathered during the pandemic – suggest two questions require explanation. 

    One is “disparity”: despite economic and cultural gains, why do women continue to shoulder the vast majority of housework and childcare? The other is “invisibility”: why do so many men believe domestic work to be more equally distributed than in fact it is?

    “Many point to the performance of traditional gender roles, along with various economic factors such as women taking flexible work for childcare reasons,” said Dr Tom McClelland, from Cambridge University’s Department of History and Philosophy of Science.

    “Yet the fact that stark inequalities in domestic tasks persisted during the pandemic, when most couples were trapped inside, and that many men continued to be oblivious of this imbalance, means this is not the full story.”

    McClelland and co-author Prof Paulina Sliwa argue that unequal divisions of labour in the home – and the inability of men to identify said labour – is best explained through the psychological notion of “affordances”: the idea that we perceive things as inviting or “affording” particular actions.

    “This is not just looking at the shape and size of a tree and then surmising you can climb it, but actually seeing a particular tree as climbable, or seeing a cup as drink-from-able,” said Sliwa, recently of Cambridge’s philosophy faculty and now at the University of Vienna. 

    “Neuroscience has shown that perceiving an affordance can trigger neural processes preparing you for physical action. This can range from a slight urge to overwhelming compulsion, but it often takes mental effort not to act on an affordance.”

    There are dramatic differences in “affordance perception” between individuals. One person sees a tree as climbable where another does not. Objects offer a vast array of affordances – one could see a spatula as an egg-frying tool or a rhythmic instrument – and a spectrum of sensitivity towards them. 

    “If we apply affordance perception to the domestic environment and assume it is gendered, it goes a long way to answering both questions of disparity and invisibility,” said McClelland.

    According to the philosophers, when a woman enters a kitchen she is more likely to perceive the “affordances” for particular domestic tasks – she sees the dishes as ‘to be washed’ or a fridge as ‘to be stocked’.

    A man may simply observe dishes in a sink, or a half-empty fridge, but without perceiving the affordance or experiencing the corresponding mental “tug”. Over time, these little differences add up to significant disparities in who does what.  

    “Affordances pull on your attention,” said Sliwa. “Tasks may irritate the perceiver until done, or distract them from other plans. If resisted, it can create a felt tension.”

    “This puts women in a catch-22 situation: either inequality of labour or inequality of cognitive load.”

    This gender-based split in affordance perception could have a number of root causes, say philosophers. Social cues encourage actions in certain environments, often given by adults when we are very young children. Our visual systems update based on what we encounter most frequently.

    “Social norms shape the affordances we perceive, so it would be surprising if gender norms do not do the same,” said McClelland.

    “Some skills are explicitly gendered, such cleaning or grooming, and girls are expected to do more domestic chores than boys. This trains their ways of seeing the domestic environment, to see a counter as ‘to be wiped’.”

    The “gendered affordance perception hypothesis” is not about absolving men say Sliwa and McClelland. Despite a deficit in affordance perception in the home, a man can easily notice what needs doing by thinking rather than seeing. Nor should sensitivity to domestic affordances in women be equated with natural affinity for housework.

    “We can change how we perceive the world through continued conscious effort and habit cultivation,” said McClelland. “Men should be encouraged to resist gendered norms by improving their sensitivity to domestic task affordances. 

    “A man might adopt a resolution to sweep for crumbs every time he waits for the kettle to boil, for example. Not only would this help them to do the tasks they don’t see, it would gradually retrain their perception so they start to see the affordance in the future.”

    Collective efforts to change social norms require policy-level interventions, argue the philosophers. For example, shared parental leave gives fathers the opportunity to become more sensitive to caring-task affordances.

    Added Sliwa: “Our focus has been on physical actions such as sweeping or wiping, but gendered affordance perceptions could also apply to mental actions such as scheduling and remembering.”

    [ad_2]

    University of Cambridge

    Source link

  • Dynamical fractal discovered in clean magnetic crystal

    Dynamical fractal discovered in clean magnetic crystal

    [ad_1]

    Newswise — The nature and properties of materials depend strongly on dimension. Imagine how different life in a one-dimensional or two-dimensional world would be from the three dimensions we’re commonly accustomed to. With this in mind, it is perhaps not surprising that fractals – objects with fractional dimension – have garnered significant attention since their discovery. Despite their apparent strangeness, fractals arise in surprising places – from snowflakes and lightning strikes to natural coastlines.

    Researchers at the University of Cambridge, the Max Planck Institute for the Physics of Complex Systems in Dresden, the University of Tennessee, and the Universidad Nacional de La Plata have uncovered an altogether new type of fractal appearing in a class of magnets called spin ices. The discovery was surprising because the fractals were seen in a clean three-dimensional crystal, where they conventionally would not be expected. Even more remarkably, the fractals are visible in dynamical properties of the crystal, and hidden in static ones. These features motivated the appellation of “emergent dynamical fractal”.

    The fractals were discovered in crystals of the material dysprosium titanate, where the electron spins behave like tiny bar magnets. These spins cooperate through ice rules that mimic the constraints that protons experience in water ice. For dysprosium titanate, this leads to very special properties.

    Jonathan Hallén of the University of Cambridge is a PhD student and the lead author on the study. He explains that “at temperatures just slightly above absolute zero the crystal spins form a magnetic fluid.” This is no ordinary fluid, however.

    “With tiny amounts of heat the ice rules get broken in a small number of sites and their north and south poles, making up the flipped spin, separate from each other traveling as independent magnetic monopoles.”

    The motion of these magnetic monopoles led to the discovery here. As Professor Claudio Castelnovo, also from the University of Cambridge, points out: “We knew there was something really strange going on. Results from 30 years of experiments didn’t add up.”

    Referring to a new study on the magnetic noise from the monopoles published earlier this year, Castelnovo continued, “After several failed attempts to explain the noise results, we finally had a eureka moment, realizing that the monopoles must be living in a fractal world and not moving freely in three dimensions, as had always been assumed.”

    In fact, this latest analysis of the magnetic noise showed the monopole’s world needed to look less than three-dimensional, or rather 2.53 dimensional to be precise! Professor Roderich Moessner, Director of the Max Planck Institute for the Physics of Complex Systems in Germany, and Castelnovo proposed that the quantum tunneling of the spins themselves could depend on what the neighboring spins were doing.

    As Hallén explained, “When we fed this into our models, fractals immediately emerged. The configurations of the spins were creating a network that the monopoles had to move on. The network was branching as a fractal with exactly the right dimension.”

    But why had this been missed for so long?

    Hallén elaborated that, “this wasn’t the kind of static fractal we normally think of. Instead, at longer times the motion of the monopoles would actually erase and rewrite the fractal.”

    This made the fractal invisible to many conventional experimental techniques.

    Working closely with Professors Santiago Grigera of the Universidad Nacional de La Plata, and Alan Tennant of the University of Tennessee, the researchers succeeded in unravelling the meaning of the previous experimental works.

    “The fact that the fractals are dynamical meant they did not show up in standard thermal and neutron scattering measurements,” said Grigera and Tennant. “It was only because the noise was measuring the monopoles motion that it was finally spotted.”

    As regards the significance of the results, which appear in Science this week, Moessner explains: “Besides explaining several puzzling experimental results that have been challenging us for a long time, the discovery of a mechanism for the emergence of a new type of fractal has led to an entirely unexpected route for unconventional motion to take place in three dimensions.”

    Overall, the researchers are interested to see what other properties of these materials may be predicted or explained in light of the new understanding provided by their work, including ties to intriguing properties like topology. With spin ice being one of the most accessible instances of a topological magnet, Moessner said, “the capacity of spin ice to exhibit such striking phenomena makes us hopeful that it holds promise of further surprising discoveries in the cooperative dynamics of even simple topological many-body systems.”

    [ad_2]

    University of Cambridge

    Source link

  • London Underground polluted with metallic particles small enough to enter human bloodstream

    London Underground polluted with metallic particles small enough to enter human bloodstream

    [ad_1]

    Newswise — The London Underground is polluted with ultrafine metallic particles small enough to end up in the human bloodstream, according to University of Cambridge researchers. These particles are so small that they are likely being underestimated in surveys of pollution in the world’s oldest metro system.

    The researchers carried out a new type of pollution analysis, using magnetism to study dust samples from Underground ticket halls, platforms and operator cabins.

    The team found that the samples contained high levels of a type of iron oxide called maghemite. Since it takes time for iron to oxidise into maghemite, the results suggest that pollution particles are suspended for long periods, due to poor ventilation throughout the Underground, particularly on station platforms.

    Some of the particles are as small as five nanometres in diameter: small enough to be inhaled and end up in the bloodstream, but too small to be captured by typical methods of pollution monitoring. However, it is not clear whether these particles pose a health risk.

    Other studies have looked at overall pollution levels on the Underground and the associated health risks, but this is the first time that the size and type of particles has been analysed in detail. The researchers suggest that periodic removal of dust from Underground tunnels, as well as magnetic monitoring of pollution levels, could improve air quality throughout the network. Their results are reported in the journal Scientific Reports.

    The London Underground carries five million passengers per day. Multiple studies have shown that air pollution levels on the Underground are higher than those in London more broadly, and beyond the World Health Organization’s (WHO) defined limits. Earlier studies have also suggested that most of the particulate matter on the Underground is generated as the wheels, tracks and brakes grind against one another, throwing up tiny, iron-rich particles.

    “Since most of these air pollution particles are metallic, the Underground is an ideal place to test whether magnetism can be an effective way to monitor pollution,” said Professor Richard Harrison from Cambridge’s Department of Earth Sciences, the paper’s senior author. “Normally, we study magnetism as it relates to planets, but we decided to explore how those techniques could be applied to different areas, including air pollution.”

    Pollution levels are normally monitored using standard air filters, but these cannot capture ultrafine particles, and they do not detect what kinds of particles are contained within the particulate matter.

    “I started studying environmental magnetism as part of my PhD, looking at whether low-cost monitoring techniques could be used to characterise pollution levels and sources,” said lead author Hassan Sheikh from Cambridge’s Department of Earth Sciences. “The Underground is a well-defined micro-environment, so it’s an ideal place to do this type of study.”

    Working with colleagues from Cambridge’s Department of Materials Science and Metallurgy, Sheikh and Harrison analysed 39 dust samples from the London Underground, provided by Transport for London (TfL). The samples were collected in 2019 and 2021 from platforms, ticket halls, and train operator cabins on the Piccadilly, Northern, Central, Bakerloo, Victoria, Northern, District and Jubilee lines. The sampling included major stations such as King’s Cross St Pancras, Paddington, and Oxford Circus.

    The researchers used magnetic fingerprinting, 3D imaging and nanoscale microscopy to characterise the structure, size, shape, composition and magnetic properties of particles contained in the samples. Earlier studies have shown that 50% of the pollution particles in the Underground are iron-rich, but the Cambridge team were able to look in much closer detail. They found a high abundance of maghemite particles, ranging in diameter from five to 500 nanometres, and with an average diameter of 10 nanometres. Some particles formed larger clusters with diameters between 100 and 2,000 nanometres.

    “The abundance of these very fine particles was surprising,” said Sheikh. “The magnetic properties of iron oxides fundamentally change as the particle size changes. In addition, the size range where those changes happen is the same as where air pollution becomes a health risk.”

    While the researchers did not look at whether these maghemite particles pose a direct health risk, they say that their characterisation methods could be useful in future studies.

    “If you’re going to answer the question of whether these particles are bad for your health, you first need to know what the particles are made of and what their properties are,” said Sheikh.

    “Our techniques give a much more refined picture of pollution in the Underground,” said Harrison. “We can measure particles that are small enough to be inhaled and enter the bloodstream. Typical pollution monitoring doesn’t give you a good picture of the very small stuff.”

    The researchers say that due to poor ventilation in the Underground, iron-rich dust can be resuspended in the air when trains arrive at platforms, making the air quality on platforms worse than in ticket halls or in operator cabins.

    Given the magnetic nature of the resuspended dust, the researchers suggest that an efficient removal system might be magnetic filters in ventilation, cleaning of the tracks and tunnel walls, or placing screen doors between platforms and trains.

    The research was supported in part by the European Union, the Cambridge Trust and Selwyn College, Cambridge.

    [ad_2]

    University of Cambridge

    Source link

  • Ethiopian schools study suggests COVID has “ruptured” social skills of the world’s poorest children

    Ethiopian schools study suggests COVID has “ruptured” social skills of the world’s poorest children

    [ad_1]

    Newswise — School closures during the COVID-19 pandemic have “severely ruptured” the social and emotional development of some of the world’s poorest children, as well as their academic progress, new evidence shows.

    In a study of over 2,000 primary school pupils in Ethiopia, researchers found that key aspects of children’s social and emotional development, such as their ability to make friends, not only stalled during the school closures, but probably deteriorated.

    Children who, prior to the pandemic, felt confident talking to others or got on well with peers were less likely to do so by 2021. Those who were already disadvantaged educationally – girls, the very poorest, and those from rural areas – seem to have been particularly badly affected.

    Both this research and a second, linked study of around 6,000 grade 1 and 4 primary school children, also found evidence of slowed academic progress. Children lost the equivalent of at least one third of an academic year in learning during lockdown – an estimate researchers describe as “conservative”. This appears to have widened an already significant attainment gap between disadvantaged pupils and the rest, and there is some evidence that this may be linked to the drop in social skills.

    Both studies were by academics from the University of Cambridge, UK and Addis Ababa University, Ethiopia.

    Professor Pauline Rose, Director of the Research in Equitable Access and Learning (REAL) Centre at the Faculty of Education, University of Cambridge, said: “COVID is having a long-term impact on children everywhere, but especially in lower-income countries. Education aid and government funding must focus on supporting both the academic and socio-emotional recovery of the most disadvantaged children first.”

    Professor Tassew Woldehanna, President of Addis Ababa University, said: “These  severe ruptures to children’s developmental and learning trajectories underline how much we need to think about the impact on social, and not just academic skills. Catch-up education must address the two together.”

    Both studies used data from the Research on Improving Systems of Education (RISE) programme in Ethiopia to compare primary education before the pandemic, in the academic year 2018/19, with the situation in 2020/21.

    In the first study, researchers compared the numeracy test scores of 2,700 Grade 4 pupils in June 2019 with their scores shortly after they returned to school, in January 2021. They also measured dropout rates. In addition, pupils completed the Children’s Self Report Social Skills scale, which asked how much they agreed or disagreed with statements such as “I feel confident talking to others”, “I make friends easily”, and “If I hurt someone, I say sorry”.

    The second study measured relative progress during the pandemic using the numeracy scores of two separate cohorts of Grade 1 and Grade 4 pupils. The first of these cohorts was from the pre-pandemic year; the other from 2020/21.

    The results suggest pupils made some academic progress during the closures, but at a slower than expected rate. The average foundational numeracy score of Grade 1 pupils in 2020/21 was 15 points behind the 2018/19 cohort; by the end of the year that gap had widened to 19 points. Similarly, Grade 4 students started 2020/21 10 points behind their predecessor cohort, and were 12 points adrift by the end. That difference amounted to roughly one third of a year’s progress. Similar patterns emerged from the study of children’s numeracy scores before and after the closures.

    Poorer children, and those from rural backgrounds, consistently performed worse academically. Dropout rates revealed similar issues: of the 2,700 children assessed in 2019 and 2021, more than one in 10 (11.3%) dropped out of school during the closures. These were disproportionately girls, or lower-achieving pupils, who tended to be from less wealthy or rural families.

    All pupils’ social skills declined during the closure period, regardless of gender or location. Fewer children agreed in 2021 with statements such as “Other people like me” or “I make friends easily”. The decline in positive responses differed by demographic, and was sharpest among those from rural settings. This may be because children from remote parts of the country experienced greater isolation during lockdown.

    The most striking evidence of a rupture in socio-emotional development was the lack of a predictive association between the 2019 and 2021 results. Pupils who felt confident talking to others before the pandemic, for example, had often changed their minds two years later.

    Researchers suggest that the negative impact on social and emotional development may be linked to the slowdown in academic attainment. Children who did better academically in 2021 tended to report stronger social skills. This association is not necessarily causal, but there is evidence that academic attainment improves children’s self-confidence and esteem, and that prosocial behaviours positively influence academic outcomes. It is therefore possible that during the school closures this potential reinforcement was reversed.

    Both reports echo previous research which suggests that lower-income countries such as Ethiopia need to invest in targeted programmes for girls, those from rural backgrounds, and the very poorest, if they are to prevent these children from being left behind. Alongside in-school catch-up programmes, action may be required to support those who are out of school. Ghana’s successful Complementary Basic Education initiative provides one model.

    In addition, the researchers urge education policy actors to integrate support for  social skills into both catch-up education and planning for future closures. “Social and emotional skills should be an explicit goal of the curriculum and other guidance,” Rose said. “Schools may also want to think about after-school clubs, safe spaces for girls, and ensuring that primary-age children stay with the same group of friends during the day. Initiatives like these will go some way towards rebuilding the prosocial skills the pandemic has eroded.”

    Ruptured School Trajectories is published in the journal, Longitudinal and Life Course StudiesLearning Losses during the COVID-19 Pandemic in Ethiopia, is available on the REAL Centre website.

    [ad_2]

    University of Cambridge

    Source link

  • Plants employ chemical engineering to manufacture bee-luring optical devices

    Plants employ chemical engineering to manufacture bee-luring optical devices

    [ad_1]

    Newswise — While most flowers produce pigments that appear colourful and act as a visual cue to pollinators, some flowers also create microscopic three-dimensional patterns on their petal surfaces. These parallel striations reflect particular wavelengths of light to produce an iridescent optical effect that is not always visible to human eyes, yet visible to bees.

    There is lots of competition for attention from pollinators and – given that 35 percent of the world’s crops rely on animal pollinators – understanding how plants make petal patterns that please pollinators could be significant for directing future research and policies in agriculture, biodiversity and conservation. 

    Research led by Professor Beverley Glover’s team at Cambridge’s Department of Plant Sciences revealed there is more to petal patterning than meets the eye. Previous results indicated that mechanical buckling of the thin, protective cuticle layer on the surface of the young growing petals could trigger the formation of microscopic ridges. These semi-ordered ridges act as diffraction gratings that reflect different wavelengths of light to create a weak iridescent blue-halo effect in the blue-UV spectrum that bumblebees can see. However, why those striations only form in certain flowers or even only on certain parts of the petals was not understood.

    Edwige Moyroud, who started this research in Professor Glover’s lab and is now leading her own research group at the Sainsbury Laboratory, has developed the Australian native hibiscus, Venice mallow (Hibiscus trionum), as a new model species to try to understand how and when these nanostructures develop.

    “Our initial model predicted that how much cells grow and how much cuticle those cells make were key factors controlling the formation of striations,” said Dr Moyroud, “but when we started to test the model using experimental work in Venice mallow we found out that their formation is also highly dependent on cuticle chemistry, which affects how the cuticle responds to the forces that cause buckling. The next question we want to explore is how different chemistries can change the mechanical properties of the cuticle, as a nanostructure-building material. It may be that different chemical compositions result in a cuticle with differing architecture or with different stiffness and hence different ways of reacting to the forces experienced by cells as the petal grows.” 

    This project revealed that there is a combination of processes working together and allowing plants to shape their surfaces. Dr Moyroud added: “Plants are formidable chemists and these results illustrate how they can precisely tune the chemistry of their cuticle to produce different textures across their petals. Patterns formed at the microscopic scale can fulfil a range of functions, from communication with pollinators to defence against herbivores or pathogens. They are striking examples of evolutionary diversification and by combining experiments and computational modelling we are starting to understand a little bit better how plants can fabricate them.”

    The findings will be published on 23 November 2022 in the journal Current Biology.

    “These insights are also useful for biodiversity and conservation work because they help to explain how plants interact with their environment,” said Professor Glover, who is also Director of the Cambridge University Botanic Garden, in which the researchers first noticed the iridescent flowers of Venice mallow.  “For example, species that are closely related but that grow in different geographic regions can have very different petal patterns. Understanding why petal pattering varies and how this might affect the relationship between the plants and their pollinators could help to better inform policies in future management of environmental systems and conservation of biodiversity.” 

     

    Investigating what drives 3D petal patterning

    The researchers took a stepwise approach to the investigations. They first observed petal development and noticed that the cuticle patterns appear when cells elongate, suggesting that growth was important. They then determined whether measuring physical parameters related to growth, such as cell expansion and cuticle thickness, could adequately predict the patterns observed, and found that they couldn’t. They then took a step backwards to try to identify what was missing.

    The properties of a material, whether inorganic or produced by living cells like the cuticle, are likely to depend on the chemical nature of this material. With this in mind, the researchers decided to look at cuticle chemistry, and found that, indeed, this is a controlling factor. To do this, they first used a new method from the chemistry field to analyse the composition of the cuticle at very specific points across the petal. This showed that petal regions with contrasting textures (smooth or striated) also differ in the chemistry of their surface. Comparing with smooth cuticle, they found the striated cuticle has high levels of dihydroxy-palmitic acid and waxes and low levels of phenolic compounds. To test if cuticle chemistry was indeed important, they then pioneered a transgenic approach in Hibiscus to alter cuticle chemistry directly in the plants, using genes similar to those known to control the production of cuticle molecules in a different model plant, Arabidopsis. This showed that cuticle texture can be modified, without changing cell growth, simply by modifying cuticle composition. How can cuticle chemistry control its 3D folding? The researchers think that a change in cuticle chemistry affects the mechanical properties of the cuticle as, even when stretched using a special device, transgenic petals with smooth cuticle remained smooth, unlike those from wild-type plants.

     

    Funding 

    This work was funded by EU Marie Curie Actions, BBSRC, the European Research Council, Herchel Smith Fund and the Gatsby Charitable Foundation. 

     

    Reference

    Edwige Moyroud, Chiara A. Airoldi, Jordan Ferria, Chiara Giorio, Sarah S. Steimer, Paula J. Rudall, Christina J. Prychid, Shannon Halliwell, Joseph F. Walker, Sarah Robinson, Markus Kalberer, and Beverley J. Glover (2022) Cuticle chemistry drives the development of diffraction gratings on the surface of Hibiscus trionum petals. Current Biology

    DOI: 10.1016/j.cub.2022.10.065

    Public URL: https://www.cell.com/current-biology/fulltext/S0960-9822(22)01713-4

    [ad_2]

    University of Cambridge

    Source link