ReportWire

Tag: brain implants

  • WTF Fun Fact 13633 – Communication via Brain Implants

    WTF Fun Fact 13633 – Communication via Brain Implants

    [ad_1]

    Imagine a world where thoughts translate into words without uttering a single sound via brain implants.

    At Duke University, a groundbreaking project involving neuroscientists, neurosurgeons, and engineers, has birthed a speech prosthetic capable of converting brain signals into spoken words. This innovation, detailed in the journal Nature Communications, could redefine communication for those with speech-impairing neurological disorders.

    Currently, people with conditions like ALS or locked-in syndrome rely on slow and cumbersome communication methods. Typically, speech decoding rates hover around 78 words per minute, while natural speech flows at about 150 words per minute. This gap in communication speed underscores the need for more advanced solutions.

    To bridge this gap, Duke’s team, including neurologist Gregory Cogan and biomedical engineer Jonathan Viventi, has introduced a high-tech approach. They created an implant with 256 tiny sensors on a flexible, medical-grade material. Capturing nuanced brain activities essential for speech, this device marks a significant leap from previous models with fewer sensors.

    The Test Drive: From Lab to Real Life

    The real challenge was testing the implant in a real-world setting. Patients undergoing unrelated brain surgeries, like Parkinson’s disease treatment or tumor removal, volunteered to test the implant. The Duke team, likened to a NASCAR pit crew by Dr. Cogan, had a narrow window of 15 minutes during these surgeries to conduct their tests.

    Patients participated in a simple task: listening to and repeating nonsensical words. The implant recorded their brain’s speech-motor cortex activities, coordinating muscles involved in speech. This data is then fed into a machine learning algorithm, managed by Suseendrakumar Duraivel, to predict the intended sounds based on brain activity.

    While accuracy varied, some sounds and words were correctly identified up to 84% of the time. Despite the challenges, such as distinguishing between similar sounds, the results were promising, especially considering the brevity of the data collection period.

    The Road Ahead for Brain Implants

    The team’s next steps involve creating a wireless version of the device, funded by a $2.4M grant from the National Institutes of Health. This advancement would allow users greater mobility and freedom, unencumbered by wires and electrical outlets. However, reaching a point where this technology matches the speed of natural speech remains a challenge, as noted by Viventi.

    The Duke team’s work represents a significant stride in neurotechnology, potentially transforming the lives of those who have lost their ability to speak. While the current version may still lag behind natural speech rates, the trajectory is clear and promising. The dream of translating thoughts directly into words is becoming more tangible, opening new horizons in medical science and communication technology. This endeavor, supported by extensive research and development, signals a future where barriers to communication are continually diminished, offering hope and empowerment to those who need it most.

     WTF fun facts

    Source: “Brain implant may enable communication from thoughts alone” — ScienceDaily

    [ad_2]

    WTF

    Source link

  • Elon Musk’s Brain Chip Company Approved for Human Testing | Entrepreneur

    Elon Musk’s Brain Chip Company Approved for Human Testing | Entrepreneur

    [ad_1]

    Elon Musk‘s company Neuralink announced late today that the U.S. Food and Drug Administration (FDA) gave it the green light to experiment with implanting brain chips in humans.

    The company, founded in 2016 and primarily funded by billionaire Musk, develops electronic implants that decode brain activity and communicate it to computers. While other companies have used brain implants to assist people with debilitating medical conditions like paralysis and ALS, Neuralink’s brain chips have only been used in monkeys.

    But that will soon change.

    Musk has said, “I think we have a chance with Neuralink to restore full-body functionality to someone who has a spinal cord injury.” But he and the company also want to take the technology a step further — maximizing the potential of healthy people to keep up with artificial intelligence.

    “We want to surpass able-bodied human performance with our technology,” Neuralink tweeted last month.

    Related: Elon Musk’s Neuralink Is Under Investigation for Allegedly Jeopardizing Human Safety

    ‘The future is going to be weird.’

    Musk said he envisions a world where patients can drop by clinics to have a chip surgically implanted into their brains by a robot. “You’ll be able to save and replay memories,” he said at a show-and-tell presentation last year. “The future is going to be weird.”

    He also predicts customers will want to upgrade their brain chips to the latest models regularly.

    “I’m pretty sure you would not want the iPhone 1 stuck in your head if the iPhone 14 is available,” Musk said.

    Musk is so confident that the devices are safe that he would be willing to implant them in his children.

    It is still unclear if the brain implants will pass the rigorous FDA trial stage. Still, the announcement is a significant step forward for Musk’s business empire and brain-computer interface technology.

    [ad_2]

    Jonathan Small

    Source link

  • What If Big Tech Could Read Your Mind?

    What If Big Tech Could Read Your Mind?

    [ad_1]

    Oct. 12, 2022 – Ever since his mid-30s, Greg lived in a nursing home. An assault 6 years earlier left him barely conscious, unable to talk or eat. Two years of rehab did little to help him. Most people in Greg’s condition would have remained nonverbal and separated from the world for the rest of their lives. But at age 38, Greg received a brain implant through a clinical trial. 

    Surgeons installed an electrode on either side of his thalamus, the main relay station of the brain. 

    “People who are in the minimally conscious state have intact brain circuitry, but those circuits are under-activated,” explains Joseph Fins, MD, chief of the Division of Medical Ethics at Weill Cornell Medicine in New York City. Delivering electrical impulses to affected regions can revive those circuits, restoring lost or weakened function. 

    These devices are like pacemakers for the brain,” says Fins, who co-authored a study in Nature about Greg’s surgery.

    The researchers switched Greg’s device off and on every 30 days for 6 months, observing how the electrical stimulation (or lack thereof) altered his abilities. They saw remarkable things. 

    “With the deep brain stimulator, he was able to say six- or-seven-word sentences, the first 16 words of the Pledge of Allegiance. Tell his mother he loved her. Go shopping at Old Navy and voice a preference for the kind of clothing his mother was buying,” recalls Fins, who shared Greg’s journey in his book, Rights Come to Mind: Brain Injury, Ethics and the Struggle for Consciousness.

    After 6 years of silence, Greg regained his voice.

    Yet success stories like his aren’t without controversy, as the technology has raised many ethical questions: Can a minimally conscious person consent to brain surgery?  What happens to the people being studied when clinical trials are over? How can people’s neural data be responsibly used – and protected? 

    “I think that motto, ‘Move fast and break things,’ is a really bad approach,” says Veljko Dubljevic, PhD, an associate professor of science, technology, and society at North Carolina State University. He’s referring to the unofficial tagline of Silicon Valley, the headquarters for Elon Musk’s neurotechnology company, Neuralink. 

    Neuralink was founded in 2016, nearly a decade after the study about Greg’s brain implant was published. Yet it has been Musk’s company that has most visibly thrust neurotechnology into public consciousness, owing somewhat to its founder’s often overstated promises. (In 2019, Musk claimed his brain-computer interface would be implanted in humans in 2020. He has since moved that target to 2022.) Musk has called his device “a Fitbit in your skull,” though it’s officially named the “Link.” 

    Brain-computer interfaces, or BCIs, are already implanted in 36 people around the world, according to Blackrock, a leading maker of these devices. What makes Neuralink different is its ambitious goal to implant over 1,000 thinner-than-hair electrodes. If the Link works as intended – by monitoring a person’s brain activity and commanding a computer to do what they want – people with brain disorders, like quadriplegia, could regain a lot of independence. 

    The History Behind Brain Implants

    BCIs – brain implants that communicate with an external device, typically a computer – are often framed as a science-fiction dream that geniuses like Musk are making a reality. But they’re deeply indebted to a technology that’s been used for decades: deep brain stimulation (DBS). In 1948, a neurosurgeon at Columbia University implanted an electrode into the brain of a woman diagnosed with depression and anorexia. The patient improved – until the wire broke a few weeks later. Still, the stage was set for longer-term neuromodulation.

    It would be movement disorders, not depression, that ultimately catapulted DBS into the medical mainstream. In the late 1980s, French researchers published a study suggesting the devices could improve essential tremor and the tremor associated with Parkinson’s. The FDA approved DBS for essential tremor in 1997; approval for Parkinson’s followed in 2002. DBS is now the most common surgical treatment for Parkinson’s disease.

    Since then, deep brain stimulation has been used, often experimentally, to treat a variety of conditions, ranging from obsessive-compulsive disorder to Tourette’s to addiction. The advancements are staggering: Newer closed-loop devices can directly respond to the brain’s activity, detecting, for example, when a seizure in someone with epilepsy is about to happen, then sending an electrical impulse to stop it.

    In clinical trials, BCIs have helped people with paralysis move prosthetic limbs. Implanted electrodes enabled a blind woman to decipher lines, shapes, and letters. In July, Synchron – widely considered Neuralink’s chief competitor – implanted its Stentrode device into its first human subject in the U.S. This launched an unprecedented FDA-approved trial and puts Synchron ahead of Neuralink (which is still in the animal-testing phase). Australian research has already shown that people with Lou Gehrig’s disease (also called amyotrophic lateral sclerosis, or ALS) can shop and bank online using the Stentrode.

    With breakthroughs like these, it’s hard to envision any downsides to brain implants. But neuroethicists warn that if we don’t act proactively – if companies fail to build ethical concerns into the very fabric of neurotechnology – there could be serious downstream consequences. 

    The Ethics of Safety and Durability 

    It’s tempting to dismiss these concerns as premature. But neurotechnology has already gained a firm foothold, with deep brain stimulators implanted in 200,000 people worldwide. And it’s still not clear who is responsible for the care of those who received the devices from clinical trials. 

    Even if recipients report benefits, that could change over time as the brain encapsulates the implant in glial tissue. This “scarification” interferes with the electrical signal, says Dubljevic, reducing the implant’s ability to communicate. But removing the device could pose a significant risk, such as bleeding in the brain. Although cutting-edge designs aim to resolve this – the Stentrode, for example, is inserted into a blood vessel, rather than through open brain surgery – many devices are still implanted, probe-like, deep into the brain. 

    Although device removal is usually offered at the end of studies, the cost is often not covered as part of the trial. Researchers typically ask the individual’s insurance to pay for the procedure, according to a study in the journal Neuron. But insurers have no obligation to remove a brain implant without a medically necessary reason. A patient’s dislike for the device generally isn’t sufficient. 

    Acceptance among recipients is hardly uniform. Patient interviews suggest these devices can alter identity, making people feel less like themselves, especially if they’re already prone to poor self-image

    “Some feel like they’re controlled by the device,” says Dubljevic, obligated to obey the implant’s warnings; for example, if a seizure may be imminent, being forced not to take a walk or go about their day normally. 

    “The more common thing is that they feel like they have more control and greater sense of self,” says Paul Ford, PhD, director of the NeuroEthics Program at the Cleveland Clinic. But even those who like and want to keep their devices may find a dearth of post-trial support – especially if the implant wasn’t statistically proven to be helpful. 

    Eventually, when the device’s battery dies, the person will need a surgery to replace it. 

    “Who’s gonna pay for that? It’s not part of the clinical trial,” Fins says. “This is kind of like giving people Teslas and not having charging stations where they’re going.” 

    As neurotechnology advances, it’s critical that health care systems invest in the infrastructure to maintain brain implants – in much the same way that someone with a pacemaker can walk into any hospital and have a cardiologist adjust their device, Fins says.

    If were serious about developing this technology, we should be serious about our responsibilities longitudinally to these participants.”

    The Ethics of Privacy

    It’s not just the medical aspects of brain implants that raise concerns, but also the glut of personal data they record. Dubljevic compares neural data now to blood samples 50 years ago, before scientists could extract genetic information. Fast-forward to today, when those same vitals can easily be linked to individuals. 

    “Technology may progress so that more personal information can be gleaned from recordings of brain data,” he says. “It’s currently not mind-reading in any way, shape, or form. But it may become mind-reading in something like 20 or 30 years.” 

    That term – mind-reading – is thrown around a lot in this field. 

    “It’s kind of the science-fiction version of where the technology is today,” says Fins. (Brain implants are not currently able to read minds.) 

    But as device signals become clearer, data will become more precise. Eventually, says Dubljevic, scientists may be able to figure out attitudes or psychological states.

    “Someone could be labeled as less attentive or less intelligent” based on neural patterns, he says. 

    Brain data could also expose unknown medical conditions – for example, a history of stroke – that may be used to raise an individual’s insurance premiums or deny coverage altogether. Hackers could potentially seize control of brain implants, shutting them off or sending rogue signals to the user’s brain.

    Some researchers, including Fins, say that storing brain data is no riskier than keeping medical records on your phone. 

    “It’s about cybersecurity writ large, he says.  

    But others see brain data as uniquely personal. 

    “These are the only data that reveal a person’s mental processes,” argues a report from UNESCO’s International Bioethics Committee (IBC). “If the assumption is that ‘I am defined by my brain,’ then neural data may be considered as the origin of the self and require special definition and protection.” 

    The brain is such a key part of who we are – what makes us us,” says Laura Cabrera, PhD, the chair of neuroethics at Penn State University. Who owns the data? Is it the medical system? Is it you, as a patient or user? I think that hasnt really been resolved.” 

    Many of the measures put in place to regulate what Google or Facebook gathers and shares could also be applied to brain data. Some insist that the industry default should be to keep neural data private, rather than requiring people to opt out of sharing. But Dubljevic, takes a more nuanced view, since the sharing of raw data among researchers is essential for technological advancement and accountability. 

    What’s clear is that forestalling research isn’t the solution – transparency is. As part of the consent process, patients should be told where their data is being stored, for how long, and for what purpose, says Cabrera. In 2008, the U.S. passed a law prohibiting discrimination in health care coverage and employment based on genetic information. This could serve as a helpful precedent, she says. 

    The Legal Question 

    Around the globe, legislators are studying the question of neural data. A few years ago, a visit from a Columbia University neurobiologist sparked Chile’s Senate to draft a bill to regulate how neurotechnology could be used and how data would be safeguarded. 

    “Scientific and technological development will be at the service of people,” the amendment promised, “and will be carried out with respect for life and physical and mental integrity.”

    Chile’s new Constitution was voted down in September, effectively killing the neuro-rights bill. But other countries are considering similar legislation. In 2021, France amended its bioethics law to prohibit discrimination due to brain data, while also building in the right to ban devices that modify brain activity.

    Fins isn’t convinced this type of legislation is wholly good. He points to people like Greg – the 38-year-old who regained his ability to communicate through a brain implant. If it’s illegal to alter or investigate the brain’s state, “then you couldn’t find out if there was covert consciousness”– mental awareness that isn’t outwardly apparent – “thereby destining people to profound isolation,” he says. 

    Access to neurotechnology needs protecting too, especially for those who need it to communicate. 

    “It’s one thing to do something over somebody’s objection. That’s a violation of consent – a violation of personhood,” says Fins. “It’s quite another thing to intervene to promote agency.”

    In cases of minimal consciousness, a medical surrogate, such as a family member, can often be called upon to provide consent. Overly restrictive laws could prevent the implantation of neural devices in these people.

     “It’s a very complicated area,” says Fins. 

    The Future of Brain Implants

    Currently, brain implants are strictly therapeutic. But, in some corners, “enhancement is an aspiration,” says Dubljevic. Animal studies suggest the potential is there. In a 2013 study, researchers monitored the brains of rats as they navigated a maze; electrical stimulation then transferred that neural data to rats at another lab. This second group of rodents navigated the maze as if they’d seen it before, suggesting that the transfer of memories may eventually become a reality. Possibilities like this raise the specter of social inequity, since only the wealthiest may afford cognitive enhancement. 

    They could also lead to ethically questionable military programs. 

    “We have heard staff at DARPA and the U.S. Intelligence Advanced Research Projects Activity discuss plans to provide soldiers and analysts with enhanced mental abilities (‘super-intelligent agents’),” a group of researchers wrote in a 2017 paper in Nature. Brain implants could even become a requirement for soldiers, who may be obligated to take part in trials; some researchers advise stringent international regulations for military use of the technology, like the Geneva Protocol for chemical and biological weapons. 

    The temptation to explore every application of neurotechnology will likely prove irresistible for entrepreneurs and scientists alike. That makes precautions essential. 

    “While its not surprising to see many potential ethical issues and questions arising from use of a novel technology,” a team of researchers, including Dubljevic, wrote in a 2020 paper in Philosophies, “what is surprising is the lack of suggestions to resolve them.” 

    It’s critical that the industry proceed with the right mindset, he says, emphasizing collaboration and making ethics a priority at every stage.

    How do we avoid problems that may arise and find solutions prior to those problems even arising?” Dubljevic asks. “Some proactive thinking goes a long way.”

    [ad_2]

    Source link