ReportWire

Tag: Neural Networks

  • Liquid AI Is Redesigning the Neural Network

    Liquid AI Is Redesigning the Neural Network

    Artificial intelligence might now be solving advanced math, performing complex reasoning, and even using personal computers, but today’s algorithms could still learn a thing or two from microscopic worms.

    Liquid AI, a startup spun out of MIT, will today reveal several new AI models based on a novel type of “liquid” neural network that has the potential to be more efficient, less power-hungry, and more transparent than the ones that underpin everything from chatbots to image generators to facial recognition systems.

    Liquid AI’s new models include one for detecting fraud in financial transactions, another for controlling self-driving cars, and a third for analyzing genetic data. The company touted the new models, which it is licensing to outside companies, at an event held at MIT today. The company has received funding from investors that include Samsung and Shopify, both of which are also testing its technology.

    “We are scaling up,” says Ramin Hasani, cofounder and CEO of Liquid AI, who co-invented liquid networks as a graduate student at MIT. Hasani’s research drew inspiration from the C. elegans, a millimeter-long worm typically found in soil or rotting vegetation. The worm is one of the few creatures to have had its nervous system mapped in its entirety, and it is capable of remarkably complex behavior despite having just a few hundred neurons. “It was once just a science project, but this technology is fully commercialized and fully ready to bring value for enterprises,” Hasani says.

    Inside a regular neural network, the properties of each simulated neuron are defined by a static value or “weight” that affects its firing. Within a liquid neural network, the behavior of each neuron is governed by an equation that predicts its behavior over time, and the network solves a cascade of linked equations as the network functions. The design makes the network more efficient and more flexible, allowing it to learn even after training, unlike a conventional neural network. Liquid neural networks are also open to inspection in a way that existing models are not, because their behavior can essentially be rewound to see how it produced an output.

    In 2020, the researchers showed that such a network with only 19 neurons and 253 synapses, which is remarkably small by modern standards, could control a simulated self-driving car. While a regular neural network can analyze visual data only at static intervals, the liquid network captures the way visual information changes over time very efficiently. In 2022, Liquid AI’s founders figured out a shortcut that made the mathematical labor needed for liquid neural networks feasible for practical use.

    Will Knight

    Source link

  • Inside the Creation of the World’s Most Powerful Open Source AI Model

    Inside the Creation of the World’s Most Powerful Open Source AI Model

    This past Monday, about a dozen engineers and executives at data science and AI company Databricks gathered in conference rooms connected via Zoom to learn if they had succeeded in building a top artificial intelligence language model. The team had spent months, and about $10 million, training DBRX, a large language model similar in design to the one behind OpenAI’s ChatGPT. But they wouldn’t know how powerful their creation was until results came back from the final tests of its abilities.

    “We’ve surpassed everything,” Jonathan Frankle, chief neural network architect at Databricks and leader of the team that built DBRX, eventually told the team, which responded with whoops, cheers, and applause emojis. Frankle usually steers clear of caffeine but was taking sips of iced latte after pulling an all-nighter to write up the results.

    Databricks will release DBRX under an open source license, allowing others to build on top of its work. Frankle shared data showing that across about a dozen or so benchmarks measuring the AI model’s ability to answer general knowledge questions, perform reading comprehension, solve vexing logical puzzles, and generate high-quality code, DBRX was better than every other open source model available.

    AI decision makers: Jonathan Frankle, Naveen Rao, Ali Ghodsi, and Hanlin Tang.Photograph: Gabriela Hasbun

    It outshined Meta’s Llama 2 and Mistral’s Mixtral, two of the most popular open source AI models available today. “Yes!” shouted Ali Ghodsi, CEO of Databricks, when the scores appeared. “Wait, did we beat Elon’s thing?” Frankle replied that they had indeed surpassed the Grok AI model recently open-sourced by Musk’s xAI, adding, “I will consider it a success if we get a mean tweet from him.”

    To the team’s surprise, on several scores DBRX was also shockingly close to GPT-4, OpenAI’s closed model that powers ChatGPT and is widely considered the pinnacle of machine intelligence. “We’ve set a new state of the art for open source LLMs,” Frankle said with a super-sized grin.

    Building Blocks

    By open-sourcing, DBRX Databricks is adding further momentum to a movement that is challenging the secretive approach of the most prominent companies in the current generative AI boom. OpenAI and Google keep the code for their GPT-4 and Gemini large language models closely held, but some rivals, notably Meta, have released their models for others to use, arguing that it will spur innovation by putting the technology in the hands of more researchers, entrepreneurs, startups, and established businesses.

    Databricks says it also wants to open up about the work involved in creating its open source model, something that Meta has not done for some key details about the creation of its Llama 2 model. The company will release a blog post detailing the work involved to create the model, and also invited WIRED to spend time with Databricks engineers as they made key decisions during the final stages of the multimillion-dollar process of training DBRX. That provided a glimpse of how complex and challenging it is to build a leading AI model—but also how recent innovations in the field promise to bring down costs. That, combined with the availability of open source models like DBRX, suggests that AI development isn’t about to slow down any time soon.

    Ali Farhadi, CEO of the Allen Institute for AI, says greater transparency around the building and training of AI models is badly needed. The field has become increasingly secretive in recent years as companies have sought an edge over competitors. Opacity is especially important when there is concern about the risks that advanced AI models could pose, he says. “I’m very happy to see any effort in openness,” Farhadi says. “I do believe a significant portion of the market will move towards open models. We need more of this.”

    Will Knight

    Source link

  • A Machine Learning Company in California Using Quantum Computers at Mathlabs Ventures is Building the First Q40 ME Fusion Energy Generator Using Advanced AI & Neural Networks

    A Machine Learning Company in California Using Quantum Computers at Mathlabs Ventures is Building the First Q40 ME Fusion Energy Generator Using Advanced AI & Neural Networks

    Harvard Mathematicians using Artificial Intelligence, Machine Learning, Blockchain and Neural Networks on a Quantum Computer have developed breakthrough algorithms and simulations that will enable the world’s most efficient Fusion Energy Power Plants to be opened 20 years earlier than planned with a Q40 Mechanical Gain by Kronos Fusion Energy Algorithms

    Press Release


    Jan 10, 2022

    Kronos Fusion Energy Algorithms LLC (KFEA-Q40) and MathLabs Ventures announced today that after 60 years of global research, the Fusion Energy industry is now poised to accelerate their growth rapidly to build commercially viable power plants 20 years earlier than planned because of three recent major advances in technology. The three major problems with reaching commercial success in Fusion Energy have recently been overcome with these three new technological advancements that together will make it possible to build efficient Fusion Energy Power Plants on Earth by the mid-2030s. These innovations, ongoing contracts & patents put KFEA’s current valuation at $530m with $1.2B in projected earnings over the next 2 years.

    “We at Kronos are building a world-class team of mathematicians, physicists, scientists and other professionals whose mission is to reverse global warming by helping to make  Fusion Energy commercially viable in the near future,” said Michael Pierce Hoban, the CEO of Kronos Fusion Energy Algorithms

    Recreating the power of the sun on earth in a controlled manner takes computing power, machine learning, artificial intelligence, blockchain, quantum computers, neural networks, and other technological advances that were not even dreamed of 60 years ago when Fusion Energy research began globally. But now, with these three technological breakthroughs, the global competition to design the next-generation Fusion Energy Power Plants that are more efficient than today’s carbon-burning power plants is underway in full swing.

    The first technological barrier that was overcome is that the computing power now exists to model the sun in simulations more accurately with the launch of the Summit Supercomputer in Oak Ridge that set the world record in 2018, and in June 2021, Japan’s Fugaka Supercomputer set a new world record of 422 petaflops.

    The second technological barrier that was overcome in September 2021 was the announcement of the most powerful magnet ever created on earth (https://news.mit.edu/2021/MIT-CFS-major-advance-toward-fusion-energy-0908). This is the first magnet with enough power capable of containing a fast-moving plasma field at heats in excess of 150M degrees Celsius without touching and melting the containment barrier.

    The third technological barrier that has been the most difficult to overcome is the 1% efficiency rate (Q1 Mechanical Gain) of the top fusion energy demo reactors on earth today. The first two breakthroughs will enable the world’s top Fusion energy designers to reach a 25% efficiency rate (Q25 Mechanical Gain) by 2050. This has been a major technological barrier because there has been no fusion energy reactor solution that has been proposed in the world that exceeds 25% efficiency until now.

    Kronos Fusion Energy Algorithms LLC announced that after five years studying the global research in Fusion Energy, we have developed advanced algorithms and simulations to achieve a 40% efficiency rate (Q40 Mechanical Gain) for Commercial Fusion Energy Power Plants that will enable a 20-year advancement in the launch dates of the world’s first Fusion Energy Power Plants that are more efficient than today’s carbon burning power plants. Our algorithms and simulations use Artificial Intelligence, Machine Learning, neural networks, blockchain, quantum computing and other advances to reduce the error rate at a Fusion Energy Reactor from the 15% error rate experienced today at the International Thermodynamic Experimental Reactor (ITER) in France to a 1% error rate after our simulations have optimized the numerous variables to identify the disruptions that cause 31% of the maintenance shutdowns at ITER.

    Kronos Fusion Energy Algorithms: Developing ALGORITHMS & SIMULATIONS to build Micro Fusion Energy Generators with Q40 Mechanical Gain for a CLEAN + LIMITLESS Energy Future

    MEDIA CONTACT:

    PRIYANCA FORD  

    Founder & Chief Strategy Officer at Kronos Fusion Energy Algorithms

    Priyanca_Ford@post.harvard.edu

    Source: MathLabs Ventures

    Source link