Join top executives in San Francisco on July 11-12, to hear how leaders are integrating and optimizing AI investments for success. Learn More


The big artificial intelligence (AI) news at Google I/O today is the launch of the company’s PaLM 2 large language model, but that’s not the only AI news at the event.

The company is also rolling out a series of open-source machine learning (ML) technology updates and enhancements for the growing TensorFlow ecosystem. TensorFlow is an open-source technology effort, led by Google, that provides ML tools to help developers build and train models.

Google is launching its new DTensor technology at Google I/O. This technology brings new parallelism techniques to ML training, helping to improve model training and scaling efficiency.

Image credit: Google

There is also a preview release of the TF Quantization API, which is intended to help make models more resource-efficient overall and thus reduce the cost of development.

Event

Transform 2023

Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.

 


Register Now

A key part of the TensorFlow ecosystem is the Keras API suite, which provides a set of Python language-based deep learning capabilities on top of the core TensorFlow technology. Google is announcing a pair of new Keras tools: KerasCV for computer vision (CV) applications, and KerasNLP for natural language processing (NLP).

“A big part of what we’re looking at in terms of the tooling and the open-source space is really driving new capabilities and new efficiency and new performance,” Alex Spinelli, Google’s vice president of product management for machine learning, told VentureBeat. “Absolutely Google will build awesome, amazing AI and ML into its products, but we also want to kind of create a rising tide that lifts all ships, so we’re really committed to our open source strategies, and enabling developers at large.”

TensorFlow remains the ‘workhouse’ of machine learning at Google

In an era where large language models (LLMs) are all the rage, Spinelli emphasized that it’s now even more critical than ever to have the right ML training tools.

“TensorFlow is still today the workhorse of machine learning,” he said. “It is still … the fundamental underlying infrastructure [in Google] that powers a lot of our own machine learning developments.”

To that end, the DTensor updates will provide more “horsepower” as the requirements of ML training continue to grow. DTensor introduces more parallelization capabilities to help optimize training workflows.

Spinelli said that ML overall is just getting more hungry for data and compute resources. As such, finding ways to improve performance in order to process more data to serve the needs of increasingly larger models is extremely important. The new Keras updates will provide even more power, with modular components that actually let developers build their own computer vision and natural language processing capabilities. 

Still more power will come to TensorFlow thanks to the new JAX2TF technology. JAX is a research framework for AI, widely used at Google as a computational library, to build technologies such as the Bard AI chatbot. With JAX2TF, models written in JAX will now be more easily usable with the TensorFlow ecosystem.

“One of the things that we’re really excited about is how these things are going to make their way into products — and watch that developer community flourish,” he said.

PyTorch vs TensorFlow

While TensorFlow is the workhorse of Google’s ML efforts, it’s not the only open-source ML training library.

In recent years the open-source PyTorch framework, originally created by Facebook (now Meta), has become increasingly popular. In 2022, Meta contributed PyTorch to the Linux Foundation, creating the new PyTorch Foundation, a multi-stakeholder effort with an open governance model.

Spinelli said that what Google is trying to do is support developer choice when it comes to ML tooling. He also noted that TensorFlow isn’t just an ML framework, it’s a whole ecosystem of tools for ML that can help support training and development for a broad range of use cases and deployment scenarios.

“This is the same set of technologies, essentially, that Google uses to build machine learning,” Spinelli said. “I think we have a really competitive offering if you really want to build large-scale high-performance systems and you want to know that these are going to work on all the infrastructures of the future.”

One thing Google apparently will not be doing is following Meta’s lead and creating an independent TensorFlor Foundation organization.

“We feel pretty comfortable with the way it’s developed today and the way it’s managed,” Spinelli said. “We feel pretty comfortable about some of these great updates that we’re releasing now.”

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.

Sean Michael Kerner

Source link

You May Also Like

Dollar bears see ‘multi-year downtrend’ ahead for world’s primary reserve currency—with major ripple effects

It just may have been the week that broke the dollar. The…

Companies are about to waste billions on AI — here’s how not to become one of them

Join Gen AI enterprise leaders in Boston on March 27 for an…

Spotnana Introduces VIP Traveler Service

Cloud-based travel platform Spotnana has launched a new service for executives and…

Goldman Sachs says Fed officials don’t need to spark a recession to tame inflation—but they think otherwise

In late September of last year, Federal Reserve Chairman Jerome Powell came…