Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.


Microsoft unveiled a flurry of generative AI announcements at its annual Ignite 2023 conference in Seattle yesterday, but among the most important news is its recommitment to open source generative AI, even in the face of the success of its multi-billion-dollar partnership with closed-source gen AI leader OpenAI.

The Redmond, Washington-headquartered company has emerged as a leader in gen AI thanks to its early and prescient backing of the Sam Altman led startup and integration of OpenAI’s GPT and DALL-E models into Microsoft products (Bing Chat, er, Copilot, and Microsoft Image Creator from Bing, respectively).

Bringing Llama and Mistral to Azure

But at Ignite over the last two days, Microsoft also made more important moves to bolster its support for rival open source AI models from the likes of Facebook-parent Meta Platforms, Inc. (another former Microsoft investment), which has become the de-facto steward of open source generative AI with its Llama models.

Llama-as-a-service is now available for enterprises to use, fine-tune, and deploy directly on Microsoft’s cloud computing platform Azure, as well as rival open source model Mistral 7B from the well-funded French startup of the same name. The move was cheered by Meta Platforms AI pioneer Yann LeCun on X (formerly Twitter).

VB Event

The AI Impact Tour

Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you!

 


Learn More

Yet, because open source models are by definition free-to-use and deploy for enterprise applications (licensing permitting), Microsoft’s move to offer such models on its Azure platform — where the paid Azure OpenAI service has already been available for the better part of the year — means that it may actually be costing its own investment/ally OpenAI some revenue. If you’re a business looking to deploy AI and see a model nearly as capable as OpenAI’s GPT-3.5/4 Turbo but that costs nothing or significantly less to deploy and use API calls, why wouldn’t you use the cheaper option?

Will Microsoft’s new Phi-2 AI model go fully open source for commercial applications?

Microsoft also announced the release of its own new AI model Phi-2, an upgrade from Phi-1.5, but with 2.7B parameters used to train it, on the smaller side of generative AI text-to-text models, allowing it to run efficiently even on machines without much graphics processing unit (GPU) power (the shortage of GPUs has been an ongoing challenge in the generative AI era, and made GPU leader Nvidia into a trillion-dollar company).

Phi-2 is itself not open source for commercial use yet — rather, it is available for only for research purposes. However, Microsoft senior principal research manager Sebastien Bubeck hinted on X that the license may change if it does see significant enough usage and demand.

Microsoft’s embrace of open source AI at Ignite 2023 this week is notable in light of the fact that CEO Satya Nadella took the time to appear at OpenAI’s first-ever developer conference DevDay last week alongside Altman, and the two seemed very chummy and complimentary on stage. Altman even said he was excited to be working on AGI (artificial generalized intelligence), that is, AI as capable as human beings, along with Microsoft.

Yet, reports have emerged in outlets such as The Information suggesting that Microsoft is internally looking to wean itself off its dependence on OpenAI for AI services and products, given the high costs of using the company’s closed, resource intensive models. There’s also of course the fact that OpenAI remains a separate company with its own agenda and goals, which may not always align with Microsoft’s.

Furthermore, it is simply smart business as the second largest cloud provider in the world, via Microsoft Azure, to offer enterprise customers a range of AI models — from the open source to the more performant closed-source ones. It’s what cloud competitor Amazon Web Services (AWS) is also doing, and presumably what Google Cloud will do as well at some point. Offering a range of open and closed-source AI models and related tools to customers makes sense and will likely be table-stakes as the AI cloud wars continue to rage.

Yet, it is impossible not to see Microsoft’s embrace of open source AI this week at Ignite as a bit of a turn away from its full-throated embrace of OpenAI. While I wouldn’t go so far as to say it is evidence of cracks in the relationship, it certainly appears that Microsoft is hedging its bets, or, “playing both sides,” a bit, to use the parlance of a popular Always Sunny in Philadelphia meme.

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.

Carl Franzen

Source link

You May Also Like

Adobe unveils Firefly 2 and previews more AI features: what you need to know

VentureBeat presents: AI Unleashed – An exclusive executive event for enterprise data…

Dana Perino of Fox News Is About to Face Her Biggest Test as a Journalist

Dana Perino can punch and parry with the best of them. But…

Boeing Urges Airlines to Inspect 737 Max Planes for Possible Loose Bolts

Boeing has urged airlines to inspect all 737 Max airplanes for a…

Fed Raises Rates After a Pause and Leaves Door Open to More

Federal Reserve officials raised interest rates to their highest level in 22…