Assuming it can turn its Project Orion augmented reality glasses into a real product people can buy, Meta apparently wants to get into robots next. That’s according to Sources‘ Alex Heath, who spoke to Meta CTO Andrew Bosworth and reports that much like Apple, Google and Tesla, Meta is researching robotics.
Unlike those other companies, though, Meta apparently isn’t all that focused on competing in hardware. It has a “Metabot” in the works, but its real goal is to create software that other companies can license, much like Google does with Android. “Software is the bottleneck,” according to Bosworth, and the hope is that the combined powers of Meta’s robotics team — led by Marc Whitten, the former CEO of Cruise — and its highly publicized Superintelligence Labs can produce a solution.
That work apparently starts with the development of a “world model” that can help a robot “do the software simulation required to animate a dexterous hand,” but will presumably extend to more complicated movements and tasks down the road. In February 2025, Meta was reportedly looking at building a robot that could handle household chores like cleaning or folding laundry. Given how early everything sounds, that’s likely a long way off.
Meta isn’t alone in pursuing robotics. Apple is reportedly working on its own home robots, starting with a table-mounted arm with a display. Tesla has regularly demoed versions of its Optimus robot to the public, though often in highly-controlled scenarios. Meta has yet to realize its goal of usurping the smartphone with AR glasses. Whether or not it does, it sounds like robots will be the thing it burns money on next.
Meta chief technology officer Andrew Bosworth took to his Instagram to explain, in more technical detail, why multiple demos of Meta’s new smart-glasses technology failed at Meta Connect, the company’s developer conference, this week.
However, at different points during the event, the live technology demos failed to work.
In one, cooking content creator Jack Mancuso asked his Ray-Ban Meta glasses how to get started with a particular sauce recipe. After repeating the question, “What do I do first?” with no response, the AI skipped ahead in the recipe, forcing him to stop the demo. He then tossed it back to Meta CEO Mark Zuckerberg, saying that he thinks the Wi-Fi may be messed up.
Jack Mancuso at Meta Connect.Image Credits:Meta
In another demo, the glasses failed to pick up a live WhatsApp video call between Bosworth and Zuckerberg; Zuckerberg eventually had to give up. Bosworth walked onstage, joking about the “brutal” Wi-Fi.
“You practice these things like a hundred times, and then you never know what’s gonna happen,” Zuckerberg said at the time.
After the event, Bosworth took to his Instagram for a Q&A session about the new tech and the live demo failures.
Techcrunch event
San Francisco | October 27-29, 2025
On the latter, he explained that it wasn’t actually the Wi-Fi that caused the issue with the chef’s glasses. Instead, it was a mistake in resource management planning.
Image Credits:Instagram (screenshot)
“When the chef said, ‘Hey, Meta, start Live AI,’ it started every single Ray-Ban Meta’s Live AI in the building. And there were a lot of people in that building,” Bosworth explained. “That obviously didn’t happen in rehearsal; we didn’t have as many things,” he said, referring to the number of glasses that were triggered.
That alone wasn’t enough to cause the disruption, though. The second part of the failure had to do with how Meta had chosen to route the Live AI traffic to its development server to isolate it during the demo. But when it did so, it did this for everyone in the building on the access points, which included all the headsets.
“So we DDoS’d ourselves, basically, with that demo,” Bosworth added. (A DDoS attack, or a distributed denial of service attack, is one where a flood of traffic overwhelms a server or service, slowing it down or making it unavailable. In this case, Meta’s dev server wasn’t set up to handle the flood of traffic from the other glasses in the building — Meta was only planning for it to handle the demos alone.)
The issue with the failed WhatsApp call, on the other hand, was the result of a new bug.
The smart glasses’ display had gone to sleep at the exact moment the call came in, Bosworth said. When Zuckerberg woke the display back up, it didn’t show the answer notification to him. The CTO said this was a “race condition” bug, or where the outcome depends on the unpredictable and uncoordinated timing of two or more different processes trying to use the same resource simultaneously.
“We’ve never run into that bug before,” Bosworth noted. “That’s the first time we’d ever seen it. It’s fixed now, and that’s a terrible, terrible place for that bug to show up.” He stressed that, of course, Meta knows how to handle video calls, and the company was “bummed” about the bug showing up here.
Despite the issues, Bosworth said he’s not worried about the results of the glitches.
“Obviously, I don’t love it, but I know the product works. I know it has the goods. So it really was just a demo fail and not, like, a product failure,” he said.
When Mark Zuckerberg announced Meta’s latest smart glasses at the company’s Connect 2025 keynote, he encountered two glitches that prevented him from properly demonstrating some of the devices’ features. Now, Meta’s Chief Technology Officer, Andrew Bosworth, said in an AMA on Instagram that they were demo failures and not actual product failures. The first glitch took place in the middle of a live demo with a cooking content creator, who asked Live AI for instructions on how to make a Korean-inspired steak sauce on his Meta glasses. Instead of giving him detailed instructions, his glasses’ AI skipped ahead by several steps and continued glitching. The chef told Zuckerberg that the “WiFi might be messed up” in the venue.
Bosworth said, however, that it was not the case. Apparently, when the chef said “Hey Meta, start Live AI,” it fired up every single Meta Ray-Ban’s Live AI in the building. And since the event was all about the company’s smart glasses, there were a lot of them in the venue at the time. The company had also routed Live AI’s traffic to its dev server to isolate it, but it ended up routing the Live AI traffic of everyone’s glasses in the building to its server. “We DDoS’d ourselves, basically,” he said. He continued that it didn’t happen at rehearsal, because there weren’t as many people wearing the glasses when they tested it out.
Zuckerberg also ran into an issue when he tried demonstrating taking WhatsApp video calls on the Meta Ray-Ban Display. The audience could see him getting calls on the glasses’ HUD, but he couldn’t answer them to start the call. Bosworth said that it was caused by a “never-before-seen bug” that had put the display to sleep at the very instant that the notifications came in that someone was calling. Even after Zuckerberg woke up the display, there was no option to answer the call. The CTO said Meta had never come across that bug before the demo and that it has since been fixed. “You guys know we can do video calling… we got WhatsApp, we know how to do video calling,” he said, but admitted that it was a missed opportunity to be able to show on stage that the feature actually works.