ReportWire

Tag: meta connect

  • We Need to Talk About Smart Glasses

    [ad_1]

    With any new device category comes a whole host of novel and sometimes exhaustingly complex questions. Smartphones, for example, no matter how mundane they seem right now, are still nagging us with existential quandaries. When should we use them? How should we use them? What in God’s name happens to us when we use them, which, last I checked, is literally all the time?

    These are important questions, and most of us, even if we’re not spending all day ruminating on them, tackle the complexity in our own way, setting (or resetting) social norms for ourselves and other people as we trudge along. The only thing is, in my experience, we tend to ask these questions mostly in retrospect, which is to say after the cat (or phone, or smartwatch, or earth-shattering portal into the online world) is out of the proverbial bag. It’s easy to look back and say, “That was the time we should have thought about this,” and when I put Meta’s new smart glasses with a screen on, I knew that the time, for smart glasses in particular, was now—like, right f**king now.

    © James Pero / Gizmodo

    In case you missed it, Meta finally unveiled the Meta Ray-Ban Display, which are its first smart glasses with an in-lens display. I flew out to Meta headquarters for its annual Connect conference to try them, and the second I put them on, it was clear: these are going to be big. It probably seems silly from the outside to make a declaration like that. We have screens everywhere all the time—in our hands, on our wrists, and sometimes, regrettably, in our toasters. Why would smart glasses be any different? On one hand, I get that skepticism, but sometimes function isn’t the issue; it’s form. And when it comes to smart glasses, there is no other form like it.

    Meta’s Ray-Ban Display aren’t just another wearable. The screen inside them opens up an entirely new universe of capabilities. With these smart glasses and Meta’s wild new “Neural Band,” a wristband that reads the electrical signals in your arm and translates them to inputs, you’re able to do a lot of the stuff you normally do on your phone. You can receive and write messages, watch Reels on Instagram, take voice calls and video calls, record video and take pictures, and get turn-by-turn navigation. You can even transcribe conversations that are happening in real time. You’re doing this on your face in a way that you’ve never done it before—discreetly and, from my experience, fairly fluidly.

    If there were any boundaries between you and a device, Meta’s Ray-Ban Display are closing them to a gap that only an iPhone Air could slide through. It’s incredibly exciting in one way, because I can see Meta’s smart glasses being both useful and fun. The ability to swipe through a UI in front of my face by sliding my thumb around like some kind of computer cursor made of meat is wild and, at times, actually thrilling. While not everything works seamlessly yet, the door to smart glasses supremacy feels like it’s been swung wide open. You are going to want a pair of these smart glasses whether you know it or not. These are going to be popular, and as a result, potentially problematic.

    meta ray-ban display
    Meta’s “Neural Band” looks as discrete as the glasses. © James Pero / Gizmodo

    We may have a solid grasp on where and when we’re supposed to use phones, but what happens when that “phone” in question becomes perfectly discreet, and the ability to use it becomes almost unnoticeable to those around us? When I use a smartphone, you can see me pick it up—you know there’s a device in my hand. When I use Meta’s Ray-Ban Display, however, there’s almost no indication. Yes, there’s a privacy light that tells outside people that a picture or video is being taken, but there’s also less than 2% light leakage through the lens, meaning you can’t tell when the screen inside the glasses is on. I certainly couldn’t tell when I watched others use them. It’s as ambient as any ambient computing I’ve witnessed so far.

    I talked to Anshel Sag, a principal analyst at Moor Insights & Strategy who covers the wearable market, and he says the privacy framework around technology like this is still in flux.

    “We are still very much in the infancy of the smart glasses, AI wearable, and AR privacy and etiquette era,” he said. “I think that the reality is that having a wearable with a camera on your face is going to change things, and there are going to be places where these things are banned explicitly.”

    Some of those environments, Sag said, are private areas like bathrooms or locker rooms, but it could extend beyond just places where you might catch a glimpse of someone naked. Driving, for example, is a major question. Meta’s Ray-Ban Display have navigation built in, and while the company tells me that the feature is designed for walking right now, it’s not actually preventing anyone from using its smart glasses in the car. Instead, it will provide a warning before you do so by detecting what speed you’re moving at. Other companies like Amazon seem not to have even thought that navigating on smart glasses while driving could be a safety hazard at all. Early reports indicate that Amazon is plowing forward, making smart glasses that are specifically designed for its delivery drivers to use in a car.

    Ray Ban Glasses Top Down
    © James Pero / Gizmodo

    While regulators like the NHTSA have issued warnings about people using VR headsets while driving (yes, people were actually doing that), it hasn’t, according to my research or knowledge, addressed the impact of smart glasses, which are much more likely—especially if they become widespread—to enter the equation while driving. I reached out to the NHTSA for comment, but have not yet received a response.

    Privacy concerns shouldn’t just stem from the form factor, either. You also have to think about the company that’s making the thing you’re wearing on your face all the time and whether it has shown to be a good steward of your data and privacy. In Meta’s case? Well, without going into an entirely separate diatribe, I think it could do a lot better. And other companies that are also in hot pursuit of screen-clad glasses, like Google? Well, they haven’t been much better.

    And makers of smart glasses shouldn’t be surprised if, when these things wind up on people’s faces, they get some shit for it. Google Glass, which came out in 2013, may seem like a different age, and in a lot of ways it is (people’s expectations for privacy are almost nonexistent now), but we also haven’t had to confront the idea of pervasive camera-clad wearables in a long time, so who’s to say things have really changed? Sag says, while he expects some backlash, it may not be like the Glasshole days of yore.

    Meta Ray Ban Display
    © James Pero / Gizmodo

    “I think there will be some backlash, but I don’t think it’s gonna be as bad as Google Glass,” he says. “Google Glass had such an invasive appearance. You know, it didn’t really look normal, so it really caught people’s attention more. And I think that’s really what has made these classes more successful, is that they’re just inherently less intrusive in terms of appearance.”

    I may not be an industry analyst, but I agree with Sag. I’m not sure there really will be a category-ending backlash like we saw back in the Google days, and a part of me doesn’t want there to be. As I mentioned, I got a chance to use Meta’s Ray-Ban Displays, and the idea all but knocked my socks off. These are the smart glasses that anyone interested in the form factor has been waiting for. What I really want is to be able to live in a world where we can all use them respectfully and responsibly, and one where the companies that are making them give us the same responsibility and respect back. But in my experience, the only way to get toward a more respectful, harmonious world is to try everything else first, and in this case, the first step might be your next pair of Ray-Bans.

    [ad_2]

    James Pero

    Source link

  • Meta CTO explains why the smart glasses demos failed at Meta Connect — and it wasn’t the Wi-Fi | TechCrunch

    [ad_1]

    Meta chief technology officer Andrew Bosworth took to his Instagram to explain, in more technical detail, why multiple demos of Meta’s new smart-glasses technology failed at Meta Connect, the company’s developer conference, this week.

    Meta on Wednesday introduced three new pairs of smart glasses, including an upgraded version of its existing Ray-Ban Meta, a new Meta Ray-Ban Display that comes with a wristband controller, and the sports-focused Oakley Meta Vanguard.

    However, at different points during the event, the live technology demos failed to work.

    In one, cooking content creator Jack Mancuso asked his Ray-Ban Meta glasses how to get started with a particular sauce recipe. After repeating the question, “What do I do first?” with no response, the AI skipped ahead in the recipe, forcing him to stop the demo. He then tossed it back to Meta CEO Mark Zuckerberg, saying that he thinks the Wi-Fi may be messed up.

    Jack Mancuso at Meta Connect.Image Credits:Meta

    In another demo, the glasses failed to pick up a live WhatsApp video call between Bosworth and Zuckerberg; Zuckerberg eventually had to give up. Bosworth walked onstage, joking about the “brutal” Wi-Fi.

    “You practice these things like a hundred times, and then you never know what’s gonna happen,” Zuckerberg said at the time.

    After the event, Bosworth took to his Instagram for a Q&A session about the new tech and the live demo failures.

    Techcrunch event

    San Francisco
    |
    October 27-29, 2025

    On the latter, he explained that it wasn’t actually the Wi-Fi that caused the issue with the chef’s glasses. Instead, it was a mistake in resource management planning.

    Image Credits:Instagram (screenshot)

    “When the chef said, ‘Hey, Meta, start Live AI,’ it started every single Ray-Ban Meta’s Live AI in the building. And there were a lot of people in that building,” Bosworth explained. “That obviously didn’t happen in rehearsal; we didn’t have as many things,” he said, referring to the number of glasses that were triggered.

    That alone wasn’t enough to cause the disruption, though. The second part of the failure had to do with how Meta had chosen to route the Live AI traffic to its development server to isolate it during the demo. But when it did so, it did this for everyone in the building on the access points, which included all the headsets.

    “So we DDoS’d ourselves, basically, with that demo,” Bosworth added. (A DDoS attack, or a distributed denial of service attack, is one where a flood of traffic overwhelms a server or service, slowing it down or making it unavailable. In this case, Meta’s dev server wasn’t set up to handle the flood of traffic from the other glasses in the building — Meta was only planning for it to handle the demos alone.)

    The issue with the failed WhatsApp call, on the other hand, was the result of a new bug.

    The smart glasses’ display had gone to sleep at the exact moment the call came in, Bosworth said. When Zuckerberg woke the display back up, it didn’t show the answer notification to him. The CTO said this was a “race condition” bug, or where the outcome depends on the unpredictable and uncoordinated timing of two or more different processes trying to use the same resource simultaneously.

    “We’ve never run into that bug before,” Bosworth noted. “That’s the first time we’d ever seen it. It’s fixed now, and that’s a terrible, terrible place for that bug to show up.” He stressed that, of course, Meta knows how to handle video calls, and the company was “bummed” about the bug showing up here.

    Despite the issues, Bosworth said he’s not worried about the results of the glitches.

    “Obviously, I don’t love it, but I know the product works. I know it has the goods. So it really was just a demo fail and not, like, a product failure,” he said.

    [ad_2]

    Sarah Perez

    Source link

  • Mark Zuckerberg has begun his quest to kill the smartphone | TechCrunch

    [ad_1]

    If you can’t resist the urge to check your phone over and over, even if you’re out with friends, Meta has a solution: check your glasses instead.

    “The promise of glasses is to preserve this sense of presence that you have with other people,” said CEO Mark Zuckerberg at the Meta Connect 2025 keynote. “I think that we’ve lost it a little bit with phones, and we have the opportunity to get it back with glasses.”

    In reality, Meta wants its own hardware to eat into the marketshare of Apple and Google so that it doesn’t have to keep siphoning profits to them via app stores. But nevertheless, this is the angle Meta is taking to sell its most sophisticated smart glasses yet, the Meta Ray-Ban Display, which the company hopes could one day eclipse the market share of smartphones.

    Meta’s Reality Labs division burns cash at an alarming rate, which has concerned investors over the years. But Wednesday’s event finally showed us a glimpse of what the division’s $70 billion in losses since 2020 have gone toward.

    Meta has had its fair share of flops, like the entire promise of its social metaverse. (Remember when they announced that metaverse avatars would finally get legs?) But with the Meta Ray-Ban Display, Meta has created a remarkable piece of technology, unlike any other consumer-facing product on the market — we have yet to test it ourselves, so we can’t quite say just how groundbreaking this really is, but it looks promising.

    Like Meta’s existing smart glasses, which have sold millions of pairs, the new model has cameras, speakers, microphones, and an on-board AI assistant. The display on the glasses, which is offset so as not to obstruct one’s sightline, can display Meta apps like Instagram, WhatsApp, and Facebook, as well as directions and live translations.

    What most sets the Meta Ray-Ban Display apart is the Meta Neural Band, a wristband that uses surface electromyography (sEMG) to pick up on signals sent between your brain and your hand when performing a gesture.

    Techcrunch event

    San Francisco
    |
    October 27-29, 2025

    Meta’s keynote didn’t get into the specifics of how Zuckerberg was writing these texts, but according to Reality Labs’ research on sEMG, users can write out messages like this by holding their fingers together as if they were gripping a pen and “writing” out the text.

    While some live AI demos at the keynote failed — Zuckerberg blamed the Wi-Fi — we at least got to see the wristband in action, which is more novel. Zuckerberg quickly wrote out text messages, then sent them on his Ray-Bans.

    “I’m up to about 30 words a minute on this,” Zuckerberg said on stage at the company’s Menlo Park headquarters. “You can get pretty fast.”

    On a touchscreen smartphone like an iPhone, research has estimated that people text at about 36 words per minute, making Zuckerberg’s claim impressive. Reality Labs’ research participants averaged closer to 21 words per minute.

    Unlike past Meta Ray-Bans, this technology allows people to actually use the glasses without speaking aloud, which isn’t always natural in public settings. While Apple Watch users can send texts without voice prompting, the process is so tedious and slow that it’s only useful as a last resort.

    Other gesture controls on the wristband seem more similar to technology that consumers have used before, like Nintendo Joy-Cons and Apple Watches. But if the voiceless texting interface is as good as it seems, then the wristband will likely be capable of more complex gestures than we’re used to.

    Image Credits:Meta

    Meta has invested heavily in research on sEMG since 2021, even showing us a prototype of a heftier product called Orion. Like Apple and Google, Meta is preparing for a not-so-impossible future where these smart glasses could potentially eclipse the smartphone.

    But as is the risk with any massive hardware investment, there’s no way to know if this will actually feel more natural to people in their day-to-day lives than pulling a sleek aluminum rectangle out of their pocket to tap out messages to their friends.

    This might be Meta’s biggest bet — perhaps a bigger bet than its subpar metaverse. That’s why it’s so striking that Zuckerberg is unveiling this technology as not just a fascinating innovation, but something that he wants to portray as more prosocial than the smartphone. It’s a way for him to capitalize on our growing malaise with our ever-increasing screen time, even though he’s the one making the apps that demand our attention.

    “The technology needs to get out of the way,” Zuckerberg said.

    Will the smartphone become an obsolete relic like a Nokia with a T9 keyboard? That depends on whether or not there’s truth to Zuckerberg’s narrative that these glasses will help us feel more present. But Meta and its competitors are betting big on the cultural shift from smartphones to smart glasses, and the Ray-Ban Display will give consumers their first taste of this possible future.

    [ad_2]

    Amanda Silberling

    Source link

  • Meta’s New Wraparound Smart Glasses Are the Most Oakley Oakleys You Can Buy

    [ad_1]

    Ray-Ban wasn’t the only collaboration that got some shine at Meta Connect. The company also took the wraps (no pun intended) off a pair of wraparound shades designed by Oakley and, like its recently released HSTN smart glasses, designed more with sporty types in mind.

    Outside of the differing glasses shape, the $499 Meta Oakley Meta Vanguard (yes, that’s the official name in that order) specs also have a centered camera that’s meant to be better suited for capturing footage during “action” sports like snowboarding or cycling. Similar to Oakley’s HSTN glasses, the Vanguard have upgraded camera specs and are capable of capturing video in up to 3K resolution with its 12-megapixel camera that has a 122-degree field of view.

    © Meta

    There are some new fitness integrations, specifically with Garmin and Strava, that allow you to use the smart glasses as a sort of augment for health wearables. For instance, you can ask Meta AI how you’re doing on your fitness goals, or you can get updates on other fitness metrics in real time.

    While the tech inside the Vanguard is significant, equally as important is the form factor. Wraparound shades, while they’re probably not the style most normies would spring for, are ideal for skiing and snowboarding because of the superior wind blockage. Having used Meta’s HSTN smart glasses a little myself, I think Vanguard will appeal to more people interested in the action sports side of things since the former double more as just regular specs.

    One of the biggest upgrades that I got to hear for myself is the speakers. According to Meta, the Vanguard are 6 decibels louder than the HSTN glasses, which is clutch if you’re tearing down a hill at 30 mph on a snowboard. Meta also tried to optimize the design for sports in a number of ways, including an IP67 water rating, which makes them very durable when it comes to water and dust. I don’t know any professional water skiers, but if I did, I’d probably recommend these smart glasses.

    Battery-wise, the Vanguard have decent longevity on paper. According to Meta, they have 9 hours of battery life with “mixed usage” or as much as 6 hours if you’re playing music continuously. With the charging case, Meta says its smart glasses get 36 hours and they can go from 0 to 50% in 20 minutes. There are all sorts of lens variations this go-around too, including black, sapphire, 24K (which is gold), and something called “Road.” Those lenses can be swapped around or replaced, but it’ll cost you a whole $85.

    I haven’t had a chance to really test out the Vanguards in depth, but I can see how these would be appealing to someone who wants a sturdy pair of action-sports-oriented smart glasses. They’re available on Oct. 21 if that’s your thing, or you can preorder now.

    [ad_2]

    James Pero

    Source link

  • Meta launches Hyperscape, technology to turn real-world spaces into VR | TechCrunch

    [ad_1]

    Although today’s Meta Connect developer conference was largely about new smart glasses, the social networking company did announce a handful of metaverse updates during Wednesday’s keynote. Of these, one of the largest was the introduction of Hyperscape, first demoed at last year’s event, which allows developers and creators to build more photorealistic spaces in virtual reality.

    The company announced that Hyperscape Capture is now rolling out in Early Access, meaning Quest device owners will be able to scan a room in a few minutes, then turn it into an immersive and photorealistic world that’s like a digital replica of a real-world space.

    The capture process itself only takes a few minutes, but the room’s rendering will actually take a few hours, Meta notes.

    At launch, users won’t be able to invite others into their digital spaces, though that functionality will be supported in time, Meta says, through a private link.

    Image Credits:Meta

    However, the tech has already been used to render some featured Hyperscape worlds, including Gordon Ramsay’s home kitchen in L.A., Chance the Rapper’s House of Kicks, The Octagon at the UFC Apex in Las Vegas, and Happy Kelli’s room filled with her Crocs shoe collection.

    Meta first demoed Hyperscape last year at its Connect conference, showing how it used Gaussian Splatting, cloud rendering, and streaming to make the digital worlds appear on a Meta Quest 3 headset. Now, it’s rolling it out to users 18 years old and up, who have either a Quest 3 or Quest 3S.

    The rollout will be gradual, starting today, so not all users may see it immediately.

    Techcrunch event

    San Francisco
    |
    October 27-29, 2025

    Meta also introduced more metaverse updates at today’s events, including a new lineup of fall VR games, including Marvel’s Deadpool VR, ILM’s Star Wars: Beyond Victory, and Demeo x Dungeons & Dragons:
    Battlemarked, and Reach.

    Its streaming app, Horizon TV, will add support for Disney+, ESPN, and Hulu, while a partnership with Universal Pictures and horror company Blumhouse will offer movies like “M3GAN” and “The Black Phone” with immersive special effects. A 3D clip of “Avatar: Fire and Ash” will also be available for a limited time.

    [ad_2]

    Sarah Perez

    Source link

  • Meta Ray-Ban Display Hands-On: The Smart Glasses You Were Waiting For

    [ad_1]

    There’s one thing people want to know when they see my first-gen Ray-Ban smart glasses, and it’s got nothing to do with AI, or cameras, or the surprisingly great open-ear audio they put out. They want to know what’s probably front-of-mind right now as you’re reading this: Do they have a screen in them? The answer? Sadly, no… until now.

    At Meta Connect 2025, Meta finally unveiled its Ray-Ban Display smart glasses that, as you may have gathered from the name, have a screen in them. It doesn’t sound like much on the surface—we have screens everywhere, all the time. Too many of them, in fact. But I’m here to tell you that after using them in advance of the unveil, I regret to inform you that you will most likely want another screen in your life, whether you know it or not. But first, you probably want to know exactly what’s going on in this screen I speak of.

    The answer? Apps, of course. The display, which is actually full-color and not monochrome like previous reporting suggested, acts as a heads-up display (HUD) for things like notifications, navigation, and even pictures and videos. For the full specs of that display, you can read the news companion to my hands-on here. For now, though, I want to focus on what that screen feels like. The answer? A little jarring at first.

    © James Pero / Gizmodo

    While the Ray-Ban Display, which weigh 69g (about 10 more grams than the first-gen glasses without a screen) do their best not to shove a screen in front of your face, it’s still genuinely there, hovering like a real-life Clippy, waiting to distract you with a notification at a moment’s notice. And, no matter what your feelings are about smart glasses that have a screen, that’s a good thing, since the display is the whole reason you might spend $800 to own a pair. Once your eyes adjust to the screen (it took me a minute or so), you can get cracking on doing stuff. That’s where the Meta Neural Band comes in.

    The Neural Band is Meta’s sEMG wristband, a piece of tech it’s been showing off for years now that’s been shrunk down into the size of a Whoop fitness band. It reads the electrical signals in your hand to register pinches, swipes, taps, and wrist turns as inputs in the glasses. I was worried at first that its wristband might feel clunky or too conspicuous on my body, but I can inform you that it’s not the case—this is about as lightweight as it gets. The smart glasses also felt light and comfortable on my face despite being noticeably thicker than the first-gen Ray-Bans.

    Meta Ray-Ban Display
    © James Pero / Gizmodo

    More importantly than being lightweight and subtle, it’s very responsive. Once the Neural Band was tight on my wrist (it was a little loose at first, but better after I adjusted), using it to navigate the UI was fairly intuitive. An index finger and thumb pinch is the equivalent of “select,” a middle-finger and thumb pinch is “back,” and for scrolling, you make a fist and then use your thumb like it’s a mouse made of flesh and bone over the top of said fist. It’s a bit of Vision Pro and a bit of Quest 3, but with no hand-tracking needed. I won’t lie to you, it feels like a bit of magic when it works fluidly.

    Personally, I still had some variability on inputs—you may have to try to input something once or twice before it registers—but I would say that it works well most of the time (at least much better than you’d expect for a literal first-of-its-kind device). I suspect the experience will only get more fluid over time, though, and even better once you really train yourself to navigate the UI properly. Not to mention the applications for the future! Meta is already planning to launch a handwriting feature, though it’s not available at launch. I got a firsthand look… kind of. I wasn’t able to use handwriting myself, but I watched a Meta rep use it, and it seemed to work, though I have no way of knowing how well until I use it for myself.

    Meta Ray-Ban Display
    © James Pero / Gizmodo

    But enough about controls; let’s get to what you’re actually doing with them. I got to briefly experience pretty much everything that the Meta Ray-Ban Display have to offer, and that includes the gamut of phone-adjacent features. One of my favorites is taking pictures in a POV mode, which imposes a window on the glasses display that shows you what you’re taking a picture of right in the lens—finally, no guess and check when you’re snapping pics. Another “wow” moment here is the ability to pinch your fingers and tweak your wrist (like you’re turning a dial) to zoom in. It’s a subtle thing, but you feel like a wizard when you can control a camera by just waving your hands around.

    Another standout feature is navigation, which imposes a map on the glasses display to show you where you’re going. Obviously, I was limited in testing how that feature works since I couldn’t wander off with the glasses in my demo, but the map was quite sharp and bright enough to be used outdoors (I did test this stuff in sunlight, and the 5,000 nits brightness was sufficient). Meta is leaving it up to you whether you use navigation while you’re in a vehicle or on a bike, but it will warn you of the dangers of looking at a screen if it detects that you’re moving quickly. It’s hard to say how distracting a HUD would be if you’re biking, and it’s something that I plan to eventually test in full.

    Meta Neural Band
    © James Pero / Gizmodo

    Another interesting feature you might actually use is video calling, which pulls up a video of the person you’re calling in the bottom-right corner. The interesting part about this feature is that it’s POV for the person you’re calling, so they can see what you’re looking at. It’s not something that I’d do in any situation, since usually the person you’re calling wants to see you and not just what you’re looking at, but I can confirm that it works at least.

    Speaking of just working, there’s also a live transcription feature that can listen in on your environment and superimpose what the other person is saying onto the display of the smart glasses. I had two thoughts when using this feature: the first one is that it could be a game-changer for accessibility. If your hearing is impaired, being able to actually see a live transcript could be hugely helpful. Secondly, such a feature could be great for translation, which is something that Meta has already thought of in this case. I didn’t get a chance to use the smart glasses for translating another language, but the potential is there.

    One problem I foresee here, though, is that the smart glasses may pick up other conversations happening nearby. Meta thought of this too and said that the microphones in the Ray-Ban Display actually beamform to focus just on who you’re looking at, and I did get a chance to test that out. While one Meta rep spoke to me in the room, others had their own conversations at a fairly normal volume. The results? Kind of mixed. While the transcription focused mostly on the person I was looking at, it still picked up stray words here and there. This feels like a bit of an inevitability in loud scenarios, but who knows? Maybe beamforming and AI can fill in the gaps.

    meta ray-ban display
    © Meta

    If you’re looking for a killer feature of Meta’s Ray-Ban Display smart glasses, I’m not sure there necessarily is one, but one thing I do know is that the coupling of the glasses with its Neural Band should be nothing short of a game-changer. Navigating the UI in smart glasses has been a constant issue in the space, and until now, I haven’t seen what I thought was a killer solution, but based on my early demos, I’d say that Meta’s “brain-reading” wristband could be the breakthrough we were waiting for—at least until hand or eye tracking at this scale becomes possible.

    I’ll know more about how everything works when I get a chance to use Meta Ray-Ban Display on my own, but for now I’d say Meta is still clearly the frontrunner in the smart glasses race, and its head start just got pretty massive.

    [ad_2]

    James Pero

    Source link

  • Meta’s Ray-Ban Smart Glasses Now Have a Screen and a Magic Wristband

    [ad_1]

    Meta’s Ray-Bans are back with a new generation, and this time they’re finally giving people the one thing they really want—a screen. At Meta’s annual Connect developer conference, the company officially took the wraps off its Meta Ray-Ban Display, which are, as the name suggests, its first pair of AI-infused smart glasses to come with a full-color in-lens display.

    The smart glasses, which still bear the same Ray-Ban branding, will cost $799 and are available for preorder today. As you might expect, they can do quite a few things that their predecessor can’t, including message notifications, turn-by-turn navigation, and telling you when queries to Meta AI are processing. There are several app integrations, including WhatsApp and Instagram, allowing you to watch reels and make video calls natively in the glasses. One major upgrade on the message notifications front is that the Meta Ray-Ban Display will not be limited to only WhatsApp, meaning it will be able to show notifications on both iOS and Android devices.

    That’s not the only major shift in this generation. Meta says its first-ever display has a 600 x 600 resolution and 20-degree field of view. The display is monocular, which means it’s only in one lens—at the bottom right-ish corner—and has a refresh rate of 90Hz. Brightness goes up to 5,000 nits and as low as 30 nits, which makes them usable outdoors in full light. One of the coolest parts of the display is that Meta claims that there’s less than 2% light leakage, which means that you can’t see when someone has their display activated.

    © Meta

    Speaking of light leakage, all of Meta’s Ray-Ban Display smart glasses will come with transition lenses. On one hand, that feels like a weird choice, but it also makes sense since this is a gadget you’re going to want to use indoors as well as out, and for $800, you should be able to use them for as long as you like without having to take them off. “As long as you like,” in this case, will be no more than 6 hours, according to Meta, which is a lot longer than I was expecting. That solid battery life is thanks in part to what Meta is calling “ultra-narrow steelcan batteries.” I wish I knew exactly what that meant, but for now, I can only look forward to getting more of a deep dive in the future.

    Glasses are only half the appeal of Meta’s Ray-Ban Display, though. The other half is its sEMG wristband that you use to control the UI in the glasses.

    The Meta Neural Band, as Meta is calling it, is arguably the most innovative part of its new Ray-Ban package, since no other product like it exists on a commercial scale. Outside of being a first, it also offers a potential solution to a problem that no other maker of smart glasses has quite solved—that problem being, how the hell do you actually use smart glasses? While most smart glasses (Meta’s first-gen Ray-Bans included) have a voice assistant for shouting commands like “take a picture” and a fairly simple touch-sensitive bar for physical inputs (i.e., pause/play), neither is ideal in every situation.

    Meta Neural Band
    © Meta

    The fact is, adding a screen complicates smart glasses—the more you can do, the more you’ll need to convey to your glasses, and in order to do that, you need an input system as nuanced as the eyewear itself. Not only that, but if you want to use your smart glasses discreetly (or in a normal fashion at all, really), shouting into a crowded subway car is less than ideal. With the Neural Band, however, you can navigate the UI discreetly by pinching, swiping, and tapping through various menus in the smart glasses. My favorite gesture is a pinch to zoom for taking photos and videos. It’s Vision Pro-esque, but all done without cameras. In case you’re wondering, yes, the neural band is included in that $800 cost.

    I got a chance to use Meta’s Ray-Bans and its new Neural Band, and you can read my full impressions here.

    Like previous iterations of Meta’s Ray-Ban glasses, this year’s edition will also come equipped with cameras and speakers. Camera-wise, Meta is going with a 12-megapixel ultra-wide sensor that can also capture 1080p at 30fps in a 1,440 x 1,920 resolution. There’s also a 3x digital zoom.

    The camera is used for the computer vision in the glasses, aka Meta AI, as well. Despite being a fan of Meta’s Ray-Bans (they’re the only device I ever want to take calls with), Meta AI has been a weak spot for me. While the voice assistant works well most of the time for basic stuff like taking pictures/videos, playing Spotify, and asking what your battery life is, the heavier AI lifting is hit-or-miss at best. Whether Meta’s new smart glasses fix that remains to be seen since I haven’t had a chance to use them, but I’m hoping for an upgrade here.

    But even if AI is still finicky and the cameras and audio are about the same, these smart glasses still have a freaking screen. That’s a big step forward, even if functionality is limited for now. When people ask me about my first-gen Ray-Bans, the first thing they want to know is whether they have a display in them, and they’re inevitably very disappointed when I have to let them down. Now, I’ll actually have something to show them, and if Meta’s wristband works, I’ll even have something to show them that only Meta can provide.

    [ad_2]

    James Pero

    Source link

  • Meta unveils new smart glasses with a display and wristband controller | TechCrunch

    [ad_1]

    Meta on Wednesday unveiled a new pair of Ray-Ban branded smart glasses with a built-in display for apps, alerts, and directions on the right lens. The smart glasses are controlled by a wristband that picks up on subtle hand gestures, called Meta Neural Band, the same one it unveiled at last year’s Connect as part of its Orion demo.

    CEO Mark Zuckerberg announced the new product, called Meta Ray-Ban Display, onstage at the company’s annual developer conference, Meta Connect 2025. Unlike Orion, Zuckerberg says this is a product that people can buy in a couple of weeks, starting September 30, and they’ll cost $799.

    This is Meta’s latest attempt to ship a pair of consumer smart glasses that can handle many of the tasks users traditionally do on a smartphone. For years, Meta has been forced to reach users through its competitors’ devices, namely those sold by Google and Apple. While Meta has invested billions in virtual reality headsets, AI-powered smart glasses now seem like the most promising way for the company to connect with users on its own hardware.

    With the Meta Ray-Ban Display, Meta aims to build off the success of its original Ray-Ban Meta smart glasses, which the company has sold millions of pairs of with its eyewear partner, EssilorLuxottica. Much like Ray-Ban Meta, the Meta Ray-Ban Display comes equipped with an on-board AI assistant, as well as cameras, speakers, and microphones. The glasses let users connect to the cloud to access the internet and social media apps.

    (Credit: Meta)

    Meta says the display enables users to do much more with their smart glasses. Users are capable of displaying Meta apps like Instagram, WhatsApp, and Facebook. Users can also view directions and see live translations in the smart glasses’ display.

    The Neural Band that ships alongside the device looks similar to a Fitbit, but without a screen, and allows users to navigate apps with small hand movements. Zuckerberg said onstage that the Meta Neural Band has 18 hours of battery life and is water resistant.

    The device uses electromyography (EMG) to pick up on signals sent between your brain and your hand when performing a gesture. Meta is betting this interface will be a new way users can control their devices.

    Techcrunch event

    San Francisco
    |
    October 27-29, 2025

    (Credit: Meta)

    Earlier this week, a video leaked of Meta’s latest smart glasses. CNBC and Bloomberg previously reported that the smart glasses, which were internally codenamed Hypernova, would be unveiled at this year’s Connect conference.

    It’s worth noting that Meta Ray-Ban Display are far less capable than the Orion smart glasses Meta showed off at Connect 2024. That device featured augmented reality lenses and eye tracking, while this pair uses a much simpler display. It may be years before Meta ever sells Orion.

    Still, Meta is hoping it can win the smart glasses market by being first to market with a real product. However, it seems likely that Google and Apple will launch smart glasses of their own in the years to come. Those devices will undoubtedly be able to integrate into Google and Apple’s respective operating systems, giving them a significant leg up over Meta.

    [ad_2]

    Maxwell Zeff

    Source link

  • Meta unveils its new Oakley Meta Vanguard smart glasses for athletes | TechCrunch

    [ad_1]

    At Meta Connect 2025 on Wednesday, the company unveiled its new Oakley Meta Vanguard smart glasses that are geared toward runners, cyclists, and other athletes. The glasses retail for $499 and are launching on October 21.

    Meta Connect is the social networking giant’s biggest conference of the year, where it unveils smart glasses and VR headsets.

    The glasses feature a large unified front lens, instead of having two cameras positioned at the top corners of the frames, which is in previous Meta smart glasses and the Oakley Meta HSTN model. The new glasses can capture video in up to 3K resolution and feature a 12-megapixel camera with a 122-degree wide-angle lens.

    The Oakley Meta Vanguard smart glasses feature a programmable button that can trigger a custom AI prompt, which you can set up using the Meta AI app. Meta notes that all of the buttons on the smart glasses are located underneath to allow athletes to wear helmets comfortably while using them.

    The Oakley Meta Vanguard glasses have up to nine hours of battery life, or up to six hours of continuous music playback. The glasses come with a charging case that can provide an additional 36 hours of charge on the go. Meta notes that users can quickly charge the glasses to 50% in 20 minutes via the charging case.

    Image Credits:Meta

    Meta says the open-ear speakers, which are six decibels louder than Oakley Meta HSTN, are the most powerful speakers yet on any of its glasses. They also feature a five-microphone array optimized to reduce wind noise while on calls, messaging, or using Meta AI with your voice.

    The glasses have an IP67 dust and water resistance rating for use during intense workouts, the highest yet of any of Meta’s smart glasses, according to the company. Meta says the wraparound design of the glasses features Oakley PRIZMTM Lens technology, which is designed to block out sun, wind, and dust.

    Techcrunch event

    San Francisco
    |
    October 27-29, 2025

    Keeping with the athletic theme, the glasses can be integrated with Garmin smartwatches. Users can ask the smart glasses for stats such as your heart rate and pace while running, cycling, or participating in other activities. They can also integrate with Strava; users can graphically overlay their performance metrics onto videos and photos captured with Oakley Meta Vanguard and share them directly to their Strava community.

    The glasses are available in four frame and lens colors: Oakley Meta Vanguard Black with PRIZMTM 24K, Oakley Meta Vanguard White with PRIZMTM Black, Oakley Meta Vanguard Black with PRIZMTM Road, and Oakley Meta Vanguard White with PRIZMTM Sapphire.

    Oakley Meta Vanguard will be available in the United States, Canada, U.K., Ireland, France, Italy, Spain, Austria, Belgium, Australia, Germany, Sweden, Norway, Finland, Denmark, Switzerland, and the Netherlands. Meta plans to launch the glasses in Mexico, India, Brazil, and the United Arab Emirates later this year.

    Wednesday’s announcement comes three months after Meta unveiled its Oakley Meta HSTN smart glasses. At the time, the company said the smart glasses were its “first product for athletes and fans alike.”

    Meta also unveiled Wednesday its new pair of Ray-Ban branded smart glasses with a built-in display for apps, photos, and directions alongside a wearable wristband to control them. Additionally, the tech giant announced its new Ray-Ban Meta 2 with up to 2x the battery life of the previous model and 3K Ultra HD video capture.

    [ad_2]

    Aisha Malik

    Source link