ReportWire

Tag: smart glasses

  • Meta’s Holy Grail AR Smart Glasses Have One Big Puck-Shaped Problem

    [ad_1]

    Meta’s “Phoenix” XR smart glasses might not be out yet, but if there’s one thing we already know, it’s that when they are, they’ll need a little assistance on the computing side. That help will likely come from a puck that you’ll have to carry around with you—a puck that we may have just gotten our first glimpse of.

    A new mockup render shared on X by Noridoesvr, who claims to have seen prototypes of Meta’s hardware—a rumored pair of lightweight XR glasses in a goggle-like form factor—offers more insight into what Project Phoenix’s puck will entail.

    At first glance, the compute puck looks pretty manageable. It’s not oversized, it looks relatively innocuous, and it even has a waistband clip so you can cart it around without pockets. It sounds like no big deal, but it might also be the one thing that turns people off from Meta’s AR glasses.

    As promising as the future of AR glasses suddenly is, cramming a whole computer into a pair of frames that rest comfortably on your face is no easy task. Miniaturization is rough, and at a certain point, maybe even impossible. Shrinking down a computer to fit on your face butts against Moore’s Law pretty directly—you need the power to do all sorts of stuff in a form factor that’s light and ergonomic, but you need to do all of that without burning a glasses-sized hole through someone’s head (thermals are no joke).

    As a workaround to all of those issues, Meta seems interested in offloading the compute to a puck, which is a solution that Google and Xreal are also interested in pursuing with Project Aura. Google and Xreal’s partnership was recently showcased in December and relies on a wired puck to enable a computer-like experience where people can use Android apps on a big virtual screen. Think Vision Pro, but in a much, much, smaller form factor.

    Framed that way, the value proposition for tethered smart glasses makes sense. The Vision Pro might be an impressive technical feat, but wearing one for long periods of time sucks because of the weight and the resulting not-so-great battery. Tethered smart glasses take all the weight and put it… not on your face, which is objectively a win for your nose and forehead.

    In other ways, though, both of these form factors share the same problem. The Vision Pro, like Meta’s Phoenix smart glasses and Project Aura, also needs its own kind of puck—a battery pack. To shed weight, the Vision Pro connects to a battery that you have to carry along with you, along with a wire. It’s not ideal, but that’s the tradeoff for a face-worn computer that does more than mirror your connected device’s screen.

    It becomes even less ideal when you consider the puck is on your body. As you may have noticed, there appears to be an exhaust fan on the compute puck, which could presumably serve to direct heat away from your body. It’s hard to tell, but based on the renderings, it might be pointing up? That would be a strange choice, and there’s a chance that what I’m seeing as the correct orientation is actually the opposite. Here’s to hoping this thing doesn’t blast hot air at your torso.

    No matter which way you spin it, there are lots of downsides to using a puck for computing, and those downsides might be a little too much for some. The worst part is, if you’re waiting around for Google, or Meta, or eventually Apple to shrink the form factor down and fit it into glasses sans puck, you might be waiting forever. There’s no guarantee that the puck is a problem that can be solved, and Meta’s upcoming Phoenix smart glasses might be further proof.

    For now, we can at least enjoy the head-to-head between Meta and Google when they’re eventually released, which could be sometime in 2027 and late 2026, respectively. May the best XR video glasses with a portable computer puck win, I guess?

    [ad_2]

    James Pero

    Source link

  • A rival smart glasses company is suing Meta over its Ray-Ban products

    [ad_1]

    Meta is being sued by Solos, a rival smart glasses maker, for infringing on its patents, Bloomberg reports. Solos is seeking “multiple billions of dollars” in damages and an injunction that could prevent Meta from selling its Ray-Ban Meta smart glasses as part of the lawsuit.

    Solos claims that Meta’s Ray-Ban Meta Wayfarer Gen 1 smart glasses violate multiple patents covering “core technologies in the field of smart eyewear.” While less well known than Meta and its partner EssilorLuxottica, Solos sells multiple pairs of glasses with similar features to what Meta offers. For example, the company’s AirGo A5 glasses lets you control music playback and automatically translate speech into different languages, and integrates ChatGPT for answering questions and searching the web.

    Beyond the product similarities, Solos claims that Meta was able to copy its patents because Oakley (an EssilorLuxottica subsidiary) and Meta employees had insights into the company’s products and road map. Solos says that in 2015, Oakley employees were introduced to the company’s smart glasses tech, and were even given a pair of Solos glasses for testing in 2019. Solos also says that a MIT Sloan Fellow who researched the company’s products and later became a product manager at Meta, brought knowledge of the company to her role. According to the logic of Solos’ lawsuit, by the time Meta and EssilorLuxottica were selling their own smart glasses, “both sides had accumulated years of direct, senior-level and increasingly detailed knowledge of Solos’ smart glasses technology.”

    Engadget has asked both Meta and EssilorLuxottica to comment on Solos’ claims. We’ll update this article if we hear back.

    While fewer people own Ray-Ban Meta smart glasses than use Instagram, Meta considers the wearable one of its few hardware success stories. The company is so convinced it can make smart glasses happen that it recently restructured its Reality Labs division to focus on AI hardware like smart glasses and hopefully build on its success.

    [ad_2]

    Ian Carlos Campbell

    Source link

  • Handwriting is my new favorite way to text with the Meta Ray-Ban Display glasses

    [ad_1]

    When Meta first announced its display-enabled smart glasses last year, it teased a handwriting feature that allows users to send messages by tracing letters with their hands. Now, the company is starting to roll it out, with people enrolled in its early access program getting it first,

    I got a chance to try the feature at CES and it made me want to start wearing my Meta Ray-Ban Display glasses more often. When I reviewed the glasses last year, I wrote about how one of  my favorite tings about the neural band is that it reduced my reliance on voice commands. I’ve always felt a bit self conscious at speaking to my glasses in public.

    Up to now, replying to messages on the display glasses has still generally required voice dictation or generic preset replies. But handwriting means that you can finally send custom messages and replies somewhat discreetly.

    Sitting at a table wearing the Meta Ray-Ban Display glasses and neural band, I was able to quickly write a message just by drawing the letters on the table in front of me. It wasn’t perfect — it misread a capital “I” as an “H” — but it was surprsingly intuitive. I was able to quickly trace out a short sentence and even correct a typo (a swipe from left to right will let you add a space, while a swipe from right to left deletes the last character).

    Alongside handwriting, Meta also announced a new teleprompter feature. Copy and paste a bunch of text — it supports up to 16,000 characters (roughly a half-hour’s worth of speech) — and you can beam your text into the glasses’ display.

    If you’ve ever used a teleprompter, Meta’s version works a bit differently in that the text doesn’t automatically scroll while you speak. Instead, the text is displayed on individual cards you manually swipe through. The company told me it originally tested a scrolling version, but that in early tests, people said they preferred to be in control of when the words appeared in front of them.

    Teleprompter is starting to roll out now, though Meta says it could take some time before everyone is able to access.

    The updates are the among the first major additions Meta has made to its display glasses since launching them late last year and a sign that, like its other smart glasses, the company plans to keep them fresh with new features. Elsewhere at CES, the company announced some interesting new plans for the device’s neural band and that it was delaying a planned international rollout of the device.

    [ad_2]

    Karissa Bell

    Source link

  • Go Beyond 20/20 With These WIRED-Tested Smart Glasses

    [ad_1]

    Other Smart Glasses We’ve Tested

    We’ve tested several more pairs of smart glasses—some good and some bad.

    Even Realities G2

    Photograph: Julian Chokkattu

    Even Realities G2 for $599: We have not fully reviewed the Even Realities G2 yet—we have spent a little time with the hardware but are awaiting a prescription model for proper testing. There are a few bugs with the software, but Even Realities’ second-gen glasses are impressive. Thin, light, and easily passable for standard glasses, these don’t have a camera or speaker; instead, they focus on extending your smartphone with the display and microphone. The projected screen is 75 percent larger than the original G1, and you can thumb the R1 smart ring (separate purchase) to navigate the interface. You can see your notifications, translate real-time conversations, see navigation instructions, pin to-do lists, and talk to the company’s Even AI assistant about anything. There’s also a teleprompter function to convince people you’re a natural at public speaking. Again, the hardware is impressive, but we need to put these glasses through their paces; stay tuned for our full review soon. —Julian Chokkattu

    Image may contain Accessories Sunglasses and Glasses

    Photograph: Simon Hill

    RayNeo Air 3s Pro for $299: TCL-owned RayNeo offers many models, and I’ve tested several. The latest Air 3s Pro glasses boast a 201-inch virtual screen (1080p, 120 Hz, 1200 nits), but the 46-degree field of view lets it down a little. Both color vibrancy and brightness offer major upgrades over previous releases, like the Air 2s and the older TCL RayNeo Air 2 XR Glasses, and you can just about see the whole screen clearly (even after adjusting, I had to slide them down my nose a little to avoid blurring at the bottom). You will also need the lens shade to use them in brighter environments. While they are cheaper than our other virtual screen picks, I found them inferior in design, fit, and comfort. Rayneo has added some more on-device options, including spatial sound, but it didn’t work well for me, though the standard sound is fine. RayNeo’s software, required for 3 DoF, is still buggy and unpolished. This is a good virtual screen for the money, and perfectly suitable for watching movies and light gaming, but if you want more from your smart glasses, I’d pick a different pair.

    Chamelo Music Shield for $260: With a cool touch-control tint capability that enables you to adjust for the conditions, and built-in Bluetooth speakers for music, the Chamelo Music Shield could be up your sporty street. You can take these dimmable glasses from 17 to 63 percent light transmittance (almost clear to mirrored) by sliding your finger along the right temple. The sound quality is decent for glasses, but even cheap wireless earbuds sound better, and these are on the pricey side for their limited functionality.

    Image may contain Accessories Glasses Sunglasses Goggles Blade Razor and Weapon

    Photograph: Simon Hill

    Lucyd Reebok Octane for $199: Designed in partnership with Reebok for cyclists and runners, these lightweight Bluetooth sunglasses feature silver polarized lenses, good quality speakers, and 8-hour battery life. I enjoyed listening to music and podcasts while hiking, and I like that you can hear the world around you. The sound leakage isn’t too bad, so you won’t bother the people you pass. They also have physical controls that are much easier to use than touch controls, even when your hands are wet (they’re water-resistant, too). You can take calls, get directions, and ask your preferred AI assistant questions. Lucyd has been making Bluetooth sunglasses for several years now and offers a wide range of different styles. We also tried the Lucyd 2.0 Bluetooth Sunglasses a couple of years ago.

    Rokid Max 2 Glasses for $429: The Spider-Man style lenses give these comfortable smart glasses a bit of character, though they won’t be to everyone’s taste. They project a 215-inch screen (1080p, 120 Hz, 600 nits, 50-degree FoV) and boast diopter dials for focus adjustments, but I struggled to eliminate blurring around the edges, and instead of stylish electrochromic dimming, there’s a clip-on plastic blackout shield. I also tried the Rokid Station 2, which adds an Android TV interface to access entertainment apps, but also a trackpad and an air mouse for easier control. The original Rokid Station was a more basic portable Android TV.

    Don’t Bother

    Here’s the eyewear that fell short.

    Halliday Glasses for $499: While these could almost pass for chunky regular glasses, with a clever ring controller and a tiny unobtrusive display, I can confidently say they are not the future of smart glasses. After spending several uncomfortable hours trying to adjust the display to be readable, all I got was a headache. The ring seemed like a smart idea, but it’s big, ugly, plastic, laggy, and frustrating to use. The eavesdropping AI is slow, and squinting up to try and see the screeds of useless text it churns out is actually painful. The sound quality and battery life are equally awful.

    Amazon Echo Frames for $300: The Amazon Echo Frames (3/10, WIRED Review) are a bit old now, but you can still purchase them. Too bad they don’t do much. They work as sunglasses, filter blue light, and are IPX4-rated. Tech-wise, they have a speaker and microphone in each temple, and you can use them to query or command Alexa, as you would with a smart speaker, but there are no cameras here, making them far less capable than the similarly priced Ray-Ban Meta glasses.

    Image may contain Accessories Glasses Sunglasses and Scissors

    Photograph: Simon Hill

    Asus AirVision M1 for $399: I was excited to see Asus launch smart glasses, but the lack of fanfare was a red flag. My first impressions of the lightweight design were promising, and the M1 offers up to a 100-inch virtual display and impressive 1,100 nits brightness. Designed to plug into your phone, laptop, PC, or handheld gaming device, like the ROG Ally, via USB-C, the M1 also features built-in speakers and a microphone. Sadly, the refresh rate maxes out at 72 Hz and is limited to 60 Hz unless you employ the Airvision software, which also enables you to select different modes (working, gaming, infinity), tweak screen position, and set interpupillary distance (IPD). I found the in-focus sweet spot was small, and most of my virtual screen was blurry, no matter how I tweaked the settings, making them uncomfortable to use, especially for work. There’s also a basic plastic shield to block light, rather than electrochromic dimming, and the speaker quality is decidedly average, leaving me puzzled about why the price is so high.

    Solos AirGo Vision for $299: With a built-in AI assistant powered by ChatGPT, the Solos AirGo Vision adds a camera on top of the Bluetooth-connected speakers in the rest of its range. Grant it unfettered access to your location and photo library, and it can describe what you are seeing. The most obvious use cases are translation and navigation, though I’m not convinced about the accuracy of its suggestions. The design is interesting, with chunky temples housing the smarts and interchangeable frames. There’s no virtual screen or HUD, but you can get prescription lenses, and they look relatively normal. Sadly, the photo and audio quality are horrible, and the touch controls are frustratingly finicky. The app is also power hungry and demands too many permissions. The Ray-Ban Meta glasses do the same things better.


    Power up with unlimited access to WIRED. Get best-in-class reporting and exclusive subscriber content that’s too important to ignore. Subscribe Today.

    [ad_2]

    Simon Hill

    Source link

  • Black Friday deal: Get $50 off the Xreal One Pro smart glasses

    [ad_1]

    If you’ve been thinking about jumping on the smart glasses bandwagon, Black Friday deals could help you do so for less. Xreal has discounted its One Pro smart glasses by $50 for Black Friday, bringing them down to $599. The sale applies to both sizes.

    These glasses are the real deal. We praised them in our official review, saying they offer similar functionality to the Apple Vision Pro, but at a much friendlier price point. Wearing these glasses allows access to a massive 222-inch virtual display that can be used for just about anything. The glasses connect to laptops, gaming consoles and smartphones, among other gadgets.

    Xreal

    The 1080p Micro-OLED screens are surprisingly bright and sharp, which makes this device great for both work and consuming content. The frames even darken to give the illusion of wearing sunglasses.

    The glasses are light and comfortable, especially when compared to the Vision Pro. However, the comfort does come at a price. These oversized glasses are not as immersive as Apple’s product, because they don’t completely block out light and cover the entire face. They offer a 57-degree field of view. This is squarely an augmented reality product and not a virtual reality product.

    The company has also discounted its Xreal One AR glasses to $399, which is a discount of around $100. These offer smaller virtual screens than the Pro, with a 50-degree field of view.

    Image for the mini product module

    [ad_2]

    Lawrence Bonk

    Source link

  • These Anti-Meta Smart Glasses Are Controlled by a Health-Tracking Smart Ring

    [ad_1]

    If you hate the idea of smart glasses with cameras in them, I have some good news: Even Realities still has your back. The new Even G2 smart glasses retain the company’s camera-less and speaker-less design while adding a whole new piece of hardware to the party—a health-tracking smart ring for input.

    The $599 Even G2 smart glasses and $249 Even R1 smart ring are available starting today and mark an evolution of Even Realities’ previous Even G1 glasses, which also notably do not have cameras or speakers. The choice not to include those features may seem odd, given the existence of smart glasses like the Meta Ray-Ban Display, but Even Realities says its decision is intentional.

    For one, not having cameras makes its smart glasses a lot more appealing to those who don’t like the privacy implications of walking around with cameras discreetly mounted on their face. Secondly, it also vastly decreases the weight of the Even G2, making them more akin to just wearing regular glasses.

    © Even Realities

    The lack of speakers and cameras also makes the Even G2 (and the Even G1 before it) more about the screen than anything, and on that front, Even Realities says it’s made some improvements over the last generation. In the Even G2, it’s introducing new micro LED projectors that power a display in each lens. The projectors, according to the company, equate to a few improvements over the last generation, including a 75% larger image that’s 50% sharper. Having used the Even G2 briefly for myself, I can attest to the display (which is still a monochrome green) being sharper than I expected. If you wear prescription lenses, the Even G2 supports those as well.

    As far as features, the Even G2 has similar functionality to other existing smart glasses on the market. Using a companion app for iOS and Android, the smart glasses can receive notifications for things like email and text messages; they have a navigation feature for turn-by-turn directions, a translation feature, and a live captions feature. And like the Rokid Glasses I tested earlier this year, they have a teleprompter feature that can show digital readouts of your presentation, following along with you as you read the words. Unlike AI glasses from Meta or the Ray-Ban Display, they do not have computer vision capabilities due to the lack of cameras.

    Even Realities is advertising a two-day battery life for its Even G2, and the included case has enough charge for a week.

    Even Realities R1
    © Even Realities

    Probably the biggest addition this time around, however, is the Even R1, a smart ring that serves a few different purposes. The main purpose is, like Meta’s Neural Band, for controlling the Even G2. With a touch-sensitive surface on the outside of the ring (which is made of zirconia ceramic), you can scroll through windows and tap to select. On top of that, the Even R1 also doubles as a health tracker, using an LED beam inside the ring to measure your heart rate and a temperature sensor for a thermometer feature. There’s also an accelerometer for counting steps, measuring burned calories, and monitoring your distance travelled. The Even R1 has a four-day battery life and is charged with an included magnetic charger. It’s admittedly a lot to cram into one device, which is why Even Realities is selling the Even R1 separately for $249.

    What I’m most curious about isn’t the health tracking stuff; it’s whether Even Realities can deliver on the promise of a smart ring that lets you traverse the smart glasses UI smoothly. Having used other smart rings to control smart glasses in the past, the results can be… unrefined like with the Inmo Air 3. While $249 is a lot of money to spend on top of the $599 smart glasses, Even Realities says it’s offering an “introductory period” where those who purchase the Even G2 can get the Even R1 for 50% off. Is that still a lot of money to spend on a pair of smart glasses? Yes, it is. But in a world where Meta’s Ray-Ban Display is being sold for $800, that’s just the way things are going right now. You can purchase the Even G2 and Even R1 starting now from Even Realities’ website.

    [ad_2]

    James Pero

    Source link

  • A New Pair of Anti-Meta Smart Glasses Are on the Way

    [ad_1]

    If you’re icked out by cameras in smart glasses, you might be in for some good news. Even Realities, which makes the G1 smart glasses, is readying to drop a new pair next week. And if they’re anything like that pair, they’ll be a lot more privacy-friendly than competitors from Meta and elsewhere.

    Even Realities doesn’t give much in the way of detail on its launch next week on Nov. 12, but a short teaser video does show a few things. Obviously, like the G1, the G2 will have a screen, which is monochrome as opposed to more premium displays like the full-color kind on the Meta Ray-Ban Display. That screen, if the video is anything to go off of, will show a few things for sure, including calendar notifications and information surfaced when you use Even AI, the smart glasses’ voice assistant. These shouldn’t come as a surprise, since those are capabilities from the G1, but good to see they’ve carried over.

    The G1, pictured here, are light and feel like regular glasses. © Even Realities

    Even Realities does tease that “a new extraordinary power is almost ready to be unleashed,” which might have some people up in arms, expecting the company to add cameras and speakers just like other competitors from Meta, but I wouldn’t count on that. For Even Realities, the whole appeal is that its glasses don’t have those things, and if the company wanted to, it could have put them in from the get-go. The problem when you start adding those things is twofold.

    First, there are the implications for privacy. While some might not care about people walking around with cameras strapped to their faces, plenty more people do, and not having those kinds of things in the G1 sets the smart glasses apart from the rest of the field. Then, there’s the matter of bulk. A major part of Even Realities’ pitch is that its smart glasses feel more like regular glasses in that they’re lightweight. Adding cameras and speakers to the equation compromises that lightness, and the moment they’re there, you’re talking about a completely different kind of gadget.

    Personally, even if I use the camera on the Ray-Ban Meta glasses sometimes, I don’t find that it’s critical, and I’d be bummed to see Even Realities take the camera route so soon—diversity, in this case, feels like the right move for smart glasses. I could always be wrong, but adding a camera to the G2 feels like a long shot, though it does beg the question: what is the new extraordinary power? For that, I guess we’ll have to wait about a week to find out for sure.

    [ad_2]

    James Pero

    Source link

  • Meta’s smart glasses have a new shortcut to call and text without saying ‘hey Meta’

    [ad_1]

    One of the things that’s long irked me about Meta’s smart glasses is how often you have to say “hey Meta.” Even though the company’s AI assistant has gotten significantly more capable, there’s something a little cringey about using its voice commands in public spaces.

    Now, the company has rolled out an update that makes you a little less dependent on voice commands for all its glasses. A new “quick connect” feature allows you to create a one-touch shortcut for “frequently used communication actions,” like making a phone call or sending a text to a specific contact. The update is out now as part of the 19.2 software update.

    The idea of “quick connect” is similar to the functionality of the “action button” on the Oakley Meta Vanguard glasses, though it’s more limited in scope. The feature allows you to designate a specific contact that you can message, text or call just by holding down on the touchpad on the right side of the glasses. The Meta AI app will even let you choose your preferred method of reaching them, whether it’s WhatsApp, Instagram, Messenger or your phone’s native calling or messaging app. The same one-touch press and hold shortcut can also be used to send photos and videos shot on the glasses right after you take them (also via your chosen app).

    You can designate a specific contact and on which app you want to be able to contact them.

    During its Connect event, Meta previewed some updates for third-party apps that appear to allow developers to set their own wake words for the glasses when using their services. So there is some hope that eventually the company will offer a bit more flexibility with its voice commands. Even so, it’s unlikely you’ll be able to get away with never having to say “hey Meta,” but the quick connect update is a handy way to make messaging a bit more discreet.

    [ad_2]

    Source link

  • How Wearable Tech Could Become ‘Big Brother’ in the Workplace

    [ad_1]

    Wearable tech continues to be one of the Next Big Things in technology innovation, thanks to what many experts expect to be the replacement for the humble smartphoneAR and VR headsets, as well as other AI-powered devices. But wearables like fitness monitors and smartwatches are already part of some workplaces as a useful tool for monitoring employees — providing data on everything from performance to employee well-being. But this sometimes controversial data collection carries some risks, as a new report highlights.

    A team of management researchers from the U.K.’s University of Surrey recently did a meta-analysis of previous studies on the benefits and risks of using wearable worker monitoring tech. They found that most workplaces that have deployed wearable tech are using them to track employees’ well-being and health data. The devices were helpful for accurately tracking “sleep quality, stress markers, physical activity, and even team dynamics,” science news site Phys.org reported. That, which aligns with some of the ways devices like FitBits and Apple Watches are promoted. 

    But the way some businesses roll out these devices is problematic the researchers said, since many of these efforts aren’t fully transparent and leave employees guessing about what personal data is being collected by their companies and why it’s being gathered. Meanwhile, many businesses have inconsistent policies for analyzing collected employee data, and they may even store it insecurely. This behavior risks making workers feel insecure and suffering the effects of “invasive surveillance,” Phys.org says. That level of explicit oversight can harm workplace culture. 

    When used properly, these wearables, many of which are commercial off-the-shelf products, can warn HR departments in real time about potential problems. One good example is their potential to spot “rising stress before burnout or to safety hazards before accidents,” wrote Dr. Sebastiano Massaro, a neuroscience lecturer and co-author of the study.

    But unless companies have “robust methodological and ethical guardrails,” there’s a risk of blurring the lines between “science and pseudoscience, between real support and dangerous surveillance,” Massaro worries. In their best uses, wearables can “help create safer, healthier, and more responsive and productive workplaces” he thinks. Done badly, they could “normalize unnecessary monitoring and paradoxically increase workplace stress rather than reduce it.”

    Recently, Amazon revealed it was developing smart glasses (a little like Meta’s recently unveiled AR glasses) the company said will help its delivery drivers “identify hazards, seamlessly navigate to customers’ doorsteps, and improve customer deliveries.” The goggles sound like powerful tech, melding “AI-powered sensing capabilities and computer vision” with cameras and a display so a driver can see “everything from navigation details to hazards to delivery tasks,” as well as spotting the right packages in their truck at a delivery address. It’s plausible that these devices could speed up deliveries—a form of 21st century optimization that’s akin to a business efficiency decision that means UPS delivery trucks almost never turn left.

    But Amazon’s product announcement immediately triggered ethical and privacy worries, both about the drivers’ well-being and about data collected outside the trucks, when drivers are at a delivery location, for example. Amazon, after all, has repeatedly been in the news over the way it surveils its workforce, including landing a 32-million euro fine ($36 million) in France in 2024 for doing so excessively

    How can you best apply this research for your own company?

    Offering your workers wearable tech can be presented positively — the devices have a certain social cachet, and if they help workers monitor their health and fitness for their own purposes (as well as for more workplace-directed reasons, like monitoring stress levels) then they can be seen as an attractive workplace perk. The data they collect can, if used responsibly, also help you avoid complex health issues like burnout.

    But if you do deploy tech like this, it’s important to be open and transparent about what data is being collected, and what for, and also to be rigorous in protecting sensitive employee medical data. Otherwise you risk harming employee well-being and your company’s reputation. 

    The early-rate deadline for the 2026 Inc. Regionals Awards is Friday, November 14, at 11:59 p.m. PT. Apply now.

    [ad_2]

    Kit Eaton

    Source link

  • I Can’t Help Feeling Like a Creep Wearing Meta’s New Gen 2 Glasses

    [ad_1]

    As I wore them on one of my walks through San Francisco, on the shore of Ocean Beach, I came upon a dolphin-like fish that had washed up on the sand. Though I got my camera glasses close enough to the thing that I could smell it, Meta’s AI assistant could not tell me what kind of animal it was. It correctly identified that it was very dead and that I should not touch it. It was then able to direct me to a number to call for city animal control services.

    Beyond instances like that, I tend to avoid the AI voice interaction because I haven’t gotten to the point where it feels natural. Getting it to search something is usually very quick, but doing so requires you to stop dead in your tracks, stare directly at another person’s purse or something, and say out loud, “Hey Meta. HEY META. Is this bag Gucci?”

    The glasses’ AI features are both its best asset and biggest weakness. Features like live language translation and whispered map directions are very helpful. But if you’ve spent any time curating the AI slop out of your Facebook feed lately, you’ll know that Meta just can’t help pack a firehose blast of AI features into everything it does.

    The software features are funneled through the same app as Meta’s AI services. That’s where pictures and videos go by default, and sometimes you have to go into the app to import the files from the glasses. There’s a very clear problem with using the app: bad vibes.

    The Vibes Are Off

    When you go into the Meta AI app to look at the pictures or videos you’ve taken, the first thing you’ll see is Meta’s terrible new Vibes service. It’s a constant barrage of AI slop videos that Meta just one day foisted upon its app users. Vibes is akin to OpenAI’s dubious Sora app, but somehow even worse quality.

    [ad_2]

    Boone Ashworth

    Source link

  • Meta Ray-Ban Display Review: Is This the Future We Really Want?

    [ad_1]

    My first reaction when I put on Meta’s $800 Ray-Ban Display was excitement. As frivolous as it may seem to have yet another screen in your life, there’s something that happens when you basically glue a display to your eyeball. You transform from a person with glasses to, like, a spy, or a cyborg—a cyborg spy! Yeah, that’s it. Ghost in the Shell fans will get it.

    When I initially donned these smart glasses at Meta Connect, I smiled because this was what I felt had been missing from my previous Ray-Ban smart glasses experience. A big, bright, full-color screen—the one thing people always wanted to know about when I showed them my deflatingly screenless Ray-Ban Meta AI glasses.

    That little dose of magic is even further heightened by Meta’s Neural Band, a small wristband that, when slipped over your hand, reads the electrical signals in your arm, allowing you to navigate the Meta Ray-Ban Display with a series of finger pinches and thumb swipes.

    Meta Ray-Ban Display

    Meta’s Ray-Ban Display is impressive hardware that’s limited by its lack of apps.

    • Display is impressive
    • Neural Band feels like magic
    • Navigation and notifications can be useful
    • Battery life holds up
    • Not enough apps
    • Camera isn’t upgraded
    • Neural Band can be uncomfortable over long periods
    • Probably a privacy nightmare
    • Existentially exhausting

    The only other experience I can liken this combo to is the first time I used Apple’s Vision Pro, which creates a similar kind of magic, sans wrist-worn wearable. In the Vision Pro and Meta’s Ray-Ban Display, you’re using technology the same way a wizard casts a spell, waving your hand to make the computer do the things computers do, which, if you’ve watched as much sci-fi and fantasy as I have, is pretty f*cking rad.

    Weirdly, I’m reminded of my grandma (my nonna, actually; sorry for being Italian), when I first showed her how to use a computer mouse on my family’s PC when I was a kid. You move this little plastic thing on a desk, and it moves something on a screen! Groundbreaking! It seemed silly to me at the time, but now, as I get older… I get it. Inputs and screens are exciting, no matter how jaded we get with the experience of using them.

    So, there it is. Excitement; that was my first reaction to the Meta Ray-Ban Display. My first reaction. It’s not, however, my last.

    A see change

    © Raymond Wong / Gizmodo

    If you’re like most people, the first thing you’re probably dying to know about the Meta Ray-Ban Display is how they actually look when they’re on your face.

    The titular display part of the Meta Ray-Ban Display is a 90Hz (30Hz minimum) 600 x 600-pixel full-color screen with a 20-degree field of view in the bottom-right corner of the right lens. The good news about having a screen in that area specifically is that it doesn’t obstruct your vision when you’re walking around and doing stuff. The bad part? Well, every time you look at it, you’re looking down and away as though you’re worried a snake might slither in and lunge at you. It’s not what I would call a natural resting face (let’s call it resting Meta face), but let’s be honest, there is nothing natural about walking around with a screen strapped to your eyeball.

    The screen inside the Meta Ray-Ban Display is also very bright, with a max brightness of 5,000 nits. This might not seem like a stat you want to pay attention to, but believe me, in a pair of smart glasses, it’s critical. I’ve used less bright screens in other pairs, and they’re hard to see outside. And if you’re spending $800 (before tax) on a pair of smart glasses, you’d better be able to use them while you’re walking around in the real world.

    Meta Ray Ban Display Review 01
    © Raymond Wong / Gizmodo

    In terms of style, you should know that all Meta Ray-Ban Display have transition lenses by default. That may seem like a bummer if transitions aren’t your thing, but it makes sense, since the screen needs to be effective indoors and out, and the only way to do that is by giving it contrast in direct sun. Conversely, it also provides see-through lenses indoors so you don’t go stepping on your cat or something. I find the screen to be very visible even in direct sun, probably because of the added contrast from the transition lenses. Also, you can buy these with prescription lenses, so that’s good news for those reading this from behind a pair of regular glasses.

    But just because the Meta Ray-Ban Display are bright does not mean the screen is perfectly sharp. I find the screen to be sharp enough to satisfy the dream I had in my head of what a pair of display smart glasses from Meta would look like, but others might be less enthused. I also noticed that some people might see the screen differently than I do. One colleague in my office described the screen as “shaky,” though I wouldn’t describe it that way at all. Others said they struggled to see it or that it was disorienting.

    Meta Ray Ban Display Review 23
    It’s hard to get a shot of the screen inside the Meta Ray-Ban Display, but in the right lighting, you can do it. © Raymond Wong / Gizmodo

    One thing I definitely found disorienting is that the lenses in the Meta Ray-Ban Display are actually mirrored. This, I assume, is part of the construction of the “geometric waveguides,” which is what the display tech inside the smart glasses is called. Geometric waveguides are special because they use mirrors to cut down on visual artifacts by reflecting light instead of splitting it like diffractive waveguides in other smart glasses. It also makes it so the screen is hard to see from the other side, which, by the way, is true. People probably aren’t going to know your screen is on unless you’re in a dark area and the brightness is turned up.

    The benefits of using a geometric waveguide are clear, but it can also be distracting at times, since you can see behind you if you look to the right, or even sometimes when you’re looking straight ahead. I do feel like my visibility actually decreases when I’m wearing the Meta Ray-Ban Display, probably more than when I wear other smart glasses with a screen in them.

    That being said, I find the screen to be up to snuff, if not the highest resolution in the world, but I highly recommend you go see for yourself before buying a pair. Luckily, Meta is requiring people to get sized for wristbands in-store anyway, but hey, maybe you’re considering buying them aftermarket! And, if you are, I would suggest… not. The lesson of the screen, if there is one to be had, is that, though your experience may vary. It is surprisingly bright, if not always sharp or hi-res. Ultimately, the screen is just a part of the picture; it’s also about what you can do on said screen, and on that front, the possibilities are… not endless.

    So… now what?

    As cool as controlling a UI by waving your fingers in the air is, that thrill (for most at least) probably won’t last forever, and when it fades, you’re going to wonder to yourself, “Okay, so what now?” In Meta’s case, the “what now” part consists of a few things, and I really mean a few.

    You’ve got bread-and-butter phone-type stuff like messaging, which encompasses Meta’s first-party messaging apps like WhatsApp or Messenger on Facebook and Instagram. It also, thank god, works with both iOS and Android, allowing you to both send and receive messages from your phone. In the smart glasses display, all of those notifications can be shown as they roll in, popping up as a bubble. You can also opt, via the Meta AI app (where you’ll have to connect Instagram, WhatsApp, and your phone) to have messages read out loud through the built-in speakers. Personally, I find that feature to be a little annoying. You have a screen now; you might as well use it.

    Meta Ray Ban Display Review 18
    © Raymond Wong / Gizmodo

    As you might imagine, the Meta Ray-Ban Display are more integrated with Instagram and WhatsApp than they are with your phone. For example, in the Instagram app, you can even watch Reels that are sent to you via DMs, which is a nice touch if you want to catch up with that one friend who spams you with memes while you’re commuting on the subway. Sending pictures via your phone, however, is a little less clean. In iOS, pictures show up as a link that the person has to tap in order to see the picture. It’s a small hurdle, but one that creates just a little bit more friction than a first-party gadget would.

    Mostly, though, the friction isn’t a dealbreaker. Messaging someone from the Meta Ray-Ban Display connected to your phone (in my case iOS) is pretty simple, though you still have to use the voice assistant on the smart glasses to do so. I tried texting my colleague, Ray Wong, for example, by saying “Text Ray,” and fortunately, Meta AI asked me, “Which Ray?” After that, I was able to use my thumb to select the correct one and then pinch “dictate” to say my message out loud, which in this case was, “I’m texting you from my stupid glasses.” I was even able to respond with a thumbs-up emoji after Ray texted back, “You look like a dork.”

    There’s nothing revolutionary about being able to send and receive messages with smart glasses, but I will say being able to see notifications as they roll in is a novel experience, and the ease with which you can respond feels more refined than you’d expect from a category of device that feels like it’s only existed for five minutes.

    Outside of messaging, there’s also video calling, which works about the same as it did on the Ray-Ban Meta Gen 1 and 2 AI smart glasses, though with a new video calling feature via WhatsApp, Messenger, and Instagram that shows your POV from the Meta Ray-Ban Display’s camera. This obviously is not an ideal way to video chat, but if you’re trying to show someone something, it could be useful. I’m not really sure how the Meta Ray-Ban Display would ever overcome the fact that there’s not a camera pointed at your face for a more natural video calling experience, either. A Meta-made version of Apple’s spatial Personas in the Vision Pro, maybe? I certainly hope they’d be more high-res than its Horizon Worlds avatars.

    I tried video calling my partner through Instagram to test the feature out, and the results were… low-res. To make sure, I compared the quality to the same thing from my iPhone 17 just to make sure it wasn’t just Instagram, and it looks like the camera resolution on the Meta Ray-Ban Display was the issue, since the video quality from my iPhone looked much clearer. For lots of reasons, I don’t think this is a feature I’d be using much.

    In addition to calling and messaging, there’s also navigation, which Meta says is still in beta. I used the Meta Ray-Ban Display’s navigation feature to do some walking in New York, and it was decent. I even used dictation to actually enter the address I was headed to, and it worked on a busy sidewalk in Times Square. Having turn-by-turn navigation glued to your eyeball like that isn’t always going to be useful, but in certain situations, it can be, and walking through Times Square definitely felt like one of them. Sure, I could have pulled out my phone just as easily, but there was something more freeing about being able to just glance down at the map on my face to make sure I was headed the right way. It also freed up my hand to use my phone to double-check the address was right, which felt a little dystopian in some ways to be screen maxing like that.

    Meta Ray Ban Display Review 16
    © Raymond Wong / Gizmodo

    Meta does offer a zoomed-in and zoomed-out view inside the UI, which is good if you’re moving fast and need to see ahead, which could be useful on a bike. Right now, when I’m using a bike share and I want to make sure I’m going the right way, I usually pull over to the curb and check my phone, which is not ideal. Meta even went as far as to integrate some non-essential map features into navigation on the Meta Ray-Ban Display, too, including tabs that help you search for cafes, restaurants, parks, and “attractions.” Meta is pulling that information, according to a helpful information tab in the maps app, from sources like OpenStreetMap, which offers publicly accessible map data.

    Another nice-to-have feature you might be interested in is live captions as well as live translations, which are exactly what they sound like. Live captions uses the microphones in the smart glasses to hear your surroundings and then captions them in real-time on the display, while translation does the same thing while converting one language to another. I tested both, and the live captions works fairly well, while the latter… well, you’ll find out.

    Live captions kept up with a fairly fast-paced YouTube video, and while it didn’t nail all of the words 100 percent, the broad strokes were all in place. If you’re hard of hearing or have an impairment, I can see live captions being useful, granted you’re in an environment where you can pick up sound okay. One thing that’s impressive is the Meta Ray-Ban Display’s ability to know when the wearer is talking and then not captioning that speech. This is, however, also a downside sometimes in real conversations, because oftentimes two people may overlap in vocalizing, and this causes Meta AI to miss what your conversation partner said in an effort not to capture your voice in the caption.

    Similarly, live translation works, but with some variability. I tested live translation in a conversation with my partner, who is bilingual (she speaks English and Spanish), and like live captions, it superimposes the translated text of your speaking partner onto the screen in real time. The only problem is, when I read back the translated text to confirm that it was correct, my partner frequently reported it being slightly off. It’s not that the meaning was wrong, per se (though sometimes it was), but the translated text, when ported over to English, was translated but not interpreted, if you catch my meaning. That’s to say the words were mostly correct, but it wasn’t rephrased to fit English grammar, making reading and understanding the conversation touch-and-go at points.

    meta ai app
    A screenshot from the Meta AI app showing a garbled translation. © James Pero / Gizmodo

    These hiccups are bound to happen in any translation app out there, and Meta AI is no different, but I wouldn’t say I was wowed at its acumen. That being said, I can see this feature coming in handy if you had to use it while traveling, if just because looking at the translation in your smart glasses screen is more natural than looking down at a phone. Google Translate is still more refined, but looking down and to the right in your smart glasses is marginally better than looking at your phone. If you do need to look at a phone, all of the translated text appears in the Meta AI app, so it can be easily referenced.

    One other minor gripe with live translation is that each time you want to switch languages, you have to go into the Meta AI app and download that language onto the smart glasses, which takes a couple of minutes (and that’s on my home Wi-Fi, not LTE). It’s not a huge deal, and you probably won’t be switching a ton between languages, but if you’re visiting a country where multiple languages are frequently spoken, it could get kind of annoying.

    Meta Ray Ban Display Review 06
    There are still touch controls on the arm if you need them. @ Raymond Wong / Gizmodo

    There are other apps that I’ll get into later (camera and photos), but right now, those are your main features. And when I say “main features,” what I mean is they’re the only features you’re going to get. There is no app store here, which means if you were excited to be able to doomscroll through TikTok on the Meta Ray-Ban Display, you’re out of luck. There is no Gmail. There is no Slack. And it’s not just third-party apps; there’s no proper Instagram or Facebook app, either. This, to me, seems like an odd choice given some other smart glasses out there with fewer resources manage to do a lot more. Even Inmo’s Air 3, which are otherwise not a good pair of smart glasses on a hardware level, have the Google Play Store and let you download pretty much any app and use it in 2D.

    It’s deflating to an extent to spend $800 on a future-forward device and then find that the future can’t run even a pared-down version of Instagram. I can only assume that will change at some point, but I don’t have a crystal ball, and can only review the Meta Ray-Ban Display as it is right now. And what it is right now is a pair of smart glasses with an impressive display that you can’t do a ton with.

    Time for (neural) band practice

    I think what gets lost in the hype of a bright and shiny screen inside Meta’s Ray-Ban Display is the screen’s companion, the Neural Band. One of the biggest hurdles to developing a functional pair of smart glasses with a screen in them is figuring out how exactly you should be controlling the UI that the screen displays. Meta’s solution is a pretty magical wearable called the Neural Band, which uses electromyography (EMG) sensors to detect electrical activity in your muscles and nerves and then translates that activity into inputs.

    To control the Meta Ray-Ban Display, you use a series of tiny gestures. For scrolling, make a fist and then move your thumb over the top to navigate left, right, up, or down, to select apps, scroll, etc… To select, you pinch your thumb and index finger together once. Going back is a single pinch of your middle finger and thumb, while a double-tap of those two digits can wake the screen or put it to sleep, and a long middle finger pinch brings up a quick menu for going home and going to apps. There’s also a double-tap gesture with your thumb on your fist that activates Meta AI, if you don’t want to use the wake command, “Hey Meta.” Every time the Neural Band registers an input, it gives you a nice little haptic buzz to tell you it’s working.

    The Neural Band is surprisingly quick at reading your inputs, though there are most likely times when you will have to send an input twice. The accuracy also has a lot to do with whether the band is strapped in tight enough, so if you’re having trouble, try making it a little tighter. Also, if you’re wearing the Neural Band all day, you can expect some accidental inputs, though I didn’t find this to be a huge issue. A couple of times here and there while I was typing, the Neural Band registered my finger motions as an input, but for the most part, it was reliable and steady.

    The band itself doesn’t have much to it; it’s a piece of cloth with plastic sensors inside that you slip on and then strap to your wrist just like you would a fitness band from Whoop or Polar. While it doesn’t look like much on the surface, it’s made from a fairly high-tech material called Vectran, which is used on the crash pads of the Mars Rover and is soft but strong. The band itself has 18 hours of battery and a magnetic charging cable, and an IPX7 rating, which means it can be submerged in 1 meter of water for up to 30 minutes without ingress. (Full disclosure: I did not test the band’s water resistance for fear of damaging it, so dunk it at your own risk.)

    Meta Ray Ban Display
    © Raymond Wong / Gizmodo

    To me, it’s all as cool as it sounds. Being able to control smart glasses this way is novel, and companies are still figuring out the best input method (smart rings are also hot right now), but this is my favorite so far in terms of user experience. With that said, it’s also a wristband. I previously wrote that making people put on a wristband to use their smart glasses is a big ask, and I still mostly stick by that statement. Whether the process is difficult or not, wearing the band for long periods can get a little irksome.

    The Neural Band leaves a mark on your arm afterward from the sensors pressing into your flesh, and sometimes it has to be strapped on fairly tight to work properly, which isn’t ideal for those of us who can get the ick from wearing something on our wrist all day (this is why I don’t wear watches of any kind). In a world of fitness bands and smartwatches, the Neural Band looks banal enough, but I am unconvinced that this is the solution to navigating glasses UI. My guess? This band is going straight in the trash when smart glasses makers figure out how to fit hand and eye tracking into the frames themselves. Oh, and speaking of the trash, don’t accidentally throw your Neural Band out or lose it; that’ll cost you $199 for a replacement.

    Meta Ray Ban Display neural band
    The Neural Band is gonna leave a mark. @Raymond Wong / Gizmodo

    Despite those downsides, the Neural Band is impressive overall, but I should warn you that misusing it can have dire consequences. My first word of advice: make sure you put the band on correctly. While testing the Meta Ray-Ban Display, I accidentally put the band on backwards since knowing which way is which can be confusing at first (just make sure the button of the top sensor is facing you). The result, obviously, was incorrect inputs, which, no big deal, right? Wrong. Don’t forget, the Meta Ray-Ban Display is connected to your phone, and because of that, I accidentally, somehow, ended up taking a picture and sending it to a friend of mine by accident. Luckily, everything was PG. Crisis averted, but it also could have made for a very awkward situation.

    If that idea sends a shiver down your spine, my word of advice would be to practice with your Neural Band for a few days before you connect your phone, so by the time you’re ready to start zooming around, you can do so with a reasonable degree of certainty that you won’t accidentally give your Aunt Debra an eyeful.

    How does the future feel?

    One thing you should be very aware of when you’re choosing a pair of smart glasses is how they look, but also how they feel on your face. Again, these are $800 smart glasses, and to get your money’s worth, you’re probably going to want to wear them for fairly long periods of time. And if you’re wearing them for long periods of time, they need to not destroy your nose. Mostly, the Meta Ray-Ban Display were comfortable during longer periods of use, though they are objectively heavier than non-display models. The Meta Ray-Ban Display are 69g (70g for the larger size) compared to the Ray-Ban Meta Gen 2, which are 52g.

    The Meta Ray-Ban Display aren’t just heavier, they’re also thicker. You can see that the frames are a great deal thicker than Meta’s screenless smart glasses. Though, thanks to the acumen of EssilorLuxottica in designing the Meta Ray-Ban Display, I think they’re stylish on most faces. Don’t get me wrong, they make most people look like Nerds (capital N intentional) or Brooklyn hipsters from 2004, but compared to other smart glasses with screens, they’re stylish and comfortable for the most part.

    Meta Ray Ban Display Review 05
    © Raymond Wong / Gizmodo

    The style choices are limited right now, which is a bit of a bummer since the Ray-Ban Meta AI glasses (the ones without a screen) come in lots of different styles. Right now, the Meta Ray-Ban Display only come in Black or Sand, and both of those finishes are shiny instead of matte, which is not really my first preference when it comes to Ray-Bans, both because I don’t love the look and they attract fingerprints a lot easier.

    One thing that I love this time around is the case, which is black and has the same pleather material as the ones for the Ray-Ban Meta AI Gen 1 and 2, but can be collapsed to lie flat, which is great for when you want to slip it into your pocket. That’s a bigger perk than you might think, since these are very expensive smart glasses that you’re going to want to take care of when you remove them from your face by putting them back in the case. I wouldn’t plan on repairs for scratches or damage on these smart glasses being cheap or easy, if you can even get them repaired at all.

    As long as we’re talking about wearing the Meta Ray-Ban Display for long periods, we should talk about battery life, too. There’s a new battery in all of Meta’s 2025 smart glasses (the same battery in the Oakley Meta Vanguard glasses and the Ray-Ban Meta Gen 2), and it pays off the same way it does in the non-display versions. After a full day’s worth of intermittent use, including about an hour of audio playback, messaging, navigation, and more, starting at about 10:30 a.m., the Meta Ray-Ban Display (which started with a full charge) were at about 18% by the time I got home at 8 p.m.

    For me, that feels more than sufficient, though I guess that depends on how much of a smart glasses junkie you are. Charging the Meta Ray-Ban Display is basically the same as always. You just slide them in between the arms inside the case so that the bridge rests on top, and the charging case will do the rest. You’ll get up to 50% charge in about 20 minutes, and the case holds 30 hours of battery in total.

    How you think the Meta Ray-Ban Display looks will be subjective, but you should be prepared for a little more heft than screenless versions, even if EssilorLuxottica does a good job of making that extra size work for the smart glasses.

    Cameras, speakers, and Meta AI

    There are some aspects of the Meta Ray-Ban Display that are somewhat unchanged from the screenless Ray-Ban Meta Gen 1 and 2. The audio is as solid as ever, both for calls and music playback, which is great since using the smart glasses for voice calls is still one of my favorite uses. Pictures are… fine. I was a bit disappointed to find that the Meta Ray-Ban Display doesn’t have the 3K 60 fps capability that the Ray-Ban Meta Gen 2 and Oakley Vanguard glasses have, and instead maxes out at 1440p at 30 fps.

    There’s also the same 12-megapixel sensor as the Ray-Ban Meta Gen 1 and Gen 2, which gives it similar video and picture capture. While shooting with the Meta Ray-Ban Display, I was underwhelmed. For $800, it would have been nice to see an improvement here, but you’re probably buying these smart glasses for the screen, anyway. If you want a more in-depth camera analysis, read our original Ray-Ban Meta Gen 1 review; it’s the same camera hardware, and as a longtime owner of those smart glasses, I can tell you that the results in the Meta Ray-Ban Display are about the same.

    Meta Ray Ban Display Review 17
    © Raymond Wong / Gizmodo

    What is new on the camera front is that you can get a real-time viewfinder of what you’re looking at in the display, which is nice. To take a picture, you just open the camera app via your preferred methods (voice or selecting it using the UI) and then pinch your index finger and thumb to get snapping. One cool twist here (no pun intended) is that you can zoom using the Neural Band by pinching your index finger and thumb and then twisting your wrist counterclockwise to zoom in. There’s a very simple photo app where you can look at all the pics and vids you snapped, too, and you can send them to people from this menu as well.

    Meta AI works the same as it does on previous generations of glasses, though you get a pop-up circle in the Meta Ray-Ban Display that tells you when Meta AI is activated and thinking. It works fine for simple voice commands like “take a picture” and “launch Spotify,” but uses the same AI models as other generations, so more complex tasks like “what am I looking at?” or “what kind of flower is this?” can be hit or miss. AI is still one of the least compelling parts of Meta’s smart glasses, despite the company’s emphasis on that front. I would like to see Meta focus on making a smoother voice assistant over computer vision capabilities, but that’s also a very tough nut to crack. Just ask Google, Amazon, and Apple, which have been trying for like a decade now.

    As always, there’s the Meta AI app, which shoves annoying AI content in your face that I could do without, but if you’re going to use these smart glasses, you’ll have to make peace with that. In fact, there are quite a few things you might have to make peace with if you’re going to use the Meta Ray-Ban Display.

    Rose-tinted smart glasses?

    There were some things I expected to feel while wearing the Meta Ray-Ban Display, and some things I didn’t. One thing I expected to feel was a little distracted. Turns out I was right. In theory, smart glasses with screens in them could be less all-consuming than phones, but in practice, I just don’t think that pans out. Sure, you don’t have access to apps and all of the things that keep your head glued to a screen, but notifications are also distracting, and even more so when they’re plastered on your eyeballs. There is something that happens when you bring your body and your eyes that close to a screen, and I’m not sure I like what that something is. Which brings me to the next thing I felt, though unexpectedly this time: worried.

    Meta Ray Ban Display Review 07
    © Raymond Wong / Gizmodo

    Putting on Meta’s smartest glasses had me feeling surprisingly introspective and a little douchey. On one hand, zooming around a screen floating before your eyes while using just one hand is cool, but on the other, it’s a little depressing. Is the novelty worth the hit to your distraction? Is it worth the implications on your privacy or the privacy of people around you? Is it worth wondering, as you’re walking around like a screen-zombie, staring at the ground, if people think you’re a total tool? That’s a personal thing that you’ll have to decide for yourself, but they’re questions worth asking, and it’s better to ask them now, before it’s too late.

    And maybe I’m blowing things out of proportion. Maybe we won’t have to reconcile any of those questions. Maybe this whole smart glasses thing will fall flat on its face, and that will be that; just glowing rectangles in our hands from here on out. That’s a possible future, but one that I sincerely doubt. With companies like Google, Apple (reportedly), and Samsung all waiting in the wings to launch their own versions of the Meta Ray-Ban Display, I’m willing to wager we haven’t seen the last of the smart glasses boom, which means we’re going to have to make some decisions.

    So, what say you: are smart glasses the future? Or are they just a one-way ticket to glasshole 2.0?

    [ad_2]

    James Pero

    Source link

  • AI In Smart Glasses Is Missing the Point

    [ad_1]

    I’ve used the Ray-Ban Meta Gen 1 and Gen 2 a lot over the last couple of years. One of my favorite things is the open-ear audio, which is a game-changer for hands-free calls and listening to music while you’re biking. And you know what? The cameras are okay, too, if you’re not looking for anything super hi-res. It’s good stuff! It’s actually the opposite stuff of one of the supposedly headline features being shoehorned into Meta’s description of the device—AI.

    Unfortunately, the titular AI in Meta’s self-described AI glasses is still the worst part about using them. As useful as I find the ability to call and message hands-free (or on display glasses, navigation, and notifications), none of that ties into the one thing that Meta seems to think is the defining feature of these devices. Sure, voice assistants use AI, and those can be incredibly useful (if not critical) on a pair of smart glasses—especially those without a screen—but there are still a lot of misses there, too.

    Voice assistants, in case you haven’t checked up on them lately, are about the same in terms of usefulness as they were 10 years ago, which is to say, moderately helpful. Google Assistant, Alexa, and Siri have all struggled mightily to improve over the years, and while two out of those three currently have next-gen voice assistants at least semi-available to the public, the word is out on whether they’re actually more helpful beyond turning our lights on and off or launching music. Meta AI, the voice assistant inside Meta’s Ray-Ban smart glasses, isn’t immune to those struggles either. It gets stuff right sometimes, and other times… not so much.

    © Raymond Wong / Gizmodo

    And the worst part is, the struggles of Meta AI in the voice assistant department pale in comparison to the computer vision side of things. If you’re not familiar with smart glasses, let me back up for a moment; in the Ray-Ban Meta Gen 1 and 2, you can use your smart glasses’ camera to ask Meta AI about your surroundings. That can be useful for stuff like translation or for low-vision people who need help reading things… if it works correctly. For everyone else and for most other scenarios, the feature can feel a bit obvious.

    When I’m testing Meta AI and its computer vision abilities, I often struggle to come up with things to throw at it. Sometimes I’ll look at an object and ask, “Hey Meta… what’s that?…” Oh, it’s a scooter, thank god! You saved me, Meta. Arguably worse is when you do actually think to use Meta AI and it doesn’t get the job done, like the time it told me every shell I picked up at the beach was a shark’s tooth.

    I’m singling Meta out here (in my defense, they put AI in the name of the smart glasses), but they’re not alone in their emphasis on AI. Prototypes of future smart glasses shown off by Google also lean heavily into computer vision, and even suggest a world where the camera in them is on literally all the time, watching everything you do. That sounds like the worst possible smart glasses future from a privacy standpoint. And based on Magic Leap’s recent presentation, it looks like other companies in the smart glasses game aren’t trailing far behind.

    I’m not writing all of this just to rag on AI. I think smart glasses can be useful, fun, and maybe even augmentative, but it’s going to take attention to detail and focus in the right direction. And sure, maybe AI can be a part of that, but there has not been evidence yet to suggest that it should be the main part, even if Meta’s naming conventions would suggest otherwise.

    [ad_2]

    James Pero

    Source link

  • Oakley Meta Vanguard review: Sporty to a fault

    [ad_1]

    By now, I have a well-established routine when I set up a new pair of Meta smart glasses. I connect my Instagram, WhatsApp and Spotify accounts. I complete the slightly convoluted steps in my Bluetooth settings to make sure Meta AI can announce incoming phone calls and text messages. I tweak the video settings to the highest quality available, and change the voice of Meta AI to “English (UK)” so it can talk to me in the voice of Judi Dench.

    But with the $499 Oakley Meta Vanguard glasses, there’s also a new step: deciding what the customizable “action button” should do. The action button isn’t even my favorite part of using the glasses, but it’s a sign of just how different these shades are from the rest of Meta’s lineup.

    While the second-gen Ray-Ban and Oakley HSTN glasses iterated on the same formula Meta has used for the last few years, the Vanguard glasses are refreshingly different. They aren’t really meant to be everyday sunglasses (unless you’re really committed to your athletic pursuits) but they are in many ways more capable than Meta’s other smart glasses. The speakers are louder, the camera has new abilities and they integrate directly with Strava and Garmin. And while these won’t replace my go-to sunglasses, there’s more than enough to make them part of my fitness routine.

    Engadget

    Wraparound frames aren’t for everyone, but the new look enables some unique capabilities that will appeal to even casual athletes.

    Pros

    • Better battery life, speakers and durability than Meta’s other glasses
    • Redesigned camera makes photos and videos more usable
    • Action button means you can do more without saying “Hey Meta”
    Cons

    • Hyperlapse clips are kind of jittery
    • More third-party app integrations, please

    $499 at Meta

    New look, new setup

    The sunglasses were very clearly made with athletes in mind. The Oakley Meta Vanguard glasses are the type of shades a lot of people probably think of when they hear “Oakley sunglasses.” The wraparound frames with colorful, reflective lenses are the style of glasses you might associate with a high school track coach, or your neighbor who is really serious about cycling.

    The pair I tested had black frames and Oakley’s orange “Prizm 24K” lenses, which aren’t polarized but are favored by a lot of athletes for their ability to dial up the contrast of your surroundings. I was able to comfortably wear my pair in bright, sunny conditions and also in more overcast lower light. I also appreciate that the lenses are swappable, so you can switch them out for a dedicated low-light or different-colored lens depending on your conditions. (Extra lenses cost $85 each and will be available to purchase separately soon, according to Meta.) These glasses don’t, however, support prescription lenses of any kind.

    I wouldn't wear these as everyday sunglasses, but I don't mind the look for a trail run.

    I wouldn’t wear these as everyday sunglasses, but I don’t mind the look for a trail run. (Karissa Bell for Engadget)

    I realize this style of sunglasses won’t be appealing to everyone, but the frame shape does enable a slightly different setup than what we’ve seen with any of Meta’s other smart glasses. Most noticeably, the camera is in the center of the glasses, just above the nosebridge. The LED that lights up when the camera is on is also in the center, near the top of the frames.

    As with Meta’s other smart glasses, you can control volume and music playback via a touchpad on the right side of the glasses, but the capture button to take photos and videos is now on the underside of the glasses rather than on top. This is meant to make it a bit easier to reach if you’re wearing a hat or helmet, though I found it took me a few tries to get used to the new placement. Behind the capture button is the previously mentioned “action button,” which can be customized to trigger specific functions via the Meta AI app.

    The capture button (left) and the action button (right) are both on the underside of the frames rather than on top.

    The capture button (left) and the action button (right) are both on the underside of the frames rather than on top. (Karissa Bell for Engadget)

    I haven’t yet figured out what the best use for the action button is, though I’ve tried out a few different setups. On one hike, I set it up to automatically call my husband, kind of like a speed dial. During a bike ride, I had it set to record a hyperlapse video. I’ve also tried it out as a shortcut for launching a specific Spotify playlist or as a general trigger for Meta AI. With all of these, I appreciated that the action button allowed me to do something without saying the “Hey Meta,” command. Repeating “hey Meta” to my glasses in public has always felt a bit cringey, so it was nice to have a much more subtle cue available.

    Did I mention it’s for athletes?

    The Vanguard’s athlete-focused features go beyond the sportier frames. The shades come with new integrations for two of the most popular run and bike-tracking platforms: Garmin and Strava. If you have a supported Garmin watch or bike computer, you can set up the glasses to automatically capture video clips based on metrics from your activity, like hitting a particular heart rate zone or other milestone. You can also ask Meta AI directly to tell you about stats from your Garmin watch, like “hey Meta, what’s my pace.”

    I don’t have a Garmin watch, though I did briefly test out some of these features during my hands-on at Meta Connect. I suspect a lot of runners and cyclists may still find it easier to simply glance at their watch to see stats, but having it all available via voice commands doesn’t seem like a bad thing either.

    Strava’s integration isn’t quite as deep. If you’re tracking a run, hike or ride while wearing the glasses, you can overlay your stats directly onto photos and videos from your activity. This includes metrics like distance and elevation, as well as heart rate if you’re also wearing an Apple Watch or other tracker that’s connected to the Strava app. Here’s what it looks like with a photo from a recent bike ride.

    You can overlay your Strava stats onto the photos and videos you record.

    You can overlay your Strava stats onto the photos and videos you record. (Karissa Bell for Engadget)

    I typically don’t share stats from runs or bike rides (usually because they aren’t that impressive) but it’s a bit more appealing that just sharing a straight Strava screenshot. Another neat feature is that if you share a video, you can watch the stats change in real time alongside your recording. That level of detail isn’t particularly interesting for a mostly flat bike ride on a city street, but I can see how it would be a lot more compelling on a more technical trail ride or in a race.

    My only complaint, really, is that Meta has limited these kinds of features to Garmin and Strava’s platforms so far. I’d love to have support for my favorite ski-tracking app, Slopes, and I’m sure there are plenty of people who’d be happy to have an integration with their running or workout-tracking app of choice. Meta has announced some plans to bring more third-party apps onto its smart glasses platform so there might be hope here.

    There are other improvements, though, that will be appealing to even casual athletes. The speakers are a lot louder to account for potentially noisy conditions like a congested roadway or high-wind environment. I never had to crank the volume up anywhere near the max during my bike rides or runs, but I can say the speakers were loud and clear enough that I was able to comfortably listen to a podcast with the glasses laying next to me on the couch at full volume.

    The new centered camera placement is meant to make it harder for a hat or helmet to interfere with your shots, which has been a consistent issue for me with Meta’s other smart glasses. The new position didn’t totally solve this — I still found that my bike helmet made it into the top of my pics — but at least it’s easier to crop out now that my headgear is centered over the top of my image rather than awkwardly sticking out on one side.

    The 12MP ultra-wide camera also comes with new video stabilization settings that make it feel a bit more like a replacement for an action cam. The glasses are set to automatically select a level of stabilization based on your motion, but you can also manually choose between low, medium or high stabilization (stabilization is locked at “medium” if you opt to record in 3K). I’ve mostly left it with the default settings and have been impressed with the results.

    The LED light is also a bit more subtle than on Meta's other smart glasses.

    The LED light is also a bit more subtle than on Meta’s other smart glasses. (Karissa Bell for Engadget)

    The Vanguard glasses are also Meta’s first smart glasses that can record hyperlapse and slow-motion video. Hyperlapse should be familiar to Instagram users who used the now-defunct app of the same name to record timelapse clips. Now, you can say “Hey Meta, start a hyperlapse” and the glasses will record a similar sped-up clip. My hyperlapse clips ended up looking a bit jittery, though, compared to the timelapse shots I’m used to getting with my GoPro.  And unfortunately, there’s no way to adjust the cadence of the video like you used to be able to with the dedicated app.

    My slow-motion clips, on the other hand, came out better. It’s not something I’d expect to use very often during a bike ride or trail run, but the POV angle is great for recording clips of pets or kids. Meta is also planning to bring support for hyperlapse and slow-motion videos to the rest of its glasses lineup, though, so you don’t need to get these particular shades to take advantage of the feature.

    The other major improvement is battery life. The Vanguard glasses have a notably better battery life compared with the second-gen Ray-Ban glasses or the HSTN frames (probably because the bigger frames allow for a larger battery). According to Meta, the Vanguard glasses can go nine hours on a charge with “typical use” or six hours with continuous audio playback. I was actually able to get a little over six hours of audio on a single charge, so they should hold up pretty well if you’re running marathons or competing in longer races. As usual, exact battery life can vary a lot depending on how much you’re using more resource-intensive features like video recording or Meta AI.

    The bigger frames and charging case give the glasses a battery life boost.

    The bigger frames and charging case give the glasses a battery life boost. (Karissa Bell for Engadget)

    I’m especially looking forward to seeing how these glasses will hold up during a day of snowboarding. Meta previously told me that the battery has been optimized for a wider spectrum of temperatures so hopefully the battery won’t drain as quickly on the mountain as Meta’s other glasses. And with increased water resistance — the shades have an IP67 rating —  I wouldn’t worry about dropping them in the snow.

    Should you buy these?

    While Meta and EssilorLuxottica have gotten very good at making smart glasses (sorry Mark Zuckerberg, I won’t call them “AI glasses,”) they are still somewhat of a niche product. And the ultra-sporty Oakley Vanguard glasses are even more niche. At $499, these are also more expensive than other models.

    That, understandably, may feel too steep for a pair of sunglasses you’re likely only going to wear during specific activities. But if you’re a dedicated cyclist, runner, hiker or [insert outdoor activity of your choice], there’s a lot to like. The camera makes a lot more sense for action cam-like POV footage, and better video stabilization means you’re more likely to get shots you actually want to share. Ready-made Garmin and Strava integrations are practically begging for you to brag about your latest PR or race time, which will certainly appeal to many.

    [ad_2]

    Source link

  • We’re Already Barrelling Toward a Smart Glasses Bubble

    [ad_1]

    Things tend to move fast in the world of gadgets—blink your eyes and you’re liable to go from “What’s an iPhone?” to seeing one in the hands of everyone you’ve ever known. That rapid acceleration can be hard to identify in the moment, but when you’ve written about gadgets as extensively as I have, you start to see the early signs, and I’m here to tell you that, if you’re a fan of smart glasses, I suggest you buckle up now.

    If smart glasses are just cropping up on your radar, I don’t blame you. While the form factor has been creeping up for years now, Meta’s entry into the equation with a pair of smart glasses that actually have a display has made quite a splash, especially after making the Meta Ray-Ban Display the highlight of its annual Connect conference. Not only that, but Meta now has its name attached to not one, but five pairs currently being sold right now—yes, five. And the thing is, it’s not just about Meta, nor is it just about Samsung and Apple, both of which are most likely in varying stages of releasing their own pairs of smart glasses.

    © Raymond Wong / Gizmodo

    Beneath all of those massive names, there are actually tons of companies that already have their own smart glasses for sale (with screens and without) that are even on multiple generations now. Companies like Rayneo, Viture, Even Realities, Solos, Brilliant Labs, Inmo, Rokid… Should I keep going? You see where I’m going with this—things are getting crowded all of a sudden, which in a lot of ways is great. The more companies making smart glasses, the more options you’ll have as consumers, and theoretically, the more innovation you’ll have in the category.

    I say “theoretically” here because, despite all of that attention, smart glasses haven’t been an easy nut to crack. While initial startups have offered some surprisingly compelling use cases (navigation and open-ear audio, to name a couple of my favorites), the category still has a bit of an Apple Watch issue, which is to say, compelling hardware without the Holy Grail killer feature that makes people rush out in droves to buy their own pair.

    And whether companies will be able to stick around long enough to figure those issues out fully is a whole different issue. Without resources to burn like Google, Apple, Samsung, or Amazon, which also recently dipped its toes into the smart glasses world with a pair of delivery driver-focused glasses, hanging in won’t be an easy task. And when Google, Samsung, Apple, and friends do arrive on the scene… What then?

    As surprisingly functional as smart glasses made by startups are right now, they don’t exactly have the robust ecosystem of mobile titans like Apple and Google, which own the platforms that smart glasses have to connect to. Why does that matter? Well, the more interconnectivity, the easier and more useful smart glasses become; suddenly, critical features like messaging, calling, and taking pictures feel seamless. Call me a cynic, but I doubt that third-party smart glasses will ever enjoy the same level of integration as a pair of Apple-made smart glasses connecting to an iPhone.

    I’m leaving room to be wrong here, too. Maybe a startup will figure out something game-changing and beat Samsung or Apple to the punch, though the clock is ticking for that to happen, since Samsung and Google have already gone as far as to preview a prototype of smart glasses. Or maybe, those tech titans pushing towards smart glasses just don’t have the chops to make things work, and a startup with some groundbreaking waveguides, or an awe-inspiring input system, will swoop in and steal the thunder. But if I’m being honest, that feels less likely to me, which is a bummer—to think that all this excitement could actually just be one big bubble.

    [ad_2]

    James Pero

    Source link

  • Amazon’s smart glasses with AI will help its drivers deliver packages faster

    [ad_1]

    Amazon has revealed that it’s currently working on smart glasses designed for delivery drivers, confirming previous reports about the project. The company said that glasses use AI-powered sensing capabilities and computer vision to detect what their cameras are seeing. Drivers then get guidance through the glasses’ heads-up display (HUD) embedded right into the lens. Based on Amazon’s announcement, it’s been working on the glasses for a while, and hundreds of delivery drivers had already tested early versions to provide the company with feedback.

    The glasses automatically activate after the driver parks their vehicle. They then show users the right packages to deliver, according to their location. Users will see the list of packages they have to take out on the HUD, and the glasses can even tell them if they pull out the right package from their pile. When they get out of their vehicle, the glasses will display turn-by-turn navigation to the delivery address and will also show them hazards along the way, as well as help them navigate complex locations like apartment buildings. Simply put, the device allows them to find delivery addresses and drop off packages without having to use their phones. Drivers will even be able to capture proof of delivery with the wearable.

    Amazon’s glasses will be paired with a vest that’s fitted with a controller and a dedicated emergency button drivers can press to call emergency services along their routes. The device comes with a swappable battery to ensure all-day use and can be fitted with prescription and transitional lenses if the drivers need them. Amazon expects future versions of the glasses to be able to notify drivers if they’re dropping a package at the wrong address and to be able to detect and notify them about more hazardous elements, like if there’s a pet in the yard.

    In the annual event wherein the company announced the device, Amazon transportation vice president Beryl Tomay said it “reduces the need to manage a phone and a package” and helps drivers “stay at attention, which enhances their safety.” She also said that among the testers, Amazon had seen time savings of 30 minutes for a given shit.

    The company didn’t say anything about developing smart glasses for consumers, but The Information’s previous report said that it’s also working on a model for the general public slated to be released in late 2026 or early 2027.

    [ad_2]

    Mariella Moon

    Source link

  • Samsung is working on XR smart glasses with Warby Parker and Gentle Monster

    [ad_1]

    As part of its Galaxy XR headset presentation, Samsung also briefly teased another wearable product. It’s working in collaboration with two eyewear companies, Warby Parker and Gentle Monster, on AI-powered smart glasses to go up against Meta’s Ray-Ban models, Samsung’s head of customer experience Jay Kim announced at the end of the livestream.

    “We’re also really excited about the AI glasses that we’re currently building together with Google,” Kim said. “We’re working with two of the most forward-thinking brands in eyewear, Warby Parker and Gentle Monster, to introduce new devices that fit into your lifestyle.”

    Samsung will focus on two different markets with those brands, though both will include “cutting-edge” AI features co-developed with Google. With Gentle Monster, it’s developing “fashion-forward” glasses that will likely be aimed at the higher end of the market. The Warby Parker collaboration, meanwhile, will yield eyewear designed for general consumers, probably at a lower price point.

    Samsung only said that the AI glasses will bring “style, comfort and practicality” to everyday life via Android’s XR ecosystem. As we saw in May with Google’s prototype XR smart glasses, it will likely employ a Gemini-powered display that will show notifications and small snippets of info from your apps, like the music you’re listening to or turn-by-turn GPS directions. It should also have a built-in camera, of course, along with speakers and a microphone.

    Design and appearance will also be key, but Samsung has yet to show any images of the upcoming smart glasses and didn’t reveal a release date. However, it will have a tough climb against Meta’s lineup given the Ray-Ban branding and that company’s head start on the technology. Last week, Meta introduced its Ray-Ban Display model that includes a screen for a true extended reality experience.

    [ad_2]

    Steve Dent

    Source link

  • Meta Ray-Ban Display review: Chunky frames with impressive abilities

    [ad_1]

    I’ve been wearing the $800 Meta Ray-Ban Display glasses daily for ten days and I’m still a bit conflicted. On one hand, I’m still not entirely comfortable with how they look. I’ve worn them on the bus, at the office, on walks around my neighborhood and during hangouts with friends. Each time, I’m very aware that I probably look a bit strange.

    On the other hand, there’s a lot I really like about using these glasses. The built-in display has helped me look at my phone less throughout the day. The neural band feels more innovative than any wrist-based device I’ve tried. Together, it feels like a significant milestone for smart glasses overall. But it’s also very much a first-generation device with some issues that still need to be worked out.

    Meta

    An exciting first-gen product, if you can get past the thick frames.

    Pros

    • Display is bright, clear and doesn’t feel overwhelming
    • Ability to preview and zoom in with the camera makes it way easier to frame shots
    • Visual feedback for Meta AI prompts is surprisingly helpful
    • Neural band is very accurate and reduces reliance on voice commands
    Cons

    • Frames are way too thick for most people’s comfort
    • Display is only compatible with a handful of apps
    • Text messages can be wonky

    More info at Meta

    Chunky statement glasses or hideously nerdy?

    To once again state the obvious: The frames are extremely chunky and too wide for my face. The dark black frames I tried for this review unfortunately accentuate the extra thickness. I won’t pretend it’s my best look and I did feel a bit self-conscious at times wearing these in public. Meta also makes a light brown “sand” color that I tried at the Connect event, and I think that color is a bit more flattering, even if the frames are just as oversized. (Sidenote: Smart glasses companies, please, please make your frames available in something other than black!)

    But, everyone has a different face shape, skin tone and general ability to “pull off” what one of my friends charitably described as “chunky statement glasses.” What looks not-great on my face, may look good on someone else. I really wish Meta could have squeezed this tech into slightly smaller frames, but I did get more used to the look the more I wore them. Overall, I do think the size is a reasonable tradeoff for a first-generation product that’s pretty clearly aimed at early adopters.

    Here's how they look in the lighter "sand" color.

    Here’s how they look in the lighter “sand” color.

    (Karissa Bell for Engadget)

    The reason the glasses are so thick compared with Meta’s other frames is because there are a lot of extra components to power the display, including a mini projector and waveguide. And, at 69 grams, the display glasses are noticeably heavier. I didn’t find it particularly uncomfortable at first, but there is a noticeable pressure after six or seven hours of wear. Plus, the extra weight and width also made them consistently slide down my nose. I’m not sure I’d feel comfortable wearing these on a bike ride or a jog as I’d worry about them falling off.

    While I tested these, I was very interested to get reactions from friends and family. I didn’t get many positive comments about how they looked on my face, though a few particularly generous colleagues assured me I was “pulling them off.” But seeing people’s reactions as soon as the display activated was another matter. Almost everyone has had the same initial reaction: “whoa.”

    Quality display with some limitations

    As I discussed in my initial impressions, these glasses have a monocular display on the right side, so it doesn’t offer the kind of immersive AR I experienced with the Orion prototype last year. You have to look slightly up and to the right to focus on the full-color display. It’s impressively bright and clear, but doesn’t overtake your vision.

    At 20 degrees, the field of view is small, but it never felt like a limitation. Because the content you see isn’t meant to be immersive, it never feels like what’s on the display is being cut off or like you have to adjust where you’re looking to properly see it. The display itself has three main menus: an app launcher, a kind of home screen where you can access Meta AI and view notifications and a settings page for adjusting brightness, volume and other preferences.

    The display is in the right lens.

    The display is in the right lens.

    (Karissa Bell for Engadget)

    For now, there are only a handful of Meta-created “apps” available. You can check your Instagram, WhatsApp and Messenger inboxes and chat with Meta AI. There’s also a simple maps app for walking navigation, a music/audio player, camera and live translation and captioning features. There’s also a mini puzzle game called “Hypertrail.”

    One of my favorite integrations was the ability to check Instagram DMs. Not only can you quickly read and respond to messages, you can watch Reels sent by your friends. While the video quality isn’t as high as what you’d see on your phone, there’s something very cool about quickly watching a clip without having to pull out your phone. Meta is also working on a standalone Reels experience that I’m very much looking forward to.

    I also enjoyed being able to view media sent in my family group chats on WhatsApp. I often would end up revisiting the photos on videos once I pulled out my phone, but being able to instantly see these messages as they came in tickled whatever part of my brain responds to instant gratification.

    There's some impressive tech inside those thick frames.

    There’s some impressive tech inside those thick frames.

    (Karissa Bell for Engadget)

    The display also solves one of my biggest complaints with Meta’s other smart glasses: that it’s really difficult to frame photos. When you open the camera app on the display model, you can see a preview of the photo and even use a gesture to zoom in to properly frame your shot. Similarly, if you’re on a WhatsApp video call you can see both the other person’s video as well as a small preview of your own like you would on your phone’s screen. It’s a cool trick but the small display felt too cramped for a proper video call. People I used this with also told me that my video feed had some quality issues despite being on Wi-Fi.

    The glasses’ live captioning and translation features are probably the best examples of Meta bringing its existing AI features into the display. I’ve written before about how Meta AI’s translation abilities are one of my favorite features of the Ray-Ban smart glasses. Live translation on the display is even better, because it delivers a real-time text feed of what the person in front of you is saying. I tried it out with my husband, a native Spanish speaker, and it was even more natural than the non-display glasses because I didn’t have to pause and wait for the audio to relay what he was saying. It still wasn’t an exactly perfect translation, and there were still a few occasions when it didn’t catch everything he said, but it made the process so much simpler overall.

    Likewise, live captions transcribes conversations in real-time into a similar text feed. I’ve found that it’s a cool way to demo these glasses’ capabilities, but I haven’t yet found an occasion to use this in anything other than a demo. However, I still think it could be useful as an accessibility aid for anyone who has trouble hearing or processing audio.

    Another feature that’s useful for travel is walking navigation. Dictate an address or location (you can say something like “take me to the closest Starbucks”) and the glasses’ display will guide you on your route. The first time I tried this was the roughly 10-minute walk from my bus stop to Yahoo’s San Francisco office. The route only required two turns, but it didn’t quite work. My glasses confidently navigated me to an alleyway behind the office building rather than the entrance. These kinds of mishaps happen with lots of mapping tools — Meta’s maps rely on data from OpenStreetMap and Overture — but it was a good reminder that it’s still early days for this product.

    I don’t use Meta AI a ton on any of my smart glasses, but having a bit of visual feedback for these interactions was a nice change. I retain information much better from reading than listening, so seeing text-based output to my queries felt a lot more helpful. It’s also nice that for longer responses from the assistant, you can stop the audio playback and swipe through informational cards instead.

    Meta AI on the glasses' display delivers information in a card-like interface.

    Meta AI on the glasses’ display delivers information in a card-like interface.

    (Meta)

    While cooking dinner one night, I asked for a quick recipe for teriyaki salmon and Meta AI supplied what seemed like a passable recipe onto the display. The only drawback was the display goes to sleep pretty quickly unless you continue to interact with the content you’re seeing, so the recipe I liked disappeared before I could actually attempt it. (You can view your Meta AI history in the Meta AI app if you really want to revisit something.)

    My main complaint is that I want to be able to do much more with the display. Messaging app integrations are nice, but I wish the display worked with more of the apps on my phone. When it worked best, I was happy to be able to view and dismiss messaging notifications without having to touch my phone; I just wish it worked with all my phone’s notifications.

    There are also some frustrating limitations on sending and receiving texts. For example, there’s no simple way to take a photo on your glasses and text it to a friend with the glasses. You have to wait for the glasses to send a “preview” of your message to your phone and then manually send the text. Or, you can opt in to Meta’s cloud services and send the photo immediately as a link, but I’m not sure many of my friends would readily open a “media.meta.com” URL.

    The glasses also don’t really support non-WhatsApp group chats, at least on iOS. You can receive messages sent in group chats, but there’s no indication the message originated in a group thread. And, it’s impossible to reply in the same thread; instead, replies are sent directly to the person who texted, which can get confusing if you’re not checking your phone. It was also a little annoying that reading and even replying to texts from my glasses wouldn’t mark the text as read in my phone’s inbox. Meta blames all this on Apple’s iOS restrictions, and says it’s hoping to work with the company to improve the experience. The company tells me that group messaging should work normally for people with Android devices and that there is also a dedicated inbox for checking texts on the glasses. I haven’t tested this out yet.

    The band + battery life

    The glasses are controlled using Meta’s Neural Band, which can translate subtle gestures like finger taps into actions on the display. Because the band relies on electromyography (EMG), you do need a fairly snug fit for it to work properly. I didn’t find it uncomfortable, but, like the glasses, I don’t love how it looks as a daily accessory. It also requires daily charging if you wear the glasses all day.

    But the band does work surprisingly well. In more than a week, it almost never missed a gesture, and it never falsely registered a gesture, despite my efforts to confuse it by fidgeting or rubbing my fingers together. The gestures themselves are also pretty intuitive and don’t take long to get used to: double tapping your thumb and middle fingers wakes up or puts the display to sleep, single taps of your index and middle fingers allow you to select an item or go back, and swiping your thumb along the side of your index finger lets you navigate around the display. There are a few others, but those are the ones I used most often.

    The Meta Neural Band requires a snug fit to work properly.

    The Meta Neural Band requires a snug fit to work properly.

    (Karissa Bell for Engadget)

    Each time you make a gesture, the band emits a small vibration so you get a bit of haptic feedback letting you know it registered. I’ve used hand tracking-based navigation in various VR, AR and mixed reality devices and I’ve always felt a bit goofy waving my hands around. But the neural band gestures work when your hand is by your side or in your pocket.

    The other major drawback of these glasses is that heavy use of the display drains the battery pretty quickly. Meta says the Ray-Ban Display’s battery can go about six hours on a single charge, but it really depends on how much you’re using the display. With very limited use, l was able to stretch the battery to about seven hours, but if you’re doing display-intensive tasks like video calling or live translation, it will die much, much more quickly.

    The Meta Ray-Ban Display glasses, charging case and neural band.

    The Meta Ray-Ban Display glasses, charging case and neural band.

    (Karissa Bell for Engadget)

    The glasses do come with a charging case that can deliver a few extra charges on-the-go, but I was a bit surprised at how often I had to recharge the case. With my normal Ray-Ban Meta glasses I can go several days without topping up the charging case, but with the Meta Ray-Ban Display case, I’m charging it at least every other day.

    Privacy and safety

    Whenever I write or post on social media about a pair of Meta-branded glasses, I inevitably hear from people concerned about the privacy implications of these devices. As I wrote in my recent review of Meta’s second-gen Ray-Ban glasses, I share a lot of these concerns. Meta has made subtle but meaningful changes to its glasses’ privacy policy over the last year, and its track record suggests these devices will inevitably scoop up more of our data over time.

    In terms of privacy implications of the display-enabled glasses, there isn’t a meaningful difference compared to their counterparts. Meta’s policies are the same for all its wearables. I suppose you could use live translation to surreptitiously eavesdrop on a conversation you wouldn’t typically understand, though that’s technically possible with Meta’s other glasses too. And the addition of a wrist-based controller means taking photos is a bit less obvious, but there’s still an LED indicator that lights up when the camera is on.

    The neural band allows you to snap photos without touching the capture button or using a voice command.

    The neural band allows you to snap photos without touching the capture button or using a voice command.

    (Karissa Bell for Engadget)

    I have been surprised at how many people have asked me if these glasses have some kind of facial recognition abilities. I’m not sure if that’s a sign of people’s general distrust of Meta, or an assumption based on seeing similar glasses in sci-fi flicks, but I do think it’s telling. (They don’t, to be clear. Meta currently only uses facial recognition for two safety-related features on Facebook and Instagram.) Meta hasn’t done much to earn people’s trust when it comes to privacy, and I wish the company would use its growing wearables business to try to prove otherwise.

    On a more practical level, I have some safety concerns. The display didn’t hinder my situational awareness while walking, but I could see how it might for others. And I’m definitely not comfortable using the display while driving. Meta does have an audio-only “driving detection” setting that can automatically kick in when you’re traveling in a car, but the feature is optional, which seems potentially problematic.

    Should you buy these?

    In short: probably not. As much as I’ve been genuinely impressed with Meta’s display tech, I don’t think these glasses make sense for most people right now. And, at $800, the Meta Ray-Ban Display glasses are more than twice as much as the company’s very good second-generation Ray-Ban glasses, which come in a wide range of much more normal-looking frame styles and colors.

    The Meta Ray-Ban Display glasses, on the other hand, still look very much like a first-gen product. There are some really compelling use cases for the display, but its functionality is limited. The glasses are also too thick and bulky for what’s meant to be an everyday accessory. At the end of the day, most people want glasses that make them look good. There’s also the fact that right now, these glasses are somewhat difficult to actually buy. They are only available at a handful of physical retailers, which currently have a very limited supply, Meta is also requiring would-be buyers to schedule demo appointments in order to buy, though some stores — like the LensCrafters where I bought my pair — aren’t enforcing this.

    Still, there’s a lot to be excited about. Watching people’s reactions to trying these has been almost as much fun as using them myself. Meta also has a solid lineup of new features already in the works, including a standalone Reels app, a teleprompter and gesture-based handwriting for message replies. If you’re already all-in on smart glasses or, like me, you’ve been patiently waiting for glasses with a high quality, usable display, then the Meta Ray-Ban Display glasses are worth the investment now — as long as you can accept the thick frames.

    Update, October 17, 2025, 3:42PM PT: Added more information about group text functionality on Android.

    [ad_2]

    Source link

  • Niantic’s Peridot, the Augmented Reality Alien Dog, Is Now a Talking Tour Guide

    [ad_1]

    Imagine you’re walking your dog. It interacts with the world around you—sniffing some things, relieving itself on others. You walk down the Embarcadero in San Francisco on a bright sunny day, and you see the Ferry Building in the distance as you look out into the bay. Your dog turns to you, looks you in the eye, and says, “Did you know this waterfront was blocked by piers and a freeway for 100 years?”

    OK now imagine your dog looks like an alien and only you can see it. That’s the vision for a new capability created for the Niantic Labs AR experience Peridot.

    Niantic, also the developer of the worldwide AR behemoth Pokémon Go, hopes to build out its vision of extending the metaverse into the real world by giving people the means to augment the space around them with digital artifacts. Peridot is a mobile game that lets users customize and interact with their own little Dots—dog-sized digital companions that appear on your phone’s screen and can look like they’re interacting with the world objects in the view of your camera lens. They’re very cute, and yes, they look a lot like Pokémon. Now, they can talk.

    Peridot started as a mobile game in 2022, then got infused with generative AI features. The game has since moved into the hands of Niantic Spatial, a startup created in April that aims to turn geospatial data into an accessible playground for its AR ambitions. Now called Peridot Beyond, it has been enabled in Snap’s Spectacles.

    Hume AI, a startup running a large language model that aims to make chatbots seem more empathetic, is now partnering with Niantic Spatial to bring a voice to the Dots on Snap’s Spectacles. The move was initially announced in September, but now it’s ready for the public and will be demonstrated at Snap’s Lens Fest developer event this week.

    Snap’s latest Spectacles, its augmented reality smart glasses.

    Courtesy of Snap

    [ad_2]

    Boone Ashworth

    Source link

  • Apple Finally Found a Use for the Vision Pro—and It’s Smart Glasses

    [ad_1]

    The Vision Pro has had a rough go of it. Not only has it struggled to find an audience since its release in 2024, but recent reports indicate that Apple is abandoning a cheaper and lighter version to instead work on 2025’s hottest new gadget: smart glasses. If you’re reading between the lines and thinking “the Vision Pro is cooked,” I can’t say I blame you. But even if it is cooked, a new report suggests at least one facet could live on, and perhaps finally find its footing, in the new form factor.

    Bloomberg’s Mark Gurman has reported that visionOS, the Vision Pro’s operating system, will make its way to Apple’s rumored smart glasses (which Bloomberg reported it was working on earlier this month). On the one hand, duh. Using visionOS, the only Apple operating system designed for mixed reality, is an obvious choice, especially because the Vision Pro’s UI is easily one of its best and biggest selling points. But it’s not just a matter of porting things over, according to the report: There’s a twist.

    Per Bloomberg, visionOS on a pair of smart glasses will have two modes: one when it’s paired with your iPhone, which is stripped down and more useful on the go, and the other when your glasses are paired with—and this is where things get interesting—a MacBook. That’s another scarce detail about Apple’s very-much-in-development smart glasses, and it gives us a hint of how they might work.

    The decision to differentiate between modes suggests that Apple’s smart glasses could compete with not only existing glasses like Meta’s Ray-Ban Display, which have a simple UI for navigation, messaging, photos, videos, and phone pairing, but also bigger, more headset-like devices (such as the Vision Pro) that parallel a MacBook. What those more advanced capabilities could be is anyone’s guess, but Apple’s smart glasses, if the display is nice enough and the chip is powerful enough, could lean into entertainment, gaming, or other more compute-intensive, laptop-like features.

    Apple’s smart glasses may go above and beyond Meta’s Ray-Ban Display. © James Pero / Gizmodo

    There’s a potential hint about UI here, too. While Apple could very well tweak visionOS to conform to different input methods on a pair of smart glasses, the OS, in its current form, is suited for the Vision Pro’s UI, which combines hand and eye tracking for a novel “spatial computing” experience that uses pinches and other finger gestures. The resulting user experience feels genuinely more refined than those that competitors like Meta and its Quest 3/3S offer. Does that mean Apple’s glasses will use hand and eye tracking? Who’s to say? But if Gurman’s reporting is accurate, the foundation for an Apple-like smart glasses UI is there.

    No matter how this shakes out, one thing is clear: Though Apple may not see a ton of promise in the Vision Pro’s hardware, it clearly sees value in visionOS. And, to be honest, so do I. As responsive and novel as Meta’s Neural Band (the wristband that registers inputs into Meta’s smart glasses) is, needing to combine smart glasses with a wearable doesn’t feel ideal. If Apple can port the convenience and smoothness of visionOS to a pair of smart glasses (especially in a wearable-free way), it’s got a big leg up, and that’s not even taking into account the opportunities presented by Apple’s direct integration with iPhones and MacBooks. This is all to say is that it looks like Apple may have finally figured out what to do with the Vision Pro, and the answer is turning it into a pair of smart glasses.

    [ad_2]

    James Pero

    Source link

  • Samsung’s Rumored Ray-Ban Smart Glasses Killer May Arrive Sooner Than You Think

    [ad_1]

    Everyone wants a piece of the smart glasses pie. With Meta plowing full steam ahead, releasing three new pairs of smart glasses in one go at its Connect conference this year (including the Ray-Ban Display with a screen), other big-name competitors are following suit. Apple, for example, is reportedly attempting to expedite its first pair of smart glasses by diverting resources from the Vision Pro team. To give you a sense of how urgent that pivot is, Apple is reportedly de-prioritizing the development of a cheaper and lighter version that people might actually, ya know, buy, to pursue said smart glasses.

    Now, it looks like another smartphone titan is being swept up in that push, and the result could be Samsung-made smart glasses on shelves sooner than you think. According to a report from the Financial News in South Korea, we could see “Project Haean,” Samsung’s rumored (Google-powered) AR glasses, as soon as early next year. It’s hard to say how much stock to put in that rumor without any official timeline from Samsung, but if you’ll allow me to speculate for a moment, it does feel very possible.

    On top of the palpable push toward smart glasses, there’s also solid evidence that both Google and Samsung are heavily invested in AR. This year at I/O, Google showed off a preview of its XR glasses, which have a similar featureset as Meta’s Ray-Ban Display. While there was no indication of when those glasses may see the light of day, it’s clear that this isn’t some pie-in-the-sky prototype. Gizmodo’s Senior Editor, Consumer Tech, Raymond Wong, got to try the XR glasses a little bit, and while the demo only ran for a grand total of 90 seconds, they were at least real in the sense that Google was letting people try them on.

    Our time with Google’s XR glasses was brief, but at least we confirmed they exist. © Gizmodo

    That’s all to say that new hardware is clearly in the pipeline, and while Samsung hasn’t gone as far as announcing anything official (it certainly hasn’t offered demos in a public setting), we have gotten some strong hints. At the I/O keynote in May, for example, GM of Android XR, Shahram Izadi said, “We’re taking our partnership with Samsung to the next level by extending Android XR beyond headsets to glasses. We’re creating the software and reference hardware platform to enable the ecosystem to build great glasses alongside us. Our glasses prototypes are already being used by trusted testers.” Well, well, look at that; Samsung and Google sittin’ in a tree…

    That’s not a clear yes or no, to be sure, but it’s not, not a no either. No matter how this plays out (or when), I’m personally looking forward to Samsung entering the fray. Even more so than Google, Samsung has a chance to make smart glasses that feel truly useful. Given the breadth of its presence in phones and other hardware, it could offer tighter integration between mobile devices and smart glasses than Meta could ever dream of, and that’s huge for delivering a quality smart glasses experience right now, since they still rely on phones for all the big-time computing.

    In the U.S., Samsung might not have the same weight as an Apple-scale ecosystem, but it’s still a huge player and would be the biggest one (sorry, Meta) that smart glasses with a screen have seen yet. Personally, my body (or my face, I guess) is ready for the varied and non-Meta-dominated smart glasses field that many of us have been waiting for.

    [ad_2]

    James Pero

    Source link