Meta is being sued by Solos, a rival smart glasses maker, for infringing on its patents, Bloomberg reports. Solos is seeking “multiple billions of dollars” in damages and an injunction that could prevent Meta from selling its Ray-Ban Meta smart glasses as part of the lawsuit.
Solos claims that Meta’s Ray-Ban Meta Wayfarer Gen 1 smart glasses violate multiple patents covering “core technologies in the field of smart eyewear.” While less well known than Meta and its partner EssilorLuxottica, Solos sells multiple pairs of glasses with similar features to what Meta offers. For example, the company’s AirGo A5 glasses lets you control music playback and automatically translate speech into different languages, and integrates ChatGPT for answering questions and searching the web.
Beyond the product similarities, Solos claims that Meta was able to copy its patents because Oakley (an EssilorLuxottica subsidiary) and Meta employees had insights into the company’s products and road map. Solos says that in 2015, Oakley employees were introduced to the company’s smart glasses tech, and were even given a pair of Solos glasses for testing in 2019. Solos also says that a MIT Sloan Fellow who researched the company’s products and later became a product manager at Meta, brought knowledge of the company to her role. According to the logic of Solos’ lawsuit, by the time Meta and EssilorLuxottica were selling their own smart glasses, “both sides had accumulated years of direct, senior-level and increasingly detailed knowledge of Solos’ smart glasses technology.”
Engadget has asked both Meta and EssilorLuxottica to comment on Solos’ claims. We’ll update this article if we hear back.
While fewer people own Ray-Ban Meta smart glasses than use Instagram, Meta considers the wearable one of its few hardware success stories. The company is so convinced it can make smart glasses happen that it recently restructured its Reality Labs division to focus on AI hardware like smart glasses and hopefully build on its success.
When Meta first announced its display-enabled smart glasses last year, it teased a handwriting feature that allows users to send messages by tracing letters with their hands. Now, the company is starting to roll it out, with people enrolled in its early access program getting it first,
I got a chance to try the feature at CES and it made me want to start wearing my Meta Ray-Ban Display glasses more often. When I reviewed the glasses last year, I wrote about how one of my favorite tings about the neural band is that it reduced my reliance on voice commands. I’ve always felt a bit self conscious at speaking to my glasses in public.
Up to now, replying to messages on the display glasses has still generally required voice dictation or generic preset replies. But handwriting means that you can finally send custom messages and replies somewhat discreetly.
Sitting at a table wearing the Meta Ray-Ban Display glasses and neural band, I was able to quickly write a message just by drawing the letters on the table in front of me. It wasn’t perfect — it misread a capital “I” as an “H” — but it was surprsingly intuitive. I was able to quickly trace out a short sentence and even correct a typo (a swipe from left to right will let you add a space, while a swipe from right to left deletes the last character).
Alongside handwriting, Meta also announced a new teleprompter feature. Copy and paste a bunch of text — it supports up to 16,000 characters (roughly a half-hour’s worth of speech) — and you can beam your text into the glasses’ display.
If you’ve ever used a teleprompter, Meta’s version works a bit differently in that the text doesn’t automatically scroll while you speak. Instead, the text is displayed on individual cards you manually swipe through. The company told me it originally tested a scrolling version, but that in early tests, people said they preferred to be in control of when the words appeared in front of them.
Teleprompter is starting to roll out now, though Meta says it could take some time before everyone is able to access.
The updates are the among the first major additions Meta has made to its display glasses since launching them late last year and a sign that, like its other smart glasses, the company plans to keep them fresh with new features. Elsewhere at CES, the company announced some interesting new plans for the device’s neural band and that it was delaying a planned international rollout of the device.
If you’ve been on the fence about trying the sort of “AR glasses” that, until recently, were called “personal cinemas,” then check this out. Xreal has turned up to CES 2026 with an updated version of its entry level Xreal One glasses, first launched at the end of 2024. The new model, dubbed the 1S (yes, with a numeral rather than the word) gets marginally better specs and $50 knocked from the asking price.
If you’re unfamiliar, One is a wearable spatial display that connects over USB-C to any compatible device including smartphones, tablets, laptops and consoles. It has two teeny-tiny displays in the eyecups that, when worn close to the eyes, trick your brain into thinking you’re looking at a big screen. The average would measure in at around 171 inches, but it’s possible to push the view to a screen closer to 500 inches if your eyes are capable of focusing that far.
As for the changes, they’re all firmly in the welcome nip-and-tuck department, boosting many of the original’s key specs. For instance, the 1080p screens have been swapped out for 1200p full HD, while the field of view has gone from 50 degrees to 52. Brightness has been boosted from 600 nits on the old model to 700 nits here, while the aspect ratio has grown from 16×9 to 16×10. But the change Xreal is arguably most proud of is the drop in price, from $499 down to $449.
At the same time, the company is launching the Xreal Neo, an external battery-cum-DisplayPort hub for your glasses. Inside you’ll find a 10,000mAh power bank to keep your glasses going for longer and, more importantly, offers better connection for your Switch consoles. After all, before now, if you wanted to play with your Switch or Switch 2, you’d need to hook it up to its own dock. With the Neo, however, you can eliminate that from your bag when you’re playing out and about. That’s available as a standalone purchase for $99 which, like the new 1S, are ready to buy right now.
Being a member of the “four-eyes” club is 10% looking like a sophisticated intellectual and 90% cleaning a mysterious smudge that somehow has the consistency of industrial-grade epoxy.
We’ve all endured the physical trauma of lying down sideways only to have our frames try to lobotomize us, or the sheer indignity of hearing better the second we put our specs on.
So stop squinting at your screen, push those frames back up your nose, and let’s dive in to these relatable and hilarious memes!
One of the things that’s long irked me about Meta’s smart glasses is how often you have to say “hey Meta.” Even though the company’s AI assistant has gotten significantly more capable, there’s something a little cringey about using its voice commands in public spaces.
Now, the company has rolled out an update that makes you a little less dependent on voice commands for all its glasses. A new “quick connect” feature allows you to create a one-touch shortcut for “frequently used communication actions,” like making a phone call or sending a text to a specific contact. The update is out now as part of the 19.2 software update.
The idea of “quick connect” is similar to the functionality of the “action button” on the Oakley Meta Vanguard glasses, though it’s more limited in scope. The feature allows you to designate a specific contact that you can message, text or call just by holding down on the touchpad on the right side of the glasses. The Meta AI app will even let you choose your preferred method of reaching them, whether it’s WhatsApp, Instagram, Messenger or your phone’s native calling or messaging app. The same one-touch press and hold shortcut can also be used to send photos and videos shot on the glasses right after you take them (also via your chosen app).
You can designate a specific contact and on which app you want to be able to contact them.
During its Connect event, Meta previewed some updates for third-party apps that appear to allow developers to set their own wake words for the glasses when using their services. So there is some hope that eventually the company will offer a bit more flexibility with its voice commands. Even so, it’s unlikely you’ll be able to get away with never having to say “hey Meta,” but the quick connect update is a handy way to make messaging a bit more discreet.
By now, I have a well-established routine when I set up a new pair of Meta smart glasses. I connect my Instagram, WhatsApp and Spotify accounts. I complete the slightly convoluted steps in my Bluetooth settings to make sure Meta AI can announce incoming phone calls and text messages. I tweak the video settings to the highest quality available, and change the voice of Meta AI to “English (UK)” so it can talk to me in the voice of Judi Dench.
But with the $499 Oakley Meta Vanguard glasses, there’s also a new step: deciding what the customizable “action button” should do. The action button isn’t even my favorite part of using the glasses, but it’s a sign of just how different these shades are from the rest of Meta’s lineup.
While the second-gen Ray-Ban and Oakley HSTN glasses iterated on the same formula Meta has used for the last few years, the Vanguard glasses are refreshingly different. They aren’t really meant to be everyday sunglasses (unless you’re really committed to your athletic pursuits) but they are in many ways more capable than Meta’s other smart glasses. The speakers are louder, the camera has new abilities and they integrate directly with Strava and Garmin. And while these won’t replace my go-to sunglasses, there’s more than enough to make them part of my fitness routine.
Engadget
Wraparound frames aren’t for everyone, but the new look enables some unique capabilities that will appeal to even casual athletes.
Pros
Better battery life, speakers and durability than Meta’s other glasses
Redesigned camera makes photos and videos more usable
Action button means you can do more without saying “Hey Meta”
The sunglasses were very clearly made with athletes in mind. The Oakley Meta Vanguard glasses are the type of shades a lot of people probably think of when they hear “Oakley sunglasses.” The wraparound frames with colorful, reflective lenses are the style of glasses you might associate with a high school track coach, or your neighbor who is really serious about cycling.
The pair I tested had black frames and Oakley’s orange “Prizm 24K” lenses, which aren’t polarized but are favored by a lot of athletes for their ability to dial up the contrast of your surroundings. I was able to comfortably wear my pair in bright, sunny conditions and also in more overcast lower light. I also appreciate that the lenses are swappable, so you can switch them out for a dedicated low-light or different-colored lens depending on your conditions. (Extra lenses cost $85 each and will be available to purchase separately soon, according to Meta.) These glasses don’t, however, support prescription lenses of any kind.
I wouldn’t wear these as everyday sunglasses, but I don’t mind the look for a trail run. (Karissa Bell for Engadget)
I realize this style of sunglasses won’t be appealing to everyone, but the frame shape does enable a slightly different setup than what we’ve seen with any of Meta’s other smart glasses. Most noticeably, the camera is in the center of the glasses, just above the nosebridge. The LED that lights up when the camera is on is also in the center, near the top of the frames.
As with Meta’s other smart glasses, you can control volume and music playback via a touchpad on the right side of the glasses, but the capture button to take photos and videos is now on the underside of the glasses rather than on top. This is meant to make it a bit easier to reach if you’re wearing a hat or helmet, though I found it took me a few tries to get used to the new placement. Behind the capture button is the previously mentioned “action button,” which can be customized to trigger specific functions via the Meta AI app.
The capture button (left) and the action button (right) are both on the underside of the frames rather than on top. (Karissa Bell for Engadget)
I haven’t yet figured out what the best use for the action button is, though I’ve tried out a few different setups. On one hike, I set it up to automatically call my husband, kind of like a speed dial. During a bike ride, I had it set to record a hyperlapse video. I’ve also tried it out as a shortcut for launching a specific Spotify playlist or as a general trigger for Meta AI. With all of these, I appreciated that the action button allowed me to do something without saying the “Hey Meta,” command. Repeating “hey Meta” to my glasses in public has always felt a bit cringey, so it was nice to have a much more subtle cue available.
Did I mention it’s for athletes?
The Vanguard’s athlete-focused features go beyond the sportier frames. The shades come with new integrations for two of the most popular run and bike-tracking platforms: Garmin and Strava. If you have a supported Garmin watch or bike computer, you can set up the glasses to automatically capture video clips based on metrics from your activity, like hitting a particular heart rate zone or other milestone. You can also ask Meta AI directly to tell you about stats from your Garmin watch, like “hey Meta, what’s my pace.”
I don’t have a Garmin watch, though I did briefly test out some of these features during my hands-on at Meta Connect. I suspect a lot of runners and cyclists may still find it easier to simply glance at their watch to see stats, but having it all available via voice commands doesn’t seem like a bad thing either.
Strava’s integration isn’t quite as deep. If you’re tracking a run, hike or ride while wearing the glasses, you can overlay your stats directly onto photos and videos from your activity. This includes metrics like distance and elevation, as well as heart rate if you’re also wearing an Apple Watch or other tracker that’s connected to the Strava app. Here’s what it looks like with a photo from a recent bike ride.
You can overlay your Strava stats onto the photos and videos you record. (Karissa Bell for Engadget)
I typically don’t share stats from runs or bike rides (usually because they aren’t that impressive) but it’s a bit more appealing that just sharing a straight Strava screenshot. Another neat feature is that if you share a video, you can watch the stats change in real time alongside your recording. That level of detail isn’t particularly interesting for a mostly flat bike ride on a city street, but I can see how it would be a lot more compelling on a more technical trail ride or in a race.
My only complaint, really, is that Meta has limited these kinds of features to Garmin and Strava’s platforms so far. I’d love to have support for my favorite ski-tracking app, Slopes, and I’m sure there are plenty of people who’d be happy to have an integration with their running or workout-tracking app of choice. Meta has announced some plans to bring more third-party apps onto its smart glasses platform so there might be hope here.
There are other improvements, though, that will be appealing to even casual athletes. The speakers are a lot louder to account for potentially noisy conditions like a congested roadway or high-wind environment. I never had to crank the volume up anywhere near the max during my bike rides or runs, but I can say the speakers were loud and clear enough that I was able to comfortably listen to a podcast with the glasses laying next to me on the couch at full volume.
The new centered camera placement is meant to make it harder for a hat or helmet to interfere with your shots, which has been a consistent issue for me with Meta’s other smart glasses. The new position didn’t totally solve this — I still found that my bike helmet made it into the top of my pics — but at least it’s easier to crop out now that my headgear is centered over the top of my image rather than awkwardly sticking out on one side.
The 12MP ultra-wide camera also comes with new video stabilization settings that make it feel a bit more like a replacement for an action cam. The glasses are set to automatically select a level of stabilization based on your motion, but you can also manually choose between low, medium or high stabilization (stabilization is locked at “medium” if you opt to record in 3K). I’ve mostly left it with the default settings and have been impressed with the results.
The LED light is also a bit more subtle than on Meta’s other smart glasses. (Karissa Bell for Engadget)
The Vanguard glasses are also Meta’s first smart glasses that can record hyperlapse and slow-motion video. Hyperlapse should be familiar to Instagram users who used the now-defunct app of the same name to record timelapse clips. Now, you can say “Hey Meta, start a hyperlapse” and the glasses will record a similar sped-up clip. My hyperlapse clips ended up looking a bit jittery, though, compared to the timelapse shots I’m used to getting with my GoPro. And unfortunately, there’s no way to adjust the cadence of the video like you used to be able to with the dedicated app.
My slow-motion clips, on the other hand, came out better. It’s not something I’d expect to use very often during a bike ride or trail run, but the POV angle is great for recording clips of pets or kids. Meta is also planning to bring support for hyperlapse and slow-motion videos to the rest of its glasses lineup, though, so you don’t need to get these particular shades to take advantage of the feature.
The other major improvement is battery life. The Vanguard glasses have a notably better battery life compared with the second-gen Ray-Ban glasses or the HSTN frames (probably because the bigger frames allow for a larger battery). According to Meta, the Vanguard glasses can go nine hours on a charge with “typical use” or six hours with continuous audio playback. I was actually able to get a little over six hours of audio on a single charge, so they should hold up pretty well if you’re running marathons or competing in longer races. As usual, exact battery life can vary a lot depending on how much you’re using more resource-intensive features like video recording or Meta AI.
The bigger frames and charging case give the glasses a battery life boost. (Karissa Bell for Engadget)
I’m especially looking forward to seeing how these glasses will hold up during a day of snowboarding. Meta previously told me that the battery has been optimized for a wider spectrum of temperatures so hopefully the battery won’t drain as quickly on the mountain as Meta’s other glasses. And with increased water resistance — the shades have an IP67 rating — I wouldn’t worry about dropping them in the snow.
Should you buy these?
While Meta and EssilorLuxottica have gotten very good at making smart glasses (sorry Mark Zuckerberg, I won’t call them “AI glasses,”) they are still somewhat of a niche product. And the ultra-sporty Oakley Vanguard glasses are even more niche. At $499, these are also more expensive than other models.
That, understandably, may feel too steep for a pair of sunglasses you’re likely only going to wear during specific activities. But if you’re a dedicated cyclist, runner, hiker or [insert outdoor activity of your choice], there’s a lot to like. The camera makes a lot more sense for action cam-like POV footage, and better video stabilization means you’re more likely to get shots you actually want to share. Ready-made Garmin and Strava integrations are practically begging for you to brag about your latest PR or race time, which will certainly appeal to many.
Amazon has revealed that it’s currently working on smart glasses designed for delivery drivers, confirming previous reports about the project. The company said that glasses use AI-powered sensing capabilities and computer vision to detect what their cameras are seeing. Drivers then get guidance through the glasses’ heads-up display (HUD) embedded right into the lens. Based on Amazon’s announcement, it’s been working on the glasses for a while, and hundreds of delivery drivers had already tested early versions to provide the company with feedback.
The glasses automatically activate after the driver parks their vehicle. They then show users the right packages to deliver, according to their location. Users will see the list of packages they have to take out on the HUD, and the glasses can even tell them if they pull out the right package from their pile. When they get out of their vehicle, the glasses will display turn-by-turn navigation to the delivery address and will also show them hazards along the way, as well as help them navigate complex locations like apartment buildings. Simply put, the device allows them to find delivery addresses and drop off packages without having to use their phones. Drivers will even be able to capture proof of delivery with the wearable.
Amazon’s glasses will be paired with a vest that’s fitted with a controller and a dedicated emergency button drivers can press to call emergency services along their routes. The device comes with a swappable battery to ensure all-day use and can be fitted with prescription and transitional lenses if the drivers need them. Amazon expects future versions of the glasses to be able to notify drivers if they’re dropping a package at the wrong address and to be able to detect and notify them about more hazardous elements, like if there’s a pet in the yard.
In the annual event wherein the company announced the device, Amazon transportation vice president Beryl Tomay said it “reduces the need to manage a phone and a package” and helps drivers “stay at attention, which enhances their safety.” She also said that among the testers, Amazon had seen time savings of 30 minutes for a given shit.
The company didn’t say anything about developing smart glasses for consumers, but The Information’s previous report said that it’s also working on a model for the general public slated to be released in late 2026 or early 2027.
As part of its Galaxy XR headset presentation, Samsung also briefly teased another wearable product. It’s working in collaboration with two eyewear companies, Warby Parker and Gentle Monster, on AI-powered smart glasses to go up against Meta’s Ray-Ban models, Samsung’s head of customer experience Jay Kim announced at the end of the livestream.
“We’re also really excited about the AI glasses that we’re currently building together with Google,” Kim said. “We’re working with two of the most forward-thinking brands in eyewear, Warby Parker and Gentle Monster, to introduce new devices that fit into your lifestyle.”
Samsung will focus on two different markets with those brands, though both will include “cutting-edge” AI features co-developed with Google. With Gentle Monster, it’s developing “fashion-forward” glasses that will likely be aimed at the higher end of the market. The Warby Parker collaboration, meanwhile, will yield eyewear designed for general consumers, probably at a lower price point.
Samsung only said that the AI glasses will bring “style, comfort and practicality” to everyday life via Android’s XR ecosystem. As we saw in May with Google’s prototype XR smart glasses, it will likely employ a Gemini-powered display that will show notifications and small snippets of info from your apps, like the music you’re listening to or turn-by-turn GPS directions. It should also have a built-in camera, of course, along with speakers and a microphone.
Design and appearance will also be key, but Samsung has yet to show any images of the upcoming smart glasses and didn’t reveal a release date. However, it will have a tough climb against Meta’s lineup given the Ray-Ban branding and that company’s head start on the technology. Last week, Meta introduced its Ray-Ban Display model that includes a screen for a true extended reality experience.
The rumors were true. Meta’s first pair of AR glasses with a built-in screen is the Meta Ray-Ban Display. They’ll cost $799 and will come to a limited number of brick-and-mortar stores in the United States on September 30. Those retailers include Best Buy, LensCrafters, Ray-Ban and Verizon, and availability will expand to Canada, France, Italy and the United Kingdom in early 2026.
The Ray-Ban Displays have a camera, audio functionality, and a translucent heads-up display that shows and allows the wearer to respond to text chats, AI prompts, directions and video calls. You’re able to use gestures to interact with the HUD, including small actions like swiping your fingers to type out a chat reply. Each pair requires and comes with a dedicated EMG wristband, the Meta Neural Band, which enables these interactions.
At least, that’s what Meta promises. The glasses failed to receive a phone call in a live demo during their announcement at the Connect 2025 conference, but they did perform other actions just fine. Meta CEO Mark Zuckerberg opened Spotify and played a song, took and viewed photos, and successfully demonstrated a real-time subtitle feature that looks legitimately useful. As outlined by Meta, the HUD supports Meta AI with visuals, messaging and video calling, previewing and zooming in on photos, turn-by-turn pedestrian navigation, live captions and translations, and music playback.
Connect 2025 kicked off with Zuckerberg streaming his POV from a pair of Ray-Ban Displays, including a HUD on the right side showing Spotify, calendar reminders, text chats and incoming images with options to respond by dictating a message, dropping an emoji or selecting a typed phrase. The glasses and wristband come in two colors, black and sand, and two sizes, standard and large. All pairs have Transitions lenses that automatically adjust to light conditions.
The glasses’ display is “extremely high resolution,” Zuckerberg was stoked to report. The HUD is full-color and supports 42 pixels per each degree of the field of view — compare that with the Meta Quest 3S, which has 20 pixels per degree. The glasses boast “six hours of mixed-use battery life and up to 30 hours of battery life total,” while the Meta Neural Band has 18 hours of battery life and an IPX7 water rating.
The Meta Ray-Ban Display glasses join a lineup of smart spectacles revealed at Connect 2025, including the second generation of the Ray-Ban Meta glasses (which also hilariously failed during a live demo of their AI assistant capabilities), and the sporty Oakley Meta Vanguard.
A leak earlier this week spoiled the Meta Ray-Ban Display surprise, capping off a year of rumors around Meta’s HUD-based efforts.
When Meta announced its first pair of Oakley-branded sunglasses, the HSTN frames, earlier this year, it called them “performance AI” glasses even though they only came with modest upgrades compared with Meta’s Ray-Ban lineup. But the new Oakley Meta Vanguard glasses, which were just unveiled at Connect, are much more clearly aimed at serious athletes and they have the features to back it up.
The $499 sunglasses feature Oakley’s familiar wraparound frames and shiny (swappable) lenses. They are the first of Meta’s smart glasses to change the placement of the camera, which is now in the center of the frames above the nose. According to Meta, this should make it harder for a hat or a helmet to ruin your shots, which was a consistent issue for me with the HSTN glasses.
Meta is making other camera adjustments that should make the glasses more reliable for capturing first-person action cam-style footage. The 12 megapixel camera now has a wider, 122-degree angle lens and adjustable video stabilization. There are also now dedicated modes for capturing slow motion videos as well as Instagram-ready hyperlapse slips.
There are other spec upgrades too. Battery life has been improved to six hours of continuous music playback and nine hours of “mixed use”. The charging case can provide another 36 hours of battery life. Meta also told me the glasses have been optimized for a wider range of temperatures, so the battery should hold up better in very cold or very hot environments.
The onboard speakers are more powerful. Mark Zuckerberg said during the Connect presentation that the open-ear speakers are 6 decibels louder than before. He said he took a call on a jet ski “a few weeks ago… it was great.”
When I cranked up the volume during my demo, I had to pause the music in order to hear the person next to me speaking. The glasses are also much more water resistant than their predecessors, with an IP67 rating that means they can be fully submerged.
Meta has also changed up the button placement on the glasses, putting the capture button on the bottom right side of the glasses instead of the top. There’s also a new “action button” that’s particularly intriguing. This is a customizable button that users can program to trigger specific actions. For example, it could start playing a specific Spotify playlist or it could trigger a hyperlapse video. It can also be mapped to actions that take advantage of Meta AI, like providing a surf report or identifying what you’re looking at. I’m not sure what I would use this button for, but I’m looking forward to trying it out when I get my hands on a pair for more than a few minutes.
The Oakley Meta Vanguard glasses will come with integrations for Strava and Garmin. In my demo, I walked on a treadmill while wearing a Garmin watch and the Vanguard glasses. This meant I could ask Meta AI for info about my heart rate and my pace. If you’re a Strava user, you can overlay photos and videos from your run onto the stats you get at the end of your run.
Like the HSTN glasses, I have a feeling the Vanguard frames could be a bit… polarizing. Most people do not want to wear big wraparound sunglasses for daily activities. I definitely don’t! But Meta has added enough new features that the $499 sunglasses might actually make sense for athletes. I’ve been wearing Oakley ski goggles for years and I suspect a pair of Vanguard glasses could easily replace them in most conditions.
The Oakley Meta Vanguard glasses are available now for pre-order. They officially go on sale October 21.
In Ubisoft’s open-world game Watch Dogs (and its sequels), you can quickly scan any NPC you meet and discover facts about them, including their name, address, criminal record, and so on. And now two people have essentially created this tech in real life using Meta’s smart glasses and mostly off-the-shelf tech and software, providing a scary glimpse at our future.
Agatha All Along Star Sasheer Zamata On Witchcraft And Star Trek
As reported by 404 Media, two Harvard students have built working smart glasses that use facial recognition technology to automatically identify someone via their face. Not only that, but the glasses then use that information to track down other details about the stranger including their address, phone number, past photos, and family members. According to the two students, AnhPhu Nguyen and Caine Ardayfio, they did this to raise awareness of what is possible with current tech and they have no plans to release it publicly.
Nguyen and Ardayfio call the project I-XRAY and showed a demo of it in action earlier this week on social media. In the video posted to Twitter, the pair were able to identify multiple strangers without asking them for any details, though some of the data proved to be inaccurate when the duo talked to the people.
“The motivation for this was mainly because we thought it was interesting, it was cool,” Nguyen told 404 Media. Apparently, other people they showed it to also thought it was “really cool” and some suggested it could be used for “networking” or to “make funny videos.” However, thankfully, someone also mentioned to them how incredibly dangerous this tech could be in the wrong hands. “Some dude could just find some girl’s home address on the train and just follow them home,” said Nguyen.
As pointed out by 404 Media, this kind of smart-glasses-facial-scanning tech has been around for a few years now. But Google and Facebook, two companies who were working on it, eventually decided to not release their software.
But you don’t need big tech resources and money to build your own Watch Dogs super glasses that can instantly dox anyone you meet on the street. Nguyen and Ardayfio’s I-XRAY uses Meta’s Ray Bans and the publicly available face recognition service Pimeyes to scan someone’s face with hidden cameras in the glasses and then identify them. That info is then used to scrape the web for phone numbers, other photos, family information, and addresses.
“We would show people photos of them from kindergarten, and they had never even seen the photo before,” said Ardayfio. “Most people were surprised by how much data they have online.” One time, they were able to show a stranger their mom’s phone number after simply scanning their face.
“I think people could definitely take [the idea of I-XRAY] and run with it,” Ardayfio said. “If people do run with this idea, I think that’s really bad. I would hope that awareness that we’ve spread on how to protect your data would outweigh any of the negative impacts this could have.” The duo has included information on how to protect yourself in a large document about the project that is freely available online.
Meta Platforms Chief executive Mark Zuckerberg unveiled ‘Orion,’ the company’s first working prototype of augmented-reality glasses during its annual Connect conference on Wednesday.
“It is a completely new kind of display architecture with these tiny projectors and the arms, the glasses that shoot light into waveguides, that have nanoscale 3D structures etched into the lenses so they can defract light and put holograms at different depths and sizes into the world in front of you.”
Users will be able to interact with the glasses through hand-tracking, voice and wrist-based neural interface.
Zuckerberg said Meta plans to make Orion smaller, sleeker and more low-cost before releasing it to consumers.
Zuckerberg positioned AR technology as a sort of magnum opus when he first pivoted toward building immersive “metaverse” systems in 2021.
Delivering products, however, has been hampered by high development costs and technological hurdles.
The company’s metaverse unit Reality Labs lost $8.3 billion in the first half of this year, according to the most recent disclosures. It lost $16 billion last year.
Meta also announced a slew of new AI chatbot capabilities.
Meta AI will now respond to voice commands and users will have the option to make the assistant sound like celebrities like Judi Dench, John Cena and Awkwafina.
ZUCKERBERG: “Are live demos risky?”
META AI USING VOICE OF AWKWAFINA: “Live demos can be risky. Yes.”
ZUCKERBERG: “Thanks Awkwafina.”
Later this year, the company plans to add video-generation capabilities and the ability to perform some real-time language translations.
As 3D printing technology evolves, some people are using it to make things more practical.One of the newer creations is 3D-printed glasses. Inside Nebraska Furniture Mart’s new optical center is the Yuniku Design Center for 3D-printed glass frames.”It’s going to take a scan of your face, and then allow us to 3D print glasses based off of your facial measurements, so we can get an exact fit so that they fit well,” said NFM optical supervisor Christian Robertson.From there, people can virtually try on a variety of different frames.”So, the Yuniku frames are in about the $250-280 range, which is about our average frame price,” he said.Robertson thinks this is the future of eyeglasses, and he’s not the only one. “I’ve been designing eyewear for about 25 years,” said frame designer Alan Tipp. “There’s a big problem if you go into any retail store, whether it’s sunglasses or optical, the department has to make some decisions as to what they’re going to carry.”And that’s why he said there’s a growing market for 3D-printed frames.”This technology behind that allows them to scan your face, interpret that data, get the right pantoscopic tilt angle on your face, get the right size of your A and B dimensions and then the distance between your nose,” he said.Tipp prints custom frames out of his home studio in Elkhorn, and he’ll soon start selling 3D-printed glasses from his company Mtrl Objects.”We started this process using artificial intelligence. So, I am prompting AI to deliver me concepts,” Tipp said.With growing demand, Tipp hopes more people take a look at a new way to see the world around them.
As 3D printing technology evolves, some people are using it to make things more practical.
One of the newer creations is 3D-printed glasses.
Inside Nebraska Furniture Mart’s new optical center is the Yuniku Design Center for 3D-printed glass frames.
“It’s going to take a scan of your face, and then allow us to 3D print glasses based off of your facial measurements, so we can get an exact fit so that they fit well,” said NFM optical supervisor Christian Robertson.
From there, people can virtually try on a variety of different frames.
“So, the Yuniku frames are in about the $250-280 range, which is about our average frame price,” he said.
Robertson thinks this is the future of eyeglasses, and he’s not the only one.
“I’ve been designing eyewear for about 25 years,” said frame designer Alan Tipp. “There’s a big problem if you go into any retail store, whether it’s sunglasses or optical, the department has to make some decisions as to what they’re going to carry.”
And that’s why he said there’s a growing market for 3D-printed frames.
“This technology behind that allows them to scan your face, interpret that data, get the right pantoscopic tilt angle on your face, get the right size of your A and B dimensions and then the distance between your nose,” he said.
Tipp prints custom frames out of his home studio in Elkhorn, and he’ll soon start selling 3D-printed glasses from his company Mtrl Objects.
“We started this process using artificial intelligence. So, I am prompting AI to deliver me concepts,” Tipp said.
With growing demand, Tipp hopes more people take a look at a new way to see the world around them.
As 3D printing technology evolves, some people are using it to make things more practical.One of the newer creations is 3D-printed glasses. Inside Nebraska Furniture Mart’s new optical center is the Yuniku Design Center for 3D-printed glass frames.”It’s going to take a scan of your face, and then allow us to 3D print glasses based off of your facial measurements, so we can get an exact fit so that they fit well,” said NFM optical supervisor Christian Robertson.From there, people can virtually try on a variety of different frames.”So, the Yuniku frames are in about the $250-280 range, which is about our average frame price,” he said.Robertson thinks this is the future of eyeglasses, and he’s not the only one. “I’ve been designing eyewear for about 25 years,” said frame designer Alan Tipp. “There’s a big problem if you go into any retail store, whether it’s sunglasses or optical, the department has to make some decisions as to what they’re going to carry.”And that’s why he said there’s a growing market for 3D-printed frames.”This technology behind that allows them to scan your face, interpret that data, get the right pantoscopic tilt angle on your face, get the right size of your A and B dimensions and then the distance between your nose,” he said.Tipp prints custom frames out of his home studio in Elkhorn, and he’ll soon start selling 3D-printed glasses from his company Mtrl Objects.”We started this process using artificial intelligence. So, I am prompting AI to deliver me concepts,” Tipp said.With growing demand, Tipp hopes more people take a look at a new way to see the world around them.
As 3D printing technology evolves, some people are using it to make things more practical.
One of the newer creations is 3D-printed glasses.
Inside Nebraska Furniture Mart’s new optical center is the Yuniku Design Center for 3D-printed glass frames.
“It’s going to take a scan of your face, and then allow us to 3D print glasses based off of your facial measurements, so we can get an exact fit so that they fit well,” said NFM optical supervisor Christian Robertson.
From there, people can virtually try on a variety of different frames.
“So, the Yuniku frames are in about the $250-280 range, which is about our average frame price,” he said.
Robertson thinks this is the future of eyeglasses, and he’s not the only one.
“I’ve been designing eyewear for about 25 years,” said frame designer Alan Tipp. “There’s a big problem if you go into any retail store, whether it’s sunglasses or optical, the department has to make some decisions as to what they’re going to carry.”
And that’s why he said there’s a growing market for 3D-printed frames.
“This technology behind that allows them to scan your face, interpret that data, get the right pantoscopic tilt angle on your face, get the right size of your A and B dimensions and then the distance between your nose,” he said.
Tipp prints custom frames out of his home studio in Elkhorn, and he’ll soon start selling 3D-printed glasses from his company Mtrl Objects.
“We started this process using artificial intelligence. So, I am prompting AI to deliver me concepts,” Tipp said.
With growing demand, Tipp hopes more people take a look at a new way to see the world around them.
Excited for Monday’s eclipse, but didn’t plan ahead? Here’s how to find a last-minute method to safely witness the historic event.
If you didn’t get ahold of special glasses for viewing Monday’s total solar eclipse yet, you’re not alone. Here are some tips on how to find a last-minute method to safely witness the historic event.
If you don’t have special glasses to view the eclipse, you run the risk of causing serious damage your eyes. The American Astronomical Society put out a guide on which glasses are safe and how they can help shield your eyes from the sun’s glare.
If you didn’t plan ahead, here are some suggestions to make sure you can view the eclipse safely.
Without a pair of those special glasses, experts advise that you really shouldn’t look directly at the eclipse.
Make your own cereal box eclipse viewer
As an alternative, indirect way of witnessing the eclipse, NASA has a special pattern for a cereal box eclipse viewer to help you get a glimpse of history.
NASA have the pattern on its website that requires a cereal box, a piece of heavy-duty foil, a piece of white cardboard, paper, markers, scissors and tape or glue.
With a handcrafted viewer, you can look through the opening inside the box to see the sun’s shadow go from a circle to an eclipse.
Construct a pinhole projector with index cards, pushpins
If you have some index cards and pushpins laying around, you can create a pinhole projector for eclipse day. The Planetary Society has special instructions online on how to use the push pin to make a small hole.
The society also has tips on where to stand to stay safe and get the best results from your makeshift projector.
Get breaking news and daily headlines delivered to your email inbox by signing up here.
CUPERTINO, CA—Muttering “Come on, come on” under his breath as he attempted to bind the two objects together, an unprepared Tim Cook was frantically taping a battery to a pair of sunglasses ahead of his keynote at the Apple Worldwide Developers Conference, sources confirmed Monday. “I’ll call them, uh, the all-new Apple Lookers—or no, how about the Apple Eye Mirrors?’ said the company’s CEO, who wiped a bead of sweat off his forehead and cried out ‘Just a minute!’ from behind stage as he struggled to tear off a piece of duct tape with his teeth. “Goddammit, why didn’t anyone tell me this thing was today? If they ask too many questions, I’ll tell them it’s a prototype. Too bad there’s not any time to paint them. I have some white-out in my desk that would have looked great. At press time, Cook was giving a demonstration of the device by putting the sunglasses on upside down and muttering ‘Beep boop’ out of the corner of his mouth.
Panasonic Recalls 2 Million Microwaves That Got Dirty
Have you been told that your child needs glasses or contacts? Health experts estimate that almost half the U.S. population — 42% — has myopia (nearsightedness), a figure that has almost doubled over the past 3 decades and continues to grow. But being nearsighted is more than just an inconvenience: It can pose long-term hazards.
While glasses, contact lenses, eyedrops, and surgery can help kids see clearly, they don’t stop myopia from getting worse. Myopia occurs due to a slightly elongated eyeball or a steep curve in the cornea (the clear part in front of the eye). As a result, the light coming into the eye focuses in front of the retina instead of on it. This causes blurred vision.
“When the eye becomes longer, the tissue of the retina and the structures supporting the optic nerve stretch and become thinner,” says Andrei Tkatchenko, MD, PhD, associate professor of ophthalmic sciences at Columbia University Irving Medical Center in New York. “This thinning increases the risk of retinal detachment, cataracts, glaucoma, and even blindness. The faster myopia progresses and the more the prescription increases, the greater the risk of these diseases.”
Children with nearsighted parents are more likely to be nearsighted themselves, and scientists have identified a lot of myopia-related genes. But genes usually combine with a person’s environment to cause a disease. The top thing in the environment linked to myopia is close-up work such as reading or working on a computer or smart device. “Over the past 3 decades, the level of near work has significantly increased in most of the world,” Tkatchenko says. He and other researchers are studying new methods for treating myopia.
How to Slow Down Myopia
Can the advance of myopia be slowed or even halted to prevent long-term complications? Tkatchenko says yes: “There is a clearly defined treatable period between ages 8 and 25 during which there is the greatest progression of myopia, and myopia control is most effective during those years.”
For those diagnosed with severe myopia, known as high myopia, special contacts worn at night can help reshape the cornea and stabilize the eye to help with vision. That’s called orthokeratology, or Ortho-K. But experts aren’t sure if it can stop myopia from getting worse.
Multifocal contact lenses worn during the day work to slow the progression of myopia in kids while also creating clear vision. MiSight lenses are the only ones approved by the FDA to stop myopia progression. A 3-year study showed that they slowed myopia down by 43% and slowed eye growth by 36%. When researchers tested MiSight over a 6-year period, they saw that myopia slowed even in children who started with single-vision contact lenses (which make vision clearer but don’t slow progression of myopia) and later switched to MiSight.
Nicholas Onken, OD, assistant professor at the University of Alabama at Birmingham, says NaturalVue is another daily disposable multifocal contact lens that can be used on an off-label basis to help with myopia. This means the product is approved by the FDA but not to stop myopia from getting worse. “I use this lens as a second option in the rare event that a patient finds the MiSight lenses uncomfortable,” Onken says.
“Myopia control is most effective when initiated as early as possible, so I am often recommending MiSight as soon as a child fits the parameters,” Onken says. “In a child who is not yet nearsighted but who appears to be headed in that direction, I will make sure to mention myopia control contact lenses to the parents so they have time to think over it and digest the idea of putting their child in contact lenses.”
Doctors also use atropine eye drops for myopia control, but they’re also prescribed only on an off-label basis, Onken says. He only recommends atropine eye drops before MiSight contacts if kids are under 4 years old or so.
Researchers are looking into other potential treatments for myopia including red light exposure and chromatic aberrations, Onken says.
How to Deal With Myopia Now
It’s important to make sure your child has the proper prescription for their glasses or lenses. An under-, and more importantly, over-corrected lens could make myopia worse.
But there’s one simple prescription that could protect your child from getting myopia in the first place: spending time outside. “A number of studies have shown that outside activities suppress the development of myopia,” Tkatchenko says. Scientists aren’t sure why this happens, but one theory is that outdoor light stimulates the release of chemicals that signal the eye to slow its growth to a normal rate.
“Go outside and play. That’s the best thing parents can tell their children to help prevent myopia,” Tkatchenko says.
By the Numbers
66%: Percentage increase in myopia in the U.S. between the early 1970s and early 2000s.
50%: Percentage of the world’s population that will have myopia by 2050.
4 in 10: Ratio of adults in the U.S. who have myopia.
1.25: Number of daily hours of outdoor time needed to cut the chance that a child will get myopia by 50%.