ReportWire

Tag: Karissa Bell

  • The first e-bike from Rivian spinoff Also has a virtual drivetrain

    [ad_1]

    Ever since Rivian spun off its “micromobility business” into a standalone startup called Also earlier this year, there’s been much speculation about what kind of vehicles the company is working on. Now, Also is showing off its first products: a lineup of e-bikes and two pedal-assisted electric quads.

    The TM-B e-bike is Also’s attempt at a do-it-all e-bike that can adapt to different use cases whether that’s daily commuting, trail riding or kid and cargo-hauling. It sports a modular frame that can also accommodate a bench seat or rear cargo rack that supports up to 35KG of weight.

    The different seats can be easily swapped out without extra tools. Instead, a button on the bike’s touchscreen display controls a latching mechanism that releases the seat. It only comes in one frame size, but Also says it should be able to adapt to “multiple body sizes,” thanks to different seat sizes and styles.

    The bench seat for the TM-B.

    (Karissa Bell for Engadget)

    The removable USB-C battery comes in two sizes: standard, which can power up to 60 miles of riding, and large, which maxes out at 100 miles of range. When you’re not riding, the batteries can also be used as a large external battery pack.

    In terms of power, the TM-B’s throttle tops out at 20MPH though the bike can reach speeds up to 28MPH with added pedaling. Also is taking an interesting approach to its drive system, with a setup it’s labeled “DreamRide.” Instead of a mechanical connection between the bike’s rear wheel and the pedals, the TM-B uses “software-defined pedaling,”

    In practice, this means that you pedaling is actually feeding the generator that powers the bike’s battery rather than directly pushing you forward. However, an Also rep told me that there is also a “limp mode” for when the bike runs out of juice so riders won’t get stranded. In those situations, pedaling will give the bike enough juice to hopefully get you to a spot where you can recharge.

    Also has envisioned the TM-B in a lot of scenarios, many of which involve hauling a lot of cargo.

    Also has envisioned the TM-B in a lot of scenarios, many of which involve hauling a lot of cargo.

    (Karissa Bell for Engadget)

    Software-controlled pedaling probably won’t appeal to purists, but Also says it enables a much more customizable riding experience. When in auto mode, the bike will adapt to the speed you’re pedaling, though you can push on the throttle to get a boost. There’s also a manual mode that lets you select a “gear” (these are also software-controlled).

    It also uses regenerative braking, so tapping on the brakes helps recharge the battery. Though in my short test ride I found that I didn’t need to use the brakes much, because when I stopped pedaling the bike slowed down pretty quickly, kind of like taking your foot off the accelerator in an EV.

    The Also app and Portal display.

    The Also app and Portal display.

    (Also)

    Given the bike’s roots at Rivian, it’s not surprising that there are also a bunch of other tech-enabled features, including a 5-inch touchscreen display, called “Portal,” that supports navigation, music playback and calling features via an accompanying app. There’s also a built-in security system that automatically locks the frame and rear wheel when you walk away. On the handlebars, there are customizable controls that can be used to adjust the volume and music playback, answer calls or navigate through display.

    Customizable controls on the left side fo the handlebar and a throttle on the right.

    Customizable controls on the left side fo the handlebar and a throttle on the right.

    (Karissa Bell for Engadget)

    Also is selling the TM-B in three configurations. The first to ship next spring will be the $4,500 TM-B Limited Launch Edition, which has a range up to 100 miles, support for standard and sport ride modes and features transparent purple accents. The $4,500 TM-B Performance has the same features as the limited edition model, but has a slightly different color scheme, and will be available within the “first half” of 2026. Finally, there’s a base-level TM-B model with a range of up to 60 miles that only comes with standard ride modes. Also hasn’t announced an exact price, but says it will cost less than $4,000 when it ships “later in 2026.”

    Pre-orders for the Launch Edition are open now and the other two bikes are available to reserve with a $50 deposit. The bikes will also be on display in Rivian showrooms later this year,

    Also's quad for commercial uses cases (left) and a smaller quad for families (right).

    Also’s quad for commercial uses cases (left) and a smaller quad for families (right).

    (Also)

    The company also previewed two electric, pedal-assisted quads it’s calling TM-Q. The smaller quad is apparently meant for “families and individuals seeking a safe, compact alternative to cars” that can still haul “significant loads.” The larger TM-Q, on the other hand, is meant for commercial deliveries.

    Also has partnered with Amazon to develop fleets of such vehicles that can be used by delivery drivers. Both quads are intended to be used in bike lanes, according to Also.

    Also will partner with Amazon for a Prime-branded TM-Q.

    Also will partner with Amazon for a Prime-branded TM-Q.

    (Karissa Bell for Engadget)

    The company didn’t share details about when these vehicles might be available or how much they’ll cost.

    Update, October 22, 2025, 2:29PM PT: Added more details and photos from Also’s launch event.

    [ad_2]

    Source link

  • Ray-Ban Meta (2nd Gen) review: Smart glasses are finally getting useful

    [ad_1]

    In a lot of ways, Meta’s hasn’t changed much with its second-gen Ray-Ban glasses. The latest model has the same design and largely the same specs as the originals, with two important upgrades: longer battery life and improved video quality. 

    At the same time, the Ray-Ban Meta glasses have a lot of features that didn’t exist when I first reviewed them two years ago, largely thanks to AI. And with the release of its second-generation frames, there’s still a lot to look forward to, like new camera features and AI-powered audio. The good news is that Meta isn’t limiting these updates to its newest frames, so if you have an older pair you’ll still see the new features. But, if you’ve been on the fence about getting a pair, there’s never been a better time to jump in. 

    Engadget

    Meta’s second-generation smart glasses are becoming a genuinely useful accessory.

    Pros

    • Noticeably better battery life
    • YouTuber-friendly 3K video
    • Meta AI translations are a game-changer for travel
    Cons

    • Framing POV photos and video is still a challenge
    • Pricey lens upgrades

    $379 at Meta

    Same look, (slightly) better specs

    Meta and EssilorLuxottica haven’t strayed too far from the playbook they’ve used for the last two years. The second-generation Ray-Ban Meta glasses come in a handful of frame styles with a number of color and lens variations that start at $379. I tried out a pair of Wayfarer frames in the new “shiny cosmic blue” color with clear transition lenses. 

    I personally prefer the look for the slightly narrower Headliner frames, but the second-gen glasses still look very much like traditional Wayfarer glasses. I’ve never been a fan of transition lenses for my own prescription eyewear, but I’m starting to come around on them for smart glasses. As Meta has improved its cameras and made its AI assistant more useful, I’ve found more reasons to wear the glasses indoors. 

    The second-generation Ray-Ban Meta glasses come with clear frames, with polarized and transition lenses available as an upgrade.

    The second-generation Ray-Ban Meta glasses come with clear frames, with polarized and transition lenses available as an upgrade.

    (Karissa Bell for Engadget)

    Also, if you’re going to be paying $300 or more for a pair, you might as well be able to use them wherever you are. It also helps that the transition lenses on the second-gen Ray-Ban Meta glasses get a bit darker than my first-gen Wayfarers with transition lenses. Upgrading from the standard clear lenses will cost you, though. Frames with polarized lenses start at $409, transitions start at $459 and prescription lenses can run significantly more. 

    As with the recent Oakley Meta HSTN glasses, the second-gen Ray-Bans come with a longer battery life and better camera. Meta says the battery can last up to eight hours on a single charge with “typical use.” I was able to squeeze a little more than five and a half hours of continuous music playback. That’s a noticeable step up from the battery on my original pair which, after two years, is starting to show its age. The glasses also now support higher-resolution 3K video recording, but the 12MP wide-angle lens shoots the same 3,024 x 4,032 pixel portrait photos as earlier models.

    The second-gen glasses have the same design as the first-gen, with a capture button on the right side of the frames.

    The second-gen glasses have the same design as the first-gen, with a capture button on the right side of the frames. The charging case provides an additional 48 hours of battery life.

    (Karissa Bell for Engadget)

    For videos, there’s a noticeable quality boost, but I still think it’s probably not necessary for most people if you’re primarily sharing your clips on social media. It does make the glasses more appealing for creators, though, and judging by the number of them in attendance at Connect, I suspect Meta sees them as a significant part of its user base. I’m looking forward to Meta adding the ability to record Hyperlapse and slow-motion videos, though, as I think these may be more interesting than the standard POV footage for everyday activities. 

    Meta AI + what’s coming

    Two years ago, I was fairly skeptical of Meta’s AI assistant. But since then, Meta has steadily added new capabilities. Of those, the glasses’ translation abilities have been my favorite. On a recent trip to Argentina, I used live translation to follow along with a walking tour of the famous Recoleta cemetery. It wasn’t perfect — the feature is meant more for back-and-forth conversations rather than extended monologues — but it allowed me to participate in a tour I would have otherwise had to skip. (A word of warning: using the live translation for an extended period of time is a major battery killer.)

    Meta AI can also provide context and translations in other scenarios, too. I spent some time in Germany while testing the latest second-gen Ray-Ban glasses and found myself repeatedly asking Meta to translate signs and notices. For example, here’s how Meta AI summarized this collection of signs. 

    Meta AI was able to translate these signs (left) when I asked it "what do these signs say?"

    Meta AI was able to translate these signs (left) when I asked it “what do these signs say?”

    (Karissa Bell for Engadget)

    As I wrote in my review of the Oakley Meta HSTN glasses, I still haven’t found much use for Live AI, which lets you interact with the assistant in real-time and ask questions about your surroundings. It still feels like more of a novelty, but it makes for a fun demo to show off to friends who have never tried “AI glasses.” There are also some very interesting accessibility use cases that take advantage of the glasses’ cameras and AI capabilities. Features like “detailed responses” and support for “Be My Eyes” show how smart glasses can be particularly impactful for people who are blind or deal with low vision.

    One AI-powered feature I haven’t tried out yet is Conversation Focus, which can adjust the volume of the person you’re speaking to while dampening the background noise. Meta teased the feature at Connect, but hasn’t said exactly when it will be available. But if it works as intended, I could see it being useful in a lot of scenarios.

    I’m also particularly intrigued by Meta’s Connect announcement that it will finally allow third-party developers to create their own integrations for its smart glasses. There are already a handful of partners, like Twitch and Disney, which are finding ways to take advantage of the glasses’ camera and AI features. Up to now, Meta AI’s multimodal tools have shown some promise, but I haven’t really been able to find many ways to use the capabilities in my day-to-day life. 

    Allowing app makers onto the platform could change that. Disney has previewed a smart glasses integration for inside of its parks that would allow visitors to get real-time info about the rides, attractions and other amenities as they walk around. Golf app 18Birdies has shown off an app to deliver stats and other info while you’re on the course.

    Should you buy these? And what about privacy?

    When the Ray-Ban Meta glasses came out two years ago, this was a pretty straightforward question to answer. If the idea of smart glasses with a good camera and open-ear speakers appealed to you, then buying a pair was a no-brainer. 

    Now, it’s a bit more complicated. Meta is still updating its first-gen Ray-Ban glasses with significant new features, like Conversation Focus, new camera modes and third-party app integrations. So if you already have a pair, you won’t be missing out on a ton if you don’t upgrade. (And with a starting price of $299, the first-gen glasses are still solid if you want a more budget-friendly option.)

    There are also other options to consider. The upcoming Oakley Meta Vanguard glasses come with more substantial hardware upgrades and other unique features that will appeal to athletes and anyone who spends a lot of time outdoors. And on the higher end, there are the $799 Meta Ray-Ban Display glasses that blend AR elements with its existing features in an intriguing way. 

    Meta has already previewed several new features, like new camera modes and Conversation Focus.

    Meta has already previewed several new features, like new camera modes and Conversation Focus.

    (Karissa Bell for Engadget)

    I also have many of the same concerns about privacy as I did when I reviewed Meta’s first Ray-Ban branded glasses back in 2021. I’m well aware Meta already collects an extraordinary amount of data about us through its apps, but glasses just feel like they provide much more personal, and potentially invasive, access to our lives.

    Meta has also made some notable changes to the privacy policy for its glasses in recent months. It no longer allows users in the United States to opt out of storing voice recordings in its cloud, though it’s still possible to manually delete recordings in the Meta AI app. 

    The company says it won’t use the contents of the photos and videos you capture to train its AI models or serve ads. However, images of your surroundings processed for the glasses’ multimodal features like Live AI can be used for training purposes (these images aren’t saved to your device’s camera roll). Meta’s privacy policy also states that it uses audio captured via voice commands for training. And it should go without saying, but anyone using Meta’s glasses should be very careful about sharing their interactions with its AI app, as a bunch of users have already seemingly inadvertently shared a ton of highly-personal interactions with the world. 

    If any of that makes you uncomfortable, I’m not here to convince you otherwise! We’re still grappling with the long-term privacy implications of generative AI, much less generative AI on camera-enabled wearables. At the same time, as someone who has been wearing Meta’s smart glasses on and off for more than four years, I can say that Meta has been able to turn something that once felt gimmicky into a genuinely useful accessory. 

    [ad_2]

    Source link

  • Meta Ray-Ban Display hands-on: Discreet and intuitive

    [ad_1]

    I’ve been testing smart glasses for almost a decade. And in that time, one of the questions I’ve been asked the most is “oh, but can you see anything in them?” For years, I had to explain that no, glasses like that don’t really exist yet.

    That’s no longer the case. And while I’ve seen a bunch of glasses over the last year that have some kind of display, the Meta Ray-Ban Display glasses feel the closest to fulfilling what so many people envision when they hear the words “smart glasses.”

    To be clear, they don’t offer the kind of immersive AR that’s possible with Meta’s Orion prototype. In fact Meta considers “display AI glasses” to be a totally separate category from AR. The display is only on one lens — the right — and its 20-degree field of view is much smaller than the 70 degrees on Orion. That may sound like a big compromise, but it doesn’t feel like one.

    Karissa Bell for Engadget

    The single display feels much more practical for a pair of glasses you’ll want to wear every day. It’s meant to be something you can glance at when you need it, not an always-on overlay. The smaller size also means that the display is much sharper, at 42 pixels per degree. This was especially noticeable when I walked outside with the glasses on; images on the display looked even sharper than in indoor light, thanks to automatic brightness features.

    I also appreciated that you can’t see any light from the display when you’re looking at someone wearing the glasses. In fact the display is only barely noticeable at all when you at them up close.

    Having a smaller display also means that the glasses are cheaper, at $799, and that they don’t look like the chunky AR glasses we’ve seen so many times. At 69 grams, they are a bit heavier and thicker than the second-gen Meta Ray-Bans, but not much. As someone who has tried on way too many pairs of thick black smart glasses, I’m glad Meta is offering these in a color besides black. All Wayfarer-style frames look wide on my face but the lighter “sand” color feels a lot more flattering.

    The Meta Ray-Ban Display (left) and second-gen Ray-Ban Meta glasses (right.) The display glasses a little thicker.

    The Meta Ray-Ban Display (left) and second-gen Ray-Ban Meta glasses (right.) The display glasses are a little thicker.

    (Karissa Bell for Engadget)

    The Meta Neural Band wristband that comes with the display glasses functions pretty much the same as the band I used on the Orion prototype. It uses sensors to detect the subtle muscle movements on your hand and wrist and can translate that into actions within the glasses’ interface.

    It’s hard to describe, but the gestures for navigating the glasses interfaces work surprisingly well. I can see how it could take a little time to get used to the various gestures for navigating between apps, bringing up Meta AI, adjusting the volume and other actions, but they are all fairly intuitive. For example, you use your thumb to swipe along the the top of your index finger, sort of like a D-pad, to move up and down and side to side. And you can raise and lower the speaker volume by holding your thumb and index finger together and rotating your wrist right or left like it’s a volume knob.

    It’s no secret that Meta’s ultimate goal for its smart glasses is to replace, or almost replace, your phone. That’s not possible yet, but having an actual display means you can look at your phone a whole lot less.

    The Neural Wristband.
    Karissa Bell for Engadget

    The display can surface incoming texts, navigation with map previews (for walking directions), and info from your calendar. I was also able to take a video call from the glasses — unlike Mark Zuckerberg’s attempted live demo during his keynote — and it was way better than I expected. I could not only clearly see the person I was talking to and their surroundings, I could turn on my glasses’ camera and see a smaller version of the video from my side.

    I also got a chance to try the Conversational Focus feature, which allows you to get live captions of the person you’re speaking with even in a loud environment that may be hard to hear. There was something very surreal about getting real-time subtitles to a conversation with a person standing directly in front of me. As someone who tries really hard to not look at screens when I’m speaking to people, it almost felt a little wrong. But I can also see how this would be incredibly helpful to people who have trouble hearing or processing conversations. It would also be great for translations, something Meta AI already does very well.

    I also appreciated that the wristband allows you to invoke Meta AI with a gesture so you don’t always have to say “Hey Meta.” It’s a small change, but I’ve always felt weird about talking to Meta AI in public. The display also addresses another one of my longtime gripes with the Ray-Ban Meta and Oakley glasses: framing a photo is really difficult. But with a display, you can see a preview of your shot, as well as the photo after the fact, so you no longer have to just snap a bunch and hope for the best.

    I’ve only had about 30 minutes with the glasses, so I don’t really know how having a display could fit into my daily routine. But even after a short time with them, they really do feel like the beginning of the kind of smart glasses a lot of people have been waiting for.

    [ad_2]

    Karissa Bell

    Source link

  • Engadget Podcast: A deeper dive into the iPhone 17 and iPhone Air

    [ad_1]

    This week, managing editor Cherlynn Low and senior reporter Karissa Bell are joined by The Verge’s Allison Johnson to talk all about the new iPhone Air, iPhone 17 Pro and iPhone 17. We also answered some questions from Threads and talk about our hopes and dreams from the next Apple event. Also, Devindra and Ben chat about some recent news, including a truly awful AI podcasting company.

    Subscribe!

    Topics

    • Cherlynn, Karissa, and a special guest break down the iPhone 17 news from Apple headquarters – 1:04

    • Notes from the iPhone Air hands on – 14:59

    • Once again, a big Apple event with no mention of Apple Intelligence – 40:27

    • Animated movie Critterz will use OpenAI’s tech to try to make a CGI movie on a shoestring budget – 59:24

    • Inception Point AI wants to use virtual hosts to make 5,000 new podcast episodes a week – 1:04:26

    • David Zaslav thinks HBO Max should be more expensive, because of course he does – 1:23:27

    • Pop culture picks – 1:28:29

    Credits

    Hosts: Devindra Hardawar and Cherlynn Low
    Guests: Karissa Bell and Allison Johnson
    Producer: Ben Ellman
    Music: Dale North and Terrence O’Brien

    [ad_2]

    Devindra Hardawar

    Source link