Shopping cart

Subtotal:

Meta Enhances Smart Glasses with AI-Powered Features

Meta introduces significant upgrades to Ray-Ban Meta smart glasses, featuring real-time AI video capabilities and multilingual translation. These improvements allow seamless interaction with Meta AI without a wakeword and facilitate smooth, real-time language translation.

Meta Enhances Smart Glasses with AI-Powered Features

Meta’s Ray-Ban Meta smart glasses are getting a serious AI upgrade, and it’s pretty cool. If you’re in the U.S. or Canada and got in early, firmware v11 is your ticket to chatting with Meta AI without constantly saying ‘Hey Meta’—because let’s be honest, that gets old fast. Dubbed live AI, this feature lets you have a back-and-forth with your AI assistant like you’re texting a friend who actually knows everything.

But wait, there’s more. Ever looked around and thought, ‘What am I even looking at?’ Now, you can ask your glasses. The real-time AI video feature uses the camera to answer questions about your surroundings. It’s like having a tour guide in your frames, minus the cheesy jokes. Meta showed this off at their Connect dev conference, and it’s clear they’re not just playing in the smart glasses space—they’re leading the pack, giving even Google’s Project Astra a run for its money.

And for the polyglots out there, the glasses now offer live translation between English and a few other languages (Spanish, French, Italian—you know, the usual suspects). Chat in one of these, and your glasses will whisper the English translation to your phone. No more nodding along pretending you understand.

Oh, and Shazam’s in the mix now. Heard a bop and can’t name that tune? Your glasses have got you covered. Meta’s upfront about these features being a work in progress, but hey, Rome wasn’t built in a day. The Shazam addition is just the cherry on top of an already impressive update.

Share:
Top