Meta Introduces Live AI, Translation, and Shazam Features on Smart Glasses
Meta has unveiled three innovative features for its Ray-Ban smart glasses: live AI, live translations, and Shazam integration. Available through Meta’s Early Access Program, these updates are poised to enhance user experience by offering conversational AI, real-time language translation, and music recognition capabilities.

Meta’s just dropped some seriously cool upgrades for their Ray-Ban smart glasses, and honestly, it’s like something out of a sci-fi movie. Live AI, real-time translations, and Shazam—because who hasn’t been in a café, heard a banger, and thought, ‘What is this song?’ Now, if you’re part of Meta’s Early Access Program, you’re in for a treat with the live AI and translations. But hey, Shazam’s up for grabs for everyone in the US and Canada, so no FOMO there.
Remember when Meta teased us at Connect 2024? Well, the live AI is here, and it’s as slick as promised. Chat with Meta’s AI assistant while it checks out your surroundings—imagine it suggesting a pasta recipe because you’re staring at tomatoes. Just a heads-up, though: it runs for about 30 minutes on a charge, so maybe don’t start a cooking marathon.
And the translation feature? It’s like having a personal interpreter in your glasses. English to Spanish, French, or Italian, in real time. You can listen to the translation or read it—whatever floats your boat.
Oh, and Shazam? Yeah, just ask Meta AI, and boom, you’ve got your song. Zuckerberg showed it off, and it looked effortless. But before you go all tech wizard, make sure your glasses are rocking v11 software and the Meta View app is updated to v196. Not in the Early Access Program? No sweat—just apply online.
This is all happening while the smart glasses scene is blowing up with AI. Google’s throwing its hat in the ring with Android XR OS, and Meta’s CTO, Andrew Bosworth, is calling 2024 the year AI glasses go big. Buckle up, folks.