Meta Introduces Live AI, Translation, and Shazam Features on Smart Glasses
Meta has unveiled three innovative features for its Ray-Ban smart glasses: live AI, live translations, and Shazam integration. Available through Meta’s Early Access Program, these updates are poised to enhance user experience by offering conversational AI, real-time language translation, and music recognition capabilities.

Meta has revealed exciting enhancements to its Ray-Ban smart glasses, introducing live AI, live translations, and Shazam. While members of Meta’s Early Access Program will exclusively experience live AI and translations, Shazam functionality is accessible to all users in the US and Canada.
Previously hinted at during Meta Connect 2024, the live AI feature enables seamless interaction with Meta’s AI assistant, actively analyzing surroundings. For instance, it can suggest recipes based on visible grocery items. This functionality operates for 30 minutes per charge.
The live translation capability offers real-time speech translation between English and languages like Spanish, French, or Italian, with options to hear translations or view transcripts.
Shazam support lets users identify music by simply prompting Meta AI, demonstrated by CEO Mark Zuckerberg. Ensure your glasses have the v11 software and the Meta View app is updated to v196 to access these features. If not in the Early Access Program, applications can be made online.
This development comes amid a surge of AI-driven smart glass innovations, with Google recently announcing its Android XR OS. Meta’s CTO, Andrew Bosworth, emphasizes 2024 as a pivotal year for AI glasses.