Meta Enhances Smart Glasses with AI, Translations, and Music Identification
Meta introduces new features for its Ray-Ban smart glasses, including live AI, instant translations, and music recognition. These additions aim to revolutionize user interaction, allowing seamless conversations and real-time language translations, while music identification is available for all users in the US and Canada. Check for software updates to access these features.

Meta is revolutionizing its Ray-Ban smart glasses by integrating three exciting features: live AI conversations, real-time translations, and music recognition powered by Shazam. While the live AI and translation features are exclusive to participants in Meta’s Early Access Program, the Shazam functionality is available to all users in the US and Canada.
The live AI capability, first teased at Meta Connect 2024, allows natural interaction with the AI assistant as it continuously scans your environment. For instance, while shopping, you can get recipe suggestions from Meta’s AI based on the ingredients you see. Each use session lasts approximately 30 minutes on a full charge.
The live translation feature supports real-time language conversion between English and Spanish, French, or Italian. Users can choose to hear or read these translations, but must download language pairs in advance.
Music recognition via Shazam is user-friendly; simply cue Meta’s AI when a song is playing to identify it.
Ensure your smart glasses have the latest v11 software and the Meta View app is updated to version v196 to access these features. Those not in the Early Access Program can apply online.
These updates highlight the growing importance of AI assistants in smart glasses, with Meta and other tech giants pushing the boundaries in this space.