Shopping cart

Subtotal:

Meta Introduces Live AI, Translations, and Shazam for Ray-Ban Smart Glasses

Meta unveils new features for Ray-Ban smart glasses, including live AI, live translations, and Shazam support.

Meta Introduces Live AI, Translations, and Shazam for Ray-Ban Smart Glasses

Meta has announced the rollout of three new features for its Ray-Ban smart glasses: live AI, live translations, and Shazam. The live AI and live translation features are currently exclusive to members of Meta’s Early Access Program, while Shazam support is available to all users in the US and Canada.

Live AI enables natural conversations with Meta’s AI assistant, allowing users to interact with the assistant while it continuously views their surroundings. For instance, while browsing the produce section at a grocery store, users can ask the AI to suggest recipes based on the ingredients they are looking at. This feature can be used for approximately 30 minutes on a full charge.

Live translation, on the other hand, allows the glasses to translate speech in real-time between English and Spanish, French, or Italian. Users can choose to hear translations through the glasses or view transcripts on their phone. Language pairs need to be downloaded in advance, and users must specify their language and that of their conversation partner.

Shazam support is more straightforward; users simply need to prompt the Meta AI when they hear a song, and the glasses will identify it. These features are part of the v11 software update for the glasses and require the v196 version of the Meta View app.

These updates coincide with Big Tech’s push towards AI assistants as the core of smart glasses. Meta CTO Andrew Bosworth recently stated that 2024 was the year AI glasses hit their stride, suggesting that smart glasses may be the best form factor for a truly AI-native device.

Share:
Top