Meta rolls out live language translations and Shazam to its smart glasses


Meta just announced three new features are rolling out to its Ray-Ban smart glasses: live AI, live translations, and Shazam. Both live AI and live translation are limited to members of Meta’s Early Access Program, while Shazam support is available for all users in the US and Canada.
Both live AI and live translation were first teased at Meta Connect 2024 earlier this year. Live AI allows you to naturally converse with Meta’s AI assistant while it continuously views your surroundings. For example, if you’re perusing the produce section at a grocery store, you’ll theoretically be able to ask Meta’s AI to suggest some recipes based on the ingredients you’re looking at. Meta says users will be able to use the live AI feature for roughly 30 minutes at a time on a full charge.
Meanwhile, live translation allows the glasses to translate speech in real-time between English and Spanish, French, or Italian. You can choose to either hear translations through the glasses themselves, or view transcripts on your phone. You do have to download language pairs beforehand, as well as specify what language you speak versus what your conversation partner speaks.
Shazam support is a bit more straightforward. All you have to do is to prompt the Meta AI when you hear a song, and it should be able to tell you what you’re listening to. You can watch Meta CEO Mark Zuckerberg demo it in this Instagram reel.
If you don’t see the features yet, check to make sure your glasses are running the v11 software and that you’re also running v196 of the Meta View app. If you’re not already in the Early Access Program, you can apply via this website.
The updates come just as Big Tech is pushing AI assistants as the raison d’etre for smart glasses. Just last week, Google announced Android XR, a new OS for smart glasses, and specifically positioned its Gemini AI assistant as the killer app. Meanwhile, Meta CTO Andrew Bosworth just posted a blog opining that “2024 was the year AI glasses hit their stride.” In it, Bosworth also asserts that smart glasses may be the best possible form factor for a “truly AI-native device” and the first hardware category to be “completely defined by AI from the beginning.”
Meta just announced three new features are rolling out to its Ray-Ban smart glasses: live AI, live translations, and Shazam. Both live AI and live translation are limited to members of Meta’s Early Access Program, while Shazam support is available for all users in the US and Canada. Both live…
Recent Posts
- Max’s ad-supported tier is losing CNN and the Bleacher Report
- Victrola’s cheapest Sonos-compatible turntable is over half off today
- Amazon’s AI-heavy Alexa+ will be accessible on the web
- Slack is down for thousands – we’ve got live updates on the outage and what’s happening
- Live updates from Amazon’s 2025 AI Alexa event
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010