
Ray-Ban Meta Glasses Add Live AI, Live Translation, & Shazam Support
- By Smart Glasses Meta Quest
- 0 Comments
- Posted on
Whether you already own a pair or you’re hoping Santa drops a one down the chimney, Ray-Ban Meta glasses* are the gift that keeps on giving. Right out of the box, they help you experience the world, share your POV, and capture the moments that matter, completely hands-free.
And with regular software updates, they keep getting better over time. Since Connect, we’ve released several new features including reminders, the ability to use your voice to search and play content on Spotify and Amazon Music, Be My Eyes integration, adaptive volume, and the option to set Meta AI on your glasses to celebrity voices including Awkafina, John Cena, Keegan-Michael Key, and Kristen Bell under the US voice options, and Dame Judi Dench under the UK voice options.
Starting today, we’ll begin rolling out the v11 software update, which makes your Ray-Ban Meta glasses more capable, useful, and fun than ever before. Make sure your glasses and Meta View app are updated to the latest versions. Let’s take a look at what’s new.
Live AI & Live Translation via Our Early Access Program
Members of our Early Access Program are about to tap into two new superpowers that we announced at Connect 2024.
The first is live AI, which adds video to Meta AI on your glasses. During a live AI session, Meta AI can see what you see continuously and converse with you more naturally than ever before. Get real-time, hands-free help and inspiration with everyday activities like meal prep, gardening, or exploring a new neighborhood. You can ask questions without saying “Hey Meta,” reference things you discussed earlier in the session, and interrupt anytime to ask follow-up questions or change topics. Eventually live AI will, at the right moment, give useful suggestions even before you ask.
The second is live translation, which Meta Founder & CEO Mark Zuckerberg demoed live onstage at Connect this year. Through this new feature, your glasses will be able to translate speech in real time between English and either Spanish, French, or Italian. When you’re talking to someone speaking one of those three languages, you’ll hear what they say in English through the glasses’ open-ear speakers or viewed as transcripts on your phone, and vice versa. Not only is this great for traveling, it should help break down language barriers and bring people closer together.
As you try these new experiences, bear in mind that, as we test, these AI features may not always get it right. We’re continuing to learn what works best and improving the experience for everyone. Your feedback will help make Ray-Ban Meta glasses better and smarter over time. Our Early Access Program is open to Ray-Ban Meta glasses owners in the US and Canada. Those interested can enroll here. Please make sure you have the latest version of the app installed and your smart glasses are updated as well.
Name That Tune With Shazam Integration — Available Now
We all know the feeling: You’re out on the town when an absolute banger starts playing—but either it’s new, obscure, or even an old favorite whose track name or artist just happens to escape you at that particular moment.
Now, your glasses can do the heavy lifting for you. When you hear a great track out in the wild, you can just say, “Hey Meta, what is this song?” and voilà! Hands-free music recognition. Shazam on Ray-Ban Meta glasses is available in the US and Canada.
The Best Is Yet to Come
We’re always on the lookout for new ways to make Ray-Ban Meta glasses better, and we value your feedback. We’ll be back with more software updates—and maybe some surprises—in 2025. Until then, follow us on Threads and Instagram for tips, tricks, and more.