Meta’s Smart Glasses Gain Live AI and Live Translation
Meta today added new features to its Ray-Ban smart glasses, including live translation and live AI. With live AI, the Meta smart glasses are able to see whatever the wearer sees thanks to the built-in camera, and can hold real-time conversations.
According to Meta, the glasses are able to provide hands-free help with meal prep, gardening, exploring a new neighborhood, and more. Questions can be asked without the need to say the “Hey Meta” wake word, and the AI can understand context between requests for referencing prior queries. Meta says that eventually, the AI will be able to “give useful suggestions before you even ask.”
Along with live AI, there’s now a new live translation feature that can translate in real-time between English and either Spanish, French, or Italian. When someone is speaking in one of those three languages, the glasses will translate what they say into English through the speakers or on a connected smart phone, and vice versa.
The Meta glasses are now able to use Shazam to identify songs, so if you ask “Hey Meta, what is this song?” Shazam can provide the song title. Shazam is an Apple-owned company now, and is heavily integrated into iOS.
All of these features are part of the Early Access Program open to any Meta glasses wearer. Sign-ups are available on Meta’s website, though there are a limited number of slots for customers in the United States and Canada.
Meta’s smart glasses have seen a good amount of consumer interest, and rumors suggest that Apple might be getting into the smart glasses market with a similar device. Back in October, Bloomberg‘s Mark Gurman said that Apple is considering a pair of smart glasses that are comparable to the Meta Ray-Bans, offering Siri support, integrated cameras, and more.
This article, “Meta’s Smart Glasses Gain Live AI and Live Translation” first appeared on MacRumors.com
Discuss this article in our forums