Meta’s $300 smart glasses, made in collaboration with Ray-Ban, allow users to take pictures, record videos, make calls, hear music, and do much more. Now, new AI features are being added to Meta’s Ray-Ban smart glasses.
New AI Features in Meta’s Ray-Ban Smart Glasses
Meta’s Ray-Ban smart glasses will be able to use new artificial intelligence software next month. With the help of the new AI software, users will be able to do a variety of things, such as scan landmarks, translate languages, identify objects, animals, and much more.
To ask questions from the AI assistant, users need to say “Hey Meta” and then say a prompt or ask a question. The AI assistant will reply to your prompt or question in a computer-generated voice, which can be heard with the help of speakers present on the frame of the glasses.
The New York Times got early access to the new AI features of Meta’s Ray-Ban smart glasses and tested them. The glasses were able to correctly identify pets and artwork, but they struggled to identify animals that were far away and behind cages. The glasses were also not able to identify an exotic fruit, even after multiple tries. Meta also says that all the features are not fully perfect, and they are still working to make them better.
The New York Times also used the AI translation feature and found out that it currently only supports English, Spanish, Italian, French, and German.
When will the new features be available?
The AI features will be available next month to US users. US users can also access these features through an early access waitlist.
Read More
Claude 3 beats GPT-4 for the First Time on LMSYS Leaderboard
0 Comments