Premium
This is an archive article published on April 25, 2024

Meta’s Ray-Ban Smart Glasses can now describe its surroundings with Meta AI

With the new capabilities of Meta’s Ray-Ban smart glasses has now transformed into an AI assistant on the go.

Meta glassesMeta said that it is rolling out Meta AI with Vision, so users can ask their glasses about what they are seeing and get helpful information. (Image: Meta)

Meta has announced multimodal capabilities for its Ray-Ban smart glasses. With this, the smart glasses will now be able to process and comprehend its user’s surroundings. While Meta’s AI assistant was earlier limited to audio interactions, it can now process visual data via the built-in camera and give relevant insights to a user.

With the latest AI capabilities, users can ask the glasses to translate text, identify objects, or even give other contextual information. And, all this is done hands free. Wearers can also share views during video calls on WhatsApp and Messenger allowing hands-free, real-time sharing experiences. Meta said that the multimodal AI upgrade will be available as a beta feature to users across Canada and the US.

Meta has said that it is expanding the Ray-Ban Meta smart glasses collection with new styles. It is also adding video calling with WhatsApp and Messenger to let users share their views on video calls. The tech giant said that it is rolling out Meta AI with Vision, so users can ask their glasses about what they are seeing and get helpful information.

Story continues below this ad

The second generation of smart glass has been introduced in collaboration with eyewear brand EssilorLuxottica. Meta has also announced that it is expanding the Ray-Ban Meta smart glasses collection with new styles designed to fit more face shapes. The new glasses come in a hundred different custom frame and lens combinations on the Ray-Ban Remix platform, where you can mix and match the glasses. The new styles are available in 15 countries including the US, Canada, Australia and across Europe.

With Meta AI, users can now control the Meta glasses with voice commands – all they need to say is ‘Hey Meta!’ and proceed with their prompts. The company began testing multimodal AI updates in December last year, and now they are announcing it as a functionality across the US and Canada.

The company’s introduction of multimodal capabilities into smart glasses is seen as a major leap forward. This transforms wearables into powerful, context-aware smart assistants.

Latest Comment
Post Comment
Read Comments
Advertisement
Loading Taboola...
Advertisement