Meta updates Ray-Ban smart glasses with real-time AI video, reminders, and QR code scanning

[ad_1]

Meta CEO Mark Zuckerberg announced updates to the company’s Ray-Ban Meta smart glasses at Meta Connect 2024 on Wednesday. Meta continued to make the case that smart glasses can be the next big consumer device, announcing some new AI capabilities and familiar features from smartphones coming to Ray-Ban Meta later this year.

Some of Meta’s new features include real-time AI video processing and live language translation. Other announcements — like QR code scanning, reminders, and integrations with iHeartRadio and Audible — seem to give Ray-Ban Meta users the features from their smartphones that they already know and love.

Meta says its smart glasses will soon have real-time AI video capabilities, meaning you can ask the Ray-Ban Meta glasses questions about what you’re seeing in front of you, and Meta AI will verbally answer you in real time. Currently, the Ray-Ban Meta glasses can only take a picture and describe that to you or answer questions about it, but the video upgrade should make the experience more natural, in theory at least. These multimodal features are slated to come later this year.

In a demo, users could ask Ray-Ban Meta questions about a meal they were cooking, or city scenes taking place in front of them. The real-time video capabilities mean that Meta’s AI should be able to process live action and respond in an audible way.

This is easier said than done, however, and we’ll have to see how fast and seamless the feature is in practice. We’ve seen demonstrations of these real-time AI video capabilities from Google and OpenAI, but Meta would be the first to launch such features in a consumer product.

Zuckerberg also announced live language translation for Ray-Ban Meta. English-speaking users can talk to someone speaking French, Italian, or Spanish, and their Ray-Ban Meta glasses should be able to translate what the other person is saying into their language of choice. Meta says this feature is coming later this year and will include more language later on.

The Ray-Ban Meta glasses are getting reminders, which will allow people to ask Meta AI to remind them about things they look at through the smart glasses. In a demo, a user asked their Ray-Ban Meta glasses to remember a jacket they were looking at so they could share the image with a friend later on.

Meta announced that integrations with Amazon Music, Audible, and iHeart are coming to its smart glasses. This should make it easier for people to listen to music on their streaming service of choice using the glasses’ built-in speakers.

The Ray-Ban Meta glasses will also gain the ability to scan QR codes or phone numbers from the glasses. Users can ask the glasses to scan something, and the QR code will immediately open on the person’s phone with no further action required.

The smart glasses will also be available in a range of new Transitions lenses, which respond to ultraviolet light to adjust to the brightness of the room you’re in.

[ad_2]

Leave a Comment