AI-powered visual search features arrived in Ray-Ban’s Meta sunglasses last year with some impressive (and alarming) capabilities — but a new one in the latest beta looks pretty useful. It identifies landmarks in various locations and tells you more about them, acting as a sort of tour guide for travelers, writes Meta CTO Andrew Bosworth in Post topics.

Bosworth showed some sample images explaining why the Golden Gate Bridge is orange (easier to see in fog), a history of San Francisco’s “painted lady” houses, and more. For them, the descriptions appeared as text below the images.

In addition, Mark Zuckerberg used Instagram to show the new possibilities through several videos filmed in Montana. This time, the glasses use audio to provide a verbal description of Big Sky Mountain and the history of Roosevelt Arch, while also explaining (like a caveman) how the snow is formed.

Meta previewed the feature at its Connect event last year, as part of new “multi-modal” capabilities that let it answer questions based on your environment. This, in turn, was enabled when all of Meta’s smart glasses gained access to real-time information (instead of having a 2022 knowledge blackout as before), powered in part by Bing Search.

The feature is part of Meta’s Google Lens-like feature, which allows users to “show” things they see through the glasses and ask the AI ​​questions about it — like fruit or foreign text that needs translation. It’s available to everyone on Meta’s early access program, which is still limited in number. “For those who don’t have access to the beta yet, you can add yourself to the waiting list while we work to make this available to more people,” Bosworth said in the post.

This article contains affiliate links; if you click on such a link and make a purchase, we may earn a commission.

https://www.engadget.com/ray-bans-meta-sunglasses-can-now-identify-and-describe-landmarks-054026843.html?src=rss