Smart glasses with microphones, cameras, onboard computing and even AR (augmented reality) have been the stuff of spy thrillers for decades. But the reality of face-worn wearables with genuinely meaningful utility has become something of a white whale to the consumer tech industry.
That’s not for lack of trying, of course. I trialled Google Glass at the company’s offices back in 2013, I was among the first in the UK to purchase the original Snapchat Spectacles in 2017, and big names like Sony and Oppo have been dabbling with the form factor for years.
Despite all their attempts, however, no company has cracked the balance of form and function that has led to mainstream acceptance or adoption.
Foundry | Alex Walker-Todd
The most recent entrant into the space has been the Ray-Ban Meta smart glasses (pictured), which despite originally launching in late 2023, resurfaced in the headlines in October this year, thanks to the expansion of Meta AI functionality into more markets internationally, including the UK.
This marks the biggest distinction between Meta and (owners of the Ray-Ban brand) Luxottica’s first foray into smart glasses. 2021’s Ray-Ban Stories contained a similarly interesting assortment of technologies but ultimately proved lesser than the sum of their parts, and AI could well have been the missing ingredient.
Different shades of AI
One of the big hooks of Meta AI’s inclusion in the Ray-Ban Meta glasses is its newfound multimodality. Using a feature dubbed ‘look and ask’, the glasses can grab a snap of whatever’s in front of you and ‒ using machine vision paired with Meta’s Llama AI model ‒ explain what it is you’re looking at.
Depending on your request, you can even use ‘look and ask’ to quickly summarise signage or documentation, extract dietary information from food packaging or learn about new recipes, inspired solely by the ingredients you have in view, in front of you. For the majority of users in the UK right now, however, this prominent facet of the Ray-Ban Meta experience remains inaccessible, with no set date for its addition.
Foundry | Alex Walker-Todd
Aside from those who’ve updated to the latest Ray-Ban Meta software by way of a VPN regionally set to the US, or who’ve been whitelisted to access the Meta View app’s beta updates, most UK users are still without this device-defining upgrade.
The reason? A combination of the EU’s AI Act and GDPR, which have collectively hampered Meta’s AI efforts in the region, resulting in a limited experience for users locally.
Meta AI on your face
If, like me, you’re a UK-based Ray-Ban Meta user, you’re likely acutely aware of the limitations the integrated Meta AI experience currently suffers from.
Having Meta’s assistant ever-present and completely hands-free is a pretty neat boon in day-to-day use; more accessible than turning to Gemini or Siri on my phone and less distracting, as there’s no interface to look at. I can still fire off conventional digital assistant requests ‒ like checking the weather or step-by-step instructions to bake the perfect brownies ‒ but beyond that, the experience still feels decidedly sparse and incomplete.
On paper, the pairing of the Ray-Ban Meta’s form factor and hardware configuration is a recipe for success, in terms of making AI interaction meaningful in day-to-day use.
Without the machine vision-led multimodal component, however, these smart glasses’ most helpful features are instead photo and video capture, integrated Bluetooth audio and the ability to field calls with solid voice clarity (courtesy of a quintet of microphones set about the frame). A far less ‘smart’ skill set than Meta wishes to focus on.
Foundry | Alex Walker-Todd
Great for you, less so for everyone else
When it comes to face-worn wearable tech, as mentioned up top, no manufacturer has yet cracked the code to mainstream adoption or acceptance, but the Ray-Ban Metas are arguably the industry’s best attempt yet.
Although the company’s augmented reality efforts remain reserved for its Meta Quest XR headsets and Meta Orion concept, the addition of AI renders the Ray-Ban Meta specs the most approachable smart glasses to date. That said, the same privacy concerns levied at the previous Ray-Ban Stories, not to mention Meta directly, aren’t exactly deflecting off this latest generation of intelligent spectacles.
If anything, such concerns are the driving force behind the stunted expansion of the glasses’ game-changing Meta AI integration and multimodality (outside of the US, Canada and Australia). My frustrations are squarely from the perspective of a user who knows they’re unable to fully utilise the cutting-edge tech they have to hand.
That said, I already know that if I wait, my experience is going to improve. You could argue that the opposite is true for everyone else on the other side of Ray-Ban Meta’s camera lens. The lack of multimodality actually grants bystanders in the UK a greater degree of privacy than those where the glasses’ full functionality is already available.
Unless you’re someone who can already spot a pair of Ray-Ban Meta smartglasses, understands that they have an integrated camera, and can be used to shoot stills or video hands-free and even livestream, chances are you’ll have no say as to whether your likeness gets captured, shared online, passed through Meta’s servers or any combination of those factors.
On the one hand, products like the Ray-Ban Meta glasses are gaining acceptance in society, if only because people don’t necessarily know they’re smart at a glance. As for their Meta AI integration, while the wearer benefits, those around them will likely be less thrilled with these specs’ growing AI-powered repertoire, if they’re even aware of its presence.