Meta AI is now available on the Ray-Ban Meta glasses in the UK & Australia.
Previously Meta AI was only available in the US & Canada.
In the UK Meta AI on the glasses is voice-only "to start", while in Australia it also supports vision, letting it see the world through the camera when needed to respond to a query or take an action you asked for.
Meta AI is a digital assistant powered by Meta's Llama series of large language models (LLMs), the same kind of technology that powers ChatGPT.
Last week Meta released an update letting you ask Meta AI on the glasses to remind you of something you see or say (for example, where you parked, or that you're low on milk), set timers, scan QR codes, call phone numbers on posters or flyers, and send and receive voice messages on WhatsApp or Messenger.
The update also removed the need to explicitly say "look" at the start of queries to trigger the visual AI capability, allowing more natural invocation.
Later this year, Meta plans to add live translation, letting Meta AI on the glasses translate in real-time between English and French, Italian or Spanish.
The company also plans to add a feature letting Meta AI see a livestream of your first-person view, not just a single shot, enabling a continuous interaction over time. Meta says this will allow its AI to "help you more naturally, in real-time as you’re doing things like exploring a city or preparing a meal".