Meta bets on a future where glasses-wearing consumers interact with its AI assistant and previews its future voice tech, which it calls ‘full-duplex speech technology.’
Meta CEO Mark Zuckerberg is a big fan of smart glasses, but how do you get more people to try these futuristic specs? Step one: A new “Meta AI” app that will help people move more seamlessly between their glasses and other smart devices, and get them hooked on the company’s version of an AI assistant.
The standalone Meta AI app will merge with the Meta View companion app, which is currently needed to set up the Ray-Ban Meta Smart Glasses.
“In some countries, you’ll be able to switch from interacting with Meta AI on your glasses to the app”, Meta says. “You’ll be able to start a conversation on your glasses, then access it in your history tab from the app or web to pick up where you left off. And you can chat between the app and the web bidirectionally.” (You can’t start in the app or web and pick up on your glasses.)
Built with Llama 4, the Meta AI app will “bring you responses that feel more personal and relevant, and more conversational in tone”, Meta says.