Meta has officially unveiled its standalone AI assistant, designed to compete with ChatGPT. Users can engage with the assistant through text or voice commands, generate images, and obtain real-time web information.
A notable feature of the Meta AI app is the Discover feed, which merges AI interactions with social media elements. This feed allows users to view and interact with AI-generated content shared by others, including friends on platforms such as Instagram and Facebook.
Users can react to these AI interactions by liking, commenting, sharing, or remixing them. Connor Hayes, Meta’s VP of product, expressed that this feature aims to demystify AI and educate users on its potential applications.
Meta’s initiative to incorporate a social aspect into its AI assistant appears to be a trend that could continue across the tech sector. Notably, Elon Musk’s platform X has formed a close integration with Grok, while OpenAI is planning to implement a social feed within ChatGPT.
The emphasis on voice interactions is evident in the Meta AI app, which features an opt-in beta option for a more conversational voice experience, similar to ChatGPT’s advanced voice capabilities. However, the current version of Meta’s voice assistant does not access online information.
This opt-in feature is based on Meta’s research into a “full-duplex” AI model, which allows for more dynamic and interactive communication. During a demonstration, the enhanced personality traits of the full-duplex mode were apparent in contrast to the standard voice option. This mode is currently accessible in the US, Canada, Australia, and New Zealand.
In regions like the US and Canada, Meta personalizes response patterns based on users’ activities within Facebook and Instagram. This means user engagement on these platforms could influence the AI assistant’s replies. Similar to ChatGPT, users have the option to save specific information for future interactions. The app utilizes a modified version of the Llama 4 model developed by Meta.
For the time being, many have encountered Meta AI through its integration within Instagram, Facebook, and WhatsApp. According to Hayes, nearly one billion users have interacted with the feature this way, although he notes that deploying a dedicated app is the “most intuitive way to interact with an AI assistant.”
The Meta AI app does not launch as a brand-new application; instead, it replaces the previous View companion app associated with Meta’s Ray-Ban smart glasses. It features a dedicated tab that provides access to content previously available in the View app, such as photo and video galleries.
Meta’s decision to combine the glasses companion feature with its AI assistant stems from the company’s vision of a comprehensive software and hardware integration. The Meta Ray-Ban glasses are already equipped with AI capabilities, including object recognition and real-time translation features. Later this year, Meta plans to unveil a more advanced version featuring a small heads-up display.