Meta announced today that it plans to update the Quest 3 with AI vision features similar to those found in the company’s latest Meta Ray-Ban glasses, while Apple has yet to confirm whether the Apple Intelligence features will arrive in Vision Pro at the same time as other Apple devices this fall.
Meta has positioned investment in AI alongside XR as a key area of R&D, and now the company is beginning to integrate the two in its consumer-facing AI capabilities.
Meta announced that it will be rolling out an update to Quest 3 later this summer in the US and Canada to enable its “Meta AI with Vision” feature. The feature adds AI voice chat capabilities to the headset and also allows the headset to “see” what’s in the user’s real-world field of view. Users will be able to ask general questions or ask questions about things in front of them. Meta gave some examples in the announcement:
Imagine you’re watching a YouTube video of a breathtaking hike in mixed reality while preparing for a trip to Joshua Tree. You can ask Meta AI for advice on the best outfit for summer weather. Or you can hold up a pair of shorts and say, “What top would go best with this outfit?” You can also get weather forecasts to prepare for upcoming weather or ask for local restaurant recommendations to satisfy your foodie cravings.
Or maybe you’re working on a school paper on a giant virtual monitor while wearing a headset and listening to music. You can ask Meta AI to identify some of the most memorable quotes from Shakespeare’s Hamlet and explain the meaning of the “To be or not to be” monologue and the play-within-a-play.
While playing Assassin’s Creed® Nexus VR and parkouring across the rooftops, your curiosity may be piqued. Why not ask Meta AI if there were actually assassins in colonial Boston? The answer may surprise you…
Currently, the camera can only see things in the real world, but does not perceive virtual content displayed on the headset, and Meta has hinted that its Meta AI with Vision features will eventually include perception of both the real and virtual worlds.
Unfortunately, Meta has confirmed that the feature won’t be coming to the Quest 2 or older devices (and likely won’t be coming to the Quest Pro either).
Meta didn’t say whether requests for the feature would be processed on-device or in the cloud, nor did it mention encryption or anything like that, but it did confirm that the feature is based on Bing AI. The company has not yet responded to a request for more information about its privacy architecture.
Meta has been rapidly introducing AI features to its devices, but Apple has yet to confirm whether the so-called “Apple Intelligence” features will be coming to the Vision Pro.
Earlier this year, Apple announced a set of Apple Intelligence features coming to iPhones, iPads and Macs in beta this fall. Despite the simultaneous announcement of visionOS 2, the Apple Intelligence features have yet to be confirmed for the Vision Pro. As such, it is unclear whether the headset will have Apple Intelligence features at the same time as other Apple devices, or if users will have to wait until a later version of VisionOS.
According to Apple, while many of Apple Intelligence’s features are processed on-device, some requests, including those that rely on ChatGPT, require off-device processing. Apple claims that off-device requests “never store your data” and are “only used for your requests,” and will be opening up the behind-the-scenes code for a privacy audit.