Google has been diligently developing Android XR, its innovative platform for smart glasses and mixed reality headsets, for an extended period. The company provided an initial glimpse of this cutting-edge technology late last year. Recently, during the TED2025 conference, Google showcased the first live demonstration of Android XR in action using a pair of prototype smart glasses. This exciting demo is now readily accessible for viewing by the public.
Previously, our only insights into Android XR on smart glasses came from a video released by Google in December. In that video, the company illustrated several examples of the software's capabilities. However, many have seen similar clips for various smart glasses, which often fail to translate effectively into real-world applications. The live demo at the TED2025 conference was led by Google’s Shahram Izadi, who presented the capabilities of Android XR with the assistance of prototype glasses and participant Nishtha Bhatia.
During the demonstration, attendees witnessed the lenses displaying Android XR features. Izadi highlighted that these lenses can accommodate prescription adjustments. On stage, he utilized the glasses to show speaker notes, emphasizing their connectivity to smartphones and compatibility with “all your phone apps.” The demonstration began with the AI named Gemini generating a haiku on demand, quickly escalating the showcase as Nishtha turned around and asked Gemini about “the title of the white book on the shelf behind me.” Impressively, Gemini recited the title instantly.
This functionality was previously teased as part of Project Astra, which is now integrated into Gemini Live for smartphones. The demonstration also illustrated Gemini's ability to locate a hotel key card on a shelf. Other notable features included explaining a diagram, translating a sign into English, and then further translating that sign into Farsi (Persian). Nishtha then engaged with Gemini in Hindi, and without any adjustments to settings, Gemini promptly responded in Hindi.
Continuing the exploration of XR applications, Gemini was shown recognizing a record and retrieving a song from that record on demand. The navigation capabilities were also highlighted, featuring heads-up direction and a 3D map display. Following this, the presentation transitioned to Android XR applications on headsets, specifically using Samsung’s Project Moohan headset, which Izadi confirmed is set to launch later this year.
This segment of the demo provided insights into what Gemini can accomplish when used with headsets, aligning with our hands-on experience from the previous year. Demonstrated applications included Google Maps' Immersive View and leveraging Gemini for tips while playing Stardew Valley. The prototype glasses used in this demonstration are believed to be a precursor to a product that Samsung plans to release, potentially next year.
What are your thoughts on this captivating look at Android XR? As Google continues to unveil its innovative applications in mixed reality and smart glasses, the potential for these technologies to transform our daily interactions with digital content is immense. For more information on Android XR and its development, explore our detailed articles on related topics, including Samsung's advancements in XR technology and an extended look at Gemini’s Project Astra.