At Google I/O 2025, Google revealed major updates to its immersive computing efforts with the launch of the Android XR platform – designed specifically for glasses, headsets, and next-gen spatial devices. This is the first Android platform born in the Gemini era, combining AI and spatial computing for context-aware, intelligent interactions.
Android XR Glasses: Gemini-Powered and Phone-Connected
Google introduced smart glasses built to seamlessly work with your phone, offering on-lens displays, built-in cameras, mics, and speakers – all in a stylish, comfortable form. These glasses provide real-time, heads-up access to notifications, navigation, translations, and even live subtitles during conversations.
When paired with Gemini AI, the glasses gain environmental awareness, task assistance, and context-sensitive features – making them both private and helpful in daily use.
Design + Fashion Partnerships
To ensure style and wearability, Google is collaborating with brands like Gentle Monster, Warby Parker, and soon Kering Eyewear. The goal: bring fashionable, comfortable smart glasses to market without compromising functionality.
Android XR SDK Developer Preview 2
Matthew McCullough, VP of Product Management, unveiled Developer Preview 2 of the Android XR SDK with powerful new tools for immersive development:
- MV-HEVC support for 180° and 360° stereoscopic videos
- Jetpack Compose for XR with adaptive layout support via SubspaceModifier
- Expanded Material 3 UI components like TopAppBar and AlertDialog
- Hand tracking in ARCore with 26 joint points
- Emulator improvements: AMD GPU support, enhanced stability, and full Android Studio integration
Unity + Firebase: AI Meets XR
Unity developers can now use OpenXR: Android XR v2, which brings:
- Dynamic refresh rates, SpaceWarp shaders, and occluded hand meshes
- Sample demos for hand tracking, face tracking, passthrough, and plane detection
In addition, Firebase AI Logic for Unity (now in public preview) allows Gemini-powered generative AI integration with real-time multimodal input/output. It supports Firebase services like App Check, Remote Config, and Cloud Storage for added security.
Open Standards + Future Devices
Google is working with the Khronos Group on the glTF Interactivity standard, enabling 3D models to respond to user input. This will roll out in Jetpack XR later in 2025.
The first Android XR devices launching include:
- Samsung’s Project Moohan (late 2025)
- XREAL’s Project Aura, a portable tethered developer device
Both will support Android apps and developer tools.
XR App Store and Developer Readiness
The upcoming Android XR Play Store will list compatible 2D apps and allow developers to showcase immersive content like 180°/360° videos and interactive screenshots. Testing tools are available now via the Android XR Emulator.