
Follow ZDNET: Add us as a favorite source On Google.
ZDNET Highlights
- Google announced three advancements in its Android XR platform.
- Those displays are headlined by AI Glasses that will be available to developers first.
- The Galaxy XR and Project Aura also get updates that improve their immersive experiences.
Last week, in the confines of Google’s Hudson River office, I donned the Android XR glasses and chatted with Gemini as I walked around the room. These weren’t the Warby Parker or Gentle Monster models that were teased at Google I/O in May, but a developer kit that will soon be in the hands (and faces) of Android developers around the world.
Demos ranging from visual aids to gyroscopic navigation moved quickly and, to my surprise, efficiently. At one point, I tried to irritate Gemini by asking for the fruit salad recipe with pasta on the shelf, only to be recommended a more traditional tomato sauce dish instead. This is a testament to both Gemini’s smarts and the glasses’ multimodal hardware.
Also: I invested in Samsung’s $1,800 XR headset to replace my dual monitors — and it’s paying off
By the time my briefing was over, I’d switched from Android XR glasses to Samsung’s Galaxy XR headset and Xreal’s upcoming pair, Project Aura. This seamless transition between wearables, most of which will also take advantage of your Android phone and smartwatch for additional functionality, is one of Google’s moonshots for 2026.
From what I’ve seen, the future can’t come soon enough.
Google’s approach to AI glasses is two-fold
Google’s plans for AI glasses come in two forms: one with audio and camera only, similar to Meta’s Ray-Bans, and another that integrates a display for visual cues and a floating interface, like Meta’s Ray-Ban display. Obviously, there is some competition in this area. However, Google has a major advantage even before launch: a well-established software ecosystem, with developer preview 3 of the Android XR SDK (including API) scheduled to be released this week.
No, we’re not just talking about Gmail, Meet and YouTube like Messenger, Instagram and WhatsApp are for Meta. Instead, the abundance of existing third-party Android apps, homescreen and notification panel widgets, and hardware products will, in theory, be easily converted to the Android XR operating system.
Also: Be careful, meta: Samsung just confirmed its smart glasses plans (with some spicy hints)
I got an early taste of this when I requested an Uber ride from the Google office to the third-best-rated pizzeria in Staten Island (as I had previously asked Gemini as a test). In addition to populating a navigation route to my Uber pickup location, the glasses’ display projected driver information when I was nearby. Google told me that this functionality is taken directly from the native Uber app for Android, and it’s a good representation of how seamless development for the wearable platform will be.
Another interesting aspect during my demo was how the Gemini provided environmental context as I put on the glasses. Instead of asking the Assistant about my location, the weather, or random objects strategically placed around me for demo purposes, the Android XR experience began with a summary of relevant information and prompts for follow-up questions. It’s a thoughtful touch that makes interactions with the Assistant more natural.
The Galaxy XR gets better, but I’m more attracted to this headset
As I mentioned earlier, I also tried the Samsung Galaxy XR headset (again), only this time with some new features, including PC Connect, which syncs with a Windows PC or laptop for an extended, more immersive viewing experience, Travel Mode for better anchoring during movement, and a digital avatar generator similar to Apple’s Spatial Personas likeness.
As a Windows user, I was mostly invested in the PC Connect feature, which allowed me to project (albeit virtually) a very large screen of the game “Stray”.,With the wireless controller in hand, inputs were surprisingly responsive, and image quality was stable in terms of refresh rate,
Also: Samsung Galaxy XR headset comes with $1,000 worth of freebies – here’s what’s included
However, what made the Galaxy XR headset all the rage was a more portable and comfortable pair of Xreal glasses, called Project Aura. It was first announced months ago at Google I/O, and using the wired-in wearable for the first time made me realize that the future of comfortable face computers isn’t so far off.
Project Aura has a large 70-degree field of view, complemented by Xreal’s tinting feature, which boosts brightness. It runs on the same Android XR platform as the Galaxy XR headset, allowing you to raise your hand for pinch and swipe gestures, view multiple floating windows simultaneously (via PC Connect), and access various Android apps on your phone.
The big question with Project Aura is undoubtedly its price. Xreal’s current lineup of extended reality glasses range from $300 to $650. With Project Aura’s advanced computing (and innovation), I expect it to be closer to the $1,000 mark at launch. Google and Xreal have not yet shared an official release date for the glasses, but have suggested they will arrive late next year.
Bottom line (for now)
My journey through Google’s Android XR demo confirms that competition is growing in the wearable computing arena, driven less by speculative concepts and more by tangible, functional hardware and software integration. The main strength of Google’s strategy lies not just in Gemini’s ingenuity but in taking advantage of the established Android ecosystem. This should be music to the ears of developers.
Also: The 40 Best Products We Tested in 2025: Editors’ Choices for Phones, TVs, AI, and More
As is the experience with most beta products, there were stutters and crashes in my demo, too, but the idea of quickly switching between different tools, from bulky developer kits to Xreal’s Project Aura, underscores Google’s commitment to flexibility.
Ultimately, this shows that the company’s 2026 vision for seamless, multifunctional smart glasses is not just marketing hype, but a technologically robust and rapidly changing reality that could redefine the way we interact with the information and digital world.

