When Google unveiled Android XR last year, it seemed like a clear response to Apple’s Vision Pro: it was a plan for a true mixed reality platform that could easily expect between Meta’s Ray-Ban like AR, VR and smart glasses. Today in Google I/O 2025, Google announced the second developer preview for Android XR, and also showed a little more how it can work in headset and smart glasses. It would be for some time before seeing Android XR devices into action, however, Google also revealed that Samsung’s project Moohan headset would come later this year. Additionally, Xreal is also building project aura, which is a pair of tethard smart glasses operated by the platform.
Update: Google demolished the prototype Android XR smart glass in I/O with live translation, called Engadget’s Karisa Bell “Lightweight, but with a limited area”. Google is not planning to sell those devices, but it is Participation with Warbi Parker And gentle monster to provide frames for future smart glasses.
Originally, there is not much to be really excited yet. It is clear that Google is working hard to catch with both Apple And meta, which are actually already XR products on shelves. Given that Google goes to kill its ambitious projects with a sharpness – just take a look at Google Glass, Cardboard and Deadream, which was all early stabs in AR and VR – it is difficult to believe in the future of Android XR. Is the availability of much better XR hardware to make the platform a success? At this point, it is very difficult to mention.
For now, however, however, it seems that Google is aiming to distribute all the features you will expect with Android XR. Its second developer previews preview 180-degrees and 360-degree immersive videos, which brings hand-tracking to apps and supports dynamic refresh rates (which can seriously help battery life). As expected, Google is also making it easier to integrate its Gemini AI in Android XR apps, some company promised when it announced the first time the platform last year.
In a series of pre-produced videos, Google showed the ideal ways of using Gemini in smart glasses and headsets. If your glasses have an inherent display (some meta ray-bans do not present yet), you can see a small Google map, when you are preparing dinner or taking a picture while dancing with your partner in a sunset (seriously), you can give instructions. All I can say: “Cool demo, brother.” We can actually wear when working in headsets and glasses.
Update 5/21, 2:45 pm et: This story is updated with references to Google’s XR prototype glass.

