Google today wrapped its I/O 2025 Keenote with a wide demo of its Android XR glasses, which displays Gemini-Interested Translations, Navigation and Messaging features. Although the on-stage showcase was calm, we did not really see anything new. You can say that I was unaffected. But then I had to use glasses for five minutes, and I can’t wait to buy a pair.
I have now wearing a meta ray-ban smart glasses for more than a year, and when I do not regret my purchase, they mainly use their dog as sunglasses with underlying Bluetooth headphones while walking. The camera really takes great pictures and AI features can be useful for the right person, but there is not much for them.

Ray-Ban Meta Smart Glasses
Culture manufacturers embraced by the next generation, its journey continues with A-wide weeked technology. Listen, call, capture, and live stream features are originally integrated within the classic frame.
On paper, Google’s Android XR glasses are very similar. It has a built -in camera, speaker, and can interact with AI. But the moment I put on the glasses and saw the AR display light, I was ready to throw my meta glasses out.
Around Android XR glasses, control and buttons are similar to other smart glasses. There is a touch-sensitive area on the right hand to activate or prevent Gemini, a camera button on the right temple, speaker located near your ears and a USB-C charging port in the left hand. Google is not sharing glasses, but his weight looked equal or lighter than my meta re-ban. I expressed my shock when I first kept them and saw how comfortable he felt.
Google’s choice to use a monocular display was something that threw me to a loop. Instead of introducing the interface on both lenses, the right lens has a single performance. This could not bother everyone, but my initial response was to try to shift the frame because something felt and looked out carefully. I was not looking double, but it took a moment to adjust my mind.
I was congratulated with time and season. It was nothing but white text, but I was impressed by how bright and well -known everything was. Of course, I was in a dim light room, so perhaps it would look different if I was out under the sun. Google employee was giving me a demo, then touched the right hand to launch Gemini.
The rest of my demo was originally an entertainment of what we saw on I/O stage. Mithun Live is originally already available on your smartphone, using it, I saw several books on a shelf and sought information about the author with glasses. I then used the shutter button on the frame to take a picture, which was immediately shifted to the employee’s pixel.
Googler also launched instructions to walk, which gave me a moment to try to new interface concept. When you are looking forward directly, Google Maps shows basic navigation instructions like “Walk North on Main Street”. But if you look down, the display switches on a map, with an arrow that reflects your current trajectory. Dividing two scenes makes sure that you are floating minimum information in front of you as soon as you go, but if you need more reference, it is a look.
This is the ability to see Gemini’s reactions and interact with various applications that make these smart glasses so impressive. The reading that is easy to read by having a full-colored display also feels Google’s Android XR glasses more than other options than companies. Even the realities It uses single-color wavegide technology.
There was no way to capture the performance during my small demo, so you have to rely on the main demo below to guess what I was looking.
It should be noted that Google’s Android XR glasses currently serve as a tech demo to show the in-development operating system, with no words that the company is actually planning to sell them. They are still very small buggy, Mithun has a difficult time that is distinguishing my instructions from background conversations, but I was not expecting perfection. In addition, most of the calculations are getting on the coupled pixel handset, which is good for battery life on glasses, but you will see a little delay when transferring information between two devices.
Thankfully, even though Google does not launch “Pixel Glasses”, third-party like Xreal is already working on his own glasses. And, of course, Samsung is working on co-development Project Muhan R Headset It is ready to launch at the end of this year. But given that the developers still do not have access to Android XR to start making apps for OS, I would not expect Google’s potential offerings to hit as soon as possible by 2027.