Why can you rely on techradar
We review every product or service, we review, so you can make sure you are buying the best. Get more information about how we test.
Google Android XR cannot do much yet. In Google I/O 2025, I got to wear new glasses and try some major features – three features – and then my time is over. These Android XR glasses are not the future, but I can definitely see the future through them, and my meta ray bain smart glasses cannot match anything seen by me.
The Android XR glass I tried had only one display in it, and it did not fill the entire lens. In front of my vision, a small frame was projected on a small frame that was invisible until he was full of materials.
To begin, a small digital clock showed me time and local temperature, my phone information. It was so small and disinterested that I could imagine it to be active in the circumference.
Google Gemini is very responsible on this Android XR prototype

The first feature I tried was Google Gemini, who is making its way to Google on every device. Gemini is already more advanced on Android XR prototype glass, which you must have tried on your smartphone.
I approached the wall for a painting and asked Mithun to tell me about it. This described the poinlist artwork and artist. I said that I wanted to see art very closely and I asked for suggestions on interesting aspects to consider. This suggested me about the use of poinlism and the color of the artist.
The conversation was very natural. The latest voice model of Google for Gemini sounds like a real human. Glasses also did a good job while stopping Mithun while someone else was talking to me. There was a long delay or no frustration. When I asked Mithun to resume, he said ‘no problem’ and started quickly.
This is a big thing! The accountability of smart glasses is a metric that I have not considered before, but it matters. My Meta Ray Bain is an AI agent in smart glasses that can see through the camera, but it works very slowly. It reacts slowly at first, and then it takes a long time to answer the question. Google’s Gemini on Android XR was very fast and it was more natural.
Google Map on Android XR was not like any Google map that I have seen

Then I tried Google Maps on Android XR prototype. I did not find a big map to dominate my view. Instead, I found a simple direction with an arrow, in which I was asked to turn right in a half mile. The best part of the entire XR demo was when the sign changed as soon as I turned my head.
If I looked down directly on the ground, I could see a circular map from Google, showing me where I am and where I should go. The map went easily as I wandered in circles to get my bearings. It was not a very large map – about the size of a large cookie (or biscuit for friends of UK) in my viewing area.
As soon as I raised my head, the cookie-map moved upwards. Android XR glasses do not just stick a map in front of my face. The map is an object in space. This is a cycle that remains parallel with the floor. If I look straight down, I can see the entire map. As I move my head upwards, the map goes up and I look at it from a diagonal angle because it is high and higher with the area of ​​my viewing.
By the time I am looking forward directly, the map has completely disappeared and it has been replaced with directions and arrows. This is a very natural way to get updates on my path. Instead of opening and turning your phone, I just look at my feet and Android XR shows me where they should indicate.
Show colorful display with a photo

The last demo I saw that there was a simple picture using the camera on Android XR glass. After taking the shot, I found a small preview on the performance in front of me. It was about 80% transparent, so I could clearly see the details, but it did not completely block my approach.
It is a matter of regret that Google gave me today with glasses, and the experience was reduced. In fact, my first idea was surprised whether the Google Glass I had in 2014 had the same features similar to today’s Android XR prototype glass. It was very close.
My old Google Glass may take photos and videos, but it did not offer a preview on its small, head-mounted display. It had Google maps with bend directions, but it did not have animation or head-tracking providing Android XR.
Clearly there was no interactive AI like Gemini on Google Glass, and could not see what you see and provide information or suggestions. Does both make the same? Both of them lack apps and features.
Which comes first, Android XR software or smart glasses to run it?

Should developers have a code for a device that does not exist? Or Google should sell smart glasses, even though there are no developers yet? neither. Problems with AR glasses are not just a chicken and egg problem that comes first, software or device. This is because AR hardware is not ready to lay eggs. We do not have chicken Or Eggs, so it is of no use to debate which comes first.
Google’s Android XR prototype glass is not chicken, but they are a fine looking bird. Glasses are incredibly lighter, giving performance and all techniques. They are relatively stylish for now, and Google has a great partner in Warbie Parker and Gentle Monster.
Performance in itself is the best smart glasses display I have seen, so far. It is not very large, but it has a better field than the rest; It is well deployed from the area of ​​your right eye vision; And the images are bright, colored (if transparent), and flicker-free.

When I first saw the time and weather, it was a small text and it did not block my idea. I could imagine placing a small head-up display on my glasses all the time, just to give me a quick flash of information.
This is just the beginning, but it is a very good start. Other smart glasses did not feel that they were in the early line, walk alone on retail shelves. Eventually, the performance will be larger, more software. Or any software, as the feature set seemed incredibly limited.
Nevertheless, only with Gemini’s impressive new multi-modal abilities and Google Maps on XR as well as Gemini’s impressive new multi-modal abilities, if the price is not terrible, I am not an early adoption.

Of course, Meta Ray Bain Smart Glasses lacks a performance, so they cannot do most of it. Meta smart glass has a camera, but images smile on your phone. From there, your phone can save them in your gallery, or even smart glasses to broadcast live on Facebook directly. Just Facebook – This is Meta, after all.
With its Android Provence, I am hoping that whatever Android XR Smart Glass we find will be much more open than the gear of the meta. It should be. Android XR runs apps, while smart glasses of meta are run By an app. Google intends to have a platform for Android XR. Meta wants to collect information from cameras and microphones wearing on your head.
I have had a lot of fun with Meta Ray Ban Smart Glasses, but I honestly have not turned on them and used features in months. I was already a ray ban -wafer fan, so I wear them as my sunglasses, but I never got much luck to awaken the identity of the voice and respond on the command. I loved to use them as an open ear headphone, but not when I am in the city of New York and the noise of the road overpowers them.
I cannot imagine that I have a full platform with apps and extensibility once with my meta glasses – the promise of Android XR. I am not saying that I saw the future in Google’s smart glasses prototype, but I have a better view of what I want to look like smart glasses like future.

