
Follow ZDNET: Add us as a favorite source On Google.
On Wednesday, I got a new meta ray-ban display glass in Meta Connect 2025. And when they go a long way by changing your smartphone, they are enough to clarify that smart glasses are much better when there is a head-up display.
With what I experienced on Wednesday, we are definitely taking a step towards a world where we spend less time with our head buried in our smartphone.
Also: Meta Connect 2025 Live Updates: Ray-Ban Display, Okle Mohra Smart Glasses, more
At the company’s headquarters on Wednesday evening in Meta Connect Keenote, Meta CEO Mark Zuckerberg said that we have lost something as a society with the spread of smartphones. Meta hopes that smart glasses can help people restore what has been lost, allowing people to live more often.
Meta is trying to do this by releasing its next generation glasses with a color display and a smart restband that reads the subtle gestures with your writing hand and can not make many new tasks possible in a pair of smart glasses so far.
Zuckerberg was clearly more stinging about the launch of Meta Ray-Ban display glasses than anything declared in Meta Connect 2025.
“We have been working on glasses for 10 years in Meta, and this is one of those special moments,” Zuckerberg said. “I’m really proud of it, and I really proud to achieve this.”
Also: I tried Meta Okle Mohra Smart Glasses, and they are more cooler than watching
Glasses will be retail for $ 799, including both glasses and nerve restband. They come in two colors, in black and brown, and all pairs have infection lenses. However, due to the embedded head-up display, these glasses cannot accommodate lenses lenses.
They have 18 hours of battery life and are water-resistant. They arrive in shops on 30 September, and you can go and get a demo-in the same Esseloroluxotica stores such as sunglasses and lenscrafts where the meta ray-ban glasses have been such a stability for the demo in the last two years.
Meta Ray-Ban likes to wear a display
In these meta ray-ban display glass your right eye has a full-colored screen, and the display sits slightly away in the side so that it does not distract you, and it outputs up to a glow of 5,000 knots so that you can see if you are inside or out.
You also has a complete operating system to capture photos and videos, listen to music, apply live captions during a conversation, get visual response to questions asking AI, attach to WhatsApp messaging hands-free and take advantage of performance by taking advantage of performance by taking advantage of performance to use AI to capture notes on important conversations.
Also: I tried smart glasses with an underlying performance, and they defeated Mere Meta Re-Bamboo in major ways.
The primary way you interact with the new smart glasses is using a nervous restband that is more as a success in the form of smart glasses. Within 5–10 minutes of using the restband, I was taking a pinch with my thumb and middle finger to go back, swipe on the screen with my thumb, was taking a pinch with his forearm and thumb to choose things, and a pinch with my index finger and thumb and a setting up with thumb and turning on a setting up or volume.
Zuckerberg called Ristband the “world’s first mainstream nerve interface”.
The restband is made of a strong fabric content in itself and uses a combination of a metal cinch and magnets to fit snuggly around your wrist. I found that it is very comfortable, and within a few minutes, I did not think of being on my wrist.
Performance is very clear and easy to read in itself, although many times I found that my right eye is closing your left eye to see the screen more clearly through the right eye. But I liked to be able to frame and frame with the screen, exclusively before shooting or to adjust the frameing during video recording. You can also use a pinch-and-caste-twist gesture to zoom up to 3x. Then you can preview photos or videos on glasses and send them to someone or post them on social media without using your phone.
Also: Meta Ray-Ban vs. Okle: I tested both smart glasses, a clear winner.
One of the most clever things I saw that the glasses were trying to him in my time, he was a live captioning feature, where I identified the person who was standing in front of me, on which I wanted me to focus on it, and it only moved his words in the same room by ignoring the words in the same room.
You can easily imagine being in a noisy place and the device should only pay attention to the person you were talking to and transformed your words on the screen by depicting your words around you. In fact, it will be super helpful when to hold a significant conversation with someone in a loud room so that you can catch what you were saying.
It belongs to a new feature called a conversation focus, which is only on audio-kel smart glasses. This increases the audio of someone with whom you are speaking directly.
Similarly, the meta ray-ban display glasses seemed better to act on my voice only when using the voice command and not distracted by the voices of the people around me, who were talking. This standard is a much more clever feature than the Meta Ray-Ban and Meta Okle Glass that are currently capable.
Also: Are smart glasses viable with built-in hearing AIDS? My decision after months of testing
Finally, Meta spoke on one of the platforms in which I was not attempted, she was a live AI. The idea with this feature is that glasses can see what you see and hear what you hear.
You can start a live AI session when someone is going to give you something important that you need to remember, such as instructions, clarifications, sharing an important idea, or a message to give someone else. Then, it can take the note, create a summary, take a record steps to remember, and usually make sure that you can correct it without remembering important details.
I am eager to try it and display glasses in weeks and months to other aspects of Meta Ray-Bain. You look forward to the approach, review and breakdown on the hands more than ZDNET.

