- Meta is developing its Aria General 2 Smart Glass, which is packed with sensors and AI features
- Smart glasses can track your gaze, movement and even heart rate what is happening about you and your feelings about it
- Currently smart glasses are being used to help researchers to train robots and create better AI systems that can be included in consumer smart glasses.
Ray-Ban Meta Smart Glass is still relatively new, but Meta is already working with its new Aria General 2 Smart Glass. Unlike Ray-Ban, these smart glasses are only for research purposes, for now, but packed with adequate sensors, cameras and processing power that it seems unavoidable that the meta learns some of them who learns from them, will be included in future wear.
Project Aria’s research-level equipment, like new smart glasses, are used by people working on computer vision, robotics, or any relevant hybrids of neuria and neurological hybrids that attract the attention of the meta. The idea for developers is to use these glasses so that more effective methods can be prepared to navigate, make relevant and interact with the world for teaching machines.
The first Aria Smart Glasses surfaced in 2020. Aria General 2S is far more advanced in hardware and software. They are light, more accurate, pack more power, and look too much as glasses wear in their regular life, although you will not make them mistake for a standard pair.
Four computer vision cameras can see an 80 ° arc around you and measure depth and relative distance, so both can tell how far your coffee mug is from your keyboard, or where the drone landing gear heading can occur. This is just the onset of sensory devices in glasses, including an ambient light sensor with ultraviolet mode, a contact microphone that can also lift your voice in the noise environment, and a pulse detector embedded in nose pads that can estimate your heart rate.
Future face
There is also a lot of eye-tracking technology, it is capable of telling where you are looking, when you nap, how your disciples change, and what you are focusing on. It can also track your hands, measuring joint movement in a way that can help in training robots or learning gestures. United, glasses can find out what you are looking, how you are catching an object, and if you are seeing is increasing your heart rate due to an emotional response. If you are holding an egg and looking at your oath, AI may be able to find out that you want to throw eggs on them, and help you target it correctly.
These are research tools as stated. They are not for sale to consumers, and Meta has not said whether they will ever be. Researchers must apply to achieve access, and the company is expected to start taking those applications later this year.
But the implications are big. Meta plans for smart glasses are beyond examination for messages. They teach human conversations to do so, they want to connect with machines. Theoretically, they could see the world around them, listen and interpreted as humans do.
This is not going to happen tomorrow, but Aria General 2 proves smart glasses that it is very close as you can think. And this is probably only a few time ago when some version of Aria Gen 2 ends for sale to the average person. You will have a powerful AI brain sitting on your face, remembering where you left your key and sent a robot to take them for yourself.