Apple is reportedly designed to compete directly with Meta’s Ray-Ban on AI-operated smart glasses: AI-operated smart glasses. Bloomberg according toCupertino-based company is targeted to launch at the end of 2026 and has begun to prepare a large-scale prototype for production.
This new report explains and updates the information stated by our publication, which estimates that Apple Apple Watch will start its smart glasses with custom chips based on the Watch architecture. At that time, sources suggested that chip production would begin in summer in 2026. However, Bloomberg now confirms that Apple aims to start producing significant amounts of smart glasses – not only their components – by the end of 2025, indicating more aggressive schedule than before.
While the future AR glass years of Apple remains away, it is named new development, N401, which will bring reference to the real world for Siri in a more stylish look factor than the headset. Think them as “vision light”: onboard camera, speaker and smart glasses with microphone, live translation, call handling and turns by microphones capable of turning directions. The vision of glasses vision pro headset will not facilitate the promotional reality display. Instead, they rely on audio and voice feedback which is a deliberate step to reduce complexity, cost and bulk. This confirms earlier reports that Apple is working on both an AR model and a non-AR model, clearly represents later with N401.
Apple’s glasses are part of a comprehensive strategy for AI-enabled devices to remain relevant in the rapidly growing market. Meta and Google are already the same, and Openai’s newly declared hardware partnership with Joni Ev The ground is ready to convert further.
It is also worth noting: In this latest report, there is no mention of special chips designated in the pre -leakage, glani for airpods and nevis for Apple Watch, although it repeats that Apple is facing challenges around integrating cameras and sensor data in a light frame. The concept of offloading to processing for iPhone, which is also mentioned in earlier reporting, still has relevance as Apple continues to wrestle with performance and balanced power efficiency.
What about Apple’s other wearable experiments?
Not all thoughts are cut. Apple has allegedly sheltered the development of a smartwatch with an integrated camera and real -world analysis capabilities. The device was to bring environment-sensitive features for the wrist, but it has been eliminated due to technical obstacles and privacy concerns.
Internal, the smart glass project is said to have faced some similar boundaries. While Meta’s Ray-Ban benefits from Lama, and bends Android XR glasses on Google’s Gemini, Apple has so far really rely on a third party AI such as Openai or Google Lens for visual understanding on the iPhone. Analysts hope that Apple will soon start his own ownership model, possibly with new smart glasses.
If all go on according to the plan, AI’s AI glasses will work in the company’s first AI-First Wearables and promises that we promise to redefine the way we interact with everyday life. Can it beat Meta in its game, it remains to be seen, but Apple is not satisfied to sit on the edge.