IOS 26 was unveiled on Monday in Apple’s Worldwide Developers Conference (WWDC) 2025. The company prevails several new features and Artificial Intelligence (AI) capabilities that will soon be available on the iPhone. One of the most notable changes is for the design language of OS. Apple has introduced a new UI dubbed liquid glass, bundling glass -like icography and other elements. The latest version of iOS Apple Intelligence features and a unified interface in the phone app, updated camera app layout, typing indicators in messages, new carplay features, and more, when it is rolled out on the eligible iPhone model at the end of this year.
iOS 26 compatible model
Apple says The iOS 26 will be introduced as a free over-the-air (OTA) software update for the iPhone 11 and later handsets. However, Apple Intelligence features will be limited to the iPhone 16 series and iPhone 15 Pro models.
Registered Apple Developers can download iOS 26 on their devices and start testing new features. A public beta will be available through the Apple beta software program next month.
Visual improvement in iOS 26
According to the Senior Vice President of Software Engineering (SVP) Craig Federaghi in Apple, the liquid glass in iOS 26 is a new translucent material that reflects and refreshes visual elements around it. This applies in OS control, navigation, app icons and widgets. This new design language brings new adaptation options to the language home and lock screen, enabling users to adapt the app icons and widgets with a new clear form.
Photo Credit: Apple
On the lock screen of the iPhone, the time widget is now suitable for free space in an image, automatically shrinks and expands. Meanwhile, there is also a 3D effect that the user appears when he moves its iPhone.
The camera app in iOS 26 includes a streamlined layout, separate tab for library and collection views in photo apps, web pages flowing in safari and a re -designed tab bar in apps like Apple Music, News and Podcast. This time is asked to swim near the top and dynamically shrinks dynamically to highlight the material on the screen.
Apple Intelligence Update
At the introduction of Apple Intelligence features on the iPhone from last year, Apple is expanding on its abilities. It has introduced live translation which is integrated in apps such as messages, facetime and phones. Powered by the company’s ownership on-device AI model, this feature automatically translates lessons and audio in English, French, German, Italian, Spanish, Chinese and other languages.
Photo Credit: Apple
Update to visual intelligence, Apple option for Google’s circle-to-research, enables users to ask Chatgpt questions what they are currently looking at on their screen. They can discover similar images and products on Google, ETSY and other supported apps. In addition, it automatically recognizes whether a user is looking at an event and suggests them to connect them with date, time and other major details in their calendar.
Apple has also introduced AI-operated shortcuts and dedicated actions to write equipment and image playground. The company’s AI model can also identify and summarize details, such as invoices in email from traders. Users can also add details for their favorite emoji, genmosis and unique compositions. Meanwhile, the new Foundation Model Framework provides access to Apple’s on-device foundation model that gives strength to Apple Intelligence.
Changes in apps
Apple has rebuilt the phone app with a unified layout that connects the favorite, recipes and Voicemell tabs. It also takes advantage of the call screening that uses live vaisel to gather information from the collar, enables the recipient to decide if they want to choose it. On the other hand, the hold assist can come in handy when the user is caught in the grip, informs them when the person at the other end of the line, such as the agent, is available.
With iOS 26, users can screen messages from unknown sectors in the message app, which appear in a dedicated folder and remain silent until otherwise chosen. Users can also create choices, generate backgrounds for chat using Apple Intelligence, and can see typing indicators in group chats.
Apple Music has song translation and song accent features, as well as an automics feature, which uses AI for transition from a track using stretching and beat matching next time. Meanwhile, Apple Maps get a feature at dubbed locations, which enables the user to remember the places that they are. Apple says that the iPhone can use on-device AI to analyze its daily route and introduce them with a preferred option, while also informs them about potential delays.
Apart from the update, Apple has introduced a new app that has been dubbed the Apple Games. It is claimed to act as a-stop-shop for all gaming needs of iPhone users. It presents users with observation of activities in their sports including major events. In addition, they can launch the game from the app and switch between them.
New carplay features
With iOS 26, Apple has brought a compact scene for calls coming in CarPlay. This allows users to check who is calling them without remembering the upcoming directions. The updated widgets and live activities as well as tapback and pinned conversations for messages.
Erce for airpods
Apple says that iOS 26 spreads over the current capabilities of its TWS products, ie AirPods 4 and AirPods Pro (2nd Generation). Interviewers, podcasters, singers and others can now record audio with clear quality using studio quality audio recording feature. Apple promises “more natural vocal texture and clarity” in phone calls on iPhone.
Meanwhile, the new camera remote feature allows airpods to act as camera shutters; A simple press and hold of the airpods stem takes a photo through the original camera app or any other third-party app. It can start or stop video recording on an iPhone.