
Follow ZDNET: Add us as a favorite source On Google.
Key takeaways of zdnet
- Apple’s event has AI updates hidden in their products.
- The company is going against the grains of other smartphone manufacturers.
- These subtle features streamlines user experience and ease.
As an AI reporter, I have covered every major smartphone launch event in the last one year. The line between hardware and software is blurred with each release, and the new AI features are equally notable company for the company. This week, Apple took a quiet approach to embed AI in its products.
Also: Apple iPhone 17 Event Recap: iPhone Air, Apple Watches, AirPods Pro 3, Responses for more
After a streak of overpromizing and underaising in the AI space, Apple went back to the basics with its new product drops. The focus of new smartphones, watches and airpods was on the new hardware, including camera, battery life, form factor and more.
But, if you were paying full attention, the AI was responsible for some latest, most exciting experiences on devices – even though they are not necessarily presented as Apple Intelligence features.
To help catch whatever you can miss, take a look at the round-ups released below, at least from the most obvious.
1. Live translation in airpods
The irony is that the largest AI upgrade was not even in the iPhone, but not in the new AirPods Pro 3. Live translation brings real -time translation capabilities. Apple declared back to Airpods in iOS 26 in June at WWDC. As the name means, when the convenience is activated, users can participate in free-free natural interactions and listen to the translation of what the other person is saying live in their ears. Users who do not have airpods can see live transcription on their iPhone screen.
Also: The iPhone 17 lineup came up with a high price tag – is the tariff blamed?
Jason Hinar of ZDNET laid the feature live from Apple Park, and shared with me that he was impressed, as the experience was more than his expectations. In general, LLM has a deep understanding of language and how people speak, and as a result, they are particularly good in accurate translation speech, using not only literally but also additional references. As a result, this integration would probably be a really useful use case of AI.
2. Center stage for your selfie
With the center stage feature, when you are taking a selfie, if you want a horizontal shot, you no longer need to flip your phone. When you can manually togle orientation using AI, the camera can automatically widen the shot when it detects a group of people and rotates from landscape to the vertical to get everyone in the frame. Although this feature is not the most attractive, it is one of those convenient characteristics that will just make your life more convenient on a daily basis.
3. High blood pressure notifications
One of the biggest announcements from the incident was in addition to high blood pressure notifications for Apple Watch Series 11 and Ultra Watch 3. With the feature, your Apple Watch can alert users when it detects signs of chronic high blood pressure, also known as high blood pressure. Although this is not a standalone AI feature in any way, Apple shared that it was developed using advanced machine learning, in addition to training data from several studies. Seeing this, it is clear how Apple created a facility to AI that is very helpful for people’s everyday life.
4. New photographic style filter
Apple’s iPhone 16 lineup had a new version of photographic styles, which allowed users to scroll via different “styles” or micro filter that adjusts colors and tones in a picture before taking it. Now, according to Apple, iOS 26 has a new bright style that can “brighten the skin tone and apply a pop of vibration in the image.”
Also: Why am I breaking the 5 -year iPhone upgrade cycle – and I am not alone
The company said that these photographic styles are powered by the Apple Neural Engine, which means that some AI underlines technology.
5. Updated photonic engine
The iPhone 17 Pro and Pro Max have an updated photonic engine, or how the phone processes your photos is computational photography behind it. Apple said that “the image pipeline uses more machine learning to preserve natural details, reduce noise and improve color accuracy significantly.” Similar to the center stage feature, this is another example of how AI can help make your everyday phone experience easier.
Bonus: New chipset
Beyond the new features, Apple unveiled the new chipset to provide electricity to its equipment across the board. Although it may probably look less attractive than an advanced Siri, it gives groundwork for more advanced AI features.
Also: Each iPhone that can be updated on iOS 26 (and when you can install it)
For example, the new A19 and A19 Pro chipset features a 5-core GPU feature with nerve accelerator built in each core, which should enable the equipment to run more powerful generative AI models as much as possible more powerful generative AI models. Another prominent example Apple Watch SE 3 has the S10 chip, which enables the budget model, first, on-device processing for Siri.

