
Apple WWDC 2025 in Tim Cook and Craig Federighi.
Jason Hinar/ZDNET
After WWDC 2025 Keenote, it is official – going in a different direction on AI compared to the rest of the Apple industry.
Unlike last year’s moonshots on “personal reference” and unlike a re -used version of Siri, no promises or predicted successes were predicted. And there was little mention of chatting with Bot.
CNET Survey: Only 11% of people upgrade their phones for AI features. What do they want instead
Instead, Apple is taking less speculative and more installed strength of the large language model (LLM) and integrating them piece-by-tukra in iPhone and other devices-without the need to mention the word AI.
An AI for iPhone (and other apple devices)
The first and most important is the live translation facility of Apple. Language translation is one of the things that LLM actually do well. In most cases, you have to copy and paste in chatbott or use language apps such as Google translation to take advantage of those superpower. In iOS 26, Apple is integrating your live translation features directly in messages, facetime and phone apps so that you can use the feature in places where you are interacting.
Pcmag: Apple’s wwdc was all glass, no vision: Openai, what is on tap to counter Google?
Next, the visual is intelligence. Apple will now use it from any app or screen on your phone by integrating it directly into the screenshot interface. With iOS 26, visual intelligence can now recognize what is on your screen, understand the reference, and recommend tasks. The example shown in Keenote was an event flyer where you take a screenshot and visual intelligence automatically creates a calendar event for it.
This is actually a step towards an AI agent, one of the most popular – and sometimes technical trends of 2025 are overheep. I am eager to try this feature and to see what else it can do. I have got good luck using the Samsung/Google version of this type of feature, which is called circle-to-search. Another new thing will let you do Visual Intelligence in iOS 26, this is what you have captured on your screen, ask the chatgate questions about it. Visual Intelligence can also take the lesson of what has captured in its screenshot and read it loudly or can be read briefly.
One of the excellent LLM capabilities extended this year can be seen in a shortcut, which can now tap in the Apple Intelligence model. For example, you can create a shortcut that will take any file to be saved on the desktop on MACOS 26, use Apple Intelligence to check the content of the file (preserve privacy), and then classify it and move it to one of the many different folders that you have named on the basis of the stories you have made. Every time you save a file on the desktop, you can automate it, which again makes it like an AI agent.
Too: Apple’s Goldillox approach for AI in WWDC is a winner. here’s why
Another method that is tapping in Apple LLM can be seen in new functionality in the share button in iOS 26. For example, now you can take a list of things from PDF or web page in safari, select text, tap on the share button, and then select the reminder app. Apple will use your generic model to analyze the intelligence list and will convert them into each category item in each category that you choose in the reminder app. If this is a long list, you can again break the AI in the subcontinent for you using LLM’s natural language processing (NLP) capabilities.
(Disclosure: ZDNET’s original company Ziff Davis filed a case of April 2025 against Openai, alleging that it violates Ziff Davis copyright training and operating its AI system.)
Apple’s AI for Developers
Finally, while most of the leading players in generic AIs usually offer a chatbot and a coding partner for software developers for the general public – because they are two things that are best known for LLM – Apple did not say much about building either in WWDC 2025. With thousands of developers on the campus of Apple Park, it seemed the right time to talk about both.
Too: Apple’s 7 Best AI features in Apple’s WWDC have been announced that I can’t wait to use
But all Apple would say about the next version of Siri-which needs to be reconsidered for a long time-it was still working on it and it would not release the next Siri until it meets the high standards of the company for user experience.
And when it comes to programming peers, Apple did not unveil his own coding copillot for developer tools like Swift and Xcode – after promising Swift Assist in last year’s WWDC. Instead, Apple made some big steps to empower developers. This opened its own Foundation Model Framework to allow developers to tap to the powers of Apple Intelligence – Swift, with three rows of code in claims of Apple. In addition, it all occurs on-device and without any cost.
Too: How Apple replaced the developer world with this AI announcement
And in Xcode 26, Apple will now allow developers to use the general coding partner of their choice. Chatgpt Xcode 26 is integrated by default, but developers can also use their API key from other providers to bring those models into Xcode. Apple sees it as a fast -speed location and wants developers to have access to the latest equipment of their choice, instead to limit the developers to the goods manufactured only by Apple.
In all, Apple is making a lot of practical options while talking about AI, bending in things that LLMS do best, and simply use generous AI to create better features on your phones, laptops and other equipment.
Keep all the latest AI developments for free of ZDNET for free from Apple and the rest AI ecosystems Tech Today Newsletter.