If you have recently upgraded to a new iPhone model, you probably have seen that Apple Intelligence is appearing in some of your most used apps, such as messages, mails and notes. Apple Intelligence (yes, AI was also abbreviated) was shown in Apple’s ecosystem in October 2024, and it is to live here because Apple competes to create the best AI tool with Google, Openai, Ethropopic and others.
What is Apple Intelligence?

Cupertino marketing officials have branded Apple Intellation: “AI for the rest of us.” The platform is designed to take advantage of things that tribal AIs already do well, such as lessons and image generations, to improve existing features. Like other platforms, including Chatgpt and Google Gemini, Apple Intellation was trained on large information models. These systems use deep learning to make connections, whether it is lesson, picture, video or music.
The LLM offers the text operated by the LLM, presenting itself as a writing tool. This feature is available in various apple apps, including mail, messages, pages and notifications. It can be used to write long texts, proofrides and even messages to you using materials and tone signals.
The image generation is also integrated in the same way, similarly – although slightly less basically. Users may indicate apple intelligence to generate custom emozis in Apple House Style. Meanwhile, the image playground, a standalone image is a generation app that indicates to create visual materials, which can be shared through the message, the main speaker or social media.
Apple also marks a long-awaited face-lift for Intelligence Siri. Smart auxiliary was early for sports, but is mostly neglected over the past several years. Siri is very deeply integrated into Apple’s operating system; For example, instead of familiar icons, the users will see a glowing light around the edge of their iPhone screen, when it is talking about it.
More important, the new Siri works in apps. This means, for example, that you can ask Siri to edit the photo and then put it directly into a text message. This is a frictionless experience that was a shortage of assistants. Onscreen awareness means that Siri uses reference to the material that you are currently to provide a proper answer.
WWDC leading by 2025, many hoped that Apple would introduce us to even more soup-up version of Siri, but we have to wait a little.
Apple SVP of Software Engineering Craig Federghi at WWDC 2025 said, “As we shared, we are continuing our work, making Siri even more individual,” WWDC 2025 said the Software Engineering Craig Federighi’s Apple SVP said. “This task requires more time to reach our high quality bar, and we are ready to share more about it in the coming year.”
It should be able to understand the more individual version of Siri, such as your relationship, communication routine, and more. But according to a report by Bloomberg, there is an in-development version of this new bear Very error for shipHence its delay.
In WWDC 2025, Apple also unveiled a new AI feature called Visual Intelligence, which helps you find an image for things you browse. Apple also unveiled a live translation facility that can translate the conversation in real time in messages, facetime and phone apps.
Visual intelligence and live translation are expected to be available later in 2025, when iOS 26 is launched to the public.
When was Apple Intelligence unveiled?
After months of speculation, Apple Intelligence took the center stage at WWDC 2024. The forum was announced the forum in view of a torrent of liberal AI news from companies like Google and Open AI, which worried that the famous tight-tight technology giants recalled the boat on the latest technical craze.
Unlike such speculation, however, Apple had a team, which proved to be a very apple approach for artificial intelligence, was working on. There was still pizzz between the demo – Apple always prefers to keep it on a show – but Apple Intelligence is ultimately a very practical in the category.
Apple Intelligence is not a standalone facility. Rather, it is about integrating in existing offerings. While it is a branding practice in a very real sense, the large language model (LLM) will work behind the curtain -operated technology curtains. As far as the consumer is concerned, the technology will mostly present itself as new features for the current apps.
We learned more during Apple’s iPhone 16 event in September 2024. During this phenomenon, Apple postponed many AI-operated features on its equipment, translating on Apple Watch Series 10, visual search on iPhones, and many twice for Siri’s abilities. The first wave of Apple Intelligence is coming in late October, as part of iOS 18.1, iPados 18.1 and Macos Sequoia 15.1.
Facilities launched earlier in American English. Apple later added Australian, Canadian, New Zealand, South African and UK English localization. Support for Chinese, English (India), English (Singapore), French, German, Italian, Japanese, Korean, Portuguese, Spanish and Vietnamese will come in 2025.
Who gets Apple Intelligence?

The first wave of Apple Intelligence arrived in October 2024 va iOS 18.1, iPados 18., and Macos Sequoia 15.1 updates. These updates consisted of a typing input for integrated writing tools, image cleaning, article summary and re -designed Siri experience. A second wave of features became available as part of iOS 18.2, iPados 18.2 and Macos Sequoia 15.2. That list includes genmosy, image playground, visual intelligence, image wand and chatgpt integration.
These offerings are free to use, so until you have one of the following pieces of hardware:
- All iPhone 16 models
- iPhone 15 Pro Max (A17 Pro)
- iPhone 15 Pro (A17 Pro)
- iPad Pro (M1 and later)
- iPad Air (M1 and later)
- iPad Mini (A17 or later)
- MacBook Air (M1 and later)
- MacBook Pro (M1 and later)
- IMAC (M1 and later)
- Mac Mini (M1 and later)
- Mac Studio (M1 Max and later)
- Mac Pro (M2 Ultra)
In particular, only the pro versions of the iPhone 15 are getting access to the shortcomings of the standard model only due to the deficiencies. Probably, however, apple will be able to run intelligence when the entire iPhone 16 comes in line.
How does Apple’s AI work without internet connection?

When you ask GPT or Gemini a question, your query is being sent to the outer server to generate a response, which requires internet connection. But Apple has taken a small model, BESPOKE approach for training.
The biggest advantage of this approach is that many of these functions become very low resources intensive and it can be on-device. This is because instead of relying on a kitchen sink approach that fuel platforms like GPT and Gemini, the company has compiled a dataset in the in-house for specific tasks to create an email.
However, everything does not apply. More complex questions will use new private cloud compute offers. The company now operates a remote server running on Apple Silicon, claiming that it allows it to offer confidentiality as its consumer equipment. Whether an action is being taken local or the user will be invisible through the cloud, until their device is offline, the point at which the remote query will toss an error.
Apple intelligence with third-party apps

Before the launch of Apple Intelligence, there was a lot of noise about Apple’s pending partnership with Openai. Finally, however, it turns out that the deal was low about powering apple intelligence and it is not actually made more about offering an alternative platform for those things. It is a silent acceptance that the construction of a small-model system has its limit.
Apple Intelligence is free. So, too, there is access to Chatgpt. However, people with later paid accounts will have access to free users for premium features including unlimited query.
There are two primary roles of Chatgpt integration, which debuts iOS 18.2, iPados 18.2, and Macos Sequoia 15.2, complementing Siri’s knowledge base and adding to existing writing equipment options.
With the service being capable, some questions will ask the user to approve their accessing chatgpt. Recipes and travel plans are examples of questions that can keep the option on the surface. Users can also indicate the Siri directly for “asking the chatter”.
Apple has other primary chatgpt facilities available through intelligence. Users can access it in any app that supports new writing tools feature. Composes add the ability to write material based on an indication. It is included in existing writing tools such as style and summary.
We know to ensure that Apple planned to partner with additional liberal AI services. The company all said that Google Gemini is ahead in that list.
Can developers build on Apple’s AI model?
In WWDC 2025, Apple announced what the foundation says to the model framework, which will allow developers to tap in their AI model while offline.
This makes it more possible for developers to manufacture AI features in their third party app that take advantage of Apple’s existing system.
“For example, if you are getting ready for an exam, an app like Kahoot can create a personal quiz from your notes so that more attractive studies can be done to study.” “And because it occurs using an on-device model, it is without cloud API cost (…), we cannot be more excited about how developers can build on Apple Intelligence to bring your new experiences that can build on Apple Intelligence that are smart, when you are offline, and it protects your privacy.”

