Last year, Apple’s WWDC Keenote highlighted the company’s ambitious progress in AI. This year, the company laid its emphasis on Apple Intelligence and focused on its operating system, services and updates, a new aesthetics called “liquid glass” with a new naming conference.
Nevertheless, Apple still tried to please the crowd with some AI-related announcements, such as an image analysis tool, a workout coach, a live translation facility, and more.
Visible intelligence
Visual Intelligence is Apple’s AI-powered image analysis technology that allows you to gather information about your environment. For example, it can identify a plant in a garden, tell you about a restaurant, or identify someone worn.

Now, this feature will be able to interact with information on your iPhone screen. For example, if you come in a post on the social media app, visual intelligence can find an image that you see while browsing. The tool searches using Google search, chatgpt and similar apps.
To reach visual intelligence, open the control center or customize the action button (the same button is usually used to take a screenshot). This feature becomes available with iOS 26 when it is launched later this year. Read more.
Chatgpt image comes in the playground
Apple integrated the chat in the image playground, its AI-driven image generation equipment. With Chatgpt, the app can now produce images in new styles such as “anime,” “oil painting,” and “watercolor”. There will also be an option to send a prompt to Chatgpt to create additional images. Read more.
Workout friend
Apple’s latest AI-powered workout coach really seems to be-it uses a text-to-speech model to encourage an individual trainer’s voice, to encourage while exercising. When you start a run, AI within the workout app gives you a motivational thing, such as highlights important moments such as when you run your fastest mile and your average heart rate. After completing the workout, AI summarizes your average speed, heart rate, and have you achieved any milestones. Read more.
Live translation
Apple is giving power to a new live translation feature for Intellous Messages, facetime and phone calls. This technique automatically translates the text or spoken words into the user’s favorite language in real time. During the facetime call, the users will see live captions, while for phone calls, Apple will translate the conversation loudly. Read more.

AI helps with unknown callers
Apple has introduced two new AI-operated facilities for phone calls. The first is known as a call screening, which automatically answers the call from unknown numbers in the background. This allows users to make calls before deciding before giving the name of the caller’s name and call.
The second feature, hold assist, automatically detects music while waiting for the call center agent. Users can choose to stay connected while on hold, allowing them to use their iPhone for other tasks. Information will alert them when a live agent is available. Read more.
Poll suggestion in messages
Apple also introduced a new feature that allows users to create elections within the message app. This feature uses Apple Intelligence to suggest elections based on the context of your interaction. For example, if people in group chats are having trouble deciding where to eat, Apple Intellure will recommend starting a survey to help the ground on a decision. Read more.

AI-Invented Shortcut
The shortcut app is becoming more useful with Apple Intelligence. The company explained that while creating shortcuts, users will be able to select AI models to enable features such as AI summary. Read more.

Relevant conscious spotlight
An on-device search feature for Mac, a minor update for the spotlight is being introduced. It will now involve Apple Intelligence to improve its relevant awareness, which users usually perform, and provide suggestions for tasks to suit their current tasks. Read more.
Foundation model for developers
Apple is now allowing developers to reach their AI model even when they are offline. The company introduced the Foundation Model Framework, which enables developers to manufacture more AI abilities in their third-party application that uses Apple’s current system. This is probably likely to encourage more developers to create new AI features as Apple competes with other AI companies. Read more.
Apple’s AI-Operated Siri Setback
The most disappointing news to emerge from the incident was that the much -awaited developments for Siri are not yet ready. The attendees were keen on a glimpse of the promised AI-operated facilities, which were expected to start. However, SVP Craig Federeghi of Apple of Software Engineering said they would not have much to share by next year. This delay can raise questions about Apple’s strategy for Voice Assistant in a fast competitive market. Read more.