At last year's Apple WWDC keynote, the tech giant showcased its ambitious advancements in artificial intelligence. However, this year, Apple shifted its focus away from Apple Intelligence and directed attention towards updates in its operating systems, services, and software. A notable introduction was a fresh aesthetic dubbed “Liquid Glass” along with a new naming convention. Despite this pivot, Apple still made headlines with several AI-related announcements, including an innovative image analysis tool, a virtual workout coach, and a live translation feature.
One of the standout features announced is Visual Intelligence, an AI-powered image analysis tool designed to provide users with insightful information about their surroundings. For example, this technology can identify plants in a garden, provide details about nearby restaurants, or recognize clothing items. The upcoming iteration will enhance its capabilities by allowing interactions with information displayed on your iPhone's screen. If you encounter a post on social media, Visual Intelligence can execute an image search related to the content you see, utilizing platforms like Google Search and ChatGPT. Users can access Visual Intelligence by opening the Control Center or customizing the Action button, which is traditionally used for taking screenshots. This feature will be available with the upcoming iOS 26 release later this year.
In an exciting integration, Apple has incorporated ChatGPT into its Image Playground, an AI-driven image generation tool. This enhancement allows the app to produce images in various styles, including “anime,” “oil painting,” and “watercolor.” Users will also have the option to send prompts to ChatGPT for creating additional customized images, further diversifying the creative possibilities within the app.
Apple unveiled its latest innovation in fitness technology—a new AI-driven workout coach. This feature mimics a personal trainer's voice through a text-to-speech model, offering encouragement and motivation as you exercise. When you start a run, the AI will provide a motivational speech, highlighting significant milestones such as your fastest mile and average heart rate. After completing your workout, the AI summarizes your performance, including average pace and achieved milestones, making it an essential companion for fitness enthusiasts.
Another groundbreaking feature powered by Apple Intelligence is the new live translation capability, which will enhance user experiences in Messages, FaceTime, and phone calls. This technology automatically translates text and spoken words into the user's preferred language in real time. During FaceTime calls, users will benefit from live captions, while phone calls will feature audible translations, streamlining communication across language barriers.
Apple also introduced two innovative AI-driven features for phone calls. The first, known as call screening, automatically answers calls from unknown numbers in the background, allowing users to hear the caller's name and reason for the call before deciding to answer. The second feature, hold assist, detects hold music during calls with customer service agents. Users can choose to remain on hold while multitasking on their iPhones, receiving notifications when a live agent is available.
Another useful addition to the Messages app is the ability to create polls, leveraging Apple Intelligence. This feature suggests poll options based on the conversation context. For instance, if a group chat is struggling to choose a restaurant, Apple Intelligence will recommend initiating a poll to facilitate decision-making, enhancing group interactions.
The Shortcuts app is also benefiting from Apple Intelligence enhancements. Users will soon be able to select an AI model while creating shortcuts, enabling features like AI summarization. This improvement aims to streamline user tasks and boost productivity by automating routine actions.
A minor but impactful update to the Spotlight search feature on Mac will incorporate Apple Intelligence to enhance contextual awareness. This upgrade will provide users with tailored action suggestions based on their current tasks, improving the overall search experience.
In a move to foster innovation, Apple has introduced the Foundation Models framework, allowing developers to access its AI models even when offline. This initiative aims to empower developers to integrate advanced AI capabilities into their third-party applications, positioning Apple as a competitive player in the AI development landscape.
While Apple's focus at this year's WWDC may have shifted from its previous AI-centric spotlight, the company has still made significant strides in integrating artificial intelligence into its products and services. With features like Visual Intelligence, live translation, and AI-powered tools, Apple continues to innovate, enhancing user experiences and setting the stage for future advancements in technology.