Apple is in the process of developing a groundbreaking new version of Siri that promises to surpass the current assistant in all aspects. This next-generation Siri will leverage sophisticated large language models (LLMs) akin to popular AI chatbots such as ChatGPT and Claude. Unlike the outdated capabilities of the existing Siri from 2012, this upgraded assistant will be smarter, more efficient, and capable of handling complex tasks seamlessly.
The forthcoming iteration of Siri is expected to utilize advanced large language models, enabling it to engage in natural, ongoing conversations and deliver human-like answers. With these improvements, LLM Siri will also be equipped to tackle more intricate tasks, transforming it into a truly powerful digital assistant.
Apple's journey to enhance Siri hasn't been without challenges. Initially unveiled during the 2024 Worldwide Developers Conference alongside the introduction of iOS 18, the smarter, Apple Intelligence-powered version of Siri generated significant excitement. However, after months of anticipation, Apple announced that these anticipated features would be delayed. In March 2025, the company confirmed that the new Apple Intelligence Siri enhancements would not be available in iOS 18 as planned, pushing the release timeline to 2026.
The delayed functionality was not the highly anticipated LLM version of Siri, but rather an intermediate upgrade that promised enhanced intelligence without full chatbot capabilities. Some key features awaiting release include:
Personal Context: This feature will allow Siri to remember details about your emails, messages, files, and more, enabling it to assist you more effectively. Imagine requesting, "Show me the files Eric sent last week" or "What's my passport number?"Onscreen Awareness: With this capability, Siri will be able to interpret what's displayed on your screen and perform actions relevant to that content. For instance, if you receive a text containing an address, you can instruct Siri to add it to a contact.Deeper App Integration: This enhancement will allow Siri to perform tasks across multiple applications, such as moving files between apps, editing images, and sharing estimated arrival times with contacts.Apple's missteps with the current version of Siri were addressed during the launch of iOS 26. Apple’s software engineering chief, Craig Federighi, explained that the initial architecture for the personalized Siri features was too limited. Recognizing the need for a more robust solution, Apple decided to pivot to a second-generation architecture based on large language models. This transition has necessitated delays in releasing new features but promises a more substantial upgrade than originally envisioned.
The challenges faced by Siri led to significant changes within Apple’s AI team. John Giannandrea, Apple’s AI chief, was removed from the Siri leadership team, with Mike Rockwell, head of Vision Pro, stepping in to oversee the project. This restructuring aims to revitalize the development of Siri and restore confidence in its capabilities.
Amid low morale among AI employees and competition from companies like Meta, there are rumors suggesting that Apple may collaborate with external AI firms, including OpenAI and Anthropic, to enhance Siri. These partnerships could facilitate the development of a more sophisticated Siri while allowing Apple to continue refining its own large language model in the background. However, no final decisions have been made yet.
Apple's commitment to improving Siri reflects its strategy to stay at the forefront of AI technology. With the integration of large language models and a focus on delivering a comprehensive digital assistant, Apple aims to redefine what users can expect from Siri. The upcoming years will be crucial as Apple navigates these enhancements and potential partnerships, striving to meet its high standards for innovation and user satisfaction.