The Siri joke is ending
For years, "Siri is bad" has been one of the most consistent complaints in tech — a running joke that Apple acknowledged with "improvements" that never quite addressed the fundamental gap between Siri and competitors.
2026 is when that finally changes. Apple has officially announced a completely reimagined AI-powered Siri, and for once, the underlying architecture change is significant enough that the skepticism feels less justified than usual.
What's actually different
The old Siri was fundamentally a voice interface to structured Apple services. Ask about the weather, set a timer, call a contact — it handled these tasks reliably. Ask it to do anything involving reasoning, synthesis, or context across apps, and it fell apart.
The new Siri is being rebuilt around "on-screen awareness" — the ability to understand and act on whatever is currently visible on the device. This sounds small. It isn't.
Imagine asking Siri to "send this to my manager" while looking at an email, and Siri knows who your manager is from your contacts, knows you mean email (because you're in Mail), and completes the action. Or asking "summarize what we discussed" while in Messages, and getting an intelligent summary of a conversation thread.
This is cross-app integration at a level Siri has never achieved. The on-screen awareness capability is what makes this architecturally different rather than incrementally improved.
The Google Gemini partnership
Here's the detail that surprised everyone: Apple is partnering with Google to use its Gemini model for the underlying AI capabilities.
The model in question is Google's 1.2 trillion parameter Gemini — among the largest and most capable publicly documented models. Apple is not building this from scratch. They're building the interface, the privacy layer, and the Apple-specific integration on top of a model they didn't train.
The privacy implications matter enormously here. Apple's primary AI differentiation has been on-device processing. With Gemini integration, the architecture involves sending some queries to Google's infrastructure — which Apple is trying to address with their Private Cloud Compute infrastructure, but which represents a genuine departure from their fully on-device approach.
What it means for the App Store ecosystem
The cross-app integration capability has significant implications for developers. If Siri can intelligently navigate between apps, execute multi-step tasks, and understand context from any app on the device, it creates a layer above individual apps that didn't exist before.
Apple has announced an expanded Siri API that allows apps to declare capabilities and actions that Siri can invoke. The developer opportunity here is essentially: "make your app's key functions available as Siri actions and benefit from users discovering your app through voice."
Whether Apple maintains the appropriate balance between enabling developer integration and leveraging it for their own apps will be a question regulators and developers watch closely.
The timeline
A completely rebuilt AI system launching in 2026 means phased rollout. Apple's engineering reality is that the core on-screen awareness features are launching with iOS 20 / iPadOS 20 in the fall 2026 update cycle. Some features — particularly the most sophisticated cross-app scenarios — are described as "later this year" rather than day one.
Historically, "later this year" from Apple can mean Q1 of the following year. Manage expectations accordingly.
My honest take
Apple moving from "Siri is a voice shortcut system" to "Siri is a genuine AI assistant that understands your context" is the right direction. The on-screen awareness architecture is genuinely clever — it lets the model work with context that users have already generated rather than asking them to provide context explicitly.
The Google partnership is pragmatic and surprising. Building a frontier model yourself takes years and resources even Apple would feel. Using Gemini gets them to capability parity faster.
The risk: Apple's differentiation in AI has been privacy. If that's now shared with Google, what's the Apple advantage?
The bet they're making: the integration quality, the Apple ecosystem coherence, and the hardware (Apple Silicon) will create an experience better than anything Google offers in its own products, even if Gemini is the underlying model.
That bet could pay off. Apple has been good at integration before.