Breaking: January 21, 2026 – Apple just confirmed what everyone feared and nobody wanted to admit: the old Siri is dead. Completely dead. And instead of building their own AI to replace it, they’re paying Google $1 billion per year to power the future of iPhone voice interaction.
Let that sink in. Apple the company built on “we own the entire stack” is outsourcing the brain of Siri to their biggest rival.
The news dropped via Bloomberg’s Mark Gurman two days ago, and the tech world is still reeling. iOS 27, launching in September 2026, will feature a completely rebuilt Siri codenamed “Campos” that transforms Apple’s clunky voice assistant into a full ChatGPT-style conversational AI chatbot. Powered by Google Gemini.
This isn’t an incremental update. This is Apple admitting they lost the AI race, swallowing their pride, and handing Google the keys to the most valuable real estate in tech: your iPhone’s voice interface.
And honestly? It’s simultaneously the smartest and most embarrassing thing Apple has done in years.
Let me explain why this matters way more than you think.
The Admission Nobody Saw Coming: Craig Federighi’s Bombshell
Before we dive into what’s coming, we need to talk about WHY this is happening. Because the backstory here is absolutely wild.
At WWDC 2024 (June 2024), Apple unveiled Apple Intelligence and promised a revolutionary new Siri. The demos were impressive. The AI seemed capable. Tim Cook said the features would ship with iOS 18.
They didn’t.
Then Apple said iOS 18.1. Nope. Then iOS 18.4. Nope. Then iOS 19.1. Still nope.
Finally, in a rare moment of brutal honesty, Craig Federighi Apple’s SVP of Software Engineering admitted the truth:
“The new Siri did not reach the level of reliability Apple wanted, with internal testing revealing unacceptable error rates.”
Translation: Apple’s own AI sucked. They knew it. And they couldn’t ship it without destroying their reputation.
This is HUGE. Apple executives almost NEVER publicly admit product failures before launch. The fact that Federighi came out and said “our AI wasn’t good enough” tells you exactly how serious the problem was.
Apple had two choices:
- Delay indefinitely and build their own LLM from scratch
- Swallow their pride and partner with someone who already has world-class AI
They chose option 2. And the partner they chose? Google. Their biggest mobile OS competitor. The company they’ve been fighting in court for years. The company they literally compete with on every product category.
The irony is almost painful.
What’s Actually Coming: Meet “Campos” – The Siri That Actually Works
Okay, so what IS this new Siri chatbot? Let me break down everything we know from the Bloomberg report and subsequent leaks.
Official Code Name: Campos Launch Timeline: iOS 27 / iPadOS 27 / macOS 27 (September 2026) Preview: WWDC 2026 (June 2026) Underlying Technology: Google Gemini (custom Apple-configured version) Interface: Replaces current Siri completely
Here’s How It Actually Works:
Instead of the current Siri where you say “Hey Siri, set a timer for 10 minutes” and it works (maybe), the new Siri chatbot will function like ChatGPT or Claude full conversational AI that understands context, remembers what you said, and can handle complex multi-step requests.
Example Interaction:
You: “I need to prepare for my dinner party tomorrow.”
Current Siri: “Sorry, I didn’t understand that.”
New Siri (Campos): “I can help with that! Based on your calendar, you have 8 guests arriving at 7 PM. Would you like me to:
- Create a shopping list based on your saved recipes
- Set cooking timers for each dish
- Adjust your smart home lighting for ambiance
- Send calendar invites if you haven’t already
- Suggest a playlist based on your music preferences?”
You: “Yes, and remind me to start the appetizers by 6 PM.”
New Siri: “Done. I’ve created your shopping list, set a 6 PM reminder for appetizers, and queued up your dinner party playlist. Need anything else?”
THAT’S the difference. It’s not just command execution. It’s actual conversation with context awareness and proactive assistance.
The 6 Features That Make This Revolutionary
Based on the reports, here’s what the new Siri chatbot will ACTUALLY be able to do:
1. Natural Conversational Interface
The chatbot will have the same natural language conversation functionality as chatbots like ChatGPT.
No more rigid commands. You can talk to Siri like you’d talk to a person, with follow-up questions, clarifications, and context that carries across the conversation.
2. Deep Personal Context Understanding
Siri will be able to understand and use information from emails, messages, photos, calendar entries and files stored on the device.
Example: “What was that restaurant Sarah recommended?” Siri will search your Messages, find the conversation, pull the restaurant name, and potentially even make a reservation.
3. Visual Intelligence and Screen Awareness
Apple is designing a feature that will let the Siri chatbot view open windows and on-screen content, as well as adjust device features and settings.
Siri will be able to SEE what’s on your screen and interact with it. “Summarize this article I’m reading” or “Reply to this email saying I’ll be there at 3 PM.”
4. Complete App Integration
Siri will integrate into all Apple apps, including Photos, Mail, Messages, Music, and TV, and it will be able to access and analyze content in the apps to respond to queries and requests.
Every first-party app becomes Siri-controllable. “Find photos from my trip to Japan last year and create an album” or “Play the song from that movie we watched last week.”
5. Content Generation Capabilities
The chatbot will be able to search the web, generate content like images, help with coding, summarize information, and analyze uploaded files.
Siri becomes a creative assistant. Write emails, generate images, code snippets, summaries all the stuff ChatGPT does, but integrated into your phone.
6. Voice AND Text Interface
There will be voice and type-based interface options.
You can type to Siri when you’re in a meeting or public space, or speak when it’s convenient. Finally, flexibility that matches how people actually use their phones.
The Google Deal: $1 Billion Per Year for AI Supremacy
Let’s talk money. Because this is where things get really interesting.
Apple and Google have entered an AI technology partnership reportedly worth as much as $5 billion, with Apple paying roughly $1 billion annually.
One. Billion. Dollars. Per. Year.
That’s what Apple is paying Google for access to Gemini technology to power Siri. And it’s a multi-year deal, potentially totaling $5 billion over the contract duration.
But here’s the fascinating part: Apple isn’t just licensing Gemini as-is. They’re getting a custom-configured version that runs on Apple’s own Private Cloud Compute servers.
Apple can independently customize Google’s Gemini models without interference from partners. Apple can ask Google to tweak aspects of how the Gemini model works, but otherwise Apple can finetune Gemini on its own so that it responds to queries the way Apple prefers.
Translation: Apple pays for the technology, but maintains complete control over how it works, what data it sees, and how it responds. Google doesn’t get access to your data. They don’t control the experience. They just provide the AI engine.
And crucially: Any integration of Gemini into Apple products will not have any additional branding. Any responses Siri provides using the new models will not have any mention of other companies.
From a user perspective, this will just be “Siri.” No “Powered by Google” watermark. No branding. Just Apple’s interface with Google’s brain underneath.
It’s brilliant positioning, if you think about it. Apple gets world-class AI without building it themselves. Google gets $1 billion per year and distribution to billions of devices. Users get a Siri that finally doesn’t suck.
Everyone wins. Except Apple’s ego.
Why Google Won (And OpenAI Lost)
This deal is a MASSIVE win for Google in the AI wars. Here’s why:
1. Distribution at Scale
There are 2+ billion active Apple devices. Suddenly, Google’s Gemini AI is powering voice interaction for iPhone users worldwide. That’s distribution money can’t buy (well, $1B/year can, but you get the point).
2. Validation Over OpenAI
Apple’s decision to go with Google solidifies the narrative that Google has not only caught up with OpenAI, but has now edged past it in having the best AI models in the market.
Apple evaluated OpenAI (ChatGPT), Anthropic (Claude), and Google (Gemini). They chose Google. That’s a stamp of approval that’s worth billions in market perception.
3. Revenue Sharing Potential
With Gemini powering the new version of Siri, Google may get a share of any revenue those users generate through product discovery and purchases made through a Gemini-powered Siri.
If Siri helps you shop, book travel, or discover products, Google could get a cut. That turns a $1B annual payment into a potential revenue-generating partnership.
4. The Search Partnership Stays Intact
Google already pays Apple $18-20 billion per year to be the default search engine on Safari. This deal doesn’t disrupt that. In fact, it strengthens the Apple-Google relationship, making it even harder to break apart.
Meanwhile, OpenAI got nothing. They have a ChatGPT integration in iOS, but it’s a side feature that users can optionally enable. It’s not the core AI powering Siri. OpenAI CEO Sam Altman must be FURIOUS.
The Privacy Angle: How Apple Squares This Circle
The elephant in the room: how does Apple maintain its privacy reputation while sending user data to Google’s AI?
Apple’s Solution: Private Cloud Compute
Apple Intelligence will continue to run on Apple devices and its Private Cloud Compute system.
Here’s how it works:
Simple Tasks: Handled entirely on-device using Apple’s own small language models. “Set a timer,” “Call mom,” “Play music” all processed locally. Google never sees this data.
Complex Tasks: Sent to Apple’s Private Cloud Compute servers (NOT Google’s servers) running a custom version of Gemini. The data is encrypted end-to-end, processed, and immediately deleted. Google doesn’t get access. Apple controls the infrastructure.
No Data Sharing: Apple has emphasized that Apple Intelligence will continue to prioritize its established privacy standards by utilizing a combination of on-device processing and Private Cloud Compute.
So technically, your queries never touch Google’s infrastructure. They run on Apple’s servers using Google’s AI models. It’s like renting an engine and putting it in your own car that you drive on your own roads.
Is it perfect? No. Any cloud processing introduces some privacy risk. But it’s about as good as you can get while still having cutting-edge AI capabilities.
What This Means for iOS 26.4 (Launching March-April 2026)
Here’s where it gets confusing: there are TWO Siri updates happening in 2026.
iOS 26.4 (March-April 2026): A “more personalized version” of Siri with some Apple Intelligence features that were promised back at WWDC 2024. These are improvements to the current Siri, not the full chatbot replacement.
The Siri chatbot will be an upgrade to the more personalized version of Siri that Apple plans to roll out in iOS 26.4.
iOS 27 (September 2026): The full chatbot replacement codenamed Campos. This is the big one. This replaces the Siri interface entirely.
So if you’re keeping track:
- Now: Current Siri (pretty bad)
- March-April 2026: Improved Siri with Apple Intelligence (better, but not revolutionary)
- September 2026: Campos chatbot (completely new experience)
The Privacy Trade-Off Nobody’s Talking About
Here’s the uncomfortable question: how much will Siri “remember”?
ChatGPT remembers your conversations. Claude builds a profile of your preferences. Gemini learns from your interaction history. That’s how they get so good at personalization.
But that conflicts directly with Apple’s privacy stance.
Apple is considering how much the Siri chatbot will remember. Claude and ChatGPT are able to glean information about users from past conversations for personalization purposes, but Apple may limit Siri’s memory for privacy purposes.
Translation: Apple might cripple the chatbot’s memory to protect privacy, which would make it LESS useful than ChatGPT.
Imagine this scenario:
Monday: “Siri, I’m allergic to peanuts.” Friday: “Siri, recommend a restaurant for dinner.” Siri: “How about this Thai place? They have great peanut sauce dishes!”
If Siri doesn’t remember personal context, it can’t provide truly personalized assistance. But if it DOES remember everything, it’s collecting and storing massive amounts of personal data.
This is the fundamental tension Apple has to solve. And based on the report, they haven’t figured it out yet.
My Prediction: Apple will offer user-controlled memory settings. You can opt-in to full conversational memory for better personalization, or keep it limited for maximum privacy. Best of both worlds, but requires user education.
The Testing Reality: Why This Could Still Fail
Here’s something buried in the reports that should worry everyone: Apple is testing the Siri chatbot as a standalone app, but it won’t be offered in app form. Instead, it will be built directly into Apple devices.
Translation: They’re building a standalone chatbot app for testing, then integrating it into the OS. That’s a HUGE technical challenge.
Why? Because chatbots as apps can fail gracefully. If ChatGPT crashes, your phone still works. But if the core Siri system crashes? Your whole voice interface breaks. Notifications might fail. Shortcuts stop working. It’s a much higher stakes integration.
And remember: Craig Federighi said the new Siri did not reach the level of reliability Apple wanted, with internal testing revealing unacceptable error rates.
If Apple’s OWN AI had unacceptable error rates, what makes anyone think Google’s Gemini will be perfect when integrated into iOS?
The risk is real. Apple is betting the entire Siri experience a feature used billions of times per day on a third-party AI model they didn’t build and don’t fully control.
If it works: Apple looks like geniuses who made the smart business decision to partner rather than waste years building inferior technology.
If it fails: Apple looks incompetent for outsourcing their core voice interface and losing control of the user experience.
There’s no middle ground here.
The Snow Leopard Comparison: What It Really Means
Bloomberg’s Mark Gurman said that iOS 27 will be similar to Mac OS X Snow Leopard, in the sense that it will focus on stability improvements and bug fixes.
For those who don’t remember: Mac OS X Snow Leopard (2009) was a “refinement” release. No major flashy features. Just making everything that already existed work BETTER.
Translation for iOS 27: Don’t expect a ton of new features. Expect Apple to focus on making the AI integration solid, fixing bugs, improving performance, and ensuring reliability.
That’s actually smart. The Siri chatbot is such a massive change that layering on a bunch of other new features would be risky. Better to nail the foundation, then build on it in iOS 28.
But it also means iOS 27 might feel underwhelming to users who don’t care about Siri. If you don’t use voice assistants much, this entire update might not matter to you.
The Timing: WWDC 2026 and the 50th Anniversary Play
WWDC 2026 is in June. iOS 27 launches in September. And April 1st, 2026 is Apple’s 50th anniversary.
Connect the dots:
April: 50th anniversary celebrations, possibly hardware announcements (HomePad with the new Siri?) June: WWDC preview of iOS 27 with Campos chatbot demo September: iOS 27 public launch alongside iPhone 18
It’s a perfect narrative arc: “For our 50th year, we’re reinventing how you interact with technology through the most advanced AI assistant ever created.”
Apple loves a good story. And this one writes itself.
Who This Actually Helps (And Who It Doesn’t)
Winners:
Power Users Who Gave Up on Siri If you abandoned Siri years ago because it was useless, this brings you back. Finally, a voice assistant worth using.
People Deep in the Apple Ecosystem If you have iPhone, iPad, Mac, Apple Watch, HomePod—having a competent AI that works across all of them seamlessly is huge.
Developers SiriKit gets a massive upgrade. Apps that integrate with Siri suddenly become way more powerful when Siri can actually understand complex requests.
Apple (If It Works) Credibility restored. AI narrative fixed. User engagement increases. Potential new revenue streams from AI-powered commerce.
Google $1B/year plus distribution to billions of devices plus validation as the best AI. Total win.
Losers:
OpenAI They lost the biggest AI partnership in tech to Google. That’s going to hurt long-term growth.
Users Who Don’t Trust Google If you hate Google and refuse to use their products, well… tough. Siri runs on Gemini now. Your options are: use it anyway, or don’t use Siri.
Privacy Purists Even with Private Cloud Compute, sending data to cloud-based AI is inherently less private than on-device processing. If you’re hardcore about privacy, this might feel like a betrayal.
Apple’s Engineering Pride This is the company that used to build EVERYTHING in-house. Now they’re buying AI from Google like a startup that couldn’t build its own tech stack. That’s gotta sting.
The Unanswered Questions That Keep Me Up At Night
1. What happens if Google and Apple have a falling out? This is a multi-year deal. But what if antitrust regulators force them apart? What if the relationship sours? Apple would be left with no AI backbone for Siri.
2. Will Apple ever build their own LLM? Probably. The Google deal buys them time to develop their own technology. But that could take 3-5 years. Are they okay being dependent on Google that long?
3. How will this affect battery life? ChatGPT-level interactions require significant processing power. Even with on-device + cloud hybrid approach, battery impact could be noticeable.
4. What about offline functionality? Current Siri works (mostly) offline for basic tasks. Will the new chatbot require internet? That’s a huge downgrade for users in areas with poor connectivity.
5. Will other languages launch simultaneously? Siri already supports dozens of languages. Training a chatbot on that many languages is HARD. Will English launch first and other languages lag?
6. What’s the fallback if Gemini fails? If there’s an outage, does Siri just… stop working? Or is there a degraded mode that falls back to the old command-based system?
My Brutally Honest Take: Smart Move, Risky Execution
Apple did the right thing partnering with Google. Building a competitive LLM from scratch would take years and billions of dollars, and they’d still probably end up with something worse than Gemini.
But the execution risk is ENORMOUS.
Integrating a third-party AI model as the core intelligence layer of your operating system is unprecedented. No major tech company has done this before. Everyone else either built their own AI (Google, OpenAI, Anthropic) or uses it as an optional add-on (Microsoft with OpenAI).
Apple is betting their most-used feature voice interaction on technology they didn’t create and don’t fully control.
If it works: This is remembered as the moment Apple pragmatically admitted they couldn’t win the AI race alone and made the smart business decision to partner.
If it fails: This is remembered as the moment Apple lost their way, abandoned their vertical integration philosophy, and handed control of the user experience to their biggest competitor.
There’s no middle ground.
My Prediction: 70% chance this works reasonably well. Users will mostly be happy. Siri will finally be good enough that people actually use it.
30% chance there are significant problems reliability issues, privacy concerns, performance degradation that force Apple to backtrack or heavily modify the approach in iOS 28.
The Bottom Line: A New Era for Siri (For Better or Worse)
Siri has been a punchline for years. The assistant that couldn’t assist. The AI that wasn’t intelligent. The voice interface that made you long for buttons and keyboards.
That era is over.
iOS 27 brings Campos a Google Gemini-powered chatbot that replaces Siri completely with actual conversational AI that (theoretically) works.
Apple is paying $1 billion per year for this privilege. They’re swallowing their pride. They’re admitting they lost the AI race. They’re outsourcing the brain of their devices to their biggest rival.
And you know what? It’s probably the smartest thing they could do.
Because the alternative shipping an inferior AI and watching users flee to Android or just stop using voice assistants entirely would be way worse than the ego hit of partnering with Google.
We’ll get our first look at WWDC in June 2026. We’ll get our hands on it in September 2026 with iOS 27.
Until then, all we can do is hope that Apple and Google pull off one of the most audacious technology partnerships in history.
Because if they don’t? Every “Hey Siri” interaction is about to become a whole lot more frustrating.
Are you excited about the new Siri chatbot, or worried that Apple is outsourcing too much to Google? Would you trade some privacy for actually useful AI, or is that a line you won’t cross? And be honest—when was the last time Siri actually understood what you wanted on the first try? Drop your thoughts in the comments.
P.S. – The absolute WILDEST part of this story? Apple executives literally said “our AI wasn’t good enough” and nobody made fun of them for it. That’s how desperate everyone is for Siri to not suck. We’re so beaten down by years of “Sorry, I didn’t understand that” that we’re actually rooting for Apple to succeed with someone else’s technology. What a time to be alive.


Leave a Reply