Home » Apple’s AI Overhaul: A Hybrid Siri Using Apple Models and a $1B Google Engine

Apple’s AI Overhaul: A Hybrid Siri Using Apple Models and a $1B Google Engine

by admin477351

Apple’s plan to fix Siri is a hybrid one, combining its own in-house AI models with a powerful $1 billion-a-year engine from Google. The company is finalizing a deal to license Google’s 1.2 trillion parameter Gemini AI to handle the most complex parts of the new Siri, specifically its “summariser” and “planner” functions. Simpler features will continue to run on Apple’s proprietary models, creating a two-tiered system designed to balance power and privacy.
This “interim solution” is the result of Apple’s “Glenwood” project, an internal effort to make Siri competitive. After testing models from OpenAI and Anthropic, Apple’s leadership, including Craig Federighi, chose Gemini as the best short-term path forward. The new Siri, code-named “Linwood” and set for a spring release (iOS 26.4), will leverage this 1.2 trillion parameter model—a massive upgrade from Apple’s 150-billion parameter AI—to process complex, multi-step user requests.
The most critical part of this hybrid system is its privacy architecture. The Google-built Gemini model will be hosted on Apple’s Private Cloud Compute servers. This “walled-off” environment ensures that Google never sees or processes any Apple user data, a non-negotiable point for Apple. The company has already allocated the significant server hardware required to run this advanced model independently.
This partnership will be a silent one. Apple will not publicly market Google’s involvement, treating it as a “behind-the-scenes” technology supplier. This is a starkly different approach from the visible Safari search deal and is also separate from earlier, failed talks about a standalone Gemini chatbot integration. Apple is embedding Google’s tech at a deep, functional level.
While this $1 billion-a-year deal provides a necessary immediate boost, Apple’s long-term ambition is to replace Google’s engine. Its internal AI teams are developing a 1 trillion parameter model to achieve this, but they are in a difficult race against Google’s Gemini 2.5 Pro, which already dominates AI benchmarks. This “temporary” hybrid system may be in place for some time.

You may also like