Apple is apparently on the verge of unveiling the first concrete results from its collaboration with Google Gemini. After years of struggling to develop its own powerful AI models for Siri and Apple Intelligence, the company is now relying on external support. Users should be able to see the practical effects of this partnership as early as next month.
The announcements surrounding Apple Intelligence at WWDC in June 2024 raised high expectations. At the same time, implementation took significantly longer than planned. Many features were postponed multiple times or only vaguely promised. With the switch to Gemini models, Apple now seems to have adopted a pragmatic approach to finally deliver on its promises.
Apple and Google: Gemini integration is getting closer to launch
Earlier this month, Apple and Google officially confirmed their collaboration in the field of artificial intelligence. In the future, Gemini models will form the basis for key Apple intelligence features.
According to Mark Gurman of Bloomberg, this partnership is scheduled to begin with the iOS 26.4 beta. The rollout is expected in the second half of February. Apple plans to publicly showcase the new features and present concrete demos.
It's still unclear in what format this presentation will take place. Both a larger event and a smaller, strictly controlled press event are possible, perhaps at Apple's media loft in New York. Regardless of the format, the presentation is only a few weeks away.
New Siri features with iOS 26.4
With iOS 26.4, Siri is set to become noticeably more powerful. The assistant will better understand what's happening on the screen, recognize personal connections, and actively perform actions within apps. This brings Siri closer, for the first time, to the range of functions Apple promised almost two years ago.
Technically, these innovations are based on the so-called Apple Foundation Models v10. These run on Apple's private cloud computing infrastructure and utilize Gemini models. Internally, a model with approximately 1.2 trillion parameters is used. For Apple, however, this is not the end of development, but rather a first step.
Preview of iOS 27 and macOS 27
Apple is already working on a further development stage. iOS 27 and macOS 27 are expected to integrate additional chatbot-like functions. These will be supported by the Apple Foundation Models v11, which, according to Gurman, are intended to be of a quality approaching that of Gemini 3.
Compared to the models in iOS 26.4, they are considered significantly more powerful. However, due to their higher demands, these functions may be more reliant on Google's infrastructure. Internal discussions regarding the scope and implementation of the iOS 27 features are still ongoing.
- Apple has big plans for AI's future: Siri 2.0 is just the beginning
- Apple & Google: New AI deal fundamentally changes Siri
Why Apple chose Google
Before deciding on Google Gemini, Apple considered several alternatives. It was already revealed last year that Apple was considering using third-party models. Internally, this approach initially met with resistance. Mike Rockwell, who is responsible for Siri, is even said to have dismissed such reports as nonsense in a crisis meeting.
Nevertheless, Apple held talks with several AI providers. Negotiations with Anthropic took place, but the talks fell through after the company demanded annual costs in the billions over several years. A deal with OpenAI also proved complicated. OpenAI was actively poaching Apple employees and simultaneously working with Jony Ive on new hardware.
Another factor was a court ruling regarding the existing partnership between Apple and Google. After a court decided that the agreement for the default search engine on the iPhone was legally permissible, an additional deal with Google became significantly easier. This cleared the way for Gemini.
- Apple plans to use Siri as an AI chatbot: All the details on iOS 27
- iOS 26.4 is on its way: Here's what's included in the next update
- Apple under pressure: Why Siri as a chatbot could cost money
New Siri features mark a turning point for Apple
Everything indicates that Apple will soon deliver tangible Apple Intelligence features for the first time. Even though a considerable amount of time has passed between announcement and implementation, public sentiment could quickly change once the new Siri features can be tested in everyday life. The collaboration with Google marks a clear strategic shift for Apple – away from going it alone internally and towards functional solutions. (Image: Mojahid_Mottakin / DepositPhotos.com)
- Apple plans to manufacture some of its own chips with Intel
- iPhone 18 Pro: This is how much the new Dynamic Island is shrinking
- Apple Design under new leadership: John Ternus in focus
- iPhone Rumors 2027: All-Screen Later Than Expected?
- Report: Apple changes AI strategy under Craig Federighi
- Apple Home Hub: New evidence of a rotatable robot base
- Apple plans AI pin: New wearable could arrive in 2027
- iOS 27: Siri becomes a chatbot with deep system access
- Apple is using AI chatbots internally to increase productivity
- iPhone 18: New leak suggests extremely bright display
- iPhone 18 Pro: Smaller Dynamic Island instead of a radical redesign
- iPhone Air 2: Will there be an update in 2026 or not until 2027?
- iPhone 18 Pro: Leaker contradicts rumors about the front camera
- iPhone 18 Pro and Face ID: What Apple is planning for the display
- M5 Max MacBook Pro before release: Performance predictions
- iPhone 18 Pro: New front design with camera on the left?
- MacBook Pro with OLED takes a crucial production step
- iPhone Fold: New details on technology, AI & market strategy
- Apple and TSMC: AI boom puts the chip alliance to the test
- iPhone 17e: Dynamic Island is coming, display remains at 60 Hz
- iOS 26.4 could significantly improve Apple Notes
- iPhone Fold: Liquid metal hinge and titanium casing
- iPhone 18 Pro: Leak suggests major changes to the display



