Apple is planning the most comprehensive Siri overhaul since the voice assistant's introduction for iOS 27. A new system-wide gesture, a dedicated Siri app with chat history, and a permanent spot in the Dynamic Island are just three of the key elements. Just weeks before WWDC, further concrete design details are leaking out.
The information comes from reports by Mark Gurman at Bloomberg. They outline a Siri that operates much more like ChatGPT, Claude, or Gemini than the classic iPhone voice assistant users have known since 2011. Apple has been outlining the general framework in several stages for months – the details that have now leaked provide the concrete plan that Apple intends to present at WWDC 2026 on June 8th.
The new system gesture "Search or Ask"
The key new feature is a system-wide search gesture that works throughout iOS 27. Swiping down from the top of the screen in any app activates a new "Search or Ask" bar that appears in the Dynamic Island. Tapping the bar then allows you to switch between Siri, ChatGPT, Gemini, or other search providers. A microphone icon switches to voice mode.
Functionally, this is very similar to today's Spotlight search. However, Apple's approach goes further in terms of content: According to the report, the bar will display "more advanced results and additional data from apps." This means that Spotlight's understanding of itself as a purely file and contact search disappears – instead, the search becomes the primary interface between the user, the system, and AI-generated answers from the web.
Siri in Dynamic Island
When Siri is activated via the wake word or the iPhone's side button, a pill-shaped animation appears in Dynamic Island. If the user asks a question or gives a command, Siri initially responds via a transparent results card. Swiping down transforms this card into a full-fledged chat mode, visually reminiscent of iMessage.
In this chat mode, Apple integrates additional small cards – for example, for weather, notes, upcoming appointments, or other information relevant to the specific query. The answer, the result card, and the follow-up conversation thus flow seamlessly together. Those with a quick question get a fast answer. Those who want to delve deeper can slip into a full-fledged conversation without having to switch apps.
For the first time, a dedicated Siri app
Alongside the system integration, Apple is building its first dedicated Siri app. The interface is clearly inspired by OpenAI's ChatGPT app and Anthropic's Claude: there's an overview of previous conversations, a search bar, a "+" button for new chats, and a grid of preview tiles that allow users to tap to access previous conversations.
The app supports uploading images and documents. Users who access Siri through the app can both type and speak. This represents a clear departure from the previous logic, where Siri existed solely as a situational overlay assistant. With its own app, Siri becomes a permanently accessible space where users can start, retrieve, or pin conversations – similar to what they are used to with ChatGPT or Gemini.
Gemini in the engine room, freedom of choice takes center stage
Behind the design decisions lies the architecture that Apple has been working on for months. The new Siri is based on a model developed specifically for Apple by Google's Gemini team – a partnership officially confirmed by Google. At the same time, Apple is opening the system to third-party AI via the new "Extensions" framework. Users who have installed the Claude or Gemini app can set it as their preferred AI source system-wide – not only in Siri, but also in the writing tools, Image Playground, and summaries.
This results in an unusual dual structure: Apple provides the standard Gemini processor, but gives users genuine freedom of choice. This arrangement represents a remarkable openness for Apple and simultaneously provides security – Apple is not dependent on a single AI provider.
What the search gesture means for iOS
The new search-or-ask gesture changes more than just how the app works - it shifts one of the operating system's central entry points. Previously, Spotlight was the quickest way for many users to access apps, contacts, and web searches. iOS 27 moves this entry point up into the Dynamic Island and replaces the simple full-text search with an intelligent blend of Spotlight, Siri, and an AI chatbot.
For Apple, this is a risky move in two respects. First, the new gesture has to compete with an established pattern: Swiping from the top of the screen currently opens either the Control Center or the Notification Center, depending on the area. A third gesture in the same location inevitably requires a learning curve. Second, the question of search engine economics arises: If users across iOS have to choose between Siri, Gemini, and ChatGPT as their search source, the relationship with Apple's existing multi-billion-dollar Google Search deal could change in the medium term.
The bigger picture: Siri becomes a standalone product
With its own app, its own animated presentation in Dynamic Island, and its own system gesture, Apple is taking Siri out of its role as a background assistant and making it an independent product within the iOS stack. This is precisely the positioning needed to be taken seriously alongside ChatGPT, Claude, and Gemini. Anyone coming from the competition's services with hundreds of millions of active users expects an app where conversations are dynamic and can be recalled - not just a voice command that disappears after 20 seconds.
At the same time, Apple is integrating Siri so deeply into the system interface that the barrier to using AI is lowered. The search-or-ask gesture is precisely that: an attempt to establish AI not as an app, but as a ubiquitous function. Combining these two approaches, they create a coherent picture – Apple will finally deliver the answer it has been working on since WWDC 2024 in 2026.
WWDC will deliver the first live impression in four weeks
The keynote on June 8th will reveal how much of this vision Apple can already present in a way that goes beyond just slides. The previous delays to personalized Siri are still fresh in the minds of observers – and the AI architecture behind it is undergoing no less of a transformation. What is publicly revealed in June is likely to become the benchmark for Apple's AI narrative, setting the standard by which buyers, developers, and analysts will measure themselves for the next year. (Image: Shutterstock / miss.cabul)
- iOS 27: Apple refines Liquid Glass and makes the camera app freely customizable
- iPhone 18 Pro: Significantly brighter display remains a thing of the future
- iPhone 18 Pro: Aggressive pricing strategy despite storage cost pressure
- visionOS 27: What Apple Vision Pro 2026 can really expect
- Apple Watch without Touch ID: Leaker dampens expectations for 2026
- Apple Vision Pro: Development continues – despite restructuring
- Safari 27: Tabs should group themselves using AI
- macOS 27: Apple is planning design adjustments to the Liquid Glass look
- Apple's Pendant: Development continues, launch in 2027 at the earliest
- AirPods with cameras: Apple's development reaches a crucial phase
- A holographic iPhone? Samsung's display could pave the way
- MacBook Neo: Apple doubles production and reviews pricing strategy
- iPhone Ultra: Apple's foldable phone is to be the easiest to repair
- iPhone 18 Pro: Display upgrade only comes from Samsung and LG
- iPhone 18 Pro: New CAD leak fuels Dynamic Island debate
- iOS 27 lets you choose between Gemini, Claude, and more
- Anniversary iPhone 2027: Solid-state buttons pass initial practical tests
- iPhone 18 Pro: Apple sticks with the controversial aluminum finish
- iPhone 18 delayed further: Apple extends iPhone 17 production for an unusually long time
- OpenAI brings forward AI smartphone launch: Launch one year earlier than planned
- iPhone Ultra: First hands-on with dummy reveals unusual format
- Apple Wallet will receive a "Create a Pass" feature with iOS 27.



