Apple is facing one of the most interesting chapters in its recent corporate history. While the tech world discusses artificial intelligence, Apple CEO Tim Cook is working quietly but very purposefully on a new hardware category that has the potential to lead the company into a completely new era: AI-powered wearables based on visual intelligence.
What at first glance appears to be speculation is reinforced by a combination of concrete rumors, internal signals, and Cook's own public statements, forming a strategically coherent picture.
When Tim Cook advertises publicly, something usually happens afterwards
Anyone who has followed Apple's communications over the years is familiar with the pattern. Tim Cook frequently and publicly praises a particular technology or feature, and shortly afterward, a product appears that puts precisely that technology front and center. This is no coincidence, but rather part of a deliberate strategy with which Apple prepares markets and shapes expectations.
Currently, Cook's marketing efforts are heavily focused on Visual Intelligence, the image-based branch of Apple Intelligence. Bloomberg journalist Mark Gurman picked up on this pattern in his weekly newsletter "Power On" and analyzed it carefully. His conclusion: Apple's CEO wouldn't be investing so much energy in marketing a feature that is essentially still a shell for external AI services if Apple didn't intend to significantly expand its presence in this area.
The pattern behind Cook's announcements
Tim Cook's method of announcing products through upfront technology PR has proven successful twice in the past. In 2013, Cook spoke remarkably often and at length about the growing importance of sensors in mobile devices, while Apple was already working internally on a wearable health device. The result was the Apple Watch, which was introduced in 2014 and subsequently expanded with more and more health sensors.
The pattern repeated itself before the launch of the Apple Vision Pro. Long before the headset was presented to the public, Cook regularly spoke about the advantages of augmented reality and virtual reality, the societal benefits of spatial computing, and Apple's unique position in this field. By the time the Vision Pro finally appeared, the groundwork had long been laid in terms of communication.
Gurman sees Cook's recent statements on visual intelligence as the next chapter in this pattern. The frequency, platforms, and tone with which Cook speaks about visual AI, in Gurman's assessment, fit the same pattern that the CEO applied to the Apple Watch and the Vision Pro.
Visual Intelligence: What's behind it and why it's so strategically important
Visual Intelligence is currently the part of Apple Intelligence that deals with image-based and photo-based queries. The feature allows users to analyze photos or camera images and extract information, answer questions, or trigger actions. In its current form, however, Visual Intelligence still relies heavily on external services: most queries are forwarded to OpenAI's ChatGPT or Google's AI models. Apple essentially acts as an intermediary here.
That's precisely what's about to change. Apple is working intensively on its own algorithms and systems for visual recognition, simultaneously on several levels. Through the Apple Car project, the company has built up in-depth expertise in computer vision—that is, the machine perception and interpretation of camera images in real time. Work on the Apple Vision Pro has given Apple additional know-how in the field of augmented reality. Furthermore, recent releases like Ferret-UI Lite, an Apple-developed model for analyzing app user interfaces, demonstrate that the company is actively developing and communicating its AI capabilities in the visual domain.
Cook's public praise for Visual Intelligence is therefore seen as a sign that Apple intends to replace these external AI dependencies with its own in-house developed models in the medium term. If this step succeeds, Visual Intelligence will become a standalone, powerful platform on which a new generation of hardware can be built.
Cook's own statements as concrete evidence
Gurman's analysis is based on specific public statements by Cook. During the conference call regarding the Christmas quarter results, Cook explicitly mentioned Visual Intelligence as one of the most popular features of Apple Intelligence. This is noteworthy because at that time, Visual Intelligence did not yet function independently but relied on OpenAI and Google.
In an internal employee meeting, Cook emphasized Apple's massive installed base. With 2.5 billion actively used Apple devices worldwide, the company possesses a "tremendous advantage" in the AI race. In this context, Cook again mentioned Visual Intelligence as a central element of Apple's AI strategy. The fact that Cook highlighted a feature that is still incomplete and heavily reliant on third-party providers in two different, publicly impactful contexts confirms Gurman's theory that Apple has significantly bigger plans in this area.
AirPods with cameras: Probably the first step
On the hardware side, AirPods with built-in cameras are considered the first concrete product in the new AI wearables category. Current rumors point to a launch at the end of 2026. The cameras are not intended as photography tools, but rather will be low-resolution or infrared cameras. Their sole purpose is to provide Apple Intelligence with a visual channel to the outside world.
The concept is similar to the smart glasses that Meta launched in collaboration with Ray-Ban: a camera on the body, AI in the background, and information delivered directly to the ear. The difference from glasses is that the AirPods don't require a visible frame or a display. The AI would communicate exclusively via audio, i.e., through speech and sounds. What seems like a simple solution at first glance is anything but trivial from a technical standpoint.
Apple Glass: First camera and audio, then full AR
Alongside the AirPods with camera, Apple Glass is expected, also with a target launch window around the end of 2026. The first version is said to be strongly based on the concept of the Meta-Ray-Ban glasses: a normal eyeglass frame, equipped with cameras and audio functions, without a display and without full augmented reality projection.
The full AR experience, meaning the overlay of digital content onto the user's field of vision, is intended to be reserved for a later version of Apple Glass. This is an important distinction: the first step is a barely noticeable wearable with passive environmental awareness. Only in a subsequent generation would the device become a fully-fledged AR headset. This strategy is typical for Apple: a manageable first product that opens up the market before the full technological vision is unveiled.
The AI pendant: From pin to wearable companion
The third and most recent rumor in this category concerns a small, wearable device that initially appeared in reports as an AI pin or badge. More recent information describes it as a pendant that can be flexibly attached to clothing. This device is also said to be equipped with a camera to continuously provide Apple Intelligence with visual data.
Together with AirPods with camera and Apple Glass, this creates a coherent third product category alongside iPhone, iPad, and Mac. Apple would thus be introducing not just a single gadget, but an entire family of devices, all built on the same visual AI infrastructure.
Possible application scenarios: From practical to visionary
Gurman describes a range of possible applications for these devices, from everyday to far-reaching. At a basic level, such a wearable could identify the ingredients of a dish on the dining table and provide information about them. This sounds simple, but it's a good indication of how easy it could be to get started.
Things get much more interesting in the area of navigation. The cameras would allow AI to analyze the surroundings in real time and give location-based instructions based on actually visible landmarks. Instead of "Turn right in 50 meters," an instruction like "Turn right at the red mailbox" would be possible. This is a fundamentally more intuitive form of navigation, based on human perception rather than using abstract metric data.
Another concept is proactive reminders. The device could actively notify the user when it detects certain interesting or relevant objects in the environment. These could be signs, familiar faces, or products previously searched for. The transition from a passive information source to an active assistant is fluid.
The technical challenges: miniaturization, bandwidth, and Siri
As compelling as the vision sounds, the path to achieving it is fraught with significant technical hurdles. The biggest of these is miniaturization. Cameras require considerably more space and, above all, significantly more bandwidth than the sensors Apple has previously used in wearables. Heart rate and motion sensors in the Apple Watch are comparatively energy-efficient. A camera, on the other hand, that continuously transmits images to an iPhone and processes them there in real time, places entirely different demands on data transmission, battery life, and thermal management.
This presents a particular challenge for AirPods: The casing is extremely small, its proximity to the ear canal limits heat generation, and at the same time, it has to house the camera, electronics, antenna, and battery. The problem is similar with Apple Glass: Apple wants the frame to remain as slim and light as regular glasses. This is one of the main reasons why the development of the smart glasses has been taking several years.
Beyond the hardware challenges, there's a structural problem that could hamper the entire AI wearables strategy: the sluggish development of the new Siri generation. Siri is intended to become the central control element of Apple Intelligence, with context-aware capabilities far exceeding the current version. However, the launch of this new Siri has been repeatedly delayed. Without a powerful, context-aware Siri, the planned wearables cannot reach their full potential. The Siri delays will therefore inevitably impact the timeline for the new hardware products.
Apple is building its next platform, but at its own pace
All in all, a coherent, albeit still fragmented, picture emerges. Apple is working on several fronts simultaneously: on its own visual AI, on miniaturized camera wearables, and on an improved generation of Siri that will bring all these components together. Tim Cook's public statements about visual intelligence are no coincidence, but rather follow an established communication pattern that has reliably hinted at upcoming product categories in the past.
AirPods with cameras, Apple Glass in its first camera-based version, and an AI pendant are the visible endpoints of this strategy. The invisible foundation is the gradual replacement of external AI services with Apple's own visual models. Only when this foundation is in place will Apple be able to combine its wearable hardware with the full potential of its own AI.
There won't be a quick breakthrough. The technical challenges of miniaturization and bandwidth are real, Siri is lagging behind, and Apple won't release a product that doesn't meet its own high standards. However, anyone familiar with Apple's pace in previous categories knows that the company is a latecomer, but often delivers a product that redefines the market. Whether it will succeed again with AI wearables remains to be seen. The signs certainly suggest that Apple is giving it a serious try. (Image: Shutterstock / BalazsSebok)
- Apple March Event 2026: Five new products expected
- iPhone 18 Pro: Apple tests new Deep Red color
- OpenAI plans smart speaker with camera
- MacBook 2026: Color ideas are still from 2022
- macOS 26.3 code reveals three new Apple products
- Apple Experience: Is Immersive F1 coming?
- Meta launches a new attack in the smartwatch market
- MacBook 2026: These colors are planned
- Apple could surprise with a product week in March
- Report: Apple plans two new Home devices
- Apple is developing three new AI wearables with Siri
- iPad Pro: Why major updates are missing
- iPhone 18 Pro: No major upgrade in sight
- MacBook soon with integrated privacy screen?
- Apple launches hardware offensive in spring 2026
- CarPlay in Teslas: Why it's taking longer
- Affordable MacBook: All the info on the March launch
- iOS 27 brings more stability and battery life
- iPhone 18 Pro: These changes are planned
- iPhone Flip: Prototype apparently already tested
- The iPhone Fold brings new life to the book format
- Siri: Apple struggles with AI upgrade problems
- iPhone 18 Pro: Will true internet via satellite be coming?
- iPhone 18 Pro: Prices likely to remain stable despite high costs



