Apple is apparently working on a new camera technology that could significantly change iPhone photography in the long run. A recent rumor from China suggests that the company is exploring multispectral imaging for future iPhone models. The goal is to improve the camera's visual intelligence, recognize materials more precisely, and accelerate overall image processing. While the project is still in its early stages, the potential impact is remarkable.
The camera has been one of the iPhone's most important selling points for years. Apple continuously invests in new sensors, software algorithms, and machine learning directly on the device. The multispectral imaging now under discussion would represent a further step that goes beyond classic improvements in resolution or light sensitivity. Instead, it's about a deeper understanding of what the camera actually "sees.".
Information from the supply chain
The current discussion was triggered by a post from the well-known leaker Digital Chat Station on the Chinese platform Weibo. According to the post, Apple is currently testing components for multispectral imaging within its supply chain. At the same time, it is emphasized that official tests have not yet begun. This suggests that the technology is still in an evaluation and concept phase and is not about to be used in production devices.
What multispectral imaging means
Unlike traditional smartphone photography, which is limited to red, green, and blue light, multispectral imaging captures data from multiple wavelength ranges. This can include narrow spectra or the near-infrared range. In this way, information can be captured that remains largely invisible to conventional camera sensors.
Advantages in material and object recognition
One potential benefit of this technology lies in the significantly improved differentiation of materials and surfaces. Different materials reflect light differently depending on the wavelength. An iPhone camera with multispectral data could distinguish skin, fabrics, vegetation, or highly reflective surfaces more precisely. This would not only improve object recognition but also enable cleaner portrait effects and more accurate subject isolation.
Improved image processing and AI
General image processing could also benefit from multispectral data, especially in scenarios with mixed or challenging lighting. Additionally, this data could enhance Apple's on-device visual intelligence and machine learning. This could lead to improvements in scene recognition, understanding complex image content, and depth estimation, without relying solely on software tricks.
Technical hurdles and costs
However, integrating additional spectral sensitivity also presents challenges. More complex sensor designs require more space and drive up costs. In an already densely packed iPhone interior, this is a critical factor. This could explain why Apple is currently only testing the technology instead of actively incorporating it into prototypes. Therefore, its immediate deployment in future iPhone generations is considered unlikely.
More camera rumors about the iPhone 18 Pro
In the same Weibo post, Digital Chat Station provided further insights into Apple's camera plans. According to the post, the iPhone 18 Pro models will feature a main lens with a variable aperture. The telephoto camera will also reportedly have a larger aperture, which would be particularly advantageous in low light. At the same time, the post emphasizes that Apple is not currently developing any prototypes for 200-megapixel cameras. The focus, therefore, seems to be more on smart sensors and optics than on sheer megapixel count.
Apple is taking a new approach to the iPhone camera
Tests of multispectral imaging show that Apple continues to work on fundamental innovations for the iPhone camera. Instead of simply increasing technical specifications, the focus is on a better understanding of light, materials, and scenes. Even if the use of this technology is still a long way off, it suggests the direction in which iPhone photography could develop: away from pure hardware power and toward more intelligent perception. For Apple, this would be a logical next step to differentiate itself from the competition in the long term. (Image: Shutterstock / BigTunaOnline)
- AirPods Pro 3: High-end model expected to become significantly more expensive
- iPhone with 200 megapixels: Apple's long-term plans
- iPhone 17e: Mass production is expected to start soon
- iPhone 18: This is how expensive Apple's new A20 chip will be
- Affordable MacBook 2026: All the details on the new entry-level model
- OpenAI is preparing new audio models and its own device
- Apple is focusing on new Fitness+ programs and challenges
- Apple Vision Pro is not a flop, it's just misrated
- Apple updates vintage list for Macs, iPhones and more
- Apple Fitness+ officially announces big plans for 2026
- Apple raises $3 million to fight AIDS
- iPhone 17 Pro: Reports of static noise while charging
- Apple better prepared: Notebook market under pressure in 2026
- More Roomba models support Matter for HomeKit
- Pluribus Season 1: The finale was planned completely differently
- Apple receives a reprieve from US tariffs on Chinese semiconductors
- Cyberattack on Apple supplier: Background and consequences
- Tesla is working on native car keys for smartphones
- Apple and the billion-dollar App Store ruling in the UK
- macOS Gatekeeper bypassed via two-stage malware chain
- Apple Intelligence must pass China's AI censorship test
- Ted Lasso Season 4: Producers announce possible release date
- Apple allows alternative app stores on iOS in Brazil
- iOS 26.3 receives praise from the EU for new features
- WhatsApp is testing a new quiz feature for channels




