With visionOS 26, the Apple Vision Pro expands its collection of immersive environments with a new planetary view. This further enhances the spatial experience for which the headset was designed. The goal is to transport users from their physical surroundings into realistic, fully designed virtual spaces – be it the shores of Mount Hood or the surface of the moon looking back at Earth.
Now, two key figures in the project have provided insight into the development process. In an interview with Cool Hunting magazine, Yuri Imoto from visionOS product marketing and Matt Dessero, Human Interface Designer at Apple, discuss how these environments are conceived and implemented.
The Apple Vision Pro represents spatial computing, where digital content is no longer limited to flat displays. VisionOS 26 further develops this concept. The new environment is based on Jupiter's moon Amalthea and exemplifies how design, science, and technology are seamlessly integrated at Apple.
The creation process of this environment demonstrates that it's not just about graphic quality. Atmosphere, scale, materiality, and spatial perception play a crucial role. Every environment must feel authentic and evoke a specific emotional response.
Nature as a design basis
According to Matt Dessero, nature forms the basis of all environments in visionOS. The starting point is the question of what kind of atmosphere should be created. Specifically, this concerns emotions: Should the environment convey tranquility? Promote concentration? Or evoke a feeling of wonder?
These considerations guide all design decisions. Lighting conditions, color scheme, perspective, and composition are specifically tailored to them. In this context, the Apple Vision Pro is not merely a technical device, but a medium that conveys moods.
Why Amalthea was chosen
For visionOS 26, the team chose Amalthea, a moon of Jupiter. This celestial body was a deliberate choice. Unlike with real landscapes on Earth, Apple couldn't rely on LiDAR scans or physical photography here.
To depict the surface as realistically as possible, the team collaborated with NASA's Jet Propulsion Laboratory (JPL). There, Apple received scientific assessments of the moon's presumed composition.
According to JPL, Amalthea is likely composed of rocks held together by Jupiter's strong gravity. Ice acts as the binding element, and the moon is believed to have a high ice content. This information formed the basis for the visual design. The team attempted to translate these scientific assumptions into the immersive environment as accurately as possible.
Development directly in the headset
A key aspect of the project was the working method. The environment wasn't developed on conventional monitors. Instead, Matt Dessero wore the Apple Vision Pro himself during the design process and guided artists directly within the spatial context.
This approach was crucial for the layout. Only through a headset can one judge how large objects actually appear and how they need to be arranged in space. The precise placement of the rocks in the foreground was particularly important. Scale, depth, and spatial effect cannot be reliably assessed in a two-dimensional view.
The composition was thus created in real time in virtual space. This demonstrates how significantly spatial computing design differs from classic interface design.
Importance for VFX, photography and UI design
The interview with Cool Hunting clearly demonstrates the complexity of the development process behind a single environment in visionOS 26. It involves scientific research, emotional impact, technical implementation, and spatial composition.
For those interested in VFX, photography, and UI design, this report provides concrete insights into a new way of designing. The Apple Vision Pro is presented not just as hardware, but as a platform that enables new creative workflows.
Apple Vision Pro and the demand for true immersion
With visionOS 26, the Apple Vision Pro expands its immersive offerings to include a scientifically grounded planetary view. The choice of Amalthea, the collaboration with NASA's Jet Propulsion Laboratory, and the development directly within the headset demonstrate Apple's meticulous and detailed approach.
The report makes it clear that spatial computing places different demands on software development than traditional methods. Immersive environments are not created on a screen, but in the physical space itself. This is precisely the crucial difference. (Image: Apple)
- Apple Card before switch: Why Chase is optimistic
- Reddit & Discord: Controversy surrounding age verification
- Apple TV unveils trailer for season 5 of For All Mankind
- Apple shareholders clearly follow management
- Apple: Why iOS leaks are so rare
- iPhone to break sales record in Europe in 2025
- Apple was warned by the CIA about the risk in Taiwan
- Apple offers a glimpse into its Mac mini factory in Texas
- Apple will now manufacture the Mac mini directly in the USA
- Apple introduces Sales Coach app for iPhone & iPad
- iOS 26.4 Beta 2: All features and adjustments
- WhatsApp is working on scheduled messages
- David Pogue offers insights into "Apple: The First 50 Years"
- Anthropic raises serious allegations against China
- iOS 26.4: Two apps get the old search layout
- iOS 26.4 Beta 2 tests secure RCS chats with Android
- iOS 26.4 Beta 2 launches: Apple continues testing phase
- Apple TV is betting on Wonder Pets Season 2
- Apple Sports expands soccer and tournament schedules
- Apple TV shows new thriller "Unconditional"
- iOS 26.3.1: Apple tests new iPhone update



