Imagine having a similar level of spatial immersion and physical presence as AR/VR headsets, but on the existing screens and devices we already own. Viewmorphism unlocks this new magic window for everyone.
Today’s wearables are both a bridge and a barrier to spatial computing. They can unlock incredible spatial depth and immersion (the feeling of “presence”), and can make digital experiences feel like an extension and expansion of physical experience. They are also too heavy, isolating, and impractical for most daily life.
Digital presence is primarily driven by two things: 6DoF tracking, which gives virtual objects persistent placement, and stereo rendering, which creates depth through binocular disparity (your eyes see slightly different angles, and your brain reconstructs a 3D scene).
When you step outside of a headset and return to a standard (monoscopic) display the depth cues collapse into a flat projection, and the rich perceptual feedback that helps drive digital immersion disappears.
Your physical view is dimensional, and with a viewmorphic interface, so is your digital experience. Using computer vision and ai we're able to create interfaces that adapt live in the moment to your physical presence .
In the example above we see a 3D Gaussian Splat that is dynamically updating based on the physical presence of the viewer. As you move and your viewpoint changes, our viewmorphic interface subtly morphs what you see in response. This is achieved using a device's standard front-facing camera to track the viewer's head and body position, in real-time.
Layer and perspective shifts provide monocular depth cues like motion parallax and perspective deformation, which in turn support 3D perception in the absence of stereoscopy. Navigating digital space feels like an extension of your physical environment, with a similar sense of dimensionality.
Just as responsive design successfully adapted content across various devices and screen sizes, spatially aware systems help content and experiences intelligently adapt and respond to your physical presence and space.
Here are some principles we're playing with to shape natural, alive, and dimensional experiences:
A Viewmorphic Interface dynamically updates its visual field to perfectly synchronize with your physical point of view—like looking through a window. This alignment removes the mental effort of interpreting a flat UI, so the interface stops feeling like a picture you manipulate and starts feeling like a space you inhabit.
Viewmorphic Interfaces translate your physical presence into system commands—your physical proximity, position, posture, perspective and motion are sensed in real-time to dynamically control what you see and experience, creating a sense of digital presence.
Viewmorphic Interfaces use layers to create occlusion and movement to produce motion parallax, which establish natural depth cues that turn a flat view into a believable extension and expansion of your physical experience.
A viewmorphic approach isn’t just aesthetic. This is about giving you the benefit of stereo displays with having to wear them, and better aligning digital experience with how we already think, perceive and move. Viewmorphism offloads cognitive effort, supports embodied interaction, and restores some of the spatial legibility that we subconsciously expect from a physical environment
We see this as a step toward more natural technology and computing — where the system and screen doesn’t just show you things, they understand you and adapt to you.
+ See our Aura Touchless Spatial Interface for more thinking on Touchless Control
+ Shoutout to Johnny Lee’s early Wii Remote experiments that showed how everyday hardware can be transformed into spatially responsive input/output systems.