Aura - Touchless Spatial Interface

Aura - Touchless Spatial Interface

What if digital interaction felt as intuitive and natural as physical interaction?

Movement is our body’s natural language, and the way we interact with the physical world feels seamless and perfect. The digital side of the equation still feels very rough in comparison, like a language we can barely speak, that limits how we express ourselves.

Tapping on glass, swiping up and down, one finger or two, moving a mouse around to click on a square. Each of these input languages unlocked a better way to interact with digital tools and experiences, but something always gets lost or limited in translation.

Introducing Aura

Aura is a touchless spatial interface built with Computer Vision and AI to understand the natural language of your body movements.

Aura uses your device camera to passively sense your body’s ambient context, controling and adapting experiences based on gestures and movements, and changes in your proximity, position, posture and activity.

Aura doesn’t require any new hardware, learning curve or calibration. It’s so deeply intuitive that you already know how to use it before you've even tried it.

Less Interface

Designed to work across spatial experiences on the devices you use everyday, Aura is powered by photons rather than atoms and lives in the air around you, so your interactions become touchless and weightless.

It’s a first-principle rethink of how we connect with spatial technology that gives you many of the benefits of wearables, without the physical limitations.

We're hiring brilliant creative and technical talents to work at the cutting edge of Computer Vision, AI and Spatial Computing. If you love to reason and build from first-principles, let's chat. Based in Los Angeles.

BACK