2016

Project Soli: Designing the no UI future

Are you ready to wave goodbye to physical controls?
First thing in the morning we switch off our alarm, swipe through our emails and press buttons on a machine to give us a quick caffeine fix. And while these little habits may feel natural, imagine if in our increasingly connected world man and machine could interact without physical controls; if we could free ourselves from the limitation of touchscreens and buttons.

A team at Google’s Advanced Technology and Projects group (ATAP), led by Dr. Ivan Poupyrev, is working to make this future a reality. They are testing its Soli sensing technology in different contexts to enable people to control devices without touching a single button, knob or screen. We at IXDS recently carried out a close, embedded collaboration with Google ATAP on Project Soli, putting the high-performance radar-based gesture sensor in action for connected home audio. Working with the technology in the JBL by Harman speaker, we set out to provide the ATAP team with a deep understanding of users’ experience interacting with audio in mid-air.

Soli is a miniature radar that detects touchless gesture interactions. The purpose-built interaction sensor can track sub-millimeter motion at high speeds and accuracy, allowing creation of what Google describes as “ubiquitous gesture interaction language that will allow people to control devices with a simple, universal set of gestures.”

It’s this kind of technological advancement that could enable our interaction with devices to become more fluid than ever before – imagine interacting with your speaker while cooking up a storm in the kitchen. There would be no need to wipe your hands, or risk splashing your device with water. But for the possibilities to be truly realized it’s important to put users in the foreground. Because although it may initially be thrilling to magically control volume, skip songs and change radio stations with a wave of your hand, what changes when people can no longer feel or touch a device? And what will it take to get used to using gestures rather than touch?  

One of the main factors to consider with this new form of interaction is how users receive feedback. With physical controls and an interface, the feedback is instant and more easily defined – you can feel a device turn off when you press a button, or see numbers countdown on a microwave screen. If there’s nothing but air, how do you guide the user? 

Exploring the possibilities of the human hand in the virtual world.

UI (User Interface) is our specialty, so to answer these questions our multi-disciplinary team, offering a unique skillset of engineering, service design, UX design and industrial design, set out to explore the changing audio landscape and trends. We then visited people in their homes to discuss how they currently use connected audio, their likes and dislikes, and future possibilities. And, armed with the prototypes our team created to test a variety of interaction forms with potential users, we shot videos to show how the gesture controls worked, how users reacted to it, and to tell the story of their overall experience.

The insights derived from these tests allowed us, in a very close cooperation with ATAP and JBL by Harman, to recommend and develop various industrial design adaptions. And as I/O 2016, Google’s yearly developer conference, was fast approaching our team continued to support ATAP and JBL by Harman in producing a functional prototype that was presented at the event.

This kind of revolutionary technology may be what finally gets us out from behind the screen, breaking down the interface to forever change the way we merge our digital and analog lives. The Soli Project has opened our eyes to the possibilities of what’s not in front of us, and this is only the beginning.