Soli is a miniaturized radar that understands human gestural communication from the significant to the subtle. For Soli’s debut on the Pixel 4, Google ATAP needed to clearly communicate touchless interactions to an audience accustomed to touch-centered interactions.
Our response was to create a transient visual system, which seamlessly builds on familiar patterns to create a solid foundation for communicating touchless interactions.
Gestural communication comes so naturally to many of us, it is almost completely invisible. Movement detected by radar is similarly invisible and intangible; you cannot feel it nor see it, but it is there.
The Soli visual system references these spatial and temporal qualities of radar and human communication to reinforce the touchless interaction paradigm and improve comprehension.
In natural human communication, a change in position, posture or gesticulation can signify vastly different meanings. There are even some social contexts where certain gestures may become unsuitable. Both of these are also true for Soli.
To properly communicate how and when a gesture can be performed, elements reference natural human interaction. When touchless interactions are available, Soli’s movement resembles a person entering a room. When a left swipe gesture is performed, it responds by mimicking the gesture’s direction. If a touch event is detected, Soli moves out of the way to provide focus for touch interactions.
To help users understand this novel interaction, various touch-points were built including games, tutorials, videos, illustrations and an open-source sandbox.
These gave users the confidence and the understanding of how different actions would trigger Soli, and the potential of this technology beyond the Pixel 4 use-cases.