The in-car environment is a challenging place for mobile user experience. User attention is divided between the road, driving controls, the dashboard and console. Any attention paid to a mobile device is guaranteed to be partial and in many countries is now restricted by legal requirements, particularly around handsfree usage.
Waze, a navigation application which relies on user input to build an accurate picture of road conditions, is using gestural activation and voice recognition to overcome this challenge. The application utilises the mobile device’s proximity sensor to detect a hand being waved in front of the screen and then activates voice input.
It is a simple workaround which uses existing technology to improve the user experience in a multi-touchpoint environment. One of the findings of MEX Pathway #2, our ongoing exploration of multi-touchpoint design, is that far from being restrictive, partial attention environments provide an opportunity for practitioners to create experiences greater than the sum of their parts. To achieve this, designers must think creatively about all of the input and output capabilities available within their users’ digital spheres, working to combine them in a way which supports the particular context of individual customers.
Here’s a video demonstration of Waze’s approach: