Sensation


Humans rely on many sensations to confirm their interactions with the physical world. Hitting a nail with a hammer produces visual, audible and tactile response. Food is experienced through sight, smell, taste, sound and touch. The dominant sense varies according to task, but rarely is anything experienced through a single dimension of sensation; even when hearing is enraptured by the most moving concert, the experience is subtly augmented by the smell of the venue, the feeling of the air temperature and observing the lighting conditions.

However, with the advent of touchscreens, digital interactions are increasingly dominated by a single sensory dimension: sight. The smooth glass beneath the finger tip never varies its tactile response. Most actions are achieved in eerie silence. It is a remarkable change for a species which, hitherto, lived in a world where scale of achievement correlated with strength of sensory response. The larger the rock lifted, the greater the tactile sensation and the louder the noise when it was thrown.

Today’s mono-dimensional digital experiences ignore humans’ innate, multi-dimensional scale for measuring influence on the physical environment.

The MEX community has for some time been exploring this issue through Pathway #9, entitled ‘Expand mobile interactions with the neglected dimensions of sound and tactility‘. Officially initiated in February 2011, the objectives of Pathway #9 have been a consistent theme at MEX for many years. The exploration continues at the next MEX in London on 19th – 20th September 2012.

Pathway #9 proposes all digital interactions should be considered multi-dimensional experiences and that design should not default to the visual medium. The goal is not to force the use of additional senses or tack unnecessary beeps and buzzes onto visual interaction sequences, but rather to encourage designers to embrace a wider sensory toolkit for achieving great user experience.

Examples are emerging to provide inspiration:

  • The Lake is an iPhone application which uses sound to stimulate the user’s imagination and generate their own story in response. It relies on sight merely to focus the eyes on a single point (in this case a playing card displayed on the screen), preventing visual attention from wandering elsewhere and ensuring the ears remain the primary channel of interaction.
  • Nokia’s Maps application, integrated into all of its Symbian OS handsets, can guide users through a combination of beeps and vibrations, allowing the phone to remain in the pocket when used for walking navigation.
  • Chirp, developed by MEX speaker Patrick Bergel, expands on the use of sound as a communication medium. This elegant application (currently on iOS, shortly to be released for Android) literally sings data between devices, generating sound prints inspired by bird song to create a micro-broadcast environment where web links, images and other data can be communicated. The experience of sending a Chirp is compelling in itself, but the real value is likely to come from enabling low cost appliances to communicate with the web. In theory, any device (think washing machines and light switches) could be equipped with a basic tweeter, costing a fraction of the equivalent wireless communications chip, and used to sing data to a listening smartphone or other digital touchpoint.
  • Clear, the iOS task manager lauded by designers for its buttonless interface, is another example where sound and tactility enhance the visual experience. The audible ‘bings’ and vibrations are as much a part of its appeal as its visual simplicity, providing sensory reference points which add depth to the user experience.

One of the challenges uncovered during our ongoing exploration of this Pathway is the dominance of visual skills in design education and day-to-day process. It is difficult to integrate additional sensory dimensions into established, visually-dominated prototyping routines. Indeed, at the most recent MEX in December 2011, a team led by Alyson Webb, Lindsey Green, Peter Law and Sam Billington, was tasked with proposing new ways to prototype experiences across all the senses. The facilitators quickly recognised one of their first challenges was to use a series of tactility and sound-based games, including blindfolding participants, to help visually led designers break outside their usual modus operandi.

At an earlier MEX event we again deprived participants of their visual sense by asking them to listen, blindfolded, to some original compositions specially created for MEX by audio designer Peter ‘PDX’ Drescher. They then had to connect what they’d heard back to the visual dimension by drawing what they saw in their minds’ eye and writing how it made them feel. The variety of responses was extraordinary and also characterised by the depths of emotion it inspired.

This can be partly explained by an insight from the May 2011 MEX session by Ed Maklouf, who described how the brain prioritises its response to sound because of humans’ evolutionary reliance on this sense to wake us if we hear danger while sleeping. It is the one sense we never truly switch off. The strength of the link between hearing and brain presents both an opportunity to create truly inspiring digital experiences augmented by sound, but also a risk that a poor soundscape can irritate and even offend users.

The exploration of Pathway #9 continues at the 19th – 20th September MEX in London, focusing on 6 initial provocations:

  1. Have today’s smooth, glass touchscreens caused designers to forget the senses of sound and tactility when developing user interaction sequences?
  2. How can all elements of the user experience – visual, tactile and audible – adapt in response to changes in the ambient environment? How should the priority afforded different senses change in response to context?
  3. How do cultural attitudes to sound and tactility vary? How can these variances be reflected in mobile interactions?
  4. If the presence of sound and tactile elements within the user experiences increases, will interactions which lack these feelings seem incomplete?
  5. Which tools can be used to include sound and tactility within predominantly visual prototyping?
  6. How can sound and tactility support free air gestural interfaces?

In the mean time, I’d love to hear from anyone working on user experience techniques which expand beyond the visual dimension.


1 comment

Add yours

+ Leave a Comment