Join us for the 15th international MEX in London in March 2015. 2 days where 100 creative thinkers learn, network and define the future of digital user experience.


Humans rely on many sensations to confirm their interactions with the physical world. Hitting a nail with a hammer produces visual, audible and tactile response. Food is experienced through sight, smell, taste, sound and touch. The dominant sense varies according to task, but rarely is anything experienced through a single dimension of sensation; even when hearing is enraptured by the most moving concert, the experience is subtly augmented by the smell of the venue, the feeling of the air temperature and observing the lighting conditions.

However, with the advent of touchscreens, digital interactions are increasingly dominated by a single sensory dimension: sight. The smooth glass beneath the finger tip never varies its tactile response. Most actions are achieved in eerie silence. It is a remarkable change for a species which, hitherto, lived in a world where scale of achievement correlated with strength of sensory response. The larger the rock lifted, the greater the tactile sensation and the louder the noise when it was thrown.

Today’s mono-dimensional digital experiences ignore humans’ innate, multi-dimensional scale for measuring influence on the physical environment.

The MEX community has for some time been exploring this issue through Pathway #9, entitled ‘Expand mobile interactions with the neglected dimensions of sound and tactility‘. Officially initiated in February 2011, the objectives of Pathway #9 have been a consistent theme at MEX for many years. The exploration continues at the next MEX in London on 19th – 20th September 2012.

Pathway #9 proposes all digital interactions should be considered multi-dimensional experiences and that design should not default to the visual medium. The goal is not to force the use of additional senses or tack unnecessary beeps and buzzes onto visual interaction sequences, but rather to encourage designers to embrace a wider sensory toolkit for achieving great user experience.

Examples are emerging to provide inspiration:

One of the challenges uncovered during our ongoing exploration of this Pathway is the dominance of visual skills in design education and day-to-day process. It is difficult to integrate additional sensory dimensions into established, visually-dominated prototyping routines. Indeed, at the most recent MEX in December 2011, a team led by Alyson Webb, Lindsey Green, Peter Law and Sam Billington, was tasked with proposing new ways to prototype experiences across all the senses. The facilitators quickly recognised one of their first challenges was to use a series of tactility and sound-based games, including blindfolding participants, to help visually led designers break outside their usual modus operandi.

At an earlier MEX event we again deprived participants of their visual sense by asking them to listen, blindfolded, to some original compositions specially created for MEX by audio designer Peter ‘PDX’ Drescher. They then had to connect what they’d heard back to the visual dimension by drawing what they saw in their minds’ eye and writing how it made them feel. The variety of responses was extraordinary and also characterised by the depths of emotion it inspired.

This can be partly explained by an insight from the May 2011 MEX session by Ed Maklouf, who described how the brain prioritises its response to sound because of humans’ evolutionary reliance on this sense to wake us if we hear danger while sleeping. It is the one sense we never truly switch off. The strength of the link between hearing and brain presents both an opportunity to create truly inspiring digital experiences augmented by sound, but also a risk that a poor soundscape can irritate and even offend users.

The exploration of Pathway #9 continues at the 19th – 20th September MEX in London, focusing on 6 initial provocations:

  1. Have today’s smooth, glass touchscreens caused designers to forget the senses of sound and tactility when developing user interaction sequences?
  2. How can all elements of the user experience – visual, tactile and audible – adapt in response to changes in the ambient environment? How should the priority afforded different senses change in response to context?
  3. How do cultural attitudes to sound and tactility vary? How can these variances be reflected in mobile interactions?
  4. If the presence of sound and tactile elements within the user experiences increases, will interactions which lack these feelings seem incomplete?
  5. Which tools can be used to include sound and tactility within predominantly visual prototyping?
  6. How can sound and tactility support free air gestural interfaces?

In the mean time, I’d love to hear from anyone working on user experience techniques which expand beyond the visual dimension.

1 comment - join the debate

  1. MEX – the strategy forum for mobile user experience – The sound of photos replied:

    [...] up on the recent essay on ‘Sensation‘, MEX alumnus Ben King shared this article on ‘Picle‘, an iOS application which [...]

    August 7th, 2012 at 3:28 pm. Permalink.

Post a comment

Trackback URI

Spread the word


Read related articles

About the author
Marek Pawlowski Marek Pawlowski is the founder of MEX. Since 1995, he has focused the MEX business on helping digital industries to develop better, more profitable products through improved understanding of user behaviour with mobile devices and wireless networks. MEX is best known for its events, research and consulting, which balance commercial, technical and user insights to define the future of mobile user experience. Web | Email
Posted on
2 August 2012
Opinions, ideas and new thinking
Pathway #9: Audible dimension