“To this day, I still feel that surge of adrenalin when I hear the ringtone I used to set for hospital emergencies. It is like it’s been hardwired into my brain…” #mexuservoice
For the majority people, sound is the sense most likely to wake you from sleep and alert you to danger. This was explored in a fascinating talk by Ed Maklouf at MEX/9 in 2011.
The reliance on sound as a signalling channel goes some way towards explaining why it is equally possible for music to move people to tears of joy or noise to be used as a form of torture.
We know it drives powerful emotional connections and yet it remains woefully under addressed in most digital experience design projects. You can start to tap into its power with a simple step: consider, experiment and plan to implement non-visual elements of your experience design from the earliest stages of your next project – don’t leave it to become an afterthought at the end of the process.
This MEX principle represents an overarching recognition of the more detailed work conducted within the MEX initiative over the past 13 years, exploring all forms of non-visual interface design. This includes both audible and tactile interfaces, from haptics to the design of physical controls.
MEX predates the dominance of the flat, smooth glass rectangles. While these have become most users’ dominant digital touchpoint, we can draw on lessons learned from an earlier time when smartphones were judged as much on the tactility and sound of their physical keys as their displays.
Thanks to the talents of the MEX community, there is truly an embarrassment of riches to explore in the archives (filed under MEX Pathway #9), from the conceptual importance of non-visual interface elements, to the detail of why specific haptic and sound combinations work better than others.
Dig into the Pathway #9 archives or here’s a reading list to serve as your starting point:
- Experimenting with audio in digital experience design
- Episode 3 of the MEX Design Talk podcast, including an interview with Peter Law on multi-sensory design
- Paul Bennun & Nicky Birch’s MEX/12 talk on audio design in mobile UX
- Ed Maklouf’s MEX/9 talk on natural interfaces
- ‘Whispering to the future – a tale of navigating with Google Now‘, Marek Pawlowski’s 2014 essay on voice UIs
- Jim Kosem’s MEX/10 talk on audible UIs
- Charlotte Magnusson’s MEX/11 talk on audible and haptic UIs
The principle, part of an emerging series in the MEX journal, is summarised below in a tweetable, shareable graphic. Please share and thank you for citing appropriately.