Overtones in multi-sensory UX design

Overtones in multi-sensory UX design


The way in which dogs experience the world, where their superior senses of smell and hearing are deeply intertwined with that of sight, may provide an inspiring model for a creative approach to multi-sensory digital design.

The thought was prompted by an episode of the Fresh Air podcast, in which Alexandra Horowitz, an expert in canine behaviour, talks to the host Terry Gross about the nuances observed in dogs’ olfactory behaviour. In particular, she believes they associate changes in scent with the passing of time. In addition to being able to judge how recently a particular scent was added to a place, they also understand the time of day based on the gradual movements of air currents within a room.

You can hear the conversation in the 3rd November 2017 episode of Fresh Air. I can recommend the whole programme, but the part about the link between time and smell starts at about 8 minutes 50 seconds.

I find this canine ability interesting as an example of sensory blending, particularly because it sparks creative tangents about ways one could blend sensory experiences for humans. It inspired me to think about another tangentially related phenomenon, this time from the world of barbershop music, where it is possible to produce a psychoacoustic effect known as ‘overtones’. According to Wikipedia, when overtones occur listeners are able to perceive a fifth pitch, higher than and different from the fundamentals of the four pitches being sung by the quartet.

Might it be possible to produce similar, overtone-like experiences across the sensory spectrum by combining various digital tools such as virtual reality headsets, haptic actuators and audio?

We tend to think about new forms of digital design in relation to new classes of hardware: hence the drive to design for mobile and now the drive to design for virtual reality. What if instead the next big driver of digital design was the search for new experiences which exist in the conceptual space where several of our senses overlap?

There are hints of this in Apple’s delightful Animojis. The facial recognition capabilities built into iPhone X enable it to watch one’s face and map one’s expressions onto a shareable digital avatar. Apple is transposing the sound of users’ voices and the tactile movement of their body into a visually consumable experience. It is easy to imagine a set of digital tools which would allow various filters to be applied to this process of transposition, allowing users to amplify their facial expressions with the ‘dramatic’ filter or add concern with a ‘frown’ filter.

We’re no strangers to multi-sensory design within the MEX community, of course, having hosted sessions such as Flying Object’s workshop at MEX/16. It explored how brands can use taste, smell, hearing, touch and sight to influence digital design decisions. I’d also recommend taking a look at:

Part of MEX Inspirations, an ongoing series exploring tangents and their relationship to better experience design.

+ There are no comments

Add yours