Designing audible experiences in the multi-platform world


Music will be one of the main drivers of multi-platform user experience. Content designed for the ear can be consumed in numerous environments, from the car to the home, not least because it doesn’t require use of our hands.

Mobile phones are the natural connectors for mutli-platform user experience. They are with us constantly, they provide a link to cloud-based music storage or sufficient memory to cache large quantities of music on the device itself. It is hard to imagine any other digital touchpoint better suited to being the main interface for our personal music collections.

However, quality can be a significant barrier. Anyone who has plugged a mobile phone into their home audio system and turned up the volume on an MP3 file will know the sound is often compromised by loss of definition. If phones are to become the drivers of multi-platform music, they need to deliver sufficient quality.

I had an enlightening conversation with Jonathan Jowett and Mark Price of Dolby about these issues. Dolby is the audio processing technology found in some 4 billion devices around the world – it comes in numerous variations, but the principle is the same: it takes existing audio content and optimises the sound output or it utilises sounds created specifically for Dolby algorithms to provide the highest quality listening.

For instance, it is often Dolby technology which creates the surround sound and dimensional effects you hear at the cinema.

According to Jowett, they are now experiencing strong demand from handset manufacturers to embed their sound engine within mobile phones. This will facilitate two things: firstly, it improves the audio experience on the device itself. Secondly, it also ensures the mobile is able to power considerably enhanced music playback in a multi-platform scenario, such as using a phone to supply music to an in-car or home sound system.

Currently Dolby can support up to 5.1 channel sound on phones, the same level supported by most entry- and mid-level home cinema systems (professional cinemas are now reaching 11.1 channels, where sound has direction, depth and height).

The engineering challenge behind this is significant. It means integrating at chipset level with mobile phone hardware and then tweaking the sound experience on each device to ensure the best possible quality.

Jowett was vocal about the importance of maintaining consistent quality and upholding the integrity of the Dolby experience. Put simply, the company survives by virtue of its brand strength and association with premium sound. If it loses that by poor implementation or variable performance across apps or devices, it could do irreparable harm to the rest of the business.

As multi-platform experiences grow in number, more and more companies will face these same risks of peripheral brand damage through poor user experience.

Jowett and I also discussed the emerging possibilities for designing an audible dimension into mobile UIs using their technology. After seeing the demonstrations of visual 3D in the FUSE concept and by Nvidia and other chipset manufacturers, it strikes me that audible feedback in the UI should also be in 3D.

For instance, a visual event occuring lower on the Z access could generate a lower volume alert. Content being called forward from the visuals depths could be accompanied by a corresponding 3D sound effect.

Dolby is not yet addressing this opportunity, but it would be a desireable addition to the range of effects available to mobile interface designers.


+ There are no comments

Add yours