Personal mobile devices have a growing role in driving experiences across multiple screens, something highlighted in MEX Pathway #2. This includes using a device such as a phone to control one or more physically separate screens, e.g. a home cinema display, and products where there are two or more screens embedded with a single device, such as the Nintendo DS.
Momentum is gathering swiftly in the former category, with companies such as Logitech developing apps to allow Android and iOS phones to act as control devices for their Google TV box. Apple are also doing this with iOS devices and Apple TV.
Sonos, which offers a system for multi-room home audio, pioneered this concept by offering a free iOS application to supplement and – in many cases – replace its own proprietary controller. It has been a popular feature for the company and, by getting to market early, helped to differentiate Sonos’ offering on its core brand promise of simplicity and quality.
Designing experiences which work across multiple touchpoints requires a specific approach. The challenge becomes as much about designing for the connections between products as it does the products themselves.
In particular, there are difficult issues with partial visual attention, where designers need to create control interfaces which can be used with minimal visual bandwidth, allowing the user’s focus to remain on the main display. Activities which require this visual attention to be refocused away from the primary display must deliver unique benefit to justify the interruption.
This topic will be explored in more detail at MEX on 30 Nov – 01 Dec in London. Pathway #2 includes contributions such as Jason Daponte of The Swarm looking at how storytelling could evolve to support multiple screens, Gianluca Dianese of Toshiba talking about design challenges with multi-screen experiences and Greg Taylor of TigerSpike looking at the future of abstracted display technology. Jason DaPonte will also lead a MEX breakout challenge on Pathway #2.