SonyEricsson LiveView splits mobile UX across 2 devices


The focus of multi-screen user experiences, where different elements of a service are delivered across a number of displays, has traditionally been on using a handheld device to share content to a larger, fixed screen. However, Sony Ericsson’s recently announced LiveView accessory targets another possibility: a wearable, ultra-portable screen, smaller than the display of the mobile phone, which can deliver snippets of information to the user.

LiveView can show text messages, tweets, Facebook updates, RSS feeds, calendar entries, missed calls and music tracks. Other features include a locator service, which helps you find the paired phone by buzzing it, and an option to mute incoming calls.

It is primarily a viewing device, but there is also a limited range of control functions, such as pausing, skipping and adjusting the volume of music using buttons on the side of the device.

The LiveView’s screen is a 128 x 128 pixel OLED. It is designed to be worn, with a clip on the back to attach it to clothing, a handbag strap or folder.

The link works over Bluetooth and pairs with Android devices. Sony Ericsson is opening the system to developers, allowing them to build applications for its Android phone which take advantage of the dual screen user experience. One of the first will be the Sony Ericsson Sports Pack, a bundle which includes the LiveView, a carrying case and a fitness applications which works across both screens.

Multi-screen user experiences require designers to think in detail about context, asking themselves which features they can make more convenient or compelling for customers by making them available through another display device. They also present challenges at the input layer. Using different input methods on each device in a multi-screen user experience can confuse the user, but sometimes it is impossible to offer the same input mechanisms across all elements. Similarly, multi-screen user experiences also offer possibilities to enhance input, by allowing the device with the most natural method (e.g. touch) to control a screen with a less satisfying input system (e.g. a TV with a complicated remote).

With an open system for LiveView application development, the Sony Ericsson device should provide a testing ground for designers keen to experiment with multi-screen mobile user experience. It ships in Q4 2010.

Research the implications of supporting more than one screen from a single device‘ is 1 of 6 MEX Pathways at the next MEX event in London on 30 Nov – 01 Dec. 100 of the deepest thinkers from across mobile, media and design will be coming together to create new ideas around the MEX Pathways over 2 days of expert presentations and practical working sessions.


2 Comments

Add yours

Leave a Reply to John Smith Cancel reply