Handsets are no longer just for the hand


Marek Pawlowski, PMNThis is one of a series of articles by Marek Pawlowski, Editorial Director at PMN and founder of the MEX conference, examining the key mobile user experience issues facing the telecoms industry in 2008. These themes are highlighted in PMN’s 2008 MEX Manifesto and will be at the heart of the agenda for the 4th annual MEX conference in London on 27th – 28th May 2008.

Mobile phones were traditionally designed with the comfort of the ear in mind. The original Motorola flips, the Nokia ‘banana phone’ and the numerous chunky ‘bricks’ of the 1990s were all built primarily around the need for a device which could be held to the face for extended periods of time. If we look at how the market has evolved today, the design requirements are very different because phones are as much about visual activites like texting, email, photos and web pages as they are about the traditional function of voice.

Increasing ratio of screen size to overall interaction area in mobile phones since 1980

Consider the ratio of screen size verus the overall ‘face’ area of the device. Over time, displays have come to dominate the main interaction surface of the mobile phone. If you could track this ratio over the lifetime of the mobile industry, it would show a steadily increasing trend, starting with the single line ‘dot matrix’ displays of the 1980s and rising through to the massive screens of the iPhone, Prada phone, Viewty and HTC Touch.

The iPhone and its touchscreen have ushered in a boom for the UI design industry. Faced with Apple as a new competitor, rival handset manufacturers are recruiting UI experts as never before. Spurred in to action by the fear of being left behind, management teams throughout the device business are now mandating a selection of touchscreen products in their portfolio. iPhone sales volumes may still be less than a single digit percentage of the market, but there is no doubting the device has established a new design benchmark.

This sudden willingness to embrace the touchscreen is providing UI designers with more scope than ever before to create flexible interaction layers which adapt to provide the best interface method for individual applications.

What we are seeing is the digitisation of the man machine interaction (MMI) layer and the consequences will be profound.

The iPhone was the first device brave enough to implement the MMI entirely in software. In doing so, Apple prompted the industry to consider what could be achieved once it was freed from having to interact with every application through the same three or four hardware buttons.

The manufacturers with an established and consistent DNA for hardware-based MMI are now pondering how they can maintain the value of their existing investment in MMI consistency and still introduce new innovations with the same ‘wow’ factor as the Apple UI. It’s a very tough question and one that is currently keeping a huge number of UI designers and consultants in well paid work!

However, while UI teams around the world are getting to grips with this major strategic issue, I would like to sound two notes of warning.

Firstly, a funky new UI is never the answer to all your user experience problems – there’s no silver bullet. Any new UI or MMI innovations must be part of an overall commitment to user experience. This is the most fundamental principle of everything we do with our MEX research and consultancy work – it is also the main theme of our 2008 MEX conference and the MEX Design Competition.

User experience is not a set of technologies or a layer within the product design process: it is about having a customer-centred approach at the heart of everything you do, from marketing strategy to after-sales support.

You need only spend a couple of hours with the a device like the HTC Touch to recognise that, however attractive the top layer of the UI, the overall user experience will be fatally flawed if you don’t invest in the deep level of integration required to make a new interaction methodology really work.

Secondly, the priorities of interaction design are about to change again. Handsets will no longer just be for the hand (this is one of 10 key Manifesto statements for the 2008 MEX conference).

The mobile phone started as a device for the ear and has since become a device that is also for the eye. In both of these scenarios, the consistent factor is that the phone remains cradled in the palm of the hand – in 30 years of mobile handset design, this has been one of the few constants.

Finally, that is starting to change. Driven by applications like mapping, music, video and tele-conferencing, the handset is increasingly migrating from our palms and finding a new place in the environment around us.

We are starting to see phones attached to the car dashboard or pumping out music from the bookshelf of a teenager’s bedroom. They are being propped up on tables so kids can watch videos on holiday and plugged in to TVs to drive photo slideshows.

Over time, the average interaction distance between the users and their phones will increase significantly from the few centimetres we see today. Interaction designers can no longer take it for granted that the user will be holding the device in the their hand, with their face close to the screen.

This has big implications for the design of software, the choice of input method, the use of haptics and the role of accessories to extend the experience.

As an example, I have my Nokia N95 mounted on the dashboard of the car. It can provide GPS-enabled mapping, speakerphone and even play my music tracks through the car audio system. However, many of these features are simply too difficult to use unless I’m actually holding the device in my hand.

The keys are too small to press accurately while driving, so searching for an address in the mapping application is impossible unless you are parked. Similarly, I am unable to find the song I want in my music library or build a new playlist. The font size on-screen is also difficult to read at that distance. At night, when the dashboard of the car dims to make it easier to see the road, the handset continues to blaze at full brightness.

This is not meant to be a criticism of the N95 in particular, but rather an illustration of how the new capabilities of mobile phones are enabling out-of-hand applications while the user interaction model is still centred on in-hand scenarios.

There are all sorts of technologies emerging which could improve this experience. Voice recognition is getting better all the time (e.g. Nuance’s ‘speak-to-search’ application). Nokia is implementing touchscreen support in Series 60, allowing for more flexible, adaptive UI design. Start-ups like Zeemote have even developed Bluetooth remote controls, allowing you to interact with your mobile phone at a distance (its initial focus is on handheld gaming).

Microvision, with a long-history in new display technologies, is one of several companies which has created a ‘pico’ projector using laser technology to beam videos and photos on to remote surfaces. Along with others, Microvision has also developed wearable glasses which display the screen as a tiny image in front of the eye which, because of its proximity, appears equivalent to a large home cinema screen.

Bowers & Wilkins iPod and iPhone dock by Native

For music, more and more handset manufacturers and third parties are offering speaker systems which turn mobile phones into compelling audio systems. One of the most attractive I’ve seen is the Bowers and Wilkins iPhone speaker dock designed by Native (Thomas Kleist, Director of UI Design at Native, is one of our speakers at the 2008 MEX Conference on 27th – 28th May in London). It transforms the iPhone from a personal media player into a room-filling audio experience that puts the mobile phone at the heart of the environment.

The industry faces a real and complex challenge over the next few years. On the one hand, device manufacturers must grapple with the immediate competitive implications of the iPhone and the growth in touchscreen devices. On the other, companies throughout the industry are seeking to expand the role of the phone into every area of our daily lives, including many scenarios where the handset will actually no longer be held in our hands.

We’ll be tackling these issues from several angles at MEX, the 4th annual PMN Mobile User Experience conference, in London on 27th – 28th May 2008. ‘Handsets are no longer just for the hand‘ is one of the 10 key statements on our MEX Manifesto and will be addressed by Steve Chambers, President of Mobile and Consumer Services at Nuance. He will give a presentation to provoke and inspire a series of breakout discussions, where 100 leading thinkers from across the mobile business will work together to explore a number of questions relating to this topic.

Thomas Kleist, Director of UI Design at Native, will speak on ‘Content itself is the new interface‘. Also addressing this topic will be Ocean Observations, before we open the session to a conference-wide debate.


No related posts.

1 comment

Add yours
  1. 1
    Mike Grenville

    “The keys are too small to press accurately while driving, so searching for an address in the mapping application is impossible unless you are parked.”

    A a cyclist (and I know you are too Marek), I find this rather reassuring!

+ Leave a Comment