360 degrees of touch and protecting user behavioural investment


The FUSE concept handset by TAT, Immersion, TI, Synaptics and Alloy has been the topic of much discussion at MWC, helping to visualise many of the experimental user experience elements which will be reaching mainstream handsets in 24 months time.

The device itself is covered in detail in last week’s diary article about TAT and Malmo, but I also had the chancee to speak with Tom Duckery, Andrew Hsu and Nada Matic of Synaptics about some of the challenges they encountered when introducing new touch surfaces into the FUSE. The handset incorporates touch sensors along the sides, on the main screen and on the entire rear surface of the device. This has highlighted some new possibilities for user interaction but also some potential failure points in the experience.

For instance, it has become obvious a much higher level of filtering is necessary to ensure good usability when combining front and rear touch sensors. While the device is held in the hand, there is a danger that the screen can be receiving input from both the main display and the rear capacitive panel. Similarly, the side sensors respond to a squeeze action, but in doing so the actual plastic casing of the handset begins to deform, changing the way in which the pressure is detected.

Duckery also put forward the idea of protecting user investment in a particular input method or interface. These kind of new interfaces have the potential to learn behavioural patterns, so that over time the device becomes more responsive to their particular nuances of touch. However, this investment would currently be lost if the user moved to a new device, so there is a new challenge emerging around storing, protecting and porting that behavioural information over the course of a user’s lifetime.

I talked with Matic, Synaptics’ Usability Director, about the potential studies which could be conducted to ‘heatmap’ touch patterns in capacitive panels during different usage sessions, helping UI designers to understand input flow in much the same way eye tracking has helped desktop interface designers.


+ There are no comments

Add yours