Smartphones moved by machines

Smartphones moved by machines


Now that Sharp is controlled by Foxconn – the company which quietly makes many of the world’s smartphones – it has taken a logical next step: combining Foxconn’s economies of scale with Sharp’s brand recognition to re-enter the high-end smartphone market. The result is the Aquos R, a largely unremarkable offering which looks like the iPhone (coincidentally, also made by Foxconn).

So far, so unremarkable.

However, look closer and there are some juicy bits to be found at the periphery – in this case with ROBOKUKU, the Aquos R’s docking station, which swivels to face the user when receiving an incoming call. It pairs with the phone and uses its sensors to detect the user’s location, then orientate itself automatically using motors embedded in the base of the accessory.

It would be easy to dismiss this as a gimmick but it hints at a longer-term trend: pairing self-moving machines with smartphones.

As smartphones have gained sensors, they’ve become veritable powerhouses capable of recording, interpreting and taking action in response to their physical surroundings. The tendency for them to remain gripped in users’ hands is actually something of a limiting factor. When a smartphone is instead attached to a drone or some other movable machine, it opens up new possibilities.

ROBOKUKU enables the simple convenience of turning to face the user when receiving a call, but it could easily be extended for live tracking and speakerphone functions during a video conference.

It doesn’t take much imagination to go a little further and envisage a symbiotic relationship between, say, a smartphone, a wireless charging station and a drone. When the phone dropped below a certain battery threshold, the drone would be dispatched to collect it with a magnetic attachment arm and fly it back to the wireless charging pad, where it would be gently lowered into position.

I suspect you will be able to think of many more possibilities.

These developments, of course, create challenges for experience designers.

For instance, how will the subtleties of motion design influence user perception of something like the ROBOKUKU? The minutiae of how quickly it turns, how soon it returns to its original position…whether it becomes over-excited and performs a little spin when receiving a call from someone on the user’s ‘favourites’ list…all of these factors could play a role in determining the quality of experience.

It is yet another example of the expanding range of interconnections between physical and virtual entities emerging to test the skills of the user-centred design practitioner.

Thankfully, the MEX community has already been puzzling over this. A MEX/15 working group facilitated by Louisa Heinrich developed these early principles for the etiquette of robot user experience – a starting point for further exploration.

+ There are no comments

Add yours