Honda’s new advertising campaign for the Jazz vehicle enables iPhone users to ‘grab’ characters from the TV screen by swiping their device through the air. It provides an interesting example of how the illusion of communication can be created between TV programmes and mobile devices, providing a multi-platform user experience which is greater than the sum of its parts.
Developed by Wieden + Kennedy, the application uses technology by Gravity Mobile to listen to the sound track of the advertisement. When a certain point is detected in the soundscape, the app illuminates an image of one of the cartoon characters on the screen of the iPhone and the user can swipe the device through the air to ‘capture’ the TV character onto their mobile device.
There are several characters, including a mystery bonus, and a different type of swipe seems to be required to ensure each one is captured correctly. The application uses input from the iPhone’s accelerometer to detect the swipes.
Once captured, users can interact with the characters on the screen of the iPhone. The application also provides a link to a YouTube clip of the advertisement and some basic information about the Jazz vehicle.
Unfortunately the first version of the application (a new edition is promised as ‘coming soon’ as of 9th February 2011) has a poor user experience for actually booking a test drive of the Jazz. The link simply loads the standard Honda web page within the framework of the app, where the tiny text size and desktop-style interface make it difficult to complete a booking request. I would hope to see this issue addressed in a future version.
Overall, however, the application is an achievement. By making clever use of existing technology, it enriches the central theme of the campaign – the unpredictability of life’s journey – by allowing the viewer to become part of Honda’s beautifully illustrated cartoon story. It picks up on and extends behavioural trends such as the rise of gestural input for gaming consoles (e.g. Wii and X-Box Kinnect) and growing mobile usage while watching TV.
MEX has been looking at multi-platform experiences for some time and will do so again in MEX Pathway #2 at the MEX event in London on 4th – 5th May 2011. Most recently MEX Pathway #2 found that existing multi-screen interactions tend to be driven from a small screen to a bigger screen, such as using Apple’s AirPlay to display an iPhone photo on an Apple TV. This campaign from Honda reverses that flow, giving users the illusion they are transferring content from the big screen to the small.
The use of sound recognition to achieve this suggests an interesting future for this approach. Separately, a social TV guide application – Intonow – has launched recently in the US, using a similar approach to detect the programme a user is watching and display information and community options specific to the show.
Also, we previously reported on this concept video (first video, from 02:00 onwards) from Berg and Dentsu, which shows a wireless device listening to background conversation and using web searching to display images related to what it hears.
Thanks to MEX alumni Julie Strawson of Monotype Imaging for pointing us to the app. Try it for yourself by downloading the iPhone application from the app store and watching the original advert on YouTube.
The question is: what other experiences could be created with this approach? I’d love to hear feedback from you on the possibilities. Please post a comment to the blog below.