Interviews with TAT, Colormonkey and Polar Rose – Malmo diary
Malmo was in the grip of icy cold when I visited last week, but the Swedish city is home to a hot mobile scene. This is my diary of interviews with those at the heart of the Malmo mobile hub, their focus on user experience and what’s driving the city as a centre of innovation.
TAT, which provides the interface environment for some 300m handsets worldwide, is based here. So too is Illusion Labs, developer of some of the most popular iPhone apps; Tactel, a 300-strong mobile software house; Colormonkey, best known for its Runstar app on Android; and Polar Rose, the face recognition pioneer now focused on getting its platform into mobile devices.
Several of Malmo’s mobile start-ups are located in the MINC, a warehouse-style incubator building supported by local government, which offers subsidised office space and the opportunity to co-operate with like-minded companies.
My first visit was with TAT, who are today officially announcing many of the innovations I saw last week as part of their presence at MWC.
TAT’s Kastor platform and Cascades development tool enable the user interface layer on handsets from Sony Ericsson, HTC and several other handset manufacturers. The Swedish company integrates deep into a number of mobile OS, adding a user interface framework which provides better graphical controls, animations, support for 3D and the ability to easily port UIs across platforms.
I spent the day with Fredrik Ademar, CTO; James Halliburton, Head of Concepts; and Ola Larsen, Vice President of Marketing. Many of the 130-strong team in TAT’s Malmo offices were busy putting the finishing touches to their demonstrations for this week’s Mobile World Congress.
The level of commitment to pushing the boundaries of mobile user experience was impressive. The company has created a wide range of concepts, from augmented reality facial recognition in partnership with their Malmo neighbours Polar Rose to a prototype interface using the dual screen support coming to a number of mobile graphics chipsets in 2011.
In addition, I was able to spend some time with the FUSE concept handset, developed in partnership with TI, Immersion, Synaptics and Alloy to showcase new ways of using multiple sensors in mobile devices.
Below is a short video filmed in TAT’s labs, with one of their engineers talking me through the capabilities of the FUSE:
The prototype was developed to push the boundaries of what’s possible with today’s interface, input and feedback technologies. It uses capacitive touch panels from Synaptics, with touch sensing on both the main screen and the whole back side of the device. It also has pressure sensors on the edges of the handset, allowing it to detect how hard it is being gripped.
Immersion has supplied haptic feedback, using 3 actuators beneath the screen to create detailed sensations under your fingertips.
The chipset at the core is a TI OMAP, which powers the TAT interface platform running on top of Linux. Alloy co-ordinated the overall design effort.
As you can see in the video, the whole interface has been implemented in 3D, which moves smoothly when the device is tilted. Squeeze the sides and it will flip between the main homescreen and a panel of widgets.
You can also navigate the screen using the capacitive panel on the rear. For instance, if you touch behind one of the icons on the homescreen, it will show you a brief summary of the latest event in that application – such as the most recent email received, the latest Facebook updates or weather.
It is very much a work in progress and is quite difficult to use in its current form. The presence of the touch panel on the rear of the device caused a lot of accidental icon selections and the pressure-sensing side panels were still being fine-tuned to activate with the correct level of grip.
However, the potential was clear to see: mobile interfaces in 18 – 24 months time will support many additional layers. You will be able to touch them in new ways, explore in 3D and feel the device responding subtly beneath your fingertips.
TAT has always invested heavily in creating concepts and prototypes to highlight the capabilities of its platform. As a result, it is often better known for its professional services team, which helps operator and handset OEM clients to refine their user interfaces, despite its primary business in providing the underlying framework.
According to Ademar and Larsen, TAT is also now experiencing strong demand from application providers who are keen to use Cascades to develop advanced interfaces and visuals, allowing them to go beyond the standard controls offered in most mobile OS. Spotify, the popular mobile music client, is the first of these, utilising TAT’s cross-platform technology to develop its app for Symbian and Android.
TAT has also been working with a data visualisation company, using its advanced UI platform to create some rich graphing applications.
Another of the demonstrations I saw was TAT’s homescreen concept for Android OS devices. Running on the Nexus One, it added several enhancements.
The most impressive of these was also the most subtle: a small indicator in the lower right-hand corner showed where the user was in the carousel of the 6 homescreen panels. Pressing on this brought up a quick 3D overview of the panels, which could then be scrolled and selected.
However, there was another layer of functionality hidden beneath the obvious navigation controls: if a user swiped from the panel indicator towards the virtual location of another panel, it opened automatically. For instance, swipe at a 90 degree angle and the 1st panel would open; at 75 degrees the second panel would open and so on… It is difficult to describe without experiencing it for yourself, but in use it is a very quick and natural way to move around the homescreen.
The TAT interface also added some interesting widgets, all of which were built on the principle of displaying only the most essential information on the homescreen, but allowing quick access to more detail. A watch face displayed the time, but could also be used to easily set an alarm by tapping and dragging a control on its edge.
Similarly, the minimalist music player could be tapped to flip-up a ‘cover flow-style’ control for browsing through additional tracks. The daily weather display also flipped over when tapped to show additional days of forecast.
The integration with the Android OS was strong, such that the homescreen could support both native Android and TAT widgets within the same panels.
I talked at some length with Ademar, Larsen and Halliburton about the new challenges presented by building UIs for the ever-expanding range of form factors and device classes now embedded with wireless connectivity.
They are already working with manufacturers on concepts for dual-screen, tablet devices, which can be held open like a traditional book. I saw a prototype UI for this on a TI demonstrator platform, where one screen was used to explore media and the other to display it.
The iPad, with its focus on e-books and media consumption was seen as a driver for this market and a rich canvas on which to experiment with new UIs.
TAT also works with companies from other industries, such as automotive and home audio, helping them to visualise the new UI concepts which can enhance the value of their products.
As we discovered at our most recent MEX Conference in December 2009, companies all over the world and in many different sectors are now struggling with the challenge of creating experiences which transcend traditional product categories. These multi-platform services will need to combine numerous digital and physical elements within an overall interaction flow, combining different devices and touchpoints into an experience greater than the sum of its parts.
The final demonstration I saw at TAT was the augmented ID concept created in partnership with fellow Malmo-residents Polar Rose. Again built using Android (this time running on a Samsung handset), the application utilises the Polar Rose facial recognition engine to instantly identify someone in the camera viewfinder and then show you their latest social network posts.
It was an impressive technology demonstration, if a somewhat worrying portent of a future in which our human interactions are initiated through the viewfinder of a camera phone!
Polar Rose
The Polar Rose engine uses patented image recognition techniques to detect which parts of a photo represent faces and then rapidly process the pixels within those areas to recognise the identify of the individual. Once it knows their identify, there are numerous potential services which can be offered.
For instance, the people within the photo could be automatically informed about the image, allowing them to see themselves on whatever social networking service they use. Alternatively, as in the TAT demo, it could be used to provide more information about those people, within the constraints of their privacy settings.
I spent some time chatting with Carl Silbersky, who took over recently as CEO of Polar Rose. He has an entrepreneurial background and was formerly Vice President of Business Development for mobile software developers Tactel, which negotiated a successful sale to a private equity firm in July 2009 despite the global economic situation.
Silbersky has been brought in to rebuild Polar Rose’s strategy with a mobile focus. The company originally launched its service as a free web offering and, while it attracted attention, revenues did not materialise as expected. Armed with additional funding and a new management team, Silbersky is now talking to manufacturers, operators and social networks about embedding the image recognition technology in their mobile applications.
He believes the accuracy and speed of their image recognition engine puts them in a unique position, despite a growing number of competitors in this space.
It is easy to see how these capabilities could be attractive to users. Photo tagging is already a popular activity on Facebook and other social networking services, with users manually indicating the identity of individuals in their photos. Once tagged, these users automatically receive a notification they are in a photo and an invitation to view it.
Polar Rose could automate the recognition part of this and bring it to mobile. For instance, a photo taken on a camera phone could automatically generate an SMS link for everyone featured in that image to download the picture.
Colormonkey
In another part of the MINC building, Per Ogren of Colormonkey was busy with his two colleagues creating new applications for Android and iPhone. Their most successful title to-date is Runstar, a beautifully crafted app designed to motivate reluctant users to run more often.
It represents a new approach in a market currently dominated by highly technical pieces of software which put features ahead of user motivation. Ogren cannot understand why most route tracking applications for mobile try to replicate the numeric interfaces of professional GPS or heart monitor watches designed for ‘fitness freaks’.
The target market for Runstar is very different and, as such, Colormonkey has invested considerable effort in creating a UI which allows users to start, track and end their runs as simply as possible. It is also working on a variety of gaming elements which will encourage competition among training partners and local communities.
Quite apart from the graphical prowess of the UI (which one would expect given Ogren’s history as a senior member of the creative team at Sony Ericsson), Runstar also has some simple but highly effective features: the application can be programmed to play a user-defined ‘power up’ song at the start of the run. This can also be blasted by hitting a single button whenever the user is feeling tired. In the future Colormonkey will explore ways to make the interface completely hands-free, using audible elements to replace graphical controls.
The Colormonkey team has been paying the bills initially with a variety of client work, taking advantage of the strong demand for its UI development skills. However, Ogren’s long-term strategy is rather different: he and his colleagues are developing products across channels ranging from mobile applications to physical toys.
Combining a recently-developed love of sewing and craft with his mobile development skills, plans are afoot to create a series of cute toy characters, which will eventually tie into virtual mobile gaming elements. Two of the dolls already sit in Colormonkey’s Malmo offices, keeping watch on the work of their ‘parents’.
Why does Malmo work as a mobile hub?
Sitting on the train which carries you across the huge Oresund Bridge linking Swedish Malmo with Danish Copenhagen, I had some time to reflect on what’s driving this hub of mobile innovation before my flight back to London.
Close links with the local universities and major employers such as Ericsson are a clear benefit. Centrally-supported initiatives such as the MINC incubator also seem to be playing a significant role.
Above all, however, there was a tremendous sense of community. Arriving at TAT’s offices in the morning, I bumped into the former CEO of Polar Rose, who has now become a VC specialising in mobile. Per Ogren of Colormonkey gave TAT one of their first big client wins when he was at Sony Ericsson and subsequently went on to work for them himself.
Silbersky, now CEO of Polar Rose, had a long history in mobile with Tactel, which counts Ericsson among its clients. During my conversation with him, the founder of another startup working in the MINC dropped in to talk about the funding they’ve just won.
These strong local links are married with a down-to-earth approach centred on a good understanding of mobile user experience issues. It’s no coincidence that local success stories such as TAT, Tactel, Colormonkey and Illusion Labs are all vocal advocates of best practice in UX.
I’d be interested to hear about other mobile hubs around the world, particularly those with strong university links or a user experience focus. Post your ideas to the blog.
[…] device itself is covered in detail in last week’s diary article about TAT and Malmo, but I also had the chancee to speak with Tom Duckery, Andrew Hsu and Nada […]
[…] for designing an audible dimension into mobile UIs using their technology. After seeing the demonstrations of visual 3D in the FUSE concept and by Nvidia and other chipset manufacturers, it strikes me that audible feedback in the UI should […]
[…] Marek’s report (2010) on mobile tech start-ups in Malmo, Sweden […]