How artificial intelligence nuances experience design

How artificial intelligence nuances experience design


Through our recent podcast interviews and conference sessions (e.g.Nathan Benaich and Ed Rex), I’ve become more aware of how the intricacies of machine learning impact the end user experience.  In particular, the choice of neural network technique (there are several different approaches) and the balance between the use of human classifiers versus purely machine-classified data.

A team from Google is sharing a multi-part series on their research blog about how they’re using these techniques to improve the effectiveness of the ‘You might also like’ and ‘Similar apps’ suggestions in the Google Play Store:

Compared to the control (where no re-ranking was done), we saw a 20% increase in the app install rate from the “You might also like” suggestions. This had no user perceivable change in latency.

As an end user, there’s been little visual change in the appearance of the Play Store, but behind the scenes Google have re-designed the entire process around machine learning to provide more relevant suggestions.

Discovering these nuances requires both an understanding what an app does, and also the context of the app with respect to the user. For example, to an avid sci-fi gamer, similar game recommendations may be of interest, but if a user installs a fitness app, recommending a health recipe app may be more relevant than five more fitness apps. As users may be more interested in downloading an app or game that complements one they already have installed, we provide recommendations based on app relatedness with each other (“You might also like”), in addition to providing recommendations based on the topic associated with an app (“Similar apps”).

One particularly strong contextual signal is app relatedness, based on previous installs and search query clicks. As an example, a user who has searched for and plays a lot of graphics-heavy games likely has a preference for apps which are also graphically intense rather than apps with simpler graphics.

In part 1 of their series, the team describes how they chose their particular machine learning approach, and also the steps they went through to improve the work of the human classifiers tasked with producing the training data for the system.

This switch to machine learning refinements is happening throughout Google’s product range at the moment and is leading to arguably the most significant changes in overall user experience – above and beyond any visual design advances they’ve introduced.

Part of Friday Inspirations, an ongoing MEX series exploring tangents and their relationship to better experience design.  We explain the origins of the Inspirations series in this MEX podcast and article.  Share your own inspirations on Twitter at #mexDTI.

2 Comments

Add yours

+ Leave a Comment