There’s always a first warm day in March, when winter’s low skies open into the blue promise of spring. It rarely lasts. Not in England.
But today the lawns of the garden square were dry for the first time that year and already filling with the lunch crowd. They bustled from the offices nearby, carrying plastic bags of sandwiches and salads, sitting in ones and twos, legs stretched on the grass and pale faces tilted upward to make up for lost time.
A little apart from it all, a gentleman was making his way around the sandy path at the park’s perimeter. He’d stop, adjust his shoulders almost imperceptibly, then lift his gaze for a few moments, as if thinking. He was muttering to himself. So does everyone these days. It was nearly a year ago now, at MWC 2023, that Google announced user interactions from voice commands had surpassed those from touch, many of them linked to their new smart contact lens display.
In the garden, almost everyone else was in shirt sleeves, while the man was still wearing his suit jacket. If you looked closely – really looked – you could see a slight bump in the line of the fabric stretched over his shoulders. An array of pale LEDs formed a narrow eye atop each shoulder, the aperture no wider than a pencil.
The camera eyes were almost invisible when the LEDs weren’t illuminated, fading into the dark fabric of the suit. Each time he paused, the lights glowed briefly, he muttered a few words, his eyes flicked briefly skyward to activate the contact lens display and then he moved on.
On sunny days like this John always ate his lunch at his desk and as quickly as he could. The sooner he finished, the sooner he could get out into the garden square in front of the office and make the most of the light. Embodied Vision, the manufacturer of the jacket, had talked a good game at the launch about the sensitivity of ‘super pixels’, but in reality he’d found this first generation product needed good natural light for his particular purposes.
The better lit the scene, the more likely the twin camera units on each shoulder would accurately gauge depth information.
The stereo separation of the twin camera arrays, and the multiple image sensors contained within each, were crucial. They’d enabled a new direction in John’s passion for photographing flowers. After years of buying the best camera phones and amassing thousands of macro shots of flowers for his collection, the Embodied Vision jacket was enabling him to create 3D models of the flora he loved. At first he kept them purely in digital form, exploring the colours and the detail from different angles on his twenty minute commute.
However, when Embodied Vision had announced their 3D printing partnership, he’d seen the potential for more.
The windowsill in the living room of his small apartment now housed a growing collection of 3D printed specimens. It was as close to a garden of his own he was going to get on his budget in London.
Embodied Vision’s blog showed other users were doing similar things, from models of classic cars to estate agents producing little replicas of the houses they were trying to sell. John’s interests though, remained resolutely focused on his flowers.
The models were imperfect, but the quality was improving with each one he produced. They typically arrived on Wednesdays, delivered to the office so he could be there to sign for the parcel. He’d bring the package home, unopened, and wait for Saturday morning. In the quiet of the apartment, he liked to sit on the sofa and compare the models with what he remembered in his mind’s eye.
Two weeks ago, upon opening his package of miniature daffodils – printed to quarter life-size – he’d discovered the slightly blurred form a bumble bee preserved in the matte textured plastic of the 3D printer. It was a beautiful surprise. He hadn’t realised it was there when he was capturing the scene with his Embodied Vision. This one John moved to the centre of the windowsill, pride of place.
He was waiting now for the software update Embodied Vision had promised, which would introduce a gestural UI based on hand tracking.
The visual clarity of his smart contact lenses allowed him to review each shot as he took them and at surprisingly good quality. It was the constraints of the voice commands which were limiting the overall fidelity. There was only so much he could achieve by muttering quietly to the software as he walked around the garden square. With the gestural UI, which he’d seen in Embodied Vision’s teaser video, he would gain the fine control he needed to ensure each shot was tuned before he moved on to the next plant.
One day, when he had a garden of his own, he wanted to produce a series: capturing an allium – his favourite – through each stage of its life, from the first shoots of the bulb to its unfolding into its full form.
Design fiction grounds future technologies in user-centred principles before they’ve fully emerged. It is at its most effective as a circular conversation between engineers, user researchers and product strategists. The key, as with any good story, is that it should be believable: either because the listener imagines it to be possible, or because it is sufficiently desirable that they are willing to suspend disbelief. Part of the MEX User Stories series.
The scenario above touches several ongoing threads within the MEX initiative:
- Emerging imaging technologies, like Intel’s RealSense, Google’s Project Tango and the appearance of early depth-sensing cameras in smartphones like the iPhone 7 Plus and Lenovo Phab 2 Pro. In the near future, multiple camera sensors will become the default for personal digital imaging, combining with computational photography to allow users to capture the world in 3D. Timeline: 3 years.
- Digital eyewear. For all the failings of experimental efforts like Google Glass, the potential of delivering a digital display directly to the eye is too compelling to discount. Companies, including Google, continue to work on the long-term goal of a smart contact lens. Dispelix is making progress on bringing displays closer to the eye. In tandem, increasingly affordable virtual reality experiences, delivered through eyewear and headsets, are preparing the market for consumer acceptance. Timeline: 8 years.
- 3D printing continues to fall in price and improve in capability. Computer companies, from HP to Microsoft, are beginning to understand the highest margins in personal computing will come from products which increase users’ creative capabilities. The first signs are appearing in HP’s Sprout and Microsoft’s Surface Studio. User-centred digital experiences can strengthen the maker movement. Timeline: 5 years.
- Gestural UI. The groundwork has already been laid and is beginning to seep into the consumer market through virtual reality products like HTC’s Vive, Sony’s Playstation VR and Google’s Daydream – all of which come with hand controllers. Within 4 years, these gestural inputs will rely on tracking cameras rather than embedded accelerometers, allowing physical hand controllers to become an optional extra rather than a necessity for gestural input on the move.
- Smart fabrics. Research for episode 26 of the MEX podcast on wearables revealed interesting long-term opportunities in truly embedded digital capabilities, where clothing itself gains new function, rather than the clip-ons and wrist-worn devices we’ve seen so far. Narrative, which created a stylish life logging clip-on paired with clever software for managing the minute-by-minute image stream, could be seen as the ‘too early’ pioneer of the fictitious Embodied Vision product described above.
Would you like to share a design fiction story of your own? 3 tips to get started:
- Human first. Think about ‘who’, before you consider ‘what’ and ‘where’.
- Why? There has to be a believable reason for the characters’ actions, as well as the motivations for any of the companies – real or imagined – involved in the story.
- Forwards, then backwards. While your story will be forward facing, can you find a believable path back to the present day from the future you’ve created?