A growing number of our everyday tasks no longer need our human input. How has this happened, and what’s still to come?
Think of early mornings, a time when few of us are at our best and many need a routine to get them ready for the day. Instead of relying on your morning muscle memory to get you through the motions, what if your alarm clock knew the best time to wake you up? What if your shower knew you were heading its way and started warming the water beforehand? What if your coffee was waiting for you as you walked downstairs, and your car turned on as you locked your front door? Mornings would be a breeze.
Getting to that level of artificial assistance, one which can conveniently and reliably anticipate our needs and make decisions to accommodate them, requires a cohesive platform that draws in disparate devices and data sources. As personal assistants such as Alexa, Siri, and the Google Assistant grow in capabilities, this is the challenge designers and developers will face in the coming years. But we aren’t starting from scratch, anticipatory experiences have been edging into our lives for years now.
Anybody who has opened Google and started typing a query will be familiar with its suggested search, which can at times be startlingly accurate even when searching for something seemingly obscure. Google has moved on from simply trying to understand what we are looking for, and now tries to anticipate what we’re going to looking for. The more it learns about us and our web usage patterns, the better it becomes at finding the information we want faster.
Autocorrect is another example. While it might seem like a technology that is still struggling with accuracy, leading digital keyboard makers such as Switfkey have begun adding neural networks to their predictive text algorithms. These work alongside a user’s historical behaviour, not only recognising and ignoring mistakes more easily, but accurately predicting phrases a user has never typed before using a database of over one billion words and 100 million sentences.
Beyond language, retail companies have also been experimenting with and making breakthroughs in anticipatory design. Companies such as Kroger and JuiceBeauty have invested in proactively suggesting or purchasing certain products before the user runs out, while Amazon is going so far as to explore predicting what a user would like to buy and deliver the products before users order them with the aim of expediting the delivery process.
As more services start attempting to predict our actions, more of the small decisions and menial tasks in our everyday lives will be handed over to algorithms. Remaining in control of this automation is a key concern, particularly where financial decisions are involved, but personal assistants such as Siri, Alexa, and the Google Assistant are well positioned to develop into both an interface for the user and a gateway for companies.
Right now these assistants are useful for reactive actions such as playing music or turning lights on; but they’ve yet to make the leap to proactive actions, such as anticipating that we would like the lights on and suggesting it to us.
Getting to this point is partly a question of technology and partly data. Expanding assistants to more devices is expanding their range of data points such as behaviours and location; they are learning where we go and when; when we are working and when we are relaxing at home; when and where we check our bank accounts; which friends we text the most; and more, slowly building up the personal behavioural profiles which are needed to make accurate and relevant guesses at our future actions.
Soon, our smart assistants will be able to order more coffee and milk when we start running low, push a bank balance update to us just as we go to check it, curate and push us the day’s new stories just as we find our seat on the bus, or even flag a warning that traffic on the route to work is heavy today and advise to leave a few minutes earlier. In this final case, it would also alert our alarm clock, shower, coffee machine and car to make sure they all run ahead of their usual schedule too. However, all of this is not only dependent on people accepting this kind of automation, but actively adopting it.
If automation is introduced seamlessly and delivers clear benefits to the user, adoption is smooth, or even unnoticed by the user, as was the case in the supposed story of eBay’s site colour change.
Intelligent personal assistants need data to develop their predictive behaviour, and there are still a lot of issues to be worked out in that domain, particularly around privacy. The challenge for designers and developers is to ensure that the growing role of predictive assistants in our lives is a smooth change and above all beneficial. There will certainly be, and have been, some misfires; assistants making the wrong assumptions, or users being resistant to them; but this is all part of the process of innovation and development.
Anticipated experiences are one of the next steps in the evolution of user experience. Trusting technology to analyse, predict, and decide better than we can - using historical data, real-time data, and even a little bit of creative licence - as well as relinquishing control of some of our decisions into the hands of AI will be a choice we have to make if we really want more of our valuable time back. An easy choice for some.
For Screenmedia and our clients, it’s back to the lab where we’ve been exploring the use cases to come. If you're looking for more than just words, get in touch to see how we can help you out with your next project.
So now what?