Categories
Communication Innovation UX

Voice interface is still overpromising and underdelivering

Cortana speakerAround 15 years ago Flash technology was in the ascendancy. One of the odd conventions to emerge at the time was the ‘Flash intro’. Very often, to build your anticipation for the website awaiting you, you would be entertained with what was essentially an opening title sequence. And if you were really unlucky, on the other side of it was a website fully rendered in Flash.

What you wanted was content; what you got was an extended journey through a designer’s ego trip (and yes I should know, I was one of those designers). The basic premise of a Flash-built website was that tricking out the interface would make for a better user experience. That assumption turned out to be wrong.

With Siri, Alexa et al entering our lives, our interfaces now have personalities. If a digital misunderstands our requests we are likely to learn about it through a witty quip. TV ads featuring virtual assistants often make a particular show of droll one-liners emanating from the device.

But as Neilsen Norman Group research shows, voice interfaces are falling far short of user expectations. It seems that priorities need to be reassessed.

A little less conversation

As part of a project last year I began designing for command line interface. With no previous experience, a terminal window or console can be a daunting place. Initially I was puzzled why user prompts and feedback in this world were so clinical and abrupt. Why would command line users not want to be addressed in a more human fashion? The answer lies in task efficiency.

Command line interface evolved from single-line dialogue between two human teleprinter operators. Over time, one end of the human-human dialogue became a computer, and the conventions remained. These interfaces provide users a more efficient method of performing tasks. In short, command line users are just like the rest of us: that is, trying to perform a lot of tasks in as short a time as possible, without surplus dialogue or clutter getting in the way.

This method of working is totally in keeping with our tendency towards ever more concise communication. Email is on the wane due to the long, unwieldy threads it encourages. The rise of chat apps such as Slack is due in large part to the tendency towards more concise messages. We’re making less mobile calls, opting instead for text messages using abbreviations, acronyms and emojis.

Many rivers to cross

As designers we are not always trying to mimic a conversation. We are creating an exchange which delivers for the user as efficiently as possible. To re-cast all human-computer interactions as conversations is to misunderstand our relationship with machines and devices.

The obstacles to success with voice UI are many. Users need to think more than once about the commands they give. They are required to speak in a manner that often isn’t natural for them. Even relatively simple queries may need to be broken down into smaller questions before reaching anything like the right answers.

When barriers are placed between a user and the outcomes they want the end result is predictable: they will simply opt out. A report from The Information suggests that only 2% of Alexa speakers have been used to make a purchase from Amazon in 2018. Additionally, 90% of the people who try to make a purchase through Alexa don’t try again.

We are still some distance away from the dream that voice UI promised. Perhaps this is voice’s Flash period, where the user needs to work hard to access the content they want. And I’m willing to bet that most frustrated users would be willing to trade every ounce of their virtual assistant’s sassy responses for just a little more efficiency.

The fact is that voice UI is still pretty hard work, no matter how hard Siri or Alexa try to entertain us.

Categories
Innovation UX

UX as a new year’s resolution

 Gym membership is about to undergo its traditional annual boost; even now the introductory offers have been readied to greet the queues of earnest individuals who feel that the time has come to make that long–overdue change to their regimen. Sports shops will be visited, gym wear and protein shakes will be purchased.

Television ads, with easy answers to any number of personal improvement and personal transformation challenges, will race to fill the vacuum left by the stores trying to convince us that they ‘do’ Christmas better than all the others. Exercise gadgets, diet plans, fitness DVDs by an endless string of B and C list celebrities will be paraded across our screens. We all know the pattern. It is recurrent, seasonal.

In boardrooms and meeting rooms across the land, similar cyclical activities may be underway: discussions centering around the need to improve performance in particular areas. Just possibly, this may involve performance online. Maybe that website that no–one believes is really pulling its weight for the organisation has had its time. Yes, that’s it, it’s time for a new website.

Fulfilling Einstein’s definition of insanity – doing the same thing over and over again and expecting a different result every time – a lack of attention to underlying challenges will result in repeated, failed attempts to meet any tangible goals. No questions are asked of the old website. No specific, relevant targets set for the new one. Clarity of purpose gets lost in interdepartmental wrangling.

In many cases, a vague sense of the website having to achieve something will exist, but what that something is will too frequently be weighted towards the organisation’s view of the world.

Increasingly, discussion of this nature will turn towards the need for a UX/UI/[insert your acronym of choice here] guru. Sometimes this will involve hiring a single individual to provide the essential missing ingredients – as evidenced recently, with a company seeking to recruit a “Creative Front–end / UI / UX Engineer”. Nothing could signal more clearly ‘we don’t know what we’re doing but we hope that by throwing some terms around something magic will happen’.

Without something changing beyond the veneer then, like the hopeless dieter who is all talk at the water cooler while gorging out on nachos and chocolate at night, very little is likely to change. Without a commitment to a customer-centred approach the new digital venture, be it an app, a website, whatever, becomes that oddest of man-made endeavours: a folly.

The lessons from recent history are clear, right across the digital industries that innovation and market advantage are gained through customer insights and user-centred action.

GOV.UK is quietly revolutionising transactional services through a commitment to understanding the needs of the user. If you have renewed your car tax online recently, you’ll perhaps know what we mean. 

Umpqua Bank has gone from 6 branches in 1994 to almost 364 branches today, across 5 U.S. states, counter to conventional wisdom which says that physical branches are a thing of the past. Through deeper understanding of customer needs, Umpqua created spaces that customers actually want to visit.

Zappos built a billion dollar business by eschewing traditional media and investing instead in a superior user and customer experience, well ahead of that offered by its competitors.

These advantages require a change that is more than just a fad – or a diet if you like. It is a lifestyle change, a cultural change that requires buy-in from all levels within an organisation.

So when those familiar conversations begin again around you, ask yourself: “How are we actually going to win this time? What is going to be different this time round?”.

A Happy and Prosperous New Year to you, and whatever your New Year resolutions are, may they bring the lasting change that you aspire to.

Categories
Innovation Process UX

One-eyed kings: the masters of innovation

One sure way to spice up any conversation around design or innovation is to note that Apple have never invented any device.

While it may be met with some resistance, the fact remains that they didn’t invent the PC, nor the MP3 music player. They are not responsible for the tablet computer, the smartphone or indeed the smartwatch. For all of their (deserved) reputation as innovators, Apple have yet to debut any mass market device that didn’t exist previously.

Given that one of the greatest innovators of our age has achieved this position by essentially coming to the market second or third with their offerings, how have they managed to achieve such a lofty status?

“In the land of the blind the one–eyed man is king” – Desiderius Erasmus c.1500

Apple’s greatest asset by far has been their fieldwork; relentlessly studying how people behave, discovering what they need – and why. They then do one of two things: either a) successfully define a problem, and apply existing technology to solve it in a superior manner, or b) identify inherent problems within existing technologies and successfully solve those.

In doing so they stimulated latent consumer needs which then triggered demand for their product. This is Apple’s genius, and this is innovation.

The company’s rare failures tend to be situations where they tried to solve problems that didn’t yet exist. As an example, the Apple Newton message pad was a tablet by any other name but it came too soon to an unprepared market. In retrospect the consumer PC market itself hadn’t yet been properly established; the public hadn’t yet come to value personal computing of any kind let alone tablet computing. For a modern-day comparison, one has to wonder whether Apple’s reported removal of the headphone jack from the next generation of iPhones is straying into just this territory.

Build a better mousetrap and the world will beat a path to your door – attributed to Ralph Waldo Emerson, c. 1889

No matter what a business or product offers, someone else has either already tried it, or is currently doing it. To innovate, you need simply to do it better than those others. And by “better” read ‘in a more customer-centric fashion’. A surprising number of founders, businesses, organisations don’t appear to have grasped this. Investing heavily in what may be incremental improvements might not set the boardroom alight, but that’s where the gold is. As with design, innovation is a process not an event. A verb, not a noun.

Correctly defining the problem is more than half the battle in product development. Putting the customer at the centre of the solution is the rest. And to any cries of “but… what about marketing??” in response to that last point, let’s answer it with a look at the banking industry.

Banks are so far behind where they should be with their services it’s tempting to be embarrassed for them. Banks are prime examples of organisations that have tried to market their way out of problems that should have been answered by simply providing better services. This approach has led banks into the unenviable position of being some of the least customer-centric businesses in the world. Millions of RBS customers unable to withdraw money from cash machines for days on end would attest to that.

The financial sector is gradually waking up to the fact that design thinking can be applied to services every bit as much as products. Design thinking doesn’t need to be the territory of the ‘big thinker’ or genius designer. It belongs to everyone on a team – including designers.

Of course everyone wants to innovate. And innovation can be managed through a pragmatic process of observation, competitive benchmarking and incremental improvement. Just ask Apple.

This post first appeared on the FATHOM_ blog.

Categories
Innovation Product UX

One size fits none

Who doesn’t love easy answers? Don’t we all feel good whenever the way forward comes quickly to hand?

With less effort, pain and outlay involved, the lure of easy answers is such that we tend to place undue emphasis on initial conclusions. This cognitive heuristic is called ‘anchoring’ or focalism, where we rely too heavily on the first piece of data we’re exposed to. Where the information is in line with an existing belief or preconception, confirmation bias simply compounds the issue.

Psychology aside, often the easy answers we seek are expediently classified as ‘best practice’. Indeed, when we ask “what is best practice?”, it can often be a thinly-veiled substitute for “what’s the easiest answer?”. The same question can similarly mask sentiments such as “if it’s good enough for company X, it’s good enough for us.”

Ten or twelve years ago, whenever a tricky decision point was reached on a website project, it wasn’t uncommon to hear someone chip in with: “Well, what do Microsoft do?”. The inference was that a) Microsoft had all the right answers (it didn’t) and b) that the context was the same (it almost certainly wasn’t).

‘Best practice’ can be an excuse for all manner of evils. Less than two years ago, the use of infinite scrolling – where a webpage would continuously load new content every time you reached the bottom of the page – was running rampant. It was a trend, and one that became de rigueur for any new website that wanted to appear ‘with it’. No doubt somewhere at some point it was referred to as best practice. But like any trend it is increasingly being left behind as we discover lots of sound reasons why it is bad practice. In the case of infinite scrolling, it meant that the website footer – use of which is definitely best practice – was forever just out of reach of the user.

No sooner has one best practice established itself than extrinsic factors change, and something else (albeit not necessarily better) has come along to take its place. What we call ‘best practice’ evolves in parallel with technology, user behaviour and established norms across industries. It never follows trends blindly. Sometimes trends are completely at harmony with firmly established principles. Sometimes they are not. And so it is with best practice.

There’s an old meme in the design industry that goes something like this: Good. Quick. Cheap. Pick any two. Meaning: good, quick design won’t be cheap, and so on. To paraphrase this for user research and customer insights: “Easy. Quick. Right. Pick any two.”

Whenever best practice is referenced as rationale for decision–making, the immediate follow-up should always be: “Okay. But is it best practice for our customers?”. Context is everything. Easy answers might be cheap, but they won’t necessarily be right.

Learning from others through benchmarking, both within and outside your sector, is an important element in establishing a direction of travel. However research and validation with customers are the true path to success.

Don’t let a quick and easy path to answers under the guise of ‘best practice’ either stifle your organisation’s chance to innovate and excel, or inspire outcomes which simply mirror your organisation’s in-house desires. The path to the dark side this is.

Categories
Innovation UX

There’s nothing new about innovation

If you are familiar with phrases such as ‘early adopter’, or the bell curve model of adoption then – whether you know it or not – you are also familiar with the work of Everett Rogers.

In 1962 (53 years ago at time of writing), Rogers published Diffusion of Innovations which contained not only enduring ideas like the bell curve, but a wealth of material that continues to be relevant in a world hungry for the silver bullets of success.

Still in print, and in its fourth edition, Diffusion of Innovations remains a central text when it comes to assessing the potential of innovations in the marketplace. Building on research gleaned from over 1500 field studies, Rogers identified that an innovation could be rejected, and therefore fail, based on one or more of the following characteristics:

Relative Advantage – the degree to which an innovation is perceived to be better than something comparable

Compatibility – the degree to which an innovation is compatible with existing habits or behaviours

Complexity – the level of complexity associated with adopting the innovation

Trialability – the level of opportunity to test out or trial the innovation

Observability – the extent to which the results of an innovation are visible to others (particularly peers)

Many producers or technologists believe – indeed, need to believe – that they will be the next Steve Jobs or Henry Ford. Quite a number will claim that Rogers’ criteria don’t apply to their product, in the same way that Jobs’ or Ford would never be tethered by rules of any kind.

However the vast majority of everyday innovations tend to build on and improve something already familiar. Taking the iPod as an example, the idea of a portable music player was well established – the Walkman was first introduced in 1978 after all. Indeed, the Walkman’s innovation had been to miniaturise the music player, as the opportunity was already manifest. Its inventor Akio Morita had observed young people in New York carrying boom boxes around on their shoulder and recognised the desire for portability.

Further, the iPod was not even the first of its kind; that honour belonged to the The Audible Player from Audible.com, arriving on the market almost 4 years before the launch of the of the iPod. By the time Apple’s response appeared there were over half a dozen MP3 players available on the market.

So the ‘innovation’ of the iPod was building on already established behaviours and needs. Where Jobs’ vision triumphed was in the exquisite execution of the concept, something that other companies didn’t come close to. Viewed through the lens of Rogers’ criteria, the iPod matched many of the requirements:

Relative Advantage – it had a much greater capacity than the Walkman

Compatibility – the idea of personal stereos was ingrained in the consumer mindset (although iPod brought with it the iTunes eco–system)

Complexity – the iconic scrollwheel made the task of navigating huge amounts of content not only easy, but pleasurable

Trialability – many high street stores stocked the iPod

Observability – the white earbuds were instantly identifiable, and formed the centrepiece of advertising campaigns of the time

Assessing the potential for innovations to succeed is prime territory for user experience work. UX is often confused as being user interface work alone, or as a trendy nom de plume for what we should simply call ‘design’. But in 1962, Rogers was already speaking a new language:

Determining felt needs is not a simple matter, however. Change agents must have a high degree of empathy and rapport with their clients in order to assess their needs accurately. Informal probing in interpersonal contacts with individual clients, client advisory committees to change agencies, and surveys of clients are sometimes used to determine needs for innovations.

This was new thinking in 1962, and it remains a challenge for businesses and organisations today. But it is evergreen advice, and words that we in Fathom adhere to day and daily.

In the scramble to innovate, don’t overlook the fundamentals.

This post first appeared on the Fathom_ blog.