One-eyed kings: the masters of innovation

One sure way to spice up any conversation around design or innovation is to note that Apple have never invented any device.

While it may be met with some resistance, the fact remains that they didn’t invent the PC, nor the MP3 music player. They are not responsible for the tablet computer, the smartphone or indeed the smartwatch. For all of their (deserved) reputation as innovators, Apple have yet to debut any mass market device that didn’t exist previously.

Given that one of the greatest innovators of our age has achieved this position by essentially coming to the market second or third with their offerings, how have they managed to achieve such a lofty status?

“In the land of the blind the one–eyed man is king” – Desiderius Erasmus c.1500

Apple’s greatest asset by far has been their fieldwork; relentlessly studying how people behave, discovering what they need – and why. They then do one of two things: either a) successfully define a problem, and apply existing technology to solve it in a superior manner, or b) identify inherent problems within existing technologies and successfully solve those.

In doing so they stimulated latent consumer needs which then triggered demand for their product. This is Apple’s genius, and this is innovation.

The company’s rare failures tend to be situations where they tried to solve problems that didn’t yet exist. As an example, the Apple Newton message pad was a tablet by any other name but it came too soon to an unprepared market. In retrospect the consumer PC market itself hadn’t yet been properly established; the public hadn’t yet come to value personal computing of any kind let alone tablet computing. For a modern-day comparison, one has to wonder whether Apple’s reported removal of the headphone jack from the next generation of iPhones is straying into just this territory.

Build a better mousetrap and the world will beat a path to your door – attributed to Ralph Waldo Emerson, c. 1889

No matter what a business or product offers, someone else has either already tried it, or is currently doing it. To innovate, you need simply to do it better than those others. And by “better” read ‘in a more customer-centric fashion’. A surprising number of founders, businesses, organisations don’t appear to have grasped this. Investing heavily in what may be incremental improvements might not set the boardroom alight, but that’s where the gold is. As with design, innovation is a process not an event. A verb, not a noun.

Correctly defining the problem is more than half the battle in product development. Putting the customer at the centre of the solution is the rest. And to any cries of “but… what about marketing??” in response to that last point, let’s answer it with a look at the banking industry.

Banks are so far behind where they should be with their services it’s tempting to be embarrassed for them. Banks are prime examples of organisations that have tried to market their way out of problems that should have been answered by simply providing better services. This approach has led banks into the unenviable position of being some of the least customer-centric businesses in the world. Millions of RBS customers unable to withdraw money from cash machines for days on end would attest to that.

The financial sector is gradually waking up to the fact that design thinking can be applied to services every bit as much as products. Design thinking doesn’t need to be the territory of the ‘big thinker’ or genius designer. It belongs to everyone on a team – including designers.

Of course everyone wants to innovate. And innovation can be managed through a pragmatic process of observation, competitive benchmarking and incremental improvement. Just ask Apple.

This post first appeared on the FATHOM_ blog.

People-centred design™

I’m a little late to the party here but still bemused enough at the storm in a teacup that I couldn’t let it go. Jack Dorsey’s suggestion that we need to talk about “customers” rather than “users” sparked one heck of a debate and gathered a lot of backing, but it strikes as having more than a whiff of the PR exercise about it.

The term “users” remains relevant and essential. Anyone with any experience of designing for user interfaces know for instance that marketing personas are, and should remain, distinct from user personas. One can inform the other of course; much good data can be gleaned from well thought out and comprehensive marketing personas. But We cannot allow the term “customers” to dominate.

We use devices, we interact with content. Within those two simple statements lies a myriad of questions that require answers, challenges that need addressed. To apply the term “customer” regardless of context is to give undue emphasis to a marketing-centric approach. The art and science of designing for the web has many facets, of which designing for customers is just one.

If anyone practising user experience or user-interface design was so caught up in the science of their work that users becomes some kind of abstract, then something is wrong. If that was Jack’s point, I’d be right behind him. We are designing for people.

However, Jack also emphasises the importance of semantics in support of his argument, but it is a flawed point. Before “customers” is a fit term to apply in these contexts, that word in itself would require redefining. He also argues that “the word ‘customer’ is a much more active and bolder word. It’s honest and direct”. It is not. I’d suggest there are many people interacting with their favourite apps or sites who would be horrified to find out they are regarded as “customers”.

Apple has recently put Jonathan Ive in charge of what it has historically called its “Human Interface (HI)” team, a term which if anything sounds even more clinical, impersonal than UI design. No matter though; Apple know they are dealing with people, with customers, with consumers… with users. Whatever terms they choose to use in internal processes, what really matters is the products that emerge from them. Everything else is hot air.

Leaving Flash behind

Even before the Apple vs. Adobe bunfight erupted, we were beginning to assess every rich media requirement on every one of our projects to see if it could be done without Flash. If it could be done in code, then we would avoid Flash. That is now standard practice.

Flash, as the default choice for delivery of rich media, has had its day. For us at least.

Adobe has had plenty of time to look for alternative directions for its product, but have preferred to continue the ‘all-things-to-all-men’ marketing approach. “It’s a video deployment platform! It’s an animation tool! It’s a fully-fledged development platform! It’s… give us a while to think of something else!”.

For me, and for many others, the leap to AS3 was too much. In trying to appeal to developers, Flash lost part of its core user base: those designers who had managed to develop rich media experiences in AS2. Adobe eventually realised this, and spent a lot of time trying to convince us that AS3 was the way to go.

I’ll hold my hands up: for years, I threw myself into conquering Flash, so sure that it would yield a fantastic future. But let’s be brutally honest: Flash was never quite the universal, cross-browser, cross-OS platform that Adobe would have had us believe. It never really worked quite in that way.

One of my abiding memories of working on advanced Flash projects was that moment when you suddenly felt very alone. You had promised the client some feature or other, completely convinced that of course Flash would be capable of it. And then you would try and implement something relatively straightforward. Then you would get stuck. Then you would hit the Live Docs, the Flash forums, and everything would go silent. You would find yourself pursuing the issue in some obscure forum, the web equivalent of a country road with grass growing in the middle. You would get that sinking feeling somewhere in the pit of your stomach, and realise that you may actually be asking something of Flash that it was never able to deliver. I had that feeling too many times, and I never want to go there again.

Adobe and its bloggers may post all they like and try to retro-evangelise that we were never meant to develop certain work on their highly legitimate platform. The fact is that for years we were asked to buy in to the idea of Flash as a site development tool. Hundreds of dire ‘Macromedia Site of the Day’ examples – usually the more complex, the better – will testify to what Flash’s masters were trying to promote. Adobe never seriously tried to change the agenda, except when early warning signs started to show, and video delivery became Flash’s saviour.

The simple fact is that the medium has had its time. At 12 years, it has had a good innings. The truth is we are finding other ways of doing what Flash used to show off about. In the early days of this migration, it even irked me. “Why is all this stuff suddenly cool to do in code, when Flash has been able to do it for years” I would whine. Then the penny dropped.

I am no anti-Flasher now. I’m a realist. So long Flash. Sorry Adobe. It’s not you, it’s me.

And then there was… iPad

I can’t offer an objective opionion of Apple’s new progeny; I’m an Apple fanboy, a design groupie and a tech non-purist.

The ability of Apple to distill the technological zeitgeist of the near future into exhilarating new product is indesputible. I remember being underwhelmed by the iPod when it arrived. I had barely a handful of mp3 files on my hard drive at the time, and still bought CDs by the dozen. Within 12 months I had an iPod and bought my last CD in 2004.

I haven’t quite succumbed to iPhone fever yet – I’m still a pay-as-you-go luddite when it comes to phones – but the iPad could well find its way into our home. As a device for the casual consumption of digital media it looks perfect to these eyes.

The gamut of reactions to the iPad have been entirely predictable. One reaction in particular I read left me staggered though. Twitter’s Alex Payne is “disturbed” by the iPad, suggesting it is the sunset of the tinkerer: “if I had an iPad rather than a real computer as a kid, I’d never be a programmer today.” Wha? huh?

The opinions and ideas Payne trots out in his blog post are beyond ludicrous. Will Rock Band kill off real guitarists? Did the Walkman kill off DJs? Payne is completely missing the point: the iPad is not a PC replacement. It may be for some; for the low level user who are no sooner going to hustle some C++ than they are to learn karate from scratch, it will fit like a glove.

The iPad is designed for the convenient consumption of digital media, period. The user experience is likely to be, as with most Apple products, superior. The same inquisitive folk who might feel drawn to tinker with PCs will be inspired to look under the hood and create something for that platform. And guess what? For those people, there are SDK’s, PC’s, Mac and all the same tools there always have been.

To suggest that because Apple have added the iPad to their product line, innovation in the development industry has somehow been assassinated is simply inane.

Relax Alex, PCs still exist. All Apple have done is add a new member to their ‘nuclear family’ of products.