Leaving Flash behind

Even before the Apple vs. Adobe bunfight erupted, we were beginning to assess every rich media requirement on every one of our projects to see if it could be done without Flash. If it could be done in code, then we would avoid Flash. That is now standard practice.

Flash, as the default choice for delivery of rich media, has had its day. For us at least.

Adobe has had plenty of time to look for alternative directions for its product, but have preferred to continue the ‘all-things-to-all-men’ marketing approach. “It’s a video deployment platform! It’s an animation tool! It’s a fully-fledged development platform! It’s… give us a while to think of something else!”.

For me, and for many others, the leap to AS3 was too much. In trying to appeal to developers, Flash lost part of its core user base: those designers who had managed to develop rich media experiences in AS2. Adobe eventually realised this, and spent a lot of time trying to convince us that AS3 was the way to go.

I’ll hold my hands up: for years, I threw myself into conquering Flash, so sure that it would yield a fantastic future. But let’s be brutally honest: Flash was never quite the universal, cross-browser, cross-OS platform that Adobe would have had us believe. It never really worked quite in that way.

One of my abiding memories of working on advanced Flash projects was that moment when you suddenly felt very alone. You had promised the client some feature or other, completely convinced that of course Flash would be capable of it. And then you would try and implement something relatively straightforward. Then you would get stuck. Then you would hit the Live Docs, the Flash forums, and everything would go silent. You would find yourself pursuing the issue in some obscure forum, the web equivalent of a country road with grass growing in the middle. You would get that sinking feeling somewhere in the pit of your stomach, and realise that you may actually be asking something of Flash that it was never able to deliver. I had that feeling too many times, and I never want to go there again.

Adobe and its bloggers may post all they like and try to retro-evangelise that we were never meant to develop certain work on their highly legitimate platform. The fact is that for years we were asked to buy in to the idea of Flash as a site development tool. Hundreds of dire ‘Macromedia Site of the Day’ examples – usually the more complex, the better – will testify to what Flash’s masters were trying to promote. Adobe never seriously tried to change the agenda, except when early warning signs started to show, and video delivery became Flash’s saviour.

The simple fact is that the medium has had its time. At 12 years, it has had a good innings. The truth is we are finding other ways of doing what Flash used to show off about. In the early days of this migration, it even irked me. “Why is all this stuff suddenly cool to do in code, when Flash has been able to do it for years” I would whine. Then the penny dropped.

I am no anti-Flasher now. I’m a realist. So long Flash. Sorry Adobe. It’s not you, it’s me.

Simplicity vs minimalism

My work tends toward simplicity to the point that I sometimes I think it marks me out as some kind of design luddite.

As I grow older I’ve become increasingly aware of how difficult it can be to simply stop designing. There is a time when a design – or to be more accurate – a style is done, finished.

Simplicity is the art of removing complexity (my definition but apologies if this has been subconsciously lifted from an official definition from elsewhere). For the user, complexity is a usability killer. For the designer, complexity is a time drain and a creative anathema. The more complex a design is, the less sustainable it becomes and the more work is required at later stages to uphold the extreme level of detail or clutter that has been established.

Simplicity is not minimalism. I see minimalism as an aesthetic, whereas simplicity is an imperative. Very different. And simplicity does not mean “plain” or “bland”. Just simple.

I’m barely touching on this huge subject but I’ll revisit it again and again I have no doubt.

Here’s a good point well made which I think illustrates what I’m getting at: http://www.usabilitypost.com/2010/07/23/a-mild-case-of-borderitis/

There’s something about airports

Airport architecture makes them enjoyable places to be. I rarely travel by air, but I do know that it’s possible to waste hours at a time in them. It seems that ever since the Pan Am terminal at JFK airport in New York appeared, airport terminal design took a turn for the transcendent. I can’t say that Stansted Airport is up there with the best, but it’s beautiful by night.

stans.jpg

Vancouver 2004/2010

It’s been kind of odd watching the 2010 Winter Olympics. I was in Vancouver at the time that the city won the bid, back in 2004. Although on a holiday, my professional interest soon peaked when an emerging story related to the Games’ logo started to emerge.

The Vancouver Olympic Committee (VANOC) launched a competition to design the Games’ logo, open to amateurs and professionals alike. Nothing short of a huge speculative pitch. I documented the story for CSD’s ‘The Designer’ magazine and the piece was kindly reproduced by the GDC on its website (now offline).

The dogged determination shown by the GDC at the time to ensure that professional designers were represented intelligently and fairly was fantastic to witness, and renewed my faith in design associations at a time when the virtual community provided by the web was quickly making them obsolete.

The logo competition controversy aside, there were some neat little design touches that accompanied the Games, right down to the competitor bibs for Games events. CBC’s Olympic website was a concise and comprehensive reference point for events, results and Games news. The winners’ podium likely went unnoticed by most, save for those fortunate enough to step up onto it.

The success of the Games overall was a credit to Vancouver, and the infectious enthusiasm they brought to the party eventually won over even the most cynical of oberservers.

Too late the hero

This is one of a number of archive posts, dating from some articles I wrote between 1999 – 2001. Some read as terribly niave now, but I thought it worth retaining them for posterity.

—–

Cast your mind back, if you will, the mid-nineties. The internet was emerging as the next mass communications medium, and a gauntlet was being thrown down to business to come to terms with a new way of reaching potential customers and clients. A more potent challenge was resonating around the graphic design industry. For decades, designers had dominated the communications field, handling and refining the information we digest in all its myriad forms. In what is now generally accepted as the ‘Information Age’, one would think that designers might have come into their own and stamped their authority on this innovation that was clearly to become widespread in the short to medium term? Sadly not so. The challenge laid down by the internet was one that designers singularly failed to meet.

We designers can be a vain and – if truth be told – lazy bunch. We are reluctant at times to admit that we need to augment our pool of knowledge and offer anything over and above our basic talent, a truth cruelly highlighted with the web’s arrival. To publish online back in 1995/96, we had to roll our sleeves up and get our hands dirty with some rudimentary computer code. Those now familiar with HTML might be afforded a quiet laugh at it being labelled “code”, but it was a hurdle that had to be overcome nonetheless. This was a reality that the majority of designers failed to recognise, instead opting to wait until accommodating software packages arrived to make things easy for them.

In that moment of hesitancy (two years in truth), web design became the domain of the “all-rounder”. Web companies sprang up offering “design” that in any other medium would not even be afforded the name; anyone even remotely capable of producing a web site was deemed good enough. Thus the standard of web design was set at a very low mark whilst designers at large stood idly by, scoffing at the bad results, but unable and unwilling to put things right. The residue of this period is still with us; web sites are by and large poorly designed. To be blunt, design was devalued. And we, the design industry, let it happen.

Our industry was in a state of denial. Only recently (read: two years) have the mainstream design magazines really begun to take web design seriously; only in the same period have design agencies started to take on web-savvy employees, and belatedly offered web design as a service. While I would stop short of suggesting that the boat has been missed, I would stress that there is a fair amount of frantic paddling to do to catch it. Young designers who have grown up with the medium find that their traditional mentors, the senior designers and creative directors, have no significantly greater experience in the field, therefore anything that looks “cool” is deemed suitable to put live on the web. A million shoddy Flash intros are testament to that. The user experience is rarely, if ever, considered.

If I appear to be over-dramatising the significance of this blip in the recent history of graphic design, then let me give you an example to illustrate my point: for years a print designer’s brief has included at least a little technical knowledge of printing processes, paper stock and inks. In short, a firm grasp on the constraints of the medium. I would suggest is that to truly describe oneself as a web designer, then surely the same applies; what can and cannot be done? To this end, at least an outline appreciation of HTML is required, along with the basic fundamentals of dynamic scripting languages. Flash, which at one point seemed like a short-cut to web wizardry, now features an indigenous scripting language (Actionscript), that poses as steep a learning curve as many a “real” programming language. And that is just the beginning: usability issues, information architecture and more are now part of the designer’s remit.

This is not designed in any way to be a finger-pointing exercise from some kind of moral high ground. I became involved in web design in 1996; earlier than most certainly, but sadly for my bank balance not early enough to claim “guru” status and all that goes with it (stand up Mssrs. Zeldman, Davis et al). I arrived late on the scene with the rest of our industry. However my motivation for becoming involved when I did was borne from a duty that all designers have, whether they care to acknowledge it or not. I felt very strongly, albeit belatedly even then, that designers should be at the heart of any information revolution. When a new medium springs into existence, designers have a responsibility to become involved, to tame that medium and ensure that design is never allowed to be regarded as mere eye candy. This is as fundamental an issue for each every designer as I can think of; it challenges our very purpose and role in society. Do we wait for a medium to come to us? Or do we in fact have an obligation to engage with a medium, even if that involves doing so on its terms, not our own? The answer is of course obvious.

Hopefully we are now through the phase when the internet was seen as one huge retail opportunity, and into a time when the ability to inform a global audience is seen as the central principle of the internet – the web is moving on. We as an industry have a duty not only to play catch up, but keep pace in future.