19 Billion is a big number. Dr.Evil big. And like Instagram before it, the WhatsApp acquisition belies Facebook’s utter desperation for relevance, and in contrast to pundits’ breathless projections, signals a likely end to Facebook’s mobile survival.

If you don’t work for Facebook, and you’re not invested in it, you are probably comfortable considering the obvious signs that the Facebook social network has been revealing a lack of relevance.

As Facebook’s users age, and become associatively uncool, the network has become less a place where young, influential, upwardly-mobile users go to “hang out”, and more a place where they “reconnect”, get updates on high school reunions, and share the occasional cute cat picture with grandparents.

Facebook made sense in a web-browser universe, back when digital social connections were still new, few, and cumbersome. But users don’t live in that world anymore, and have increasingly numerous and convenient options for connecting. This has forced Facebook scrambling to find relevance. Literally breaking itself into digestible mobile parts only to find themselves competing with a million other apps with similar attributes.

Mobile is… a perfect storm – one specifically designed to remove dominant players from power.

Yes, I’ve seen the amazing numbers and projections. Every investor has a slightly wide-eyed, positive spin on the Whatsapp deal, lining trajectories of popular mobile apps next to the web’s old guard. But I’m still shaking my head, certain the cards are not stacked in Facebook’s favor. Not because the current numbers aren’t impressive, but because those numbers exist in the eye of a hurricane. Those numbers only make sense so long as the landscape remains recognizable, the natural laws consistent. So long as we don’t acknowledge the inevitability of exponentially disruptive players.

The mobile world is fundamentally different than the one Facebook was born into. The metabolism of business is rapidly increasing before our eyes. There are dominant and unpredictable forces swirling around every business today – let alone those that exist solely on objects of convenience, like mobile apps.

The democratization of development and distribution makes the mobile app ecosystem a whole new world. Never before in history have there been so many competing software developers with so much power to utterly disrupt. The distance between market dominance and failure is now one person, and a day.

Add to this that the very existence of an app store as the portal of distribution, concentrates attention on the value of new discoveries. On trying new apps that might be better than, say, whatever you use today. Face it, app stores are like news outlets; old news isn’t good for business.

Face it, app stores are like news outlets; old news isn’t good for business.

And here you have a perfect storm – one specifically designed to remove dominant players from power. Once you’ve enjoyed a run, the entire ecosystem is optimized to make room for the next thing.

Take the case of Dong Nguyen, a developer in Vietnam who created FlappyBird. In a few days. Single-handedly. One guy. Unpredictably it quickly became the most downloaded game in the iOS app store, and the Android version, released later, was catching up. Was that predictable? Did Rovio or King see that upset coming? How many people stopped playing Angry Birds to addictively play Flappy Bird? Lucky for them Nguyen inconceivably pulled the app from both platforms. A virtual get out of jail free card for every other contender. But see, it was predictable. Because this is the very nature of the mobile app landscape.

Facebook’s 19 Billion dollar deal does not appear to take into account the high likelihood – the inevitability rather – that some deceivingly simple upstart app, like WhatsApp and Instagram before it, will come along and do something different, better, cooler. Just enough that it gets attention, gets downloaded, spreads, and eclipses or replaces the old ones.

Mobile apps are not platforms, they are disposable instances, they are trends. The sturdy limitations that held Microsoft Office in place for so long do not exist here. Nor are the ones that have continued to keep Facebook warm on the web. Every popular 3rd party mobile app is destined to face an unprecedented, massive and relentless onslaught of unpredictable new ideas from divergent competition.

I’m not sure how many multi-Billion dollar app acquisitions Facebook is prepared to close over the next 5-7 years, but I can tell you with absolute certainty that WhatsApp is far from the last app acquisition Facebook will have to make to retain a position of relevance in mobile users’ lives. Far from it. If indeed sheer acquisition of disruptive apps is to remain the sole successful basis of Facebook’s mobile strategy – they’re on a very expensive treadmill.

As professional creatives, as designers, and artists in any medium, staff or freelance, we tend to share a common career goal. After entering the workforce and working in our chosen field for a number of years, we imagine naturally progressing to directing, where we will be inspiring teams of people in doing what we have done. We may further imagine rather loftier goals than that, but surely directing is part of our journey.

Although often eager for this promotion, few creatives understand the implications of directing, and therefor fail to prepare themselves adequately for the role.

Let me state emphatically – the hardest thing any talented creative person will ever have to do in his/her career – and truly nothing is fraught with more hidden challenge – is face the moment of transitioning from being a person who makes things, to a person who directs people who make things.

I have watched and mentored countless creatives through this transition, and at 50 I still continue to face the challenges of this transition myself. As such I can report that upon finding yourself in a directing role, many of you will not be happy, won’t be any good at it, or both. At least not for many more years than you expect.

And that’s because directing is a completely new medium, one that has almost nothing to do with the creative medium you are an expert in. You will (likely) painfully find yourself virtually starting over in your career, you will have to let go of reliance on so many of the expert skills you have acquired, and as when confronting any new medium, you will have to confront the lack of knowing the basics.

Despite expectation and intuition, directing is in no way a natural progression from wherever you are as a creative today.

Despite expectation and intuition, directing is in no way a natural progression from wherever you are as a creative today.

Your Internal Director

As a maker of things, as a designer or artist, your work-flow is often intuitive and non-verbal, you feel your way. It’s how virtually all of us started – by making things ourselves, satisfying our inner voices. You form ideas, you sculpt – internally debating, making decisions and solving problems as you feel best, all in the flow, without uttering a word or articulating a thought. If the work doesn’t look, feel or sound right, you simply know it at a glance. You don’t have to articulate why – you only need to respond to that powerful creative intuition you have developed – trusting your hands, your increasing skills, and feelings to take you to the answer.

There is nothing lost in translation at each step because for you it happened organically.

When your work is completed – often you have to step back and analyze why it works. But anyway – in the end it does.

So says your intuitive internal director.

Directing: The Art of People

Most assume that because they know how to design or make things that they are suited to direct, succumbing to the illusion that directing is merely a progressive step.

However, what you soon discover is that when you direct people in making things you don’t get to use most of the skills that brought you here. The tools that you spent 10 or more years cultivating. You soon discover that you’re standing there holding a new palette, new tools. The new tools of your trade are interpersonal relationships, the ability to sense feelings, to encourage artists to do what they do, to analyze and diagnose creative, strategic and emotional conditions and articulate them back – all with words. Words. Words.

Words. Words. Words.

Remember that intuitive, internal director? That one who worked so confidently, who felt its way, who, without ever a single utterance, instructed your mind and hands to create stunning works of art? That director must now step out, stand on stage and articulate every thing it thinks and does – with words alone – in such an attenuated way that it encourages this trusting ego, or that passive-aggressive, defensive ego, or the gentle, sensitive ego over there.

Every creative I have ever known grossly underestimated the difficulty of this, they mistakenly believed directing was a natural evolution, the next step of being the artist that they are. Which is ironic because, truth be known, a large number of us became artists specifically because we were not good at interacting with people. But despite this, I think most creatives naturally believe they would excel at directing.

We’re all quite used to being directed ourselves, and as the receiver of someone else’s direction, it just doesn’t seem all that hard to do. Maybe in part because good directors and clients appear to do it effortlessly and bad ones (of which we encounter many more) suck such that you can plainly see it, you feel naturally emboldened that you can do better. The problem is, this game isn’t doing better than the bad ones. The game is doing it great. And doing it great means , among other things, that you must be terrific at motivating, challenging, inspiring and analyzing people.

Directing Someone Else’s Good Idea To The Target

Aside from turning interpersonal relationships into creative solutions, there is another aspect of directing that is often a very new experience: Encouraging someone else’s creative voice to occupy the space.

For someone who has come to define his/her aesthetic sensibility through hands-on action, the act of letting go of execution – while still being responsible for the outcome – of motivating someone else to create great work in their own creative voice – not yours, is a daunting challenge.

I now know that when the team’s work is poor, 9 times out of 10 it’s my fault. And when their work is good, 9 times out of 10 it’s not because of me. That’s directing.

…when the team’s work is poor, 9 times out of 10 it’s my fault. And when the work is good, 9 times out of 10 it’s not because of me. That’s directing.

And it’s not because we, as directors, don’t occasionally have good creative ideas, but because the director’s tactical creative solutions are not those that finally manifest. Sure you inspire, and guide and you might even get the team to design down a path that you originally conceived, and you are ultimately responsible if the work sucks. But the work, the image, the site, the art is not yours. It can’t be. The artwork simply is somebody else’s – and it must be allowed to be. It has to come from their heads. They hold the brush, and their heart needs to move it.

There is a close corollary when directing film and theater actors.

A bad theater or film director will give his actor a “line-reading”. This is when the director acts out dialogue from the script by speaking with specific emphasis, and then directing his actor to repeat the line with that emphasis. This is micromanaging, forced, and does not result in a realistic, believable performance.

A great film or theater director will never have to tell an actor how to say a line. That does not mean that he won’t manage to get the actor to say the line differently however. Our hypothetical great director will sit down with the actor and discuss the character – he may revisit the character’s back story, the impact some event must have had on the character’s current emotions. A dramatic event, the context of the scene. The director may further sense a personal conflict in the actor himself, one the director must emotionally counsel the actor through. Armed with that context, feeling, and emotional therapy the actor is then able to do his job – to lose himself in the real emotion – to use his own instrument to become the character. When the actor is truly in character – when he believes what he says – with the emotion of his back story in his heart – the performance will feel real- and it will be consistent. And any emphasis on that line of dialogue, and all the others, will come from the actor alone.

The same is true for all great directors, no matter the medium. Designers need to understand the goal, the intent, the strategy, the feelings that the piece needs to convey. The artist will likely need emotional counsel from time to time- sensitivity to the challenges she faces. And the director must trust the voice of that good designer. If he does not, if he says “do it like I do, do this, do that”, if in exasperation he sits down and creates a piece of art to show his designer what he means, he is essentially giving his designer a “line reading”, he is cheating. And he is undermining his designer’s ability to be great, to do the best work she can do.

…if in exasperation he sits down and creates a piece of art to show his designer what he means, he is essentially giving his designer a “line reading”, he is cheating.

Often new directors gravitate back to their creation tools. Simply because sometimes it is how they think. It’s how they have grown up communicating. The art-making tools are a young director’s comfort zone. Even if you don’t think that’s why you’re doing it – it is usually the reason. It feels safe. You know where you stand when you wield photoshop or whatever your tool is. You have power there.

But when you let go, when you donn the director’s straight-jacket and try to merely talk… well, what does one do? How does one “create”? If the work isn’t right, how does one get the team from point A to point B? How does one get the artist to change the art without telling her what to do? Does one repeat the original direction- again? Does one simply reject bad ideas? Does one compromise? Does one make forms, or charts, or plans? Does one come up with ideas for off-sites to motivate the team? Does one make sure everyone has the best equipment? What’s the job?

The idea that the director’s contribution has little or no physical deliverable is often an alien sensation to someone who has been promoted from making things.

Skills and Credibility

With all this talk about “hands off” I hope I haven’t misled you to believe that a director does not need a solid foundation in the hands-on skills in his background. Having watched so many directors from different fields and backgrounds, I’ve come to realize that those who have done the work before, who have solved problems like these many times before, who might otherwise be able to sit down, take up the tools and do this job now, these directors are almost always better. (Again – assuming they apply the knowledge – but withhold from doing it!) They know what their team is going through. A director who lacks such direct hands on skills neither understands the nuanced challenges his team faces, nor does he tend to command respect and belief from his team. The extent to which the director or client has not done this type of work is the extent to which the creative team will likely doubt the integrity of any direction he has provided.

It’s why clients and directors who lack creative or hands-on backgrounds but who provide creative comments are notoriously lampooned and ridiculed by creatives in all fields.

Authority without experience. Creatives are a cynical lot. And few things trigger their cynical response more than an inexperienced client or director giving feedback.

Creatives are a cynical lot. And few things trigger their cynical response more than an inexperienced client or director giving feedback.

And this brings me to the last main challenge for most directors.

Navigating the Corporation

Even the title triggers measured sighs and eyes to roll. But this is another arena that often comes as a shock, and where great directors can excel.

Almost all creative jobs exist within a company. Very few of those – even among ad and design agencies – are truly designed to nurture creatives’ needs, disciplines and sensibilities. And it’s here, in the organizational world of profit and loss, of business plans and strategy, of budgets and Excel spreadsheets that the last few creative directors sink or swim.

Nothing elicits such a strong show of cynicism as when corporate machinations impact the creative team. If you run a company you are all to familiar with the fear, uncertainty and doubt that seems to plague your design teams. You feel they often make unrealistic demands, disconnected from what it takes to run a business. They complain when things change – they always seem to look on the dark side when the company grows or changes – never seeing the positive.

But you need to know, your creative teams are not just irrationally “whiny”. They behave this way because creatives, by in large, really are victims of the corporate world.

creatives, by in large, really are victims of the corporate world

See, the reason creatives enter the fields they do is because they were designed for that. It’s how their brain works. And being designed for that often (though perhaps not always) means not being designed for other types of roles: strategy, management, accounting, and sales for example.

Unfortunately for creatives, creating great artwork does not automatically explain or justify its benefits to the business. The disciplines and skills involved in being a great creative does not make one great at conceiving and arguing for organizational change that will both improve the work they produce and also make the company more money.

Not the way, say, salesmen can. Or strategists can. These guys can assemble a compelling argument, compare the numbers – they can argue and show how the bottom line will improve by funneling more money and resource to their departments in ways that make their lives easier and allows them to do better work. They are verbal creatures. They think in quite literal, logical terms. And they can sell in their ideas. Their job skills actually align with organizational operation.

But creatives generally don’t have those skills. They are intuitive thinkers. They have feelings that manifest through their hands into objects and artwork that none of the rest of us can fully explain but that we love and appreciate.

So it goes that when things happen in a company – when teams move, get reorganized or budgets and schedules are allocated, the creative team is carried along for the ride – in whatever way some executive, armed with reasoned arguments from other articulate teams, decided was best. Often this results in non-optimized conditions for the creative teams. When they are lucky the creatives have a team of executives that look out for them. But this is most often not the case.

So creatives the world over are literal victims of the corporate system. And they act like it.

This is where a solid director has an opportunity to make a difference. Navigating the corporate world – selling into the business – justifying the need for greater budgets, schedules, resources. And defending the creative product itself in the face of dissension.

If you can do all this, your creative team will do better work – and to me there need be no more reason to do this part well.

But what exactly does any of this have to do with that wonderful creative skills that brought you to this role?

Very little indeed.

It’s just another unexpected challenge that most directors discover after the fact, and struggle against for years.

Love What You Do

Like all things the transition to directing often eventually works itself out if you enter with your eyes open – aware of these otherwise hidden factors, and remain committed, always willing to learn a new lesson.

Mainly though, and I’m sorry if I sound like a broken record, it’s important to be aware that directing is not a natural step in the progression of your role as an artist.

If you love your art, if you love designing – if your heart thoroughly enjoys the skills you have developed, my emphatic recommendation is: don’t be too eager to leave that behind you. Because in many ways – that is what directing results in.

Conclusion

To recap, there are four main qualifications you’ll end up confronting, if indeed directing is your calling. You’ll have to:

Know the art and have mastered the hands-on skills. If you can’t make things yourself, if you haven’t done it before – you don’t really know what your team is going through – you’re guessing – and therefor can’t direct well. Having these skills behind you is how you will relate to your teams, how your feedback will carry credibility, and more importantly how you will gauge what they are and aren’t capable of.

Become an expert in interpersonal relationships. People are now your medium – where the art form itself no longer is. You must be able to read people’s concealed emotions, you must intuitively know what they need from you and from others to do great work. Your own ego has little place here. You must have nothing to prove, you cannot be defensive. You must be a therapist and a leader. If this one qualification doesn’t come naturally to you – directing may not be up your alley.

Direct with context and words, not “line-readings” and hands. You must be a strong speaker – you must be able to form and articulate thoughts that are valid and make sense. You must wear the director’s straight jacket, able motivate and redirect your teams without doing their jobs – they must be allowed to own and invent the solution. They must be allowed to create the art. If you do it for them, and it does not manifest from their consciousness, they’re ongoing performance will will be weaker.

Navigate the corporate organization. You will have to defend your team’s creative ideas in such a way that clients, and executives can buy in to the creative executions. This is about much more than the “pitch”. You need to be able to explain to them how it improves their business. You need to defend your team when corporate changes are likely to impact them. You need to be able to wrangle the corporate machinery to your team’s best interest.

This is directing.

It’s all about the art – but the art is not your medium. Now your medium is people.

And that is why, creatives who’ve advanced to directorship often find themselves longing for the days that they were making things again.

If you want to identify an embarrassing trend that will iconify outdated, wrong-headed web design circa 2012-2014, you need look no further than this.

Though probably not in the way you expect.

For the better part of 2 years, and largely ushered to popularity on the back of scroll friendly platforms like iPad, scrolling has become one of the most useful but sorely abused and overused interfacing tools available to web developers today.

Like a lot of people I breathed a sigh of relief when it became clear that the tide had changed and the dark ages of “above the fold” had lost a fair bit of its gravitational strength. That scrolling had finally osmosed its fair share. Always important, but more articulately understood today than years past, the fold just doesn’t have to work as autocratically as it used to.

Today, scrolling enjoys unfolding and metering stories and arguments as it was intended.

That said…

As often happens in the world of interactive trends, when they get an inch, they take a mile. And for script-gimmicky developers (undoubtedly suffering from Flash withdrawal) scrolling has entrenched itself as one of the industry’s latest misappropriated novelties.

Site creators have long embellished the basic scrolling function by creating parallax effects – where layers move at different increments; sometimes to create subtle, pseudo 3D effects.

And I bet you thought these were the sites I meant to deconstruct. Well, not today. The parallax effect is admittedly overused, but it’s generally ambient, and doesn’t overtly undermine the UX or content it carries. Despite parallax, users still scroll as expected, content may be consumed as intended, and no one is unduly surprised or confused.

The parallax effect is admittedly overused, but it’s generally ambient, and doesn’t overtly undermine the UX or the content it carries.

No, today I’m talking about the sites that take it further. Too far. To the point of utterly undermining the content and user experience. These are the most embarrassing of acclaimed executions.

These executions are characterized by something I call the “scroll-powered movie”. Projects where the scroll function is employed to advance lengthy animated sequences to tell a story. And it’s my opinion that use of this technique reveals a weak understanding of the medium.

I’m regularly bemused at the poor judgement web award groups display in selecting sites that are so clearly off the path to our future. It leads me to believe that these organizations have little in the way of a philosophical understanding or stance on interactive language to inform their decisions, instead apparently basing their awards on the interactive equivalent of “ooh shiny”.

The Quest for Consistent Speed

As long as mankind has had the ability to record time-based images and sound on a medium that could be stored and played back later, we have well understood the need for a consistent, repeatable playback speed. Record players, tape recorders, movie cameras, projectors, VCRs and even video codecs all had the same dependency on consistent playback speeds.

If the speed of playback was not identical to the speed of the recording, content was presented inaccurately.

In analogue audio, playback too fast and you got the chipmunk effect (increased pitch and tempo), too slow and you sounded like James Earl Jones. On film the well-paced dramatic scene would lose any dramatic tension as the characters zip around like keystone cops. Even worse, inconsistent speeds result in all manner of warbling and stuttering. And with rare exception (say, the specific intention to poorly reproduce), none of this is “good”.

But welcome to 2014, where such past obviousnesses are overrated – hey, we’re in the future now, right? We use computers. Lessons from the past have no relevance here. /sarcasm

Interactive, at its “true-use”, is not about linear, prerecorded experiences. It’s an art form based on gesture and response.

Even so, we often consume linear content within the context of an interactive experience.

And it’s here, between interactive design, and linear self-play, that so many site creators continue to struggle; failing to find rational hand-offs between these sometimes opposing concepts.

And it’s here, between interactive design, and linear self-play, that so many site creators continue to struggle

This struggle is iconified by the scroll-powered movie.

Despite its rampant popularity, the scroll-powered movie never (ok, rarely) serves a useful or even aesthetically superior purpose. In fact, as I will show, it usually diminishes the value of the content provided in these pieces.

(I say rarely, because there can be practical uses of the interaction, if, say, the user were enabled to carefully analyze motion or footage to some practical or aesthetic end: for example the way a film editor scrubs video to find a specific cut point. The problem is, the vast majority of acclaimed instances of this practice are not of this rational sort.)

How it Breaks

The scrollable “movie” contains a story, sequence, or idea that is expressed via a series of frames, or positions that were intentionally designated – laid out at specific increments or percentages from one another – to create a meaningful sequence.

But the careful relationship of all those increments and percentages are ultimately worthless in random users’ hands.

A User is in the habit of scrolling however he or she prefers. Fast slow – jerky, smooth. And scrolling further dictates highly variable speeds from device to device, and from system preference to system preference. So immediately a wildly unpredictable “random speed generator” is introduced. Other technical functions play a role here too – browser, processor speed, network, all sorts of things impact the ramp up, momentum, speed, smoothness, and stop points of the scroll function. In short, there is an absolute guarantee that scrolling speed and action will be wildly unpredictable.

And yet all of that was generally functional until our young creator got the bright idea of tying those unpredictable variables to the speed and progression of his linear movie.

The comical result of this brand of “experience” is that users inch, inch, inch their way through some linear animated segments that would have rather benefitted hugely from consistent, smooth action. And at other times unpredictably zip through, skipping and stuttering past important segments, wholly missing key ideas and moments and losing all sense of dramatic pacing and timing.

To these site creators one might ask: what did you think you were giving the user control of, and to what end? What value has the user derived?

The lesson we need to take away from this kind of UX failure is that we must rather functionally honor the true nature of our content.

The lesson we need to take away from this kind of UX failure is that we must rather functionally honor the true nature of our content.

If what you have conceived is a linear movie – one that utilizes lengthy (more than binary or iconic) animated sequences to tell your story, then honor the inherent nature of that, embrace the linear “movie”, and let the machine manage your ideal speed and pacing. With rare exception your linear story is never going to be improved by handing your user an inconsistent, kludgy interface, and saddling them with the job of managing consistent and smooth playback; in short, to be your virtual projector motor.

There is nothing wrong with letting a movie be a movie. It’s how you handle users’ potential desire to interrupt it that matters.

Embracing Linearity – A Successful Variant

Despite the failure of these continuous scroll-powered movies, there is a different scroll-based movie model that does work, one that more effectively utilizes the function to advance linear sequences.

Here the true, linear, movie-like quality of a number of discrete animated segments is fully honored – even while the user is given the freedom to advance or back up within the greater story. In this case scroll is used as a trigger – instead of as the virtual motor. In this case, scrolling simply triggers an animated sequence, which is then displayed (powered by the machine) in as smooth and consistent a speed possible, until it comes to rest on a predetermined idle point. Although users might initially disorient as control is temporarily wrested away, the feeling is only initial and momentary. Here, one discovers that scrolling has become a page-turn or “next” button.

In this first “scroll-as-trigger” model the vertical response we are accustomed to does not exist. The subject (a Mac Pro) appears to exist continuously on screen animating in arbitrary ways, somewhat in contrast perhaps to the expectation of scrolling up and down.

But closely related to this is a second model where the act of scrolling rather does move the subject up and down, but in a semi-automatic way, and here again, in a triggered, page-turn, manner.

The system, once instructed by the user to scroll, ignores the user’s increments, and does the work of setting the speed and stop point of the scroll – usually ending when the next “fold space” is aligned perfectly in the browser window.

While I am not particularly an eager fan of the latter two techniques, they do represent sensible alternatives to the page scroll function, and can enhance the UX.

But this of course is in contrast to the embarrassing scroll-powered movies employed by so many site creators today.

The time will come, and for some perhaps that time is now, that we will look back on these years of scroll-powered movies, roll our eyes and wonder with embarrassment what the Hell we were thinking.

The greatest creative expressions are the direct result of an individual’s inspiration, vision and guidance.

It may take a mammoth team to execute on that vision. But the best work starts with an idea, a visionary seed, one that must be defended and guided through a myriad of decisions a project meets along its growth.

I often think of creative ideas like trees in a forest. In a forest, the trees that stand out, those that get your attention, that make you stop and marvel, those are the trees that are unusual in some way. The ones that defy the average vertical pattern. The tree that is bent and twisted against the norm. A tree that quite literally goes out on a limb. This tree is not average. To some, this tree may seem awkward, or ugly. To others it is unquestionably the – one – beautiful stand out.

At this point some feel compelled to point out that the forest – made up of my average trees – is itself a thing of beauty. And indeed that is true – but taken at that scale, the forest then is the unique, unusual object against a larger experiential backdrop.

Either way, our tree is its own. It is unique.

Such uniqueness is only possible because it was subjected to a one-of-a-kind force or condition that the other trees were not.

If however, you averaged the shape of all the trees in the forest, the unique beauty of this one unusual tree would be lost. Averaged out.

In development of creative ideas, void of an individual’s guided vision, the more voices, the more inspirations, the more filters, doubts and preferences that collide and direct, the less distinct the eventual expression becomes. A variety of inspirations naturally cause a canceling effect. An averaging of the distinct, unique exceptions. They pull the limb closer to the middle, closer to an average.

It’s a simple truth.

And this is how I think of closed Vs open.

It’s why artists tend to prefer a closed condition. It allows for authorship – for an individual’s vision. For expression of (potentially) a truly unusual, unique idea. One that goes out on a limb. One-of-a-kind.

There are often many flaws and possible pitfalls in the structure of closed projects. Being non-standard, they are more often prone to systemic deformities and challenges. But this is why the whole process, the whole team must be working in agreement to support the originating vision. Because more technical rigor is required to overcome this natural weakness – to ensure the integrity of the unique structure. While each member of the team has a role that will impact the project, still above all directives is the one that defends the vision.

This is not to say that Closed is naturally superior. Open has its own benefits. An open project naturally resists many of the risks of systemic deformity. In fact it excels at evading deformity. It more easily reveals and repairs structural flaws and more readily results in a functional system. But what it more easily gains in structural integrity, it gives up in uniqueness, in surprise, in drama, creative integrity, and delight. It is merely a tree – out on no limb. Standard, functional, and utilitarian.

And it’s why so many of the engineers I know prefer an open environment. Not all, but most. It is sensible if your aim is above all to ensure technical integrity.

I don’t mean to split artists and engineers, that’s a generality and not entirely fair. I’ve known rare exceptions on both sides.

But to me, all this does ring true when I reflect on debates and sensibilities surrounding iOS and Android.

When I use each system I can see the difference in the originating process and sensibility.

My experience with Android is one of utility and functionality. It works. And for some that utilitarian functionality is plenty. It’s preferable even. These people look on the unusual bends and twists of iOS and they see flaws, a focus on gratuity that feels odd and unnecessary.

But an open system will never surprise you. It will function rationally, but it will not surprise and delight.

And to me – my heart drops when I use Android. It works, yes. I get from point A to point B. But (heavy sigh) I don’t enjoy it. There is no joy. Perhaps acknowledgement of this is part of the reason Google has been taking a more “closed” approach to parts of Android.

Users of iOS, and all other Apple products, I think generally appreciate the ongoing lengths Apple has gone to engineer and fortify its twists. The obsessive attention to detail that make Apple products surprising, delightful and unique. Apple is the very product of going out on a limb.

Creativity requires a vision. A great movie, a pointed work of art, a gripping book, a great design, a delightful OS experience, all require a vision. And these further require strong direction and leadership – on whatever scale may be relevant. There are easier ways to create – but none that result in strongly differentiated creativity. Great creative expressions are not originated by communities. Executed, perhaps, but not originated and directed.

And for this reason I assert, with exceedingly rare exception, outstanding creative expression is the result of a closed model. And it’s why I prefer the closed model myself.

There is a reason the word “face” is found in “interface”. Your face (and its senses) is the primary conduit through which you receive information. And when we talk I tend not to look at your elbows, but at your face, since most of the information I receive comes from it. In addition to verbal responses, your face communicates non-verbally – where your elbows for example, tend not to.

And this is why Google Glass, as conceived today in hardware, is doomed.

In sitting persistently between the world and your face, Google Glass screams self-centeredness, persistently communicates contradicted attention, and confirms a flip in the social subtext from “occasionally about me” – to “always about me”.

With a design that belies an effort to both persistently engage but not interfere at the same time, Glass appears plainly two-faced and is predictably regarded with social suspicion.

Proponents of Google Glass will argue that Glass – by virtue of it being persistently available – will reduce the annoyance others experience when you look away to your phone, or maybe, someday, your iWatch. That pausing a conversation to look into space, up and right, at an email is somehow less intrusive.

But that’s ridiculous.

You call on these other devices only as needed, and yes, it’s always slightly annoying to have mutual communication interrupted by a glance at your phone. But I can assure you, it doesn’t solve the problem when you mount your phone over your right eye. At least you can put those other devices away and once again plainly give yourself back to our communication.

Despite the many flavors of self-centeredness ushered in by digital technology, few consumers, no matter their age, will be willing to outwardly don such an obvious “fuck you, I’m actually all about me” to the world.

For this reason, Google Glass will never work – it will never be adopted en mass – until it fully fades from view. Until you, the wearer, no longer broadcast utter self-centeredness to all passersby.

Even a telltale bump and lens on your tortoise-shelled Warby Parkers will not save you the heavy-lidded eye rolls (that’s Mime language for “Jesus, one of these guys”) and sudden camera-shy self-consciousness that the Google Glass wearers I know are encountering today.

Until such time that Google Glass recedes into invisibility, until there is no outward evidence that you are a Google Glass wearer, only then does the technology stand a chance of penetrating the greater world.

And only then will the real product design problem start.

For when aesthetics of the physical device is no longer a consideration, the entirety of the experience becomes a software problem.

For when aesthetics of the physical device is no longer a consideration, the entirety of the experience becomes a software problem.

And on this point it seems to me that Google Glass software with its slightly kludgy behavior, mediocre design, and limited overall experience is a very, very long way from the target.

I remember when Steve Jobs demoed the iPhone. Do you remember the shocking fluidity of the interface? What it did seemed like magic. It was delightful and seemed some factor more sophisticated than every other device you’d ever used. It solved problems gracefully and with striking originality. It was at once charming and incredibly hi-tech. The physical form-factor was great at the time which was necessary considering its handheld status, but the real story was how it behaved. The software experience.

Had he demoed iPhone, with software that was merely utilitarian and lacking in surprise and delight, had he dumped responsibility to invent a delightful user experience on the developer community, rather than leading with one, no one, aside from a few geeks, would have wanted it.

And that’s exactly where we are with Google Glass.

We have a long way to go. The hardware has to recede starkly to make up for its current social failure, and the software experience has to balloon into something profound.

In the meantime Google is now jumping through hoops with Warby Parker. But I don’t think it will matter. They’ll probably try to make Glass look like real glasses, (hopefully for them fat, chunky geek glasses stay in style a little longer) and maybe that will go some distance in making the tech a little less blatant. But the second you catch wind of a battery pack and a camera – it will start all over.

Whatever the specific brand of industrial design applied to Google Glass, no matter how fashionable the obscured right eye, it will not play the slightest factor in the future of a successful solution.

Delightful software is the product, the sole playing field on which augmented reality will succeed or fail. Software so great that you’ll want it everywhere you go.

“iOS7’s UI is flat (etc.) to focus on content (etc.), and if you don’t make your app flat (etc.) to focus on content (etc.) too, it won’t look ‘at home’ in iOS7, it will look old and nobody will want it”.

I’m paraphrasing but that’s basically it. And I refer only to the belief that the aesthetics need to conform, that it needs to look more like the OS. I am not referring to functional adaptation.

Some of you might take issue with my use of the word “flat” (Vs deep or whatever). I know, that’s incomplete because iOS7 is layered with its illusion of depth, light and materials. That’s an important point – and I’ll get to that. But for now I’m talking about the general practice of removing everything from the UI that doesn’t communicate functionality, and of the focus on graphic minimalism.

Before I explain why that message is misguided, let me say – I love most of the aesthetic changes in iOS7. I think it’s a handsome, on-trend and functional design update, with some niggling exceptions that others have done a fine job addressing (font issues, icons – some of which are already improved), and I expect it will just keep get better in coming releases. I am generally a fan.

Although this flat, minimalist movement is based on a rational devotion to better, more communicative UI, and I suppose seems truer in some pure UX sense because we have essentially moved closer to the very wireframe, “flat”, as it is being advocated, is still just a design trend.

And as with all design trends, “flat” will have a popular lifespan, following which, it will fade.

One of the main points I want to make is that this “flat” UI minimalism will go stale quite a lot faster than previous interface design trends, I believe, for two primary, synergistic reasons:

Because we have such an uncommonly concentrated community of app designers in the iOS ecosystem that trends get identified, and adopted en masse at increasingly rapid rates, but more critically.

Because the very nature of flat design, or rather, of minimalism, is the provisioning of a vastly reduced design palette. A palette that, by design, offers far fewer areas of adjustment which are rather defined by attention to detail and subtlety; the restrained, disciplined modification of the most basic UI building blocks.

So as more designers than ever are working with fewer design elements than ever, together, these factors will result in a sudden commonality in design across apps. Frankly, if you watch for these things, you know it’s already happening on the web (the Squarespace Syndrome). And with it comes a lack of clear differentiation. Indeed, I argue, minimalist app and web design will run to a type of commodity.

So as more designers than ever are working with fewer design elements than ever, together, these factors will result in a sudden commonality in design across apps.

As soon as this realization hits, that their apps are homogenizing (and it will hit) designers are going to start looking for unique ways to move past this commonality. They will start to add, and embellish. They will expand their design vocabulary and re-embrace varying degrees of gratuity.

That said, and perhaps thankfully, the best of them will not revert back to the pre iOS7 trends.

Like most shakeouts, the focus on minimalism in app design has been healthy; it’s bringing the developer community closer to understanding the rigor required for working with type and layout, of prioritizing elements, of limiting the palette to better communicate. And hopefully that awareness will remain.

So what form will the “new embellishment” take?

Virtually all of my designer friends are talking about a new “Maximalism” (half-jokingly perhaps, but that’s how these things start) as a way to break through this inevitable homogenization. I’ve heard half a dozen rather cool ideas that push past the current focus on “flat”, moving forward in a new direction – adding back elements that are, once again, completely gratuitous (and sometimes functional) in a new way. If joyfully so. These will be new, surprising elements that are, under the current flat dogma, “unnecessary” and “distracting”, allowing for random surprise and spontaneity – where rigid minimalism is clearly challenged.

But, I think many of the minimalist designers looking at iOS7’s UI aesthetics are mistaking the larger challenge as a graphic design problem. Dribbble is teaming with designers who are offering up alternative “flat designs”. A point that in some way reveals a basic weakness in the Dribbbles of the world – that these groups focus inordinately on the graphic layer. On how a UI looks.

a basic weakness in the Dribbbles of the world – that these groups focus inordinately on the graphic layer. On how a UI looks.

Whereas the vast majority of designers I interview barely focus on how an interface behaves. And how a UI behaves – how it responds – the alchemy of interaction, that is “interactive design”. A mere portion of which is graphic.

Now, if you look again at iOS7 you can see that Apple is acknowledging this. In those parts of iOS7 that the staunch minimalists are having such an allergic reaction to, things like parallax on the home screen, and wiggle of the text bubbles in iMessage. The so-called “flat” graphic design is there, yes. But it sits within an interactive design that, while restrained, is not minimalist at all, it’s embellishment. But it’s also delightful, and surprising.

This is one of the ways design complexity will necessarily reassert itself through the minimalist homogenization.

For me the main take away here is recognizing that one can honor the rigor that design minimalism has forced to the table – even while one expands the vocabulary. Where “Flat” maybe reduces to a kind of baseline, a jumping off point.

But I think we all need to find our own unique approaches.

And I guess that’s my parting thought. That I don’t believe the answer is to just jump into the specific iOS7 design approach as though it is some sort of ideal design guideline. In fact, depending on your app’s function or audience, it may even make perfect sense for your app to be utterly, cartoonishly skeuomorphic.

Namely because, from where I sit, the world of communication and UX is just way, infinitely bigger than iOS7. That’s just what Apple did – with the platform. Ok. I’m glad they did it, it is an improvement over the previous. But surely you have something to say that is different. Surely your content – your idea – your app – is a unique invention of its own. Surely it wants to be itself. Sure it does not need to look just like it belongs inside the OS.

But surely you have something to say that is different.

I mean, if a platform with one aesthetic approach always dictated the form of its content, what would that mean for, say, movies? Is it better if movies all self-reflectively share the aesthetic approach of the theater interior, or maybe of your home? I know that’s ridiculous, but I guess I feel like reflecting design choices of iOS7 is just some percentage less ridiculous. The trees you notice in the forest are the ones that are bent over funny. The ones that are unique. This is where I completely lose the rationale for following Apple’s design solution in the development of apps. I get that there are best practices, and a basic growing language that we share in the interactive space. But the point should not be to copy or align with Apple’s design approach. It should be to honor your unique vision. Learn from the masters, of course, embrace best practices, but where aesthetic choices are open to you, strive to find your own voice.

I know that some strong thinkers out there agree with Tapadoo, like John Gruber, who linked to the post above, and with whom I almost never disagree. So I must say – it’s left me scratching my head. Because on this, I do fundamentally disagree that updating your app to “look like an iOS7 app” is even remotely as urgent as updating an app to accommodate the larger screen of the iPhone 5. Not even close. With the iPhone 5 the screen was bigger and your legacy app looked broken. Of course any app needs to work, and by “work” I mean the app needs to adapt to the new system’s basic technical and functional conditions. So I guess, yes an update is necessary, but where we are talking about aesthetics – of “looking like iOS7” – no, following such a design trend is not necessary.

UPDATE 2:

John Gruber graciously answered my question:

“I use iOS 7 as my main OS on both iPhone and iPad. The non-Apple apps stuck out like sore thumbs. They don’t even have the new keyboard.

“I’m not saying all apps should look just like Apple’s. I’m saying only that they need to look and work like they were designed with iOS 7 in mind, and they need to be updated with the new SDK. That’s all.”