Search Content

Content Categories

WhitePapers

Businesses of all sizes can benefit by automating all aspects of their sales processes with an SFA (Sales Force Automation) solution. But due to the sheer number of features that most SFA solutions...Read More

If you're wondering which CMS is the right one for your organization, this comprehensive guide will take you through the various options available, detailing the pros and cons of each. Download...Read More

Considering a new phone system for your business? The Phone System Buyer's Guide from VoIP-News provides you with all of the information you need to make a more informed decision. The Guide helps you...Read More

Aurora: Writing the Script

Writing the script for Aurora turned out to be a tricky balancing act. We wanted to illustrate the interesting design solutions we had come up with, but we also had to provide enough context to make the solutions meaningful. We needed the movie to have a narrative flow and momentum, but it also had to cover a diverse array of interactions.
And just to give ourselves an extra challenge, we set for ourselves the goal of avoiding what we considered to be the clichés of the design concept video: affluent mid-thirties professionals in sleek modern environments, or chic young urbanites out on the town.

As a result, we ended up spending a lot more time working on the script than we expected. Dan Harrelson, Julia Houck-Whitaker, and I went through several iterations of sticky-note exercises: first prioritizing the interactions we wanted to illustrate, then brainstorming user tasks that would involve those interactions, then trying to stitch the tasks together into plausible scenes.

We decided pretty early on that it was impractical to work all of these interactions into a single overarching story. Instead, we settled on three segments: one focused on using a desktop computer in a work context, one focused on using a mobile device in a social context, and one focused on using a large-screen home device in a family context.

Of these three, the mobile segment turned out to be far and away the most difficult. We developed and discarded idea after idea as we realized that each one focused too much on the functionality of the device, and not enough on the functionality of the browser. It was fairly late in the scriptwriting process, after we thought we had all the scenes plotted out, when I decided we really needed two mobile sequences: one oriented around location-aware services, and one around interaction between the web and the physical environment.

When we did our first timed read through of the script, we were dismayed to discover that it came in at nearly double our target length of six minutes. (The film industry has a rule of thumb that says a page of script equates to about a minute of screen time; this didn’t apply to us because our script contained long, complex descriptions of user interface behavior that would take just a few seconds to unfold.)

So I cut it down. And cut it some more. And cut it some more. Much of what I cut was dialogue, intended to provide a bit more context on what the characters were doing and to hint at some of the technological changes suggested by the scenarios Jamais wrote. Finally, I decided I’d cut too much. I went back and rewrote the script from beginning to end, reworking it so that certain lower-priority scenes could be included if we had time to do them (as a possible “extended cut” of the movie) but ensuring that the narrative flow didn’t depend on their presence. (Only one of these, the exchange between Harry and Beth about MapQuest, found its way into the final movie, which still ended up over nine minutes long.)

Scrivener was an enormously valuable tool in the scriptwriting process. I didn’t use all of its functionality, but it did provide vital, specialized tools above and beyond what a word processor can do. Next time, we’re eager to try out Celtx, for the functionality it provides to bridge the gap between writing and production.

As written, the script doesn’t accurately reflect what ended up on screen. Many of the interactions I described in the script just didn’t work when I saw rough versions of the animations produced by Whiskytree, and in a couple of places I had to rethink the interaction flow very late in production in order to make the interface consistent and realistic.

If you’re not familiar with Siftables, watch the TED Talk. Peter Merholz: At TED, you demonstrated Siftables, small “smart” blocks that can interact with one another in interesting ways. What opportunities does Siftables provide in overcoming...