Bluntly: Best thing that could happen, especially if you’re a Star Wars fan. George Lucas is the most influential filmmaker of the last half century and we all owe him a debt in terms of how he’s advanced the technical aspects of cinema. But the dude can’t write or direct his way out of a paper bag. Equally bluntly: The best parts of the Star Wars extended universe are the parts where he’s not writing or directing. Thirdly and most bluntly: George Lucas hates all nerds now, because they poop all over him and his can’t-write, can’t-direct ways, and now wouldn’t go back to the Star Wars universe if you paid him.

Now: Disney. Yes, soulless corporate monster that gives babies adorable Mickey Mouse ears and tickles their chins before it swallows their souls. On the other hand: Also smart enough to buy Pixar and Marvel, give their respective brain trusts the keys to the castle, and say “Get to work.” Result: Films that are relentless commercial, entertaining and profitable, not only in themselves but in all sorts of ancillary markets. The mouse is a monster because it knows how to entertain the very living crap out of you. This can only be good for the Star Wars universe, freed now as it is from its cranky, frustrated Emperor Lucas.

In fact, if Disney had any brains at all, it would give the administration of the Star Wars property over to its Marvel Studios and say “That thing? That thing you did with The Avengers? Yes, that. Here. Now.” And then let them do their thing. And come 2015, when Episode VII thumps its way across the screen and you, you damn fool, you who ground your way through the Prequel Trilogy out of a patent sense of duty to your Dread Lord George, trudge off in your Jedi robes to go see it, by the sweet and merry mouse above, you will be entertained.

And George Lucas? Well, who knows? Who cares. Let him be happy on his enormous pile of money, away from the likes of you. Everybody’s better off that way.

I’ve written popular science articles and books, and one of my personal philosophies is that about 80% of any subject can be understood by any ordinary person — if you can manage to explain it correctly. Robert St. Amant has written a book to explain computer science to everyday folks — appropriately entitled Computing for Ordinary Mortals — and in the writing, he found himself confronting the task of making approachable what is often considered an unapproachable field. How did he do it? I will let him tell you this story.

ROBERT ST. AMANT:

When I was ten years old or so, I saw a battered paperback copy of Triplanetary on my grandfather’s bookshelf. I borrowed it… and found myself in ten-year-old heaven. Science fiction led me to popular science, with Isaac Asimov (and Edgar Cayce, embarrassingly enough) to help me cross the boundary. I read about physics, space, biology, math, and psychology. It was formative reading. Today I’m a computer scientist, and I’ve just written my own book.

The big idea in Computing for Ordinary Mortals is that the basics of computer science can be conveyed through stories. Not stories about computers and how we use them, but stories about other kinds of everyday things we do. Computing is more about abstract concepts than about hardware or software, and we can understand these concepts through analogies to what happens in the real world.

For example, imagine you’re shooting a low-budget horror movie, set in a haunted mansion. Unfortunately, you don’t have a mansion, much less a ghost, but you’ve found a couple of big, empty rooms that you can redecorate from one scene to the next, so that in the finished movie they’ll look like different places. You’re taking advantage of the locality principle. Movie-making is a complex activity that needs a lot of space, in theory, but it can be broken down into smaller activities that fit into much smaller spaces and work at different times; each part only needs what’s in its own neighborhood. So you can reuse the space you have, over time. We see the same thing happening when people play half-court basketball or timeshare a vacation apartment.

Analogies like these can be spun out into short-short stories, with characters and a minimalist plot, to make the how and why of computing a little more memorable. Why do computers have caches? How does virtual memory work? Can a gaming environment be infinitely large? “Well, you can think of it as if you’re making a movie…” I’ll skip the detailed explanations to get to the most interesting part–if a story works, it means that we can understand computing through some ordinary experience and the reverse. Real life as computation.

That’s exciting, to me. How hard could it be to write an exciting book full of computer-relevant stories? Hmm. Harder than I’d expected. The explanation part was straightforward, but the stories themselves didn’t come as easily. Eventually, though, I realized that I was writing something close to modern parables or fables, following strict conventions about how a story should unfold (with a bit of science, math, or engineering at the end instead of a moral insight). Most computing concepts are about making sense of problems and how to solve them; I just had to figure out how these problems might arise in an interesting way in some real or imaginary world.

For example, the opening story in a late draft was an Alice in Wonderland pastiche. I liked it, but one reviewer was irritated with the pacing, and another just said, “Alice has to go.” So Wonderland changed into a balloon ride over a coastal town, then became a scientific expedition to Mars, and ended up being a conversation with an alien on a spaceship. I was rewriting the “same” story, in a sense, but that was worthwhile; some stories express a given theme (or analogy, in my case) better than others.

Telling stories in popular science carries some risk. Are the stories true? No–analogies and metaphors are never literally true. Charles Petzold even argues against such story trappings in his excellent book, Code: “Metaphors and similes are wonderful literary devices but they do nothing but obscure the beauty of technology.” My analogies do approach metaphor at times. But I think a better question is whether the stories work, whether they give us appropriate insight. After all, we understand the world around us through stories. If those stories happen to encompass the computers and computations in our modern lives, then so much the better.

Not a lot, and it won’t last until noon, but here it is: The first snow of late 2012. I’m not terribly happy to see it, but considering at the moment there are entire cities submerged in standing seawater, I’m not going to complain.