Your Permanent Record

Microsoft CTO David Vaskevitch explains why tons and tons of memory will serve us well.

Scott Menchin

My 81-year-old mother – until recently a computer-phobe – is living the future. Not so long ago, whenever I would visit her, the first question she'd ask was "Which pictures did you bring?" And, as she is a Jewish mother, whichever pictures I brought were, somehow, the wrong ones – until one day I said, "All of them. All the pictures I have taken since the last time I saw you." I had my laptop with me. We went through the photos, and when we got to the last one, my mother said, "I want that." I assumed she meant the image on the screen and offered to print it. But no, she wanted the computer itself. I bought her one and loaded it up with pictures.

This article has been reproduced in a new format and may be missing content or contain faulty links. Contact wiredlabs@wired.com to report an issue.

Joshua Ellingson

Recently, Jim Lewis suggested in this space that the ability to store ever more digital artifacts would lead to a form of forgetting ("Memory Overload," Wired 11.02). He also suggested that traditional techniques of gathering memories such as paper photographs not only avoided this overload problem but added a kind of patina of nostalgia as the pictures faded and cracked. What a scary thought: Not only do the memories that we've chosen to document degrade, but this is supposed to be a privilege! In fact, memory – the ability to record, store, organize, play back, expand, edit, and elicit experiences – is the future of computing. What we need is more memory, plus better software tools to manage this abundance.

Not everybody uses a word processor or a spreadsheet or even email – but everybody takes snapshots. If you're like me, you have them stored in a shoe box. Finding one in particular used to take a full day of sorting by hand. Today, using my laptop as a virtual shoe box, I can flip through thousands of thumbnails in just a few hours. I'm taking pictures at four times my usual rate – which, research shows, is typical. I'll max out my 50-gig hard drive soon. Serious photographers (not to mention serious music collectors) are routinely building databases that are 200, 300, even 500 gigs. Apple promotes multimedia management as the center of its strategy. Here at Microsoft, we named the most recent version of Windows – XP – for its "experience" management features.

In truth, the computer industry has a ways to go before we come up with a memory-management machine that really, fully complements the one between our ears. Terabyte disk drives will become standard, as will a revolutionary new operating system. The current OS paradigm – the file-based model – dates from the '60s. Database designs (relational ones, at least) hark back to the '70s. Both are just barely adequate. Currently, we spend too much precious time putting pictures into folders, assigning keywords, and orchestrating slide shows.

In a fully realized digital memory management system, your camera will come with a self-setting clock, a built-in GPS locator, and perhaps 100 gigs of flash memory. Every picture or video snippet that you shoot will be embedded with date and location information. Your standard OS will include sophisticated face-matching software. Your computer will be your shoe box – one with a storage capacity approaching the largest paper-and-ink archive on earth. And ferreting out every picture of Granny at your daughter's graduation will become a matter of simply setting a few parameters in Photo Find and pressing Return.

We'll have our digital memory managers by the decade's end. And it won't be Jim Lewis' "memory overload." Au contraire; overload is a problem created by physicality, by pictures in a book, prints in an album, slides in a box, negatives in an envelope. Photos will categorize themselves "automagically" and provide us with a second memory system – a backup for our brains – that eventually will be, in its own way, as powerful as the first.

When that time comes, the computer itself will make a historic transition from something that is used for analytic tasks – number crunching and word processing – to something that can elicit emotion. You could say that it will become a right-brain machine instead of a left-brain machine. It's already happened for my teenagers. After a family trip, they'll steal my laptop to show their friends our pictures.

A dozen kids will cluster around the screen, talking, laughing, exclaiming, joined in a truly emotional moment. I ask you, Jim, is there really such a thing as too many memories?

Here’s The Thing With Ad Blockers

We get it: Ads aren’t what you’re here for. But ads help us keep the lights on. So, add us to your ad blocker’s whitelist or pay $1 per week for an ad-free version of WIRED. Either way, you are supporting our journalism. We’d really appreciate it.