Media commentary from a recovering journalist.

February 02, 2012

Google’s mission is simple, bold, and in the annals of silicon culture, tantamount to sacred gospel: “Organize the world‘s information and make it universally accessible and useful.”

Today, however, there is a New Testament being written: “Organize Google’s information about the world and make it selectively accessible and mostly useful.”

Okay, not as sexy. But in light of Google’s new focus on social results born from its own Google Plus product, this New Testament is arguably more accurate.

No longer Google Search, but Google’s Search. Not Google results, but Google’s Results. No longer the Web, but Google’s Web.

Google used to be a Web search site. Now it’s becoming an App. And not just any App, but a Super App that pulls you into Google’s Web with an insistent gravity.

Google calls its New Testament “Search Plus Your World,” delivering results from Google Plus pages and profiles more prominently than other Web properties like Wikipedia. This is supposed to make search more personal and relevant – who you know and what your friends like, or share, or do, will have a greater influence on what you see in your search results.

Personalization is hardly new (neither are the inevitable privacy concerns, which go hand in hand.) A personal Web is generally good, faster and easier to use. The “Semantic Web” envisioned by Tim Berners-Lee is today ever more possible, that dream of search delivered according to each person’s interests and desires, by computers that “learn” from us, understand us and make choices on our behalf.

But Google the App -- far more so that Google the Web site -- also runs the risk of making Search more exclusive and incomplete. One person’s customization is another’s exclusivity. Choices made for us, however well intentioned, are not always the best choices for us.

Google’s search gambit is much broader than Google of course – “aggregation” or “curation” or whatever the ridiculous made-up-name it’s called these days is all over the place, most notably on social platforms like Twitter and Facebook.

The difference is, when I’m using Twitter, I’m making the choice to curate content. I know that searching on Twitter gives me results from and favored by Twitter.

But for now at least, I can’t get results from Facebook or Twitter via Google (although the latter has been true for months.) And as mentioned earlier, results from the rest of the open Web are favored less than results from Google properties.

This is Google’s right. It’s a for-profit company and can do whatever the hell it wants, and what it wants is to secure more ad revenue and expand its own social network and services.

The rub is that Google changed the game first and then told everyone the new rules. Google went from giving us information about the world to giving us information about “our” world, provided our world includes a Google Plus account.

It’s too early to say whether this is bad, but it’s certainly different and should not be automatically assumed to be better. Search is done best with your eyes wide open, and the commercial aspects of Google’s New Web Order should be taken into account when deciding what to click.

For companies like Google, data is a conceit; but humans are far more complex than the data they create. This is why a Semantic Web is the Internet’s Holy Grail – we want and believe it to exist, despite the fact that we may never find it.

Whether Google’s Web is another step toward that discovery remains to be seen, but one thing is certain: The Web of Apps is here.

December 04, 2011

There’s a little-known scene in the movie “Star Trek: First Contact,” where Lt. Commander Data, an android, observes Captain Jean-Luc Picard touching the hull of an historic spacecraft. The captain smiles and taps the ship with his bare fingers, to which Data asks, “Sir, does tactile contact alter your perception of (the ship)?”

“Oh yes,” Picard says. “For humans, touch can connect you to an object in a very personal way. It makes it seem more real.”

Data then touches the ship himself, and says,” I am detecting imperfections in the titanium casing. Temperature variations in the fuel manifold. It’s no more real to me now than it was a moment ago.”

Data’s reaction is similar to how many of us interact with both objects and information today. More and more, our “things” are trapped under a layer of glass. We can “touch” but we can’t always feel. We live our lives from screen to shining screen.

It's a digital world – and overall, it’s a far better world. But digital should enhance physical experiences, not replace them or separate us completely from what makes us human.

This is why I believe it’s time to make a place for real objects and real connections. It’s time for a truly disruptive technology to bring us back to our senses.

It’s time for paper.

No, I’m not joking. And no, I’m not just an old journalist pining for the days of picas and ink-stained fingertips (okay, it’s true I’m old, just not insane.) There is a place for paper in a digital world, because although all information will soon be converted into bits, human beings are still made of atoms.

Our sense of touch and its emotional resonance is unique from other animals. Paper transmits feeling in ways screens can’t begin to deliver. Paper is far from perfect, but it’s more than ready for a renaissance.

Despite the meteoric rise of e-books, the largest growth is in the print-on-demand sector, increasing 169 percent in 2010 and even greater in 2011. Lulu.com, a leading print-on-demand press, expects $40 million in revenue this year, up from $34 million in 2010.

And investor Warren Buffet’s latest purchase? The Omaha World-Herald, a newspaper. Said Buffet, “I wouldn’t do this if I thought this was doomed to some sort of extinction.”

I’m being a bit overly persuasive – of course paper will never be what it once was. This shift to digital didn’t start with the iPad or Kindle either, remember LexisNexis in the 1970s?

But we have reached the point where paper is again disruptive, and when incorporating paper or print into communication can set you apart as an innovator rather than make you appear old fashioned.

Access to information should not be confused with connection to that information. Commander Data from Star Trek was able to access information about the spacecraft, but he couldn’t connect with it in the same way as Captain Picard.

The key question we must ask ourselves is should traditional mediums transition to digital, or should digital technology serve to enhance and improve traditional experiences and interactions? Can we live a life under glass, or do we need to get out once in a while?

October 21, 2011

“The Semantic Web is not a separate Web but an extension of the current one, in which information is given well-defined meaning, better enabling computers and people to work in cooperation.” -- Tim Berners-Lee

The promise of technology almost always outstrips the reality of technology.

Back when War Games came out in 1983 – ten years before NCSA Mosaic, 15 years before Google – I wanted to talk to machines, not just through machines. I felt for Scotty in Star Trek IV when he picked up a computer mouse and tried to give it voice commands, only to give up in disgust. “The keyboard, how quaint,” he said.

How quaint, indeed.

First social media killed the “delete” key. Now touch devices and Apple’s Siri voice assistant interface are killing the rest of the traditional physical keyboard.

But Siri goes much further. While still very early in its evolution and application, Siri is collecting a monster database of human behavior. Siri goes beyond “need” to “intent” – not what somebody wants, but why.

Call it technographic, call it behavioral or call it semantic – whatever the term, Apple, not Google or Microsoft, may be ushering in the era of Web 3.0 and language-based search (and with it, capturing the ad dollars that will surely follow.)

“The Semantic Web is not a separate Web but an extension of the current one, in which information is given well-defined meaning, better enabling computers and people to work in cooperation.” This is the promise of Siri -- or at least it seems so at this early stage.

Once Siri moves to other iPhones (it won’t stay just on the iPhone 4s for long, there are no hardware restrictions that prevent it from being added to older iPhones) and then to iPads and Macs, we will start to see whether Siri is just a fun diversion or, rather, a founding mother of Berners-Lee’s semantic web.

June 30, 2011

The universe was born in near-perfect order: small, organized and perhaps even serene. Over time, the universe expanded and became messy, diverse and beyond comprehension. Chaos reigned.

This effect is called entropy, the process by which order decreases in a system over time. This is admittedly a simple description, as a single system can still produce order provided there is a balance of disorder somewhere else in that system.

But since I’m just a writer and marketing guy, I’ll stick with the main concept: Order begets chaos.

The same thing has happened with social media – but this time it’s different. This time we are trying to reverse the Entropy Effect.

Our brains are linked to when the universe began, when order was the norm. Our lives – learning about the world, going to school, following rules and laws – are a constant battle against entropy, a fight for order against chaos.

The Web grows out of control, barely manageable. So we create tools like browsers, RSS and folksonomies to give it order, because our brains are wired for order.

Then the Web becomes social, a platform built for people – not machines – to connect with each other. It quickly grows out of control, so we create better search engines and hash tags, data mining and tools to “curate” content and determine influence.

Google+ is the latest weapon in our war against entropy, against chaos. Better information streams, closer and more relevant circles of relationships, stronger real-time connections to people we care about and actually know (notice I didn’t say brands – they will be a part of this too, but how or even if they should remains to be seen.)

Google began as a way to organize the world’s information. Now it wants to organize the world. It’s taking the Entropy Effect head on and of course it wants to win.

Facebook isn’t Google’s competition, nor is Microsoft (client) or Twitter or Yahoo! or any other company or social media service. In this case, they are all on the same team.

Google’s enemy is entropy. Its nemesis is the underlying force of the universe itself, of order vs. chaos.

It’s hard to believe that companies like Google will succeed where millennia of nature has failed. But like us silly humans, with our dreams of something better just beyond the horizon, Google is going to try.

February 07, 2011

“I do certainly see the day when more people will be buying their newspapers on portable reading panels than on crushed trees. Then we’re going to have no paper, no printing plants, no unions. It’s going to be great." -- Rupert Murdoch, Sept. 14, 2009

“Thanks for the trial subscription, but so far, it is selling itself.” -- commenter on The Daily’s first edition

It was no gimmick that Rupert Murdoch’s The Daily was launched at the Guggenheim Museum in New York. After all, The Daily, a subscription-based iPad only “newspaper,” is less a marriage of technology and news than it is a work of news-as-art powered by technology.

Journalism has always been more of a subjective art form than a craft or trade -- yet historically, the presentation of news has had all the beauty of a macaroni sculpture and functional efficacy of a rotary telephone.

That was fine when there were fewer ways for media companies to distribute the news and for the public to consume it. But in a world of endless choices, the medium matters more than ever. User experience becomes part of the story. Information becomes art.

And because art matters, the very rationale for a publication like The Daily to exist becomes even more subjective. Is it a newspaper or a new kind of “real time” magazine? Do stories with better visuals and interactive features have priority over stories with greater news value but fewer bells and whistles? Is the “art” of good writing less valuable to modern audiences than the “art” of video clips and 360-degree images?

My hope is that this convergence doesn’t become mere art for art’s sake. I hope that it will compel journalists to look at the news in different ways. I hope it lifts their art and drives them toward better journalism -- for all the technology in the world is no substitute for a good story.

January 13, 2011

The annual Consumer Electronics Show (CES), held last week in Las Vegas, is both a birthplace and a graveyard for technology. Some devices will “make it” and even change our lives, while others will never get to market. If T.S. Eliot were alive today and a tech nerd, he would have called January the cruelest month.

The big story of CES wasn’t tablets, or phones, or 3D, or the fact that not one of the 150,000 attendees seemed unduly distracted by the Adult Entertainment Expo at the nearby Sands Hotel (there were a few exceptions, but I promised to keep the names confidential.) For me, the real headline was a shift away from social “communal” interactions and toward socialized personal experiences.

In other words, while many new devices enable sharing with friends if desired, the true purpose is immersion, not connection -- access to content, not conversation. Internet TVs, tablet computers and movie theater sound in mobile phones all serve to take us deeper into our own worlds rather than open us to connecting with each other. “Personal” has become the new “social.”

We see this also with the trend toward curation and aggregation of news, personalized feeds and niche networks within networks. People no longer want to be part of the entire universe, just the parts that matter to them.

To be clear, while sharing and social actions are woven into today’s tech DNA, sharing in itself is not “connecting.” That last step is up to the individual (of course), yet that becomes more difficult when the individual is wearing 3D glasses or otherwise sucked into solitary activities.

We still connect, but more and more those connections are with systems and content, not with people. And in the case of brands, getting people to connect with products and services is still more important that getting to know the people themselves. It’s 20th Century marketing with 21st Century technology.

Technology is the heart of social media and conversation is the soul -- but are we at risk of losing some of that soul as we immerse into richer digital worlds? Cruel, indeed.

January 03, 2011

Predictions are a fickle business, but with the new year just begun and CES still days away, the writing for 2011 is already on the wall (and I don’t mean Facebook).

There will be stories, tweets, posts, photos, videos, slides and all sorts of content spewing from CES about the latest technological gadgets and guesses about consumer demands. But underlying it all, almost unnoticed, will be something bigger and far more powerful than any collection of microchips: The return of long-form reading.

Yes, reading, that thing we do with our eyes and brains, requiring no additional functionality other than desire and perhaps more time in the day.

In the ‘90s the “killer app” of the Internet was e-mail. This past decade saw the rise of People as the killer app thanks to social media, easily the most disruptive and powerful force of the new post-modern digital age. Now reading has re-entered the mix in a big way, becoming the main Raison d'être for many technological innovations as well as invigorating traditional media businesses:

E-book sales were 10 percent of trade sales in 2010, and it’s expected that number will double in 2011

Surveys suggest that people who purchase e-reading devices subsequently increase the number of books they buy. “For the first time in a long time, there are many more places for people to buy books,” said David Shanks, CEO of Penguin Group USA

Kindles and Nooks are great and are indeed moving well, but that’s nothing compared to the wave of tablets about to hit the market, from Microsoft and HP to others running on Android (even Vizio is getting into the game.) Tablets have led to a kind of rebirth for long-form reading and especially for magazines, and though app sales have slowed recently that will likely change once these new devices and the iPad 2 become available

Services like Storify are helping to bring back long-form narrative, assembling the disparate social layers of a story to make it whole. Then there’s Readability, an app that frees web pages from ads and other distractions from the text. Instapaper, an app for storing online content to read later, has nearly a million users

One person’s tweet is another’s cue to dig deeper. As a recent Wired piece put it, “The torrent of short-form thinking is actually a catalyst for more long-form meditation.”

Of course modern media consumption isn’t just about text; it’s not like video is going away (quite the contrary.) The way we tell stories will continue to evolve and require multimedia narratives told in layers across platforms and with multiple inputs.

But reading will remain at the core. In a way, the shift toward tablets and mobile devices are merely new ways to do something very old. We can change technology and our culture can mature, but we can’t change who we are as human beings.

We are readers, we demand quality, and we will ultimately voice those demands with our wallets.

December 11, 2010

I’m a technology evolutionist. Granted I just made that up, but it fits.

I don’t believe new technology is just created -- I believe it evolves, with each innovation building on the other. No printing press, no iPad. These technologies are from different centuries yet share the same DNA.

And like technology, content evolves too:

-- Radio started with low-quality programming — but better content (and a world war) drove mass adoption

-- Television started with little original programming, mostly re-purposing radio serials — but better content and Madison Avenue made TV must-see

-- Then came the Internet, which began with grainy postage-stamp sized video and the Star Wars Kid. Now we have Old Spice, “amateurs” making HD-quality videos and former presidents doing live Q&As on Facebook

Media is about sociology and not technology. As the stories we tell get better, the mediums we use will adapt and interfaces will morph to meet modern cultural demands. Better content will drive scale.

I know it’s cliche, but the technology “revolution” is just an evolution. What we are experiencing today is nothing less than history in motion.

November 02, 2010

IN THE UNITED STATES we have a two-party system -- well, technically. There’s the Democrats, the Republicans, and then the dozens of mini parties within and outside the main Parties, not to mention the multitudes of non-affiliated, disenfranchised and disillusioned.

But for all its failings, Democracy is an open system which, for the most part, works. It beats the opposite, that being a closed system akin to oligarchy or monarchy. We can do better with our current form of Democracy, but I’d rather let everyone have a voice even if I disagree, rather than have no voice at all.

The Internet now faces a similar choice -- stay open, as World Wide Web founder Tim Berners-Lee intended, or revert to a more closed system of apps and pay walls, akin to the ‘80s when CompuServe, Prodigy and AOL reigned.

I admit, this is an over-simplification of a complex issue. Like the two major political parties, the truth rarely lies at the extremes but rather somewhere in the middle.

Nevertheless, in our desire to bring order to our virtual universe, we are in danger of losing our ability to innovate.

As with an election, we have a choice here, too: we can either play with Legos or build model airplanes.

Legos -- the real kind, before they came pre-packaged with directions on how to build an Imperial Cruiser -- left everything to the imagination. Legos are open, endless, iterative and collaborative. Legos are a doorway to innovation.

Model airplanes, however, require deliberateness and order (not to mention a lot of rubber cement and a cool temper.) If you stray from the model or the approved directions, the plane won’t fly. Model airplanes are finite; they have rules and are largely solitary endeavors.

The Web is the ultimate Lego set, a digital tapestry of zeros and ones capable of endless discovery. Yet all this potential is tempered by the sheer volume of information, and it’s this overload which, in part, has prompted the rise of apps and walled-off social gardens.

I believe both Legos and model airplanes can co-exist -- neither needs to spell the death of the Web or stem innovation.

You can have the Apple approach where all apps for the iPhone, iPad and soon Mac store need to meet certain stands and approvals, both technical and societal -- and you can also have apps for the Android mobile platform and Mozilla app store, where developers have more freedom to play with their digital Legos and create without constraint.

The Internet needs the Web, as do the millions in developing countries who still rely on the open Web for information and education. Not everyone has an iTunes account and a credit card.

The Web’s beauty is in its simplicity -- a universal digital language that crosses borders with the ease of sunlight. And its strength is in its freedom, in the unknown potential of what it is and yet could be.

I know if I follow the directions, my model airplane will fly on the first try. But given the choice, I’d rather have the freedom to experiment and learn from the crash.

September 23, 2009

I’ve never liked the term Guru – it’s a throwaway word, much
like Paradigm, Content, or Kanye. Plus, I wonder if calling a marketing person a “guru” is
offensive to actual gurus, and whether by using the term I’ll get punished with
some karmic payback, like being reincarnated as a Fox News anchor.

But I particularly dislike the word when precedent by two
other overused words, “social” and “media.”

Any blowhard with a blog can self-designate as a social
media guru, and because any blowhard can, many blowhards do. Same goes for Twitter,
the only difference being that Twitter allows people to become assholes much
faster and with more grammatical errors.

If you say you are a social media guru, then you are
focusing on the wrong thing. It’s important to understand the tools and channels
and all that, totally fine – twenty years ago it was important to understand
fax machines too, but not a lot of people touted themselves as gurus in
“faxable media.”

What really matters is understanding consumer behavior, how
people communicate and why, what they are saying and why, and to whom, and
where. We use the word “social” as often as a person with a cold reaches for a
tissue, yet we forget that “social” is about sociology – you know, people, not
platforms.

All media today is social, so in my opinion there is no
“social media.” And there are no gurus either, only those who know a little
more than some others – and trust me, the others aren’t too far behind.