In the twisted minds of some business journalists, business is all about charismatic leadership. In the minds of some business school curricula, business is all about leadership, period. So because Yahoo's CTO is leaving, we're supposed to believe that the company is now in turmoil and facing a "leadership void."

But Zod Nazem has been with the company for 11 years! Just imagine what that must be like at a company that was pretty much synonymous with the growth of the Internet.

Every company's got issues. But I wouldn't rule out the continued success of a company with a range of digital businesses that are profitable, and a brand known globally in mostly positive terms.

I think the problem is, half the world is an armchair manager, possibly even an armchair "C level exec." Even some folks at Yahoo are armchair managers! (but I digress)

But ideally, great companies will run pretty well without a rock star CEO, without a rock star CTO, and definitely without anyone from the press understanding what's under the hood. It's about execution on one hand, and sound fundamental business units on the other. And hopefully a quiet, Level Five leader or two.

Nazem's work with the Panama rollout would be a great example of working on creating a make-money-automatically business that works so well that even top management cannot screw it up. "Playing catchup with Google" isn't the sin it is being made out to be: it was the responsible thing to do!

This just in: Google is doing better than Yahoo. :) In turn, Yahoo is doing better than a million other companies, including some of the biggest and best.

Funniest anecdote of the day: Craigslist CEO Jim Buckmaster highlighted the "girl with the blue urn" in the Missed Connections area of Craigslist. She "spilled her grandmother" on a "sad looking boy" in Spadina subway station.

Apparently Michael Seaton isn't quite up there with Michael Keaton (the name Google suggests I really mean when I type "Michael Seaton blog,"), but the director of online marketing for Scotiabank just got a little old media ink.

Patricia Best at the Globe and Mail noted that while Seaton ("the director of online marketing for The Bank of Nova Scotia")'s digs at competitor BMO were "by no means revelatory or even original," but his competitive post was "a departure from normal practice in the gentlemanly and closed club of Canadian banking."

Hmm, maybe it's that gentlemanly reserve that got some of them in this mess in the first place.

Best's mention was published a gentlemanly ten days after Seaton's post.

As anyone with a logfile analyzer for their own site knew already, Google has continued to take market share away from the next two search engine companies. comScore's numbers seem strangely two to three years behind reality, as always. Which probably means the trend has already leveled off!

But also, as usual, we'd love to see the breakdown among all the "Google properties," "Yahoo properties," and "Microsoft properties," and an explanation of what counts as a search. Not quite enough to pay $100k for the data, though.

Jeff Braverman at NutsOnline.com (a client) has gotten mixed up in something huge (David Braverman is pictured at left). We (my colleague, the humble genius Scott Perry) even ran an AdWords ad group for it briefly, sending visitors to this landing page. Now the viewership is linking to the page from a wide variety of blogs and forums.

OK, I admit it, I'm having trouble piecing this together as I don't watch the show... but did I hear right? Did they show a "cliffhanger" episode, and then cancel the show?

This show is being cancelled. There's a line including the word "NUTS" in the last episode. Irate/loyal fans have by this time sent 7 tons of nuts to CBS.

Is is just me or do investment banking types like their summers off? There seems to be an uptick in massive mergers and acquisitions of late in general. In our online ad industry, four major unclaimed pieces of the ad services and ad inventory pie have been snapped up. Yahoo took control of Right Media; Aquantive, which owns Valueclick (which owns Commission Junction), agency Avenue A/Razorfish, and bid management tool Atlas, was bought by Microsoft for so much money ($6 billion) that it counts as Microsoft's largest ever acquisition; mega-agency WPP bought 24/7 Real Media (among other assets and agency services, it's owner of bid management technology Decide DNA); and starting the whole domino effect in the first place was Google's $3.1 acquisition of DoubleClick (which has a number of interesting assets in the ad serving and bid management field, but also owns an agency, Performics).

We're a long way from Google picking up tiny Sprinks (an ad system that mostly served customers placing ads on About.com) for low millions. But this example might help us better understand what's going to happen next. Google replaced Sprinks inventory with its own, eliminated Sprinks' unique methodology from the marketplace, and more or less gave the employees their walking papers (no doubt politely and amicably).

So my reaction to the recent acquisitions - particularly DoubleClick and Aquantive - was that it would throw our industry into short-term chaos, on a number of fronts. The diversified nature of the acquired companies meant that people and products would be moving around some more before they came to rest and re-formed altered relationships with customers.

In each case, I think the key question to ask is, what part(s) of the acquired company are the acquirers really buying? In spite of statements to the contrary, the acquiring companies do have plans to sell, eliminate, or drastically reorganize big chunks of the companies they've acquired. This is plain.

I checked out industry reaction, both by polling some industry insiders for their viewpoints, and by reading some of the commentary. Here's a selection:

In MediaPost, Mark Simon pointed out that a bid management tool like Atlas (remember, owned by Aquantive which is now going to be owned by Microsoft) is used to manage a large number of high-spend search accounts. Atlas has a lot of detailed data about search campaigns, particularly those run on Google. Great competitive intelligence, right? Too great. Simon followed the logic to argue that there's no way Google won't block API access from bid tools owned by major competitors. I would take this a step further to try to imagine exactly how Google will reorient its policies so they don't seem discriminatory. Let's make that the next bullet point...

I believe, along with some others, that Google will study and redesign some of what they've acquired from DoubleClick (the DART Search bid management system) to begin offering sophisticated bid management in-house. Listen, they were on a path to implementing this anyway, and it was going to hurt a lot of third-party bid management software firms because we'd be able to get this functionality free within AdWords. That now looks like a certainty within six to twelve months; as with Urchin-Analytics, improved versions likely hit the market (at no cost to you) in 18-24 months.

I'll shift gears and offer my take again. WPP and 24/7 is an odd one. They need the agency services side. The inventory and network parts seem out of place. The bid management tool will be ineffective or actively blocked by Google and Yahoo. So this means WPP drastically overpaid for an agency add-on.

Danny Sullivan told me that he sees conflict-of-interest problems:

"I simply don't see how either Google or Microsoft think they are going to be able to hang on to interactive marketing companies that are involved with gaining placement with their search listings. It is simply not compatible with trust for searchers or advertisers. Even if information isn't exchanged, the perception will be that it is.

Overall, I feel like the acquistions are a grand rush to build up interactive ad networks to rival, in particular, the contextual ad network that Google has already built and is mining. I especially understand the desire with Google and Microsoft to gain better tools (Yahoo actually purchased a real network). But in the rush to get the tools, they've gained a lot of other baggage they'll have to deal with, whether they like it or not."

Richard Zwicky of Enquisite considers the WPP acquisition to be particularly notable because it's indicative of a belated (panicked?) acceptance by traditional ad agencies of the interactive ads space and even the search ads space. Richard also drew attention to the Interpublic acquisition of Reprise Media and the "outstanding team there." Looking to the future, Zwicky predicts a rise in M&A activity targeting boutique online services firms, especially in search. But he believes boutique agencies need to become diversified boutique agencies, not mere one-trick ponies stuck on, for example, SEM or SEO.

Matt Van Wagner of Find Me Faster thinks "Microsoft will have to spit Razorfish back out or they will be in conflict with many of their advertisers."

John Krystynak of GotAds told me that Microsoft's purchase of Aquantive was monumental stupidity. On his blog he pointed out that they paid 14X revenues for what is essentially an agency, and "agencies aren't exactly known for pristine revenue reporting." He says that if he were an Aquantive customer, he'd be looking for a new agency right now. (But John, what about Valueclick and Atlas? Those are the non-agency parts.)

In a detailed post too long to paraphrase, Linda Burlison outlines potential chaos in bid management and media buying across the various competitors; this is particularly evident especially now that so many conflicts of interest have been created by these various acquisitions.

I've just been briefly reviewing the history of Google and competitors in capsule form, as I write the new edition of the book. Kept me up late thinking about it, to be honest.

There is something so compelling about an open source search engine: maybe search can actually get better if it goes in that direction - tapping into distributed developer expertise. In non-public or low-scale settings, search engines like Nutch and its cousin Lucene SOLR have so much promise. And why not? It becomes "our" search engine that allows "us" to customize, while not being beholden to a particular overlord.

Some of that vibe, though, was what led to the Open Directory Project many years ago -- and what happened?

On balance, it looks to me that Nutch et al. (open machine algorithm) and Wiki-something are two very different approaches to the problem. Open source search in the traditional sense is open to a community of developers, and freely licensable. Wikified search is bound to be open in that looser, sometimes chaotically obscure or corrupt way somewhat analogous to the (problems and opportunities of) old ODP. Importantly, the Wiki concept still relies too much on people to produce content. This will not necessarily scale. It's useful for some things, hopeless for others. Another problem is that Wikipedia users won't necessarily be better at the production side than users distributed across many involved online communities. They might be worse.

This is a draft of some thoughts that might go into a book (below). A few older bits still need cleaning up. What are your thoughts?

--

Beginning life in 1998 as GnuHoo and then NewHoo, the Open Directory Project (ODP) was conceived as a competitor to the Yahoo Directory. The work was to be done by volunteer editors, and the end product was to be licensed to any portal or site that wanted to take advantage of the information. Doesn’t sound like much of a business? Well, it turned out to be a pretty good deal for the founders. The directory’s popularity led to its acquisition by Netscape, which was later acquired by AOL.

AOL became the Open Directory’s major distributor, but the directory was also licensed (at no charge to the publisher) in many other places around the Web. Google began using ODP data fairly early on, calling it the Google Directory. An innovative feature was Google’s use of an “overlay” technique, ranking results in a given ODP category in order based on the site’s Google PageRank score. This was illustrated with a green bar (on a scale of 0 to 10, similar to the way the info is displayed by searchers using the Google toolbar). This could have been a very useful feature indeed had there been more consistency to the underlying content in the directory. The so-called Google Directory still exists, but it has been completely de-emphasized in the Google Search user interface.

A couple of key Open Directory players, founder Rich Skrenta and marketing exec Chris Tolles, eventually moved on to a new venture: Topix, a sophisticated news search engine that competes directly with Google News. Topix is now 75% owned by three major media companies: Gannett, Knight-Ridder, and Tribune.

The ODP came under criticism for many of the same reasons Wikipedia is maligned in some quarters today: a lack of “professional” editorial quality control. The lack of transparency of site submission procedures to the website-owning public, and the huge variations in the degrees of disclosure of editors’ biographical information meant, for me, that this so-called open directory was far from it.[i]

The construction of a comprehensive high-quality human-edited directory remains an elusive (and perhaps now irrelevant) task. The ODP founders were correct in their assumption that a distributed model for vetting editorial recommendations was the only possible way to get a comprehensive categorized directory to scale with the growth of the Web. But they also oversold the value of human contributions insofar as even tens of thousands of these couldn’t scale adequately to cover the enormous explosion of online information – not as compared with improved search algorithms and search interfaces, to say nothing of the massive acceptance of the concept of online collaboration and a wider range of tools to support this. In the past I had come across a couple of alternatives to ODP; two notable ones were put forward by Steve Thomas (Wherewithal, Inc.) and Dave Winer (RSS pioneer). Both would rectify the problem of a fixed category structure being controlled solely by the category owner. They’d allow for collaborative taxonomy, so to speak. Ho, hum. Many of these seemingly radical critiques of ODP have become staples of today’s so-called Web 2.0 movement. Thus many of these early debates have been surpassed by growing acceptance of the need to develop technologies to subtly handle “upstream” of self-organizing editorial output from many users, rather than a top-down (if seemingly democratic) categorization scheme.

Contributing to the organization and sharing of information has to seem fun or worthwhile, and much of the ODP community moved onto other passions. A spinoff site called ChefMoz – also a good idea – found little appeal in the broader public and proved that ersatz claims of “officialdom” for an open-source, .org-based human review site were grandiose; sites (mere websites!) such as Chowhound and Yelp now achieve considerably greater popularity pursuing virtually the same goals. The emergence of a range of ODP offspring proved that it was never really the open directory. It was a human-powered directory that chose to call itself “open.” (A similar realization will no doubt dawn on users of and contributors to Wikipedia, too. It won’t prove to be the be-all and end-all.)

“Humans do it better,” the ODP slogan, was proved wrong in the sense that algorithmic approaches such as Google Search won mass acceptance from users over and above hand-categorized, ostensibly quality-controlled directories. That said, new methodologies of tapping the so-called “wisdom of crowds” (such as Digg) have meant that the machine algorithmic approach isn’t the only winner in the marketplace for ranking and rating online content. And certainly, algorithms can’t create content as tens of thousands of Wikipedia participants have managed to do in their improbable construction of a huge online resource.

In the meantime, then, a whole world of user-built information sources has exploded on the scene, with Wikipedia and Digg leading the pack. Many of the pathologies (and opportunities) that bedevil (and excite) Wikipedia and Digg users today were endemic to ODP. In hindsight, this makes Skrenta and Tolles pioneers to a greater extent than perhaps they realized.

The Google Difference: A Third-Generation Algorithm

If Google hadn’t moved to fill the void left by its struggling predecessors, someone else would have. Scientists in various research projects were working on new ideas about how to rank the importance of web pages vis-à-vis a given user query. What Google did was to popularize some of the best emerging ideas about how to design a large-scale search engine at a time when others were losing momentum. Some of these ideas are so central to the task of ranking pages in today’s web environment that they were adopted in some form or another by all of Google’s main competitors (including Inktomi, AltaVista, and FAST).

The working paper that explains Google’s PageRank methodology, “Anatomy of a Large-Scale Hypertextual Web Search Engine,” is frequently cited.[ii] But the field of information retrieval technology is rich with ongoing experimentation by hundreds of well-funded scientists, some well known, some not. Some scientists take a slightly different approach to the problem tackled by Page and Brin, organizing the Web into topic-based “communities.” Teoma, a search engine acquired by Ask Jeeves (now Ask.com), is the most public example of this approach.[iii] The two approaches tend to provide somewhat different results, but they are clearly cousins of a similar generation of thinking about the “hyperlinked environment,” and both have been a boon to researchers seeking that elusive piece of information online. In practice, algorithms such as Google’s and Ask’s today are really meta-algorithms, looking for “signals” on a wide and shifting spectrum of measures of quality and relevancy, while attempting to filter out or devalue huge volume of junk, spam results. Today’s search engines might be clever enough to measure website usage patterns, background business data, and more. (One potential signal, the age of a website, is now seen as so matter-of-fact that search marketers have a nickname for the apparent difficulty in getting well-indexed in Google if you’re a new website owner: “The Google Sandbox.”) In addition to all that, there are attempts to determine user intent in search queries, to serve up personalized results or even different types of results (news search, maps, financial charts, weather) based on the user’s history or the nature of the query. In today’s mature world of search, no one methodology is billed as “the” best way of arriving at the ultimate ranking of results on a given search query. But arguably, Google consolidated its lead in search based on the mythology that its PageRank system was an invention that led to brilliantly accurate search results.

In any case, the idea behind PageRank was brilliant and intuitive when it was brought to market in 1998. The governing principle revolves around a map of the linking structure of the Web. Pages that have a lot of other important pages pointing to them are deemed important. “PageRank can be thought of as a model of user behavior,” wrote Brin and Page. “We assume there is a ‘random surfer’ who is given a web page at random and keeps clicking on links, never hitting ‘back’ but eventually gets bored and starts on another random page. The probability that the random surfer visits a page is its PageRank.”

This was a significant advance over previous generations of web search. Although most major engines had experimented with a variety of ranking criteria, many of them had depended heavily on basic keyword matching criteria. Not only did this make good information hard to find because so many pages were locked in a virtual tie for first place, it made it easier for optimizers to feed keyword-dense pages into the search engine in a bid to rank their commercially oriented pages higher. Although this game of keyword optimization is quite effective to this day in ranking pages well on unpopular queries (even on Google Search), it seems to work rather poorly on common queries.

The ascendance of PageRank means that on a Google Search for auto insurance comparison, for example, it’s likely that a well-known site will rank well here rather than some random site that just happens to contain those keywords. When I tried the query, I saw a number of leading insurance comparison sites, and very little “junk.” This dovetails with the notion that authoritative recommendations do indeed confer authority as far as Google’s algorithm is concerned. But it won’t take you long to find a few head-scratchers in the mix. It’s difficult to get a monolithic sense of which types of pages rank well. But few would dispute the fact that a high volume of quality links pointing to one’s site is a great way of getting Google Search to treat you well. PageRank isn’t dead, it’s just part of a bigger mix of factors than ever before.

The ability to break all these “virtual ties” among similar search results was a breakthrough for search engines. Almost all major search technologies today are significantly more sophisticated than those from the mid-1990s. I recall a time when many websites used a free licensed version of Excite Search for their internal site search. The technology was weak, often providing a clutter of irrelevant results. If search was this bad in closed corporate environments, it was definitely in need of improvement if it was to help users sort through the enormous clutter of pages available on the Web. For searching relatively fixed data sets, such as finding pages within a single website, today’s technology is significantly improved over yesteryear’s. The open source movement has even brought us libraries of sophisticated search engine code (such as Lucene SOLR), meaning that a powerful small-scale search engine can be customized at a reasonable cost.

A public web crawler in the same family, Nutch, has gained notice as well. A free, open-source web search technology in 2007 is nearly as sophisticated as industry-leading search engines from a decade ago valued in the hundreds of millions of dollars, but they’re still far from beating Google at its own game. Why? Nutch – like many other search technologies – doesn’t scale as well. In the understatement of the search engine century to date, the Nutch founders write: “Much of the challenge in designing a search engine is making it scale. Writing a Web crawler that can download a handful of pages is straightforward, but writing one that can regularly download the Web's nearly 5 billion pages is much harder.”[iv]

It doesn’t stop there. Taking those billions of pages, now you’ll have to assess them all and determine how much authority each link on each page should be allowed to “pass on” to other websites and pages. Because some site owners will be up to no good (premeditated linking schemes), or simply because fortunes change, the map of how much authority (or, what type of authority) is conferred by all hyperlinks on record is going to need to be updated regularly. A web search engine must also be able to sort out “duplicate” (often stolen or “scraped”) content from the original content, so it doesn’t end up giving visibility to the wrong source. The calculation of link structures and associated authority weights alone – let alone getting the underlying approach to how to do the calculation right – is beyond the capacity of any small-scale search engine infrastructure.

Beyond massive computing power and indexing technology, then, Google’s advantage continues to rely in part on the ability of PageRank and other related technologies to sort out valuable information from information that “dumbly matches” the user’s query. Want proof? Do a search on your favorite topic at Technorati.com, the blog search engine. It’s powered by Nutch. I’m betting you’ll find quite a number of “spammy” results in the mix, in spite of some recent tinkering with a weak cousin to PageRank, an “authority score.” What’s surprising is that Google’s own Blog Search also appears easier to flood with duplicate content and spammy sites than its main search index.

To be clear, the calculations involved in determining PageRank are just the beginning when it comes to determining how high a page ranks for a given user’s query on Google....

[ii]. Sergey Brin and Lawrence Page, “Anatomy of a Large-Scale Hypertextual Web Search Engine,” Stanford University Department of Computer Science, 2000. Jon Kleinberg, widely considered to be the leading contributor to this generation of search technology, has published many important papers on search, including “Authoritative Sources in a Hyperlinked Environment,” 1998.

[iii]. For a user-friendly overview, see Mike Grehan’s interview with Paul Gardi, “Inside the Teoma Algorithm,” July 2003, archived at e-marketing-news.co.uk.

For some reason Dave Winer's blog doesn't seem to have the readership you'd expect. Many of those who are unabashedly self-referential - like the many others using "social media" to talk about social media - have done very well for themselves, traffic-wise. I guess this may just prove that powering, fostering, and promoting a whole generation of communication may or may not make you fabulously wealthy and popular.

Turning to that general subject, is the matter of whether you see the communications environment as one big "coral reef," to be treated with (semi-commercial or noncommercial) respect.

Feedburner is part of the RSS coral reef. And rumors say they're selling their piece of the reef for $100 million to Google.

The danger is that Google is a super-power, and coral reefs depend on harmony and no one entity being too powerful. Such an entity might disrupt the fragile ecology of the whole reef. Of course they'll say they won't, but...

In the type of corkscrew-like irony we've come to expect reporting on media who report on other media...

I was looking forward to seeing the finished version of Ilana DeBare's SF Chronicle article after she called to ask about how local businesses can use paid search to best effect. In effect, "paid local search" options compete directly with mainstream newspapers' ad models. Remember, we heard recently that the SF Chronicle itself is struggling, and laying off considerable numbers.

I liked some of what I saw in the article, but other tips seem to talk down to small businesses a bit. Do we expect too little of the "local" business? Some of the "local" businesses I frequent do tens of millions in business; others are at least in the low millions. That's more than many broad-based, but not very viable, online entrepreneurs are able to achieve. We're not talking about a hot dog stand all the time.

It's a bit of a myth that local businesses who don't do "online" business are ill-advised to pursue paid search. And if they don't, is there any business with a website that would find it particularly onerous to build in at least one decent measurement of buyer intent, such as a lead form, appointment form, etc.? Sure, many will just phone the hair salon. But Pure MedSpa takes online information requests and appointment booking requests, and so should many of their competitors, even if they only have a single location.

Obviously, if you're watching every penny, you don't want to burn money needlessly. But from what I've observed, in the chaotic world of the local and partially virtual business, a bit of mad money probably can't hurt in goosing your online presence. Take Shelly Purdy, a local jewelry designer in Toronto. These are medium-priced items, but by no means small-ticket. Now that e-commerce is enabled on the site, you might think - great, it would have been silly to buy paid search before, but now it's a good idea. But what if Shelly has a competitor who really doesn't emphasize online jewelry sales? What if that competitor is doing $2mm a year in sales, and spends a couple hundred thousand dollars a year on various promotional methods, including craft shows? Would it be so wrong to set aside $10,000 for paid search keywords, targeted tightly to a small geographic area, aimed at driving visitors to the showroom, a special event, or special promotion? But it's unmeasurable!! Gaahhh!! $10,000 out the window!! Hmm, $10,000 isn't all that much considering all the other unmeasurable marketing they already do.

So in my opinion, it's high time we raised the bar for "local" businesses and expected them to study the online targeting options in more depth. Saying "paid search isn't for you if you don't have a clear online outcome" is giving lazy people an out. Perhaps some of those online outcomes need to be built or pursued.

Similarly, I'm not too impressed with the notion that a business-to-business advertiser might not want to bother with online because the old-school network of "business cards" and "rolodexes" is the ticket. Face-to-face is cool, but this is 2007, and people search. Want to rake in more business than the other junior commercial real estate brokers who are relying on the old traditional methods? Get visible online. Take the time and trouble, and reap the reward.

So in the spirit of high expectations, here are some of my favorite current resources that should help small businesses or local businesses study ways of improving their online presence:

In my opinion, here are at least four things out of the long potential to-do lists that ought to have been recommended to local businesses:

Google Local Business Center is free. People increasinly use Google Local Search, and maps, so you want to show up here. Get a listing. Upgrade it if you know how. And wait, did you see that Google actually facilitates the process of creating a coupon so you can see if your local paid search ads are working? So much for excuses about "unmeasurability"!

Make sure you're covered on key local vertical sites. If Toledo.com is important to your local audience, get visible there. Just make sure you understand what you're paying for.

Be aware that your reputation is going to be affected by how you appear on the latest generation of local business search and consumer review sites: Yelp, InsiderPages, CitySearch, Judysbook... you get the idea. Figure out whether free "claimed" listings will enhance your image. Consider upgrading your listing for added visibility.

Do conventional, but localized, online and offline public relations (in a savvy way). This is especially great if you have a website and you can get folks to point to it. For this, maybe find a local boutique PR firm. Again, perhaps not for the very small or very boring, but if you're very small and very boring, why am I even writing about you?

A longer discussion is at what size of business is it viable to have a quality website designed for you - and what type of website? What functionality is required? Should you blog? Twitter? Listen to your inner voice here as you study what you can. There are tens of thousands of restaurant listings in my metro area. Do any of the owners blog? Twitter? I really don't know. Will they in the future? Don't know either. Is it vitally important for them to do so? Well, no, but culturally, personally I'd take some guerrilla vlogging from a head chef over a canned flash video treatment any day. But culture is culture. Most of these businesses should be mastering the basics rather than worrying about the bleeding edge stuff.

Having your own cyber-business opinions helps. Why abdicate your online personality solely to some disinterested web design firm who is expected to give you an "online presence"? The concept of an online presence sounds pretty static to me. Nope, you've gotta take ownership of this stuff and work with that firm - if you can find a good one. So that leads to a broader question: can you cost-effectively outsource a good portion of your marketing efforts if you're a small local business? Yes, but I think you'll need to get lucky. There are probably 1,000 people on the planet qualified to help you without doing counterproductive stuff. It's tough to find them.

Or is it storming the tunnel? Having coffee with some pals at BCE Place, I noticed this video ad for Search Engine Strategies Toronto 2007. These are showing on screens throughout the PATH, the interconnected series of underground concourses that connect pretty much the entire downtown. Sorry for the shaky-cam. Blame Starbucks Tall Bold.

The next day, to convey the proper statesmanlike image as Chair of this conference, I decided to hold a baby. However, possibly because the coffee had worn off, I not only slept like a baby - I slept while holding a baby! My neighbor looks worried.

In this wave of the Internet economy, investors are self-consciously considering investing in plays that depend on "user-generated content." In the past, Internet investments were more haphazard. Today's discourse is much more sophisticated.

So naturally, in discussions of the viability of burgeoning online communities, a big question is: why would anyone contribute? Why would you upload a video? Why would you write a product review?

Certainly the puzzle of how to induce someone to "give up" valuable information and to share and contribute their ideas is an interesting one. In spite of the apparent growth of the so-called "pro-am" movement, the impulse to contribute isn't universal, so maybe it's unnatural? Unlikely? Unrealistic to expect it from people? I guess that depends on how you look at it.

If you're too much of a sceptic on this question, of course, you assume no bonds of reciprocity; no impulse towards gossip; no sense of duty; you in fact assume that we do not live in a society.

Now if you're the victim of a survey research call at dinner, you might answer as the Alberta farmer did when I was making those putrid calls many years ago: "I've worked hard for what I got, and I'd like to keep it that way." Answering a few questions about attitudes towards the environment, and who he'd be voting for in the next election, would be tantamount to theft, in his mind.

But what if you're part of a viable community, rather than just being interrupted at dinner?

What if John Rawls, the last century's most pre-eminent American philosopher, was right? That we have a "sense of justice" necessary to be "fully cooperating members of society"? To figure out why any society would share values and information to the extent that they agree on how power and communications roughly work (eg. shared signals, common laws, etc.), philosophers like Rawls made certain assumptions. In some way these were based on modern empirical reality, given the structures of laws in most modern societies.

Rawls also defined the sense of justice -- as "the capacity to understand, to apply, and to act from the public conception of justice which characterizes the fair terms of cooperation." The fact that Elliot Spitzer brought major business leaders to account speaks a lot to the sense of justice that an Attorney-General must have if we're to live in a just society, but the broader point is that his work was by and large endorsed by society.

Many online peering and sharing activities don't require a sense of justice, but some do. There is something else at work in the reciprocal uploading of music and other content -- it doesn't require a "sense of justice," precisely -- but it's probably part of the same family. It is, at least, a sense of reciprocity or mutuality.

On RateMyProfessors or HotOrNot or something else entirely, the activity's a bit more frivolous that a "sense of justice" would require, but again, it's in the same family.

People writing book reviews on Amazon is part of a more mature - if often unfair - process of contributing to that general social discourse. This is starting to get more like what I'm talking about.

There are a great many communities that have yet to be built online, that look a little like some of the things we've seen so far, but simply aren't yet as mature. Consumer Reports is published out of something like a sense of justice and is consumed mostly passively. These kinds of impulses will soon take a more contemporary, distributed form online, a more mature form than a click on a "Hot or Not" button.

Recent news items about Optionable, Inc., a trading firm somehow mixed up in a natural gas trading scheme with a much larger partner, BMO Financial, indicate that a habitual financial "whistleblower" began noticing unusual trading patterns in the futures markets. As she had done on numerous occasions in the past, the woman, a 51-year-old president of an investment club, began sending notes to regulatory authorities alerting them to the problems. One response might be to ask what's special about this woman to cause her to have such an elevated sense of justice; moral outrage, even. Then again, if she plays in the same markets, she also has a vested interest in a level, transparent playing field. I'd like to think there's a little of that in all of us. Whatever, you can't argue with the fallout when information inevitably gets out. Optionable's stock has fallen from $9 to 50 cents, and the company is now embroiled in scandal.

As part of that discussion, usually someone will chime in with "I hate LinkedIn." Various reasons are given about cheesy insurance salesmen types trying to ride on one's coattails or intrude on one's "business privacy."

Meanwhile though the same folks are proudly talking about FaceBook.

I think I see what's going on here. 45 is the new 22. Isn't it!?!? Please say it is!!??

(The population is aging, after all. If a lot more people are turning 70 and 80, young is, well, a lot of people under 50. Just the other day, a neighbor announced that she thought I was an "angry, frustrated young man" for a minor jolt I gave to my own fence while parking my car. I was flattered!)

God forbid anyone be seen to be actively networking or doing their job. Talk about a dork-fest!

But being on FaceBook - this will be cool, for another 38 weeks.

One famous graybeard in the blogosphere actually announced that he couldn't use LinkedIn in his company because he'd be "made fun of". Self-esteem issues?!?

Personally, I don't hate Linked In. But that's because I'm not under the impression that the tools I use for business define me. Although I will grant you, I'm definitely not sending people emails from an AOL address using large, orange fonts.

It can't be news to Google that the ads have to be more relevant to some users than the adjacent organic results, at least some of the time, or Google's main cash cow is kaput.

Luckily for them, they've been thinking about it a long time.

Would I rather see something like:

or...

At the very least, it's not a slam dunk either way.

You get the feeling Google has thought a fair bit about the relative attractiveness of the organic and paid listings on commercially-oriented queries.

Users, not me, not Google, have to agree, or they'd be out of business. But Google can do plenty to gently tip the balance towards the ads, to ensure that Jakob Nielsen's "box blindness" scenario (now four years old!) doesn't sink the company. Part of that is how do you regulate and display the ads. But surely another side of it has to do with assessing the attractiveness of how the organic SERP's are displayed: placements, usefulness of text snippets, and yep, even what counts as "relevancy." In the most generous interpretation, Google has it neatly bifurcated so more commercially-oriented searchers get what they need, while informational searchers also get theirs.

To those of us who have been toiling away trying to explain the benefits of targeted, measurable, interactive advertising: Bill Gates is on our side.

He pulls no punches, and believes that the shift will be ahead of schedule, and not smooth.

Yellow Pages: doomed.

Newspapers: doomed.

Traditional TV ad models: doomed.

And all sooner than people think.

While his linebacker, Steve Ballmer, might have said as much already, the point is important, and worth driving home. An excerpt:

"When you say something like 'plumber' the presentation you get will be far better than what you get in the Yellow Pages," Gates said. "After all, we know your location and so we can cluster [results] around that. ... Yellow Page usage amongst people in their, say below 50, will drop to near zero over the next five years."

Not surprising. On a related note, I'm pretty sure that the formats offered by your web analytics reporting affect not only how you think, but how you discuss among yourselves in your company. In many companies -- admit it -- discussion revolves around the easiest reports to generate. Shouldn't it revolve around the information that relates most closely to profitability and to your key user behavior goals?

For most analytics packages in the past, the effort required to implement segmentation often meant that you made a conscious decision not to think about certain things, until someone in your company sets it up so you can get the information you need.

Speaking with Brett Crosby of Google yesterday, I was delighted to hear about all the new features that are being rolled out in the revamped Google Analytics. One big one is that they'll offer by default more reports that online marketers typically want, rather than forcing you to go in and customize in order to generate those reports. As Brett said, the old version used to be a bit like a 747, and the new version, well I forget the car he used for his example, so let's say a 2008 Subaru Legacy GT. Pretty easy to drive, but if you want to impress, there's a sporty leather-wrapped steering wheel just so you feel tough and quantitative like you did before.

This will take some time to digest. I'll comment more extensively as I get comfortable with the platform myself.

Great column today by Greg Sterling, jumping off a Pew Research study. Most of the world is not as enamored of connectivity tools and Internet trends as the "breezy attitudes about user behavior" espoused by the cadre of Web 2.0 architects tend to assume. User behavior is quite varied across the spectrum so we need to take a "nuanced" view of it.

Andrew Coyne is on the lineup of speakers at Mesh, Canada's Web 2.0 conference. The last time I met Mr. Coyne in person was ten plus years ago, at a social policy conference at Mount Allison University in New Brunswick. (This was before blogging, so it's not fair to share some of his impolitic barroom banter.) I was giving a theoretical talk on the role of societal input into the policy process. The right-leaning Mr. Coyne took the trouble to stand up and critique the paper, for which I thank him. Not having anything to do with him, by the time my research was done a couple years later I had developed serious reservations about aspects of my earlier positions. I discovered that expert research and opinion, as opposed to unreflective citizen participation, was the real orphan in contemporary governance trends -- generally under-represented at the table. Novice "democrats" were often undermining their own positions by attacking those with substance and insisting on open processes that invited chaos and excessive degrees of money influence into public decision-making. They didn't foresee, also, that the very shape of "forums" could be manipulated. They can be opened, closed, and reshaped. Or simply designed not to work. Contrary to some fashionable theories at the time, I concluded that the failures of some governments under study didn't have to do with a dearth of trendy internal democracy or a lack of (what we now call) crowd wisdom, but simply a lack of intelligent decision-making and a dearth of fresh policy of any recognizable sort at the end of the day, let alone any talent for implementation.

I don't know if Coyne or I have official positions on these matters now, but ten years later, the discourse has shifted to how online collaboration affects business and politics. It's been a long journey. I daresay longer for me than Coyne, but I'll wait to hear what he has to say.

Thanks to the evolution of forms of online collaboration, today's concepts of wikinomics and crowd wisdom are extremely important to economic development and sound decision-making when applied properly. But similar to naive interpretations of "participatory democracy," there is too much hype around them insofar as they don't solve problems requiring depth and persistence - they do provide far superior data inputs in a whole range of endeavors. So like democracy itself, wikinomics is an explosive force that is, by itself, nonetheless insufficient for excellent results, in whatever sphere you're operating. Crowd wisdom is a misnomer. Experts have wisdom. Crowds have more and potentially better data inputs. Ignore these, and inefficiency plummets. But tapping them willy-nilly is just annoying fashion, and attracts all sorts of yahoos and tinpot despots.

I realize I'm not the target demographic, but apparently in the name of trying to appeal to youth with "short attention spans" who "want it now," the advertising that continues to be churned out by those ad people just floors me.

Before I say this, a painstaking reminder that I am not the Church Lady and am not easily offended. But!

The premise of a TV spot for Amp'd Mobile is, roughly: "entertain yourself."

It portrays the fantasy world of a twenty-something bus rider (I assume the idea is to appeal to people 3-8 years younger than him). He tells the big hulking guy and a defenseless old man to "fight each other". They begin going at it and the big guys begins to toss the elderly one around like a rag doll. A pathetic looking bearded chap is asked to turn the radio up louder. Fantasy boy then tells the large African American woman to "shake her junk," which she obligingly does. Then the bus driver slams on the brakes, as requested. Everyone goes flying.

Maybe ads have too much violence and ridicule in them, and not enough sex, for my taste. :) Or maybe they just aren't uplifting or fun, which they could have been. Racy? A big guy whupping an elderly man's ass? Huh? That's just unglued.

Or maybe it's a more general phenomenon: much advertising just remains nearly impossible for 80% of us to watch, but it's broadcast to all of us... so it's grating. The problem there is (as we think about how much more targeted the ads need to be in future): it's not just about "I don't want that product." (I might, after all - wouldn't a lot of people want a new type of mobile plan with video content?) It's that I don't want that type of advertising. It's that line that's hard to put your finger on, that crosses the line between funny and just plain mean.

Does more than 2% of the buying population actually want it or respond to it?

It makes me wonder why they continue to create it.This guy appears to agree with me. I'm sure there are many others who don't. My take? Not complicated. To Taxi, the ad agency who made the ad, I'd say: make less money if you have to, and refuse to sink this low.

Looking for a white hat SEO or SEM firm to help out, company X lists their URL in their request for proposal. The site, specializing in heating & air conditioning products, has a panoply of ugly-ass, older-generation link farm links at the bottom. They're way off topic. "DSL brokers." Etc. Obviously the poor company has been hoodwinked by a link farm vendor.

But that red flag is going to make it hard for them to find a good vendor. They probably need to be cleaning up their site of their own accord, lest reputable helpers shun them like the plague, wondering what other skeletons may lie in the closet.

If a potential vendor is this shy about your home page, imagine how it must look to a customer. Conversion rate woes? Think about how credible you look to an unbiased third party... or even a biased one!

We're still trying to digest the relative impact of the DoubleClick (Google) and Right Media (Yahoo acquisitions), and to piece together their current and future impacts. There is already some speculation out there that Right Media specializes mainly in junk social media ad inventory -- if so, that's something for Yahoo to fix post-acquisition, or perhaps it's the part of the inventory pie that was available, so not to be sneezed at, especially not by the producers of large-scale junk social media ad inventory. (I do worry when you're using phrases like "non-premium inventory". In the past decade or so, non-premium online ad inventory has run the gamut from banners in less-visible positions, to forced page views. In other words, from near-worthless to fraudulent.)

ContextWeb seeks to broker both high-end and remnant inventory more efficiently by allowing publishers to communicate their ask price. ContextWeb is actively fighting the industry tendency to relegate publishers to "remnant" status by default.

The role of an independent growth player in the marketplace is a real wildcard. We know that product quality hasn't been the main reason Google won the PPC wars (reach was), but it was a real catalyst for pulling away from the laggards. ContextWeb's success will depend on the coolness of its platform (I'm confident of that) and the size of the marketplace it's able to build through business development and self-serve publisher signups.

Owning the premier properties seems to be the most favorable place to be here, given the emergence of rough-and-tumble competition among robust middlemen. Yahoo would really love to acquire Facebook, wouldn't they. So Right Media sounds like a consolation prize.

Compared to some of my competitors, who tend to specialize, my firm has probably worked for the most eclectic client list in the search marketing business. It's important to see, in my view, potential success stories in all industries. (I've certainly also seen plenty of high-risk or low-potential projects in RFP's that we are shy about handling.)

Like any consulting company we certainly "cherrypick" the projects that seem to have the potential to be workable, but beyond that, we've experienced a wide diversity of expectations and management styles, from the tiniest to the biggest companies.

Preamble: Yesterday somebody forwarded me a saying: "unrealistic expectations are planned resentments." Example: a company with a target ROI or cost-per-order target that is better than anything we've ever seen in the industry. If costs are rising because it's a competitive auction and new players haven't yet stopped entering and driving costs up, we'd be crazy to promise to hit unrealistic ROI targets short term. What we can do is implement better testing protocols or work on fundamental changes to messaging, or a dozen other things, but we can't just draw a tiny speck on a wall and hit it with a dart from 100 paces. One way to guarantee pristine ROI numbers, conversion rates, etc. is to reject growth and to focus only on a few keywords and a few slam-dunk customer relationships. That's why aggregate conversion rates aren't as meaningful as folks think. Volume matters too.

Another example of unrealistic expectations that constantly get built into our (paid search) industry: some folks can't resist touting the lowball bid strategy. We have a few of those accounts working pretty well, but that's sort of because there is demonstrably high quality content or products at the other end of the click. Beyond a few remaining skilled players, the belief in a lowball bid strategy making you rich is like waiting for Christmas on Jupiter while taking LSD. The hallucinations are great while they last, but then you start shivering uncontrollably, and Christmas never makes it to your planet. By all means play around on your own time, on your own time, but if you work for a company, budget realistically for a media buy based on what "high quality" competitors are likely to bid to stay in the top 4-5 ad positions.

Moving on.

Let's start with a broad premise about process: true process is good and leads to efficiency and better communications. Results, the end goal to which we're all oriented, aren't achieved by simply stating goals, although it isn't bad to have targets.

I'll further break process down into two types: technique process and communications process. And hey, I'm not trying to write a textbook - I'm sure there are many better thoughts on this. It's just a blog post.

Broadly paraphrasing theories of the firm, I'll suggest that process has a price and that by and large bigger companies can afford more of it, but no matter what the size of your company, you need to worry about overinvesting in this costly good, just as you need to worry about ignoring it entirely. The decision to outsource comes from inadequate in-house resources in technique process (as well as its higher-order cousin, domain wisdom). That decision now creates slightly higher costs and requirements in the area of communications process. But the key is that results in a competitive, creative field are generally only achieved with a long-term focus combining technique process and domain wisdom. Results orientation is actually assumed. No one in our industry is being paid to watch a clock or to look busy.

Imagine four golf "head coaches," and Tiger Woods. (There is no such thing as a head coach in golf. The golfer himself is the head coach. Work with me.) We all know that "shooting 65" is the goal. That's a given. Coach #1 - we'll call him "Hockey Mom," - nonetheless assumes that constantly reminding Tiger of hitting his "birdie targets" and that he "needs to sink this 30-footer" or that he "shouldn't have missed that four-footer" is the sum total of what his supervisory role entails. Tiger doesn't improve.

Coach #2 - we'll call him Sarge - realizes that improved technique, including fitness fundamentals, will improve Tiger's overall game. He follows Tiger to every fitness session, logging every activity, suggesting very small changes to his situp and treadmill technique. He fires the swing coach and begins changing Tiger's hand positions. He also tells Tiger what clubs to use; not a small point. Tiger benefited from that type of minute help when he was a small boy; Earl Woods' exacting practice regimens helped make him what he was. But now, he's Tiger Woods. His game soon goes south on the over-processed regimen.

Coach #3 - we'll call him Big Company Guy - doesn't meddle in the exact performance of the situps, but does expect an elaborate reporting regime. Instead of asking Tiger if his body was responding well to last week's agreed-upon workouts, he asks Tiger for color bar charts, heart rate and blood pressure readings, and so forth. Unable to process the information Tiger submits at first, he asks for a different kind of presentation, as well as an hour-long recap and explanation of the workout program and explication of the reporting documents. Various colleagues join the conference calls, and suggest additional statistical reports for next time, in case they might shed light on the fitness regimen. Unfortunately a few of these meetings and reporting requirements make Tiger late for a meeting with a sponsor, and in one case, he gets to the course only 30 seconds before tee time.

Coach #4 - we'll call him Goldilocks Excellent - places a heavy degree of weight on technique process in determining outcomes, just as Tiger himself - as a motivated, recognized professional - does. He manages that process and understands the purpose of various elements of the training program. When Tiger's scores are subpar, he doesn't assume Tiger "failed," but wants to know whether it was just an unlucky day (randomness, missed putts) or something specific that might have gone wrong in terms of deviating from sound process. He spends not too much but also not too little of Tiger's and his own time in the necessary communications process. Tiger is held to exacting standards, but is also facilitated in pursuing his own, self-motivated exacting standards. Goldilocks Excellent gets 2% of Tiger's winnings (less than Tiger's caddy, Steve Williams, but still pretty good), and becomes wealthy by bringing out the best in the Tiger.

Now that I've offended just about everyone, let's try to put it back into dry economic terms.

The smallest companies are at a disadvantage if all they look at is short-term results. Larger companies do this too, but very small companies are particularly susceptible to it, because they lack capital. It's pretty hard to intellectuallize getting kicked in the shins (or worse) for weeks on end in the name of some long term goal. (Again, The Dip by Seth Godin seems like it's going to address this and no, I haven't read it just yet.) Results, though, are just a common, assumed goal. The real key is to have enough of a plan and enough resources to devote to technique process and domain wisdom. Often, the smallest companies don't, so they get killed. Make a few extra mistakes -- say, overinvesting in communications process or telling Tiger what golf clubs he should use -- can throw the economics completely off course.

Big companies can afford to make many more mistakes. Where they thrive is in executing long-term plans based on domain wisdom coupled with technique process. Whether these are found in-house or outside the organization is not as important as some may think. This comes down to cost and timing, more or less. Contracting relationships abound, so every day, many calculations are made that point to a more efficient outcome by using a specialist. Otherwise, every company would have its own shipping department, web designer, auto mechanic, and massage therapist. Yep, some have all those things, but those company names tend to be Chrysler or Google. Very big companies with unique all-encompassing cultures.

Big companies generally overinvest in communications process. It's a quirk of corporate culture. Internally, this makes sense because the cost is absorbed into a larger entity. But if the outsourced expertise is a much smaller entity, the cost of communications process too easily crowds out the technique process. But while annoying, most larger companies' overinvestment in communications process is not economically harmful to them.

Either way: in spite of small company romanticism and some major advantages of nimbleness, big companies have many levers in the marketing of products and services. Smaller companies have to get it exactly right. They also have to bite the bullet and invest money they don't have into processes and medium-term planning exercises needed to compete. No wonder successful small businesspeople are considered heroes.

Final point: none of this stuff seems to matter on the days Tiger shoots 65.

The Rimm-Kaufman Group posts the latest share of spend across their client base in paid search. Big retail campaigns are spending 73% of their budgets with Google, 22% with Yahoo, and 5% with Microsoft. Plus or minus 3% and factoring in the aggravation factor of dealing with the editorial quirks of the third-place player, this roughly equates to search market share, so it comes as no surprise. Google also likely gets an extra 2-3% bump based on its better-developed contextual program (managed to strict ROI objectives).