February 2006

23 February 2006

This morning I tested Google Pages and, while I was underwhelmed, I can say that it looks the first step in ... some direction. The only thing that would make me wonder if it will remain anemic is that Blogger's feature set has remained underwhelming while every other service is continuing to improve.

I am beginning to feel that Google is primarily interested in creating new subdomains and less interested in extending existing services. I was a happy Blogger user shortly before and for a while after their acquisition -- but I left long ago due to lack of innovation.

If Google Pages is the first glimpse into a global CMS or potential publicly viewable - group editable wiki, I'm all over it. If Google Pages freezes at this feature set, I'm getting ready to start receiving the spam flood of spammers using google pages for nefarious ends.

Nik Cubrilovic has written a longer review of GooPages. He and the comments also echo my issues of bugginess. Having said that, I would point out that our love of eternal beta should be accompanied by tolerance for bugs. Eternal beta does not mean eternal rapid wish fulfillment.

22 February 2006

Nic is co-hosting for Michael Arrington while he's on vacation. Today he talks about Blogbeat, yet another blog stats package. This is in the wake of Google acquiring Measure Map.

I find myself perplexed by all these blog stats sites. MeasureMap was certainly pretty and had some nice UI innovations, but in the end delivered the same stats information you can get from any stats package and was nowhere near as comprehensive as Google's existing stats package, which they also acquired.

Blogbeat will cost you $6 a month to peer into your stats with the same level of error, presumably, as other sites. I have several tracking systems attached to this blog. They are never in agreement. I simply can't see spending $6 a month for off-server stats tracking. Can someone tell me why I'm wrong about this?

If it's a Web 2.0 evolutionary development thing like I blogged about yesterday ... maybe that's okay.

21 February 2006

Periods of Innovation usually come on the backs of adepts. Those who form a cabal, focus strongly on an area, and pull out the arcane secrets. These adepts can get quite mystical about their craft - sometimes scaring or alienating the general public.

Web 2.0 is no exception.

Dion Hinchcliffe and Adam Green have both commented on the growing "walls of confusion" surrounding Web 2.0. Dion lists these as buzzwords, hype, complexity, ignorance and significance. These are classic innovation elements. Brain surgery, the electric lightbulb and farming are all filled with these five elements. So, while I'm certain these may perplex people who haven't yet joined the Web 2.0 conversation -- I wouldn't say that that these walls are specific to Web 2.0.

The Web 2.0 community is made up primarily of wonks, geeks, enthusiasts, and so forth. In short, we're a bunch of ham radio operators, complete with the weird glimmer in our eyes when we passionately tell people that there is nothing in life more spiritually fulfilling than our ham radio (RSS).

The world of Web 2.0 right now is highly fragmented, with a million billion points of presence. Lots of little applications that do one or two things out there. Most people don't have the time or the interest to keep up with that. They look at Web 2.0 calendars and say "Outlook does a lot more than that", they look at Writely and say "Word already does more than that", they look at del.icio.us and say "What the hell is that?"

So, at present, Web 2.0 has a stylized face that only a mother could love.

We, in the community, are really excited about it. But just as most blogs are about blogging - most Web 2.0 apps are currently about web 2.0ing. They want to show that they are super Web 2.0y so they can be flipped.

Dion has argued and I have grudgingly accepted that this might not be a bad thing. The Web 2.0 cauldron of applications bubbles out a few gems which are picked up by a huge corporation. That's where public acceptance would logically occur. Maybe one or more Web 2.0 startups will become a huge corporation on their own - but most successful ones would be bought out.

And why is this? Context. Yes, it's context again! At present, Web 2.0 apps serve sliver needs and usually need to be introduced into a larger context for the public to grasp.

To overextend the metaphor, Web 2.0's Walls of Confusion surround its ivory tower. Inside the tower we are creating elements that test the limits of Web 2.0 theory - the applications exist in the context of theory. Outside, people are looking for a different context. It's just a little different now because the previously opaque ivory walls are now transparent. Everyone can see what we're doing - they see works in progress - they see elements out of context. Perhaps these walls of confusion are the transparent walls of our Web 2.0 tower.Photo: Kenn Kiser

20 February 2006

That is an epiphany. You can help someone be happy, but you can't make them happy.

Epiphanies are break-throughs. Little peak experiences where you wake up one day and notice a profound truth that significantly alters your outlook on life. This truth likely was building over time, manifesting itself in a variety of situations until it matured into an epiphany. Epiphanies are fun, they are dramatic, and when they happen you feel smarter and better about yourself.

John Hagel posted today about business's focus on "Breakthrough Management Innovation" as opposed to "Continuous Management Innovation". He notes that in order to be successful these two types of innovation cannot exist without the other. However, he notes there is a blind spot in American business:

....in practice, I find that American management is much too focused on breakthrough innovation and seriously neglects the much broader opportunity for continuous innovation.

I would argue that true breakthrough innovation necessarily comes from an ethic of continuous innovation. Continuous innovation as a corporate culture requires the organization to be open to and accepting of change. The true breakthrough innovation is a corporate epiphany that comes - more often than not - from the coalescing of good management innovations over time. It's giving things time to mature to epiphany - to breakthrough.

Trying to force an epiphany is like preaching to the unconverted. It is more likely to get at best polite nods and at worst a doctrine based arguement. John gets at this when he says:

[There] are great ways to get executives to think creatively about new approaches to management processes but, on their own, they are unlikely to lead to much more than some creative workshops.

This comes back to an on-going theme in my blog posts: context. Management workshops to build disruptive change in an organization with no corporate culture to support change are workshops out of context. Epiphanies are the result of the maturation of context.

Context requires communication and understanding. Show me any organization with a culture hostile to continuous innovation and I will show you an organization that lacks communication and understanding. I would also bet that I can show you an organization entirely populated by people with ideas of what could be improved.

Everyone wants to be part of the conversation. It's self organizing, directed and results oriented. As that conversation evolves, both the participants and the organizations will certainly do two things: (1) make mistakes and (2) have epiphanies. The fear of the former has historically stifled the latter. Management stifled these conversations and then found they were wanting. They needed to innovate but there was no culture to support it. Thus, they focus on the quick fix "Someone come in an help us massively innovate!".

This, I believe, comes back to business' historic drive to predictability. They feel they can fix costs if they can make everything predictable. It never works out that way, something always goes wrong in the cost-matrix. This ... is an epiphany. The old adage "If it ain't broke, don't fix it," has been supplanted with "If is ain't broke, refactor it!"

19 February 2006

Life is about change. Life is about constant evolutionary and de-evolutionary forces. Life is about self-actualization. Life is about sharing with others. Life is about growth and entropy.

Recently, I've noticed that much criticism people wage on just about everything is based on an assumption that things (especially bad things) are permanent. Tolerance for human error, the discovery process, and growth seem severely limited.

In the past few weeks, critiques of both web 2.0 and agile programming have highlighted this for me. In the web 2.0 critique, which I blogged about here, the author clearly envisioned a Web 2.0 army imposing its will on the world. In the agile debate, the critical party focused upon elements of one implementation of a philosophy and incorrectly applied them to something else, making the principles look incorrect when it was actually his application that was.

Both these critiques / rants arose when the author didn't take the time to thing about how these things might work well. Instead, both authors chose to envision and expound upon the negative interpretations. This is because "bad" things seem more important to us than "good" things. "Bad" things are seen as emergencies that must be immediately and forcefully dealt with.

The problem is that when you go looking for bad things, you?ll find them in everything you encounter. Food is never quite perfect, lighting is never exactly optimal, programmers are never quite fast enough, and so on.

Life is unrewarding in this mode.

Despite my growing body of work that is not in lock-step with the Web 2.0 central core, I love Web 2.0 because it is an obvious step in a wide-array of processes. Previously the web was interesting, but not personal. Web 2.0's philosophies and technologies mark an evolutionary milestone in our relationships with information and each other. Is this evil, like the CBS wonk says? Could be, but probably not.

Agile methodologies, at heart, are also very optimistic. They assume, at their core, that people of good conscience can work together rationally to create the more useful software at the lowest cost. Agile, like Web 2.0, is conversational, personal and evolutionary. The actual activities of agile methodologies are best practices in an rapidly maturing field.

At the core of both of these is an assumption that things can be improved, that collaboration is vital in human endeavors, and that good and consistent information is the bedrock for progress. It seems to me that this is a good fundamental set of beliefs that transcend software and information exchange. If more people followed them throughout their lives, we'd have fewer altercations and greater growth.

17 February 2006

Ed Vielmetti commented in this post the other day on my desire to treat individual entries of RSS or OPML -- basically treat individual pieces of information created in blogs or other services -- as objects. He says:

Of course, more often than now, you don't want to see what you're interested in, you want to see what someone else is interested in - as long as we're getting all meta, it might as well be meta from a different vantage point.

What Ed points out here is the joy of exploration which overpersonalization all but removes. As we create (the illusion of) greater control over our environment, we will be surprised by less and less. We will have less chance encounters - fewer happy accidents.

Google asks us if we're feeling lucky. Or, from a web context, if we'd like to get lucky.

There is the very real danger that the greater personalization we have over our information retreival - the less likely we are to find things that challenge us, make us willing to experiement, or that make us grow. It's the difference between eHarmony and meating people in real life. Few people we meet in real life have their self-imposed character sketches available for quick reading. Yes, this means that we might go home with some people we don't marry - but it also means that we personally have to do some work. We have to learn to ask the questions that get people to open up. We have to develop queries of personal interaction.

Overpersonalization of news sources could lead to an escalation of what I perceive as an epidemic of entrenchment. With the personalization your news sources, you stand a very real chance of buying in to a variety of flavors of groupthink. Locking yourself into ever spiraling channels of self-perpetuating opinion and conversation streams that, in the end, stifle growth, understanding and the desire to reach out to others.

The good news is that we still have some degree of free will in this life. We can sign up for eHarmony and search that way - we can do the traditional searches of meeting people through friends, going to parties, and seeing the person on the bus. Being able to personalize need not destroy our ability to socialize - but we must remember to keep that balance.

16 February 2006

Andrew Keen, an apparent dot.bomb era victim, had dinner in Palo Alto with a (now likely ex)-friend who apparently expounded Web 2.0 rhetoric which pushed more than a dozen of Keen's buttons. Keen, from this single conversation, assumed he now understood Web 2.0 and wrote an article for big media.

He is very mistaken in his understanding of Web 2.0, the assumptions of those working with it, and the degree to which people learned from the dot com bubble.

What is most amusing is that he argues for silencing conversation. He assumes wrongly that Web 2.0 is about supplanting "big media" when actually it is an energizing agent for it. Right now, I, McManus, and others. Have blogged about his errant understanding. Web 2.0 doesn't want to kill CBS, people merely want the ability to discuss what is said there.

Keen quotes Marx against Web 2.0, but that is a double edged sword. He is, I believe, saying that being idealistic is bad. Which it may be. But having goals and vision are not.

In his desire to silence conversation in this argument, he actually shows why Web 2.0 is vital. His single point source of (mis-)information has enraged him. He has written an ill-informed article that will further spread this misinformation. By arguing against Web 2.0 and against the conversation, he is arguing for limiting the point sources for information.

15 February 2006

In the attention economy, we don't want to be bothered with anything that isn't worth our attention. We want filters to only bring us the sweet stuff. The information that will make us faster, stronger, better.

Initially we had to go to individual web sites and read stuff. Then we got blogs that directed us to some information and that was nice. Then we got RSS which came to an aggregator and allowed us to find information even faster. Now we want the RSS culled and reified into pure info-gold.

As Richard McManus points out ... this is the next logical step. Richard's post also quotes Gabe Rivera and Nik Cubrilovic who note that coming up with this solution "is hard."

Perhaps we're finding that the Web 2.0 revolution to date has been laying infrastructure for future innovation. We have made sites that let us build folksonomies, share information, aggregate information, and find information. Now we need to actually do something with that information. Now we need to take this to the next step.

We need to cull the information, to manipulate the information, and take possession of our own information. The site that launched this conversation, Megite, a site that imports your OPML and builds a Memeorandumic page for you. Scoble notes that it's a good step and frees it from other peoples biases (replacing them handily with your own!).

But Megite isn't quite there - it's still value neutral and still merely parrots what your network is thinking and not necessarily tracking what you are interested in. It also operates entirely from the seed of the OPML file you provide. In order to give it any type of value inertia, you'd have to customize your OPML file, I suppose.

It won't be until we can treat the individual entries of RSS / OPML as objects and tag the individual objects that we will be able to give ourselves truly personalized clustering. Until we can do that, the information we are truly interested in will remain a mystery to or at least a difficult puzzle to be solved by technology.

It never ceases to amaze me that I will have a conversation about something esoteric with someone and it will appear the next day in someone else's blog who was totally unaware of the conversation.

We are currently planning an enterprise scale piece of software and have been going back and forth about what our UI strategy is going to be. There are many choices for us, as this applicaiton involves web services, geographic information systems, asset management, real time data visualization, hardware control interfaces, diagnostics and still more.

It would be nice to make this application run on a central server and provide a browser based UI or a set of browser based UIs. This means that nothing needs to be installed on local machines, that multiple users across an enterprise can have access, and that people could easily work remotely.

On the other hand, browsers place serious constraints on how people can interact with their data and with the UI itself. One of the main points I made yesterday was that browser UIs are lousy for giving us context sensitive right click menus. Then, I wake up this morning and Robert Anderson shoot off this post.

Now, in the interest of full disclosure, Robert is my cousin - so it's possible that there's some direct link between us on the astral plane. (Though this means I'd be even more closely related so some of our more ... interesting ... relatives.)

Given the higher evolution of the Mac in usability - I'm not sure I buy the single-button mouse concept, but implied in his criticism of Web 2.0 is one that I, myself, have weighed in on. Web 2.0's focus on "web" also focuses it on "browser" which is currently a highly limiting platform under which to create applications.

An internet-aware java desktop application can be much more powerful and lend itself more toward data integration, than are all these stand-alone web 2.0 apps we see today.

We keep trying to force the browser out of its limitations with AJAX, .NET, and so forth. But it's like adding levels to a house sinking in quicksand. We want it to be more than it ever was meant to be. Sort of like Windows and DOS.

Having said this, we still want to keep a lot of what browsers represent:

A highly standardized, open platform for development

A familiar look and feel

A customizable platform for users

An easily understandable environment for non-techs

Freedom from installation contingencies and support for widely varying system configurations

What we want to get away from is:

A system built thin for computers with 16 meg of RAM and running at 166 Mhz

A system built for access to the internet through a 1200 baud modem

A system with a highly limited user interaction paradigm

Ultimately, for me, web 2.0 is about creating portability in data, ownership of your own information, and freedom to associate. None of these elements are reliant on the browser. Perhaps it will rely on the astral plane.

14 February 2006

In the few years I have known David Anderson, I've always been relieved by his self-effacing finger waggling. He is so much better at it than I. In this post he reminds us, again, that agile thinking isn't merely relegated to the office. And that agile requires you to put your own ideas in motion.

He also reminds us that, as with Agile, life is usually more deadline driven than it is budget driven. Successful people in life figure out how much they can get done in a period of time and they strive for that. No more, no less. You miss the deadlines and you miss the opportunities. The trick is ... we get to choose our opportunities.

Good Agile managers also understand that there is overhead, competing interests and other factors that impact the number of "ideal working hours" a person has. An Ideal Working Hour is the number of hours it will take to do the actual work of a particular task. An 8 hour project is rarely completable in one standard work day.

Life is also important, even beyond Valentine's day. Figure into schedules other elements of life and be tolerant when they come up. Other elements of life often need to tolerate work's sometimes demanding schedules.

In the end, agility always requires balance.

And when I meet up with my wife and she grills me for hours about every detail of what I did that day and provides me with every detail of her's ... that's our standup meeting.