You don't invent the future, you unleash it by leveraging the global community mind.

John Seely Brown, Xerox PARC

Our heroes change along with our technologies. And thanks in part to the Internet, the Lone Ranger has ridden off into the sunset.Even if he hadn't, the Lone Ranger could never have survived as a hero for the kids of the Internet age. He was, well, lone. One. Uno. Singular. Sure, he had a big white horse and a subservient, racial stereotype of a sidekick. But it was really the kemosabe himself that made all the decisions, saved all the days, and performed all the heroics. Just like the captains of industry, generals and other rugged individuals of his time, the Lone Ranger didn’t need anything except his horse and his six-shooter. Hi ho!

But the terrain on the digital frontier is different from the Wild West. There’s really no place for the loner, because the wagons are already circled, the towns are well under construction, and the fields are already planted. Online, the collective is already here.

In the age of the collective, we need new heroes: Power Rangers. The Power Rangers may be a lot of things, but they are decidedly not lone, uno, or singular. They are the epitome of collective action – at least, to as great an extent as is possible in the two-dimensional world of kids’ TV reruns. Alone, the Power Rangers are ordinary, even bland, kids. Sure, they can ‘morph’ into individual superheroes, who happen to look a lot like the Lone Ranger wearing a motorcycle helmet instead of a Stetson. Sure, they ride mechanical Zords rather than trusty steeds named Silver. But morphing has its limitations. When Power Rangers fight their enemies all by their lonesome, they inevitably get their pimply asses kicked.

On the other hand, when Power Rangers put their individual failings behind them and pool their talents, skills, personalities and technological assets together, they form the MegaZord, an unstoppable technological colossus. Collectively, they can whip the stuffing out of all the rubber-suited monsters(always loners!) that their nemeses Rita Repulsa and Lord Zedd can throw at them. Alone, they’re just kids in candy-coloured spandex. Together, they’re the Power Rangers – and they’re invincible.

As the collective defenders of the commons (a.k.a. the town of Angel Grove), the Power Rangers are the perfect superheroes for the commonspace era. They show how ...

Internet Super Ingredient #1

PEOPLE

combines with

Internet Super Ingredient #2

WORKING TOGETHER

to create

COMMONSPACE

(+ digital collective-ness)

With this gestalt comes a comprehensive shift in how everything around us works, looks and feels.

The Lone Ranger has slunk out of town, broken-down mythology trailing behind him in the dust. The Power Rangers have ushered in a world of collective superheroes. We are now living in the world of digital gestalts, where everything connects, merges and becomes MegaZordish. Excellent.

Your New Superpowers

Congratulations. You’re now an official commonspace Power Ranger. When you choose to work collectively online, you can accomplish some amazing things because you have the following superpowers:

·Speed. You can move faster than people who aren't online. The small, fluid, Internet-enabled project/idea /enterprise/meme moves faster than the majority of bloated corporate processes.

·Creativity. You can come up with astounding new ideas and solutions. There’s no need to think outside the box, because the box is gone.

·The MegaZord Effect.Like the Rangers melding into the MegaZord, you can use collective action and synergy to create products and services far more impressive than the resources and time that went into creating them. The collective gives you the leverage to move mountains and kick rubbery monster butt.

Fortunately, doing things quickly and effectively has another side-effect: it allows people to be people without being dehumanized, restricted or circumscribed by their work. The collective power of the Internet is not important simply because it flattens hierarchies. In many respects, it also helps to create a more humane work environment.

A Penguin Is Born

What does the MegaZord look like online? Well, despite the sorry lack of a Ranger who rides in a mechanical Penguin Zord, the online collective looks a lot like Linux.

Think back. About 10 years ago, an unknown Finnish university student started working on a home-made operating system that worked much like the Unix systems that ran huge corporate and academic networks. Once he'd created a reasonably stable version of his software, he started distributing it for free online. Moreover, he invited others to hack around with his source code and share their changes and fixes. The operating system grew. The number of users grew. The MegaZord grew. Together, Linus Torvalds and his users snatched a huge market share from Microsoft and Sun and other ‘monster’ software companies, and created something that really worked. Something called Linux.

Sure, it’s a story that we’ve all heard a million times by now. But it’s still astounding. Linux happened, and is still happening like wildfire. The software is free, and so reliable that many IT professionals consider it to be more powerful and more reliable than the commercially produced alternatives.

The crux of the matter is that Linux and other open source packages were evolved by self-organizing, reasonably ‘flat’ collectives. For those who’ve never been involved in one, an open source project works in a substantially different manner from other types of authoring. Eric Raymond’s classic essay ‘The Cathedral and the Bazaar’ <www.tuxedo.org/~esr/writings/cathedral-bazaar/> provides, among other things, a kind of approximate flowchart for open source development gleaned from his experiences developing Fetchmail. If arranged into a chronological order, the steps involved would look something like this:

·Begin with a project that’s useful to you, and enjoy your work.

·Plan your data structures well.

·Establish what to recycle from other projects.

·Be ready to start over at least once… but don’t throw anything away.

·Develop a base of users/beta-testers/co-developers to facilitate debugging.

·Release early, release often.

·Listen to your customers/users.

·If you get tired of your project, hand it over to someone who cares about it.

Some of these stages are merely the extension of good planning. But some of them highlight the essential differences between open source philosophy and other approaches to development and authoring.

First of all, nothing in open source is garbage. If you can’t find a use for a given piece of your project, there may well be someone else who can. Accordingly, everything should be archived somewhere. When storage is as cheap as it is now, there’s absolutely no reason not to save everything for posterity.

Secondly, customers and users are an important part of the creation process, because they will often catch bugs and other types of problems that you’ve missed. As soon as your product becomes stable and usable, send it out into the world, and be industrious about soliciting opinions from your client base on how to improve it. This allows you to take advantage of what Raymond calls ‘Linus’ Law’: Given enough eyeballs, all bugs are shallow. In an open source environment, more releases means more corrections, and the result is a better product.

The third major difference in the open source approach has to do with intellectual property. If you have no interest in maintaining something, give it to someone who does. In an era of domain-name speculation and other forms of unleashed caplitalist greed, this may seem like an astonishing demand. But the benefits of this approach are demonstrable from the success of the open source model. And if the property you’ve donated to the common pool develops some sort of second wind, it’ll only look good on you.

Raymond also takes on conventional project management strategy by the horns and wrestles it to the ground. In ‘On Management and the Maginot Line,’ one of several ‘version upgrades’ that have been made to his paper over the years (NB: even prose can benefit from the adoption of open source practices), he points out:

Traditionally-minded software-development managers often object that the casualness with which project groups form and change and dissolve in the open-source world negates a significant part of the apparent advantage of numbers that the open-source community has over any single closed-source developer. They would observe that in software development it is really sustained effort over time and the degree to which customers can expect continuing investment in the product that matters, not just how many people have thrown a bone in the pot and left it to simmer[1].

But Raymond points out that EMACS, the standard text editing software in Unix/Linux, ‘has absorbed the efforts of hundreds of contributors over fifteen years into a unified architectural vision, despite high turnover and the fact that only one person (its author) has been continuously active during all that time. No closed-source editor has ever matched this longevity record.[2]’ And the effect of this continuous focus is evident in Linux’s continued success and growth. Almost a decade old now, Linux has weathered technological changes such as the migration from 16-bit to 32-bit with a much greater degree of ease and stability than competing systems such as Windows.

·Managers define goals and keep everybody pointed in the same direction.

Raymond accedes that this function might be necessary to some extent. Every project has to have some sort of long-term direction if it’s going to be of continued use. But he objects to the notion that middle managers can do this better than the ‘tribal elders’ of the open source world.

·Managers monitor projects and make sure that crucial details don’t get skipped.

According to a story on ZDNet <www.zdnet.com>, an internal Microsoft memo viewed by Sm@rt Reseller revealed that the first release of Windows 2000 was shipped with 65,000 bugs. Crucial details don’t get skipped? There’s no point in even debating the merits of the open-source model over monitored workflow in terms of product quality. Decentralized peer review clearly kicks the tar out of all the prevailing traditional methods for debugging.

·Managers motivate people to do boring but necessary drudgework.

What interests Raymond here are the underlying assumptions: that without someone dangling a big bag of money in front of them, programmers in an office environment will turn in substandard work, and in fact, that this form of management is useful only in circumstances when work is perceived as ‘boring’. But the moment that a competing open source solution for a ‘boring’ problem appears, customers are going to know that the problem was solved by a highly motivated person who tackled it because they enjoyed the process. Any guesses on which product the customer will see as superior?

·Managers organize the deployment of people for best productivity.

Raymond’s argument becomes unabashedly elitist here, claiming that open source productivity is a result of the community’s ‘ruthless self-selection’ for competence. Office workers, by implication, are slow and feebleminded. Whatever you think of this contention, consider ‘that it is often cheaper and more effective to recruit self-selected volunteers from the Internet than it is to manage buildings full of people who would rather be doing something else.[4]’

·Managers marshal resources needed to sustain the project.

In an office environment, where people, machines and spaces are limited and in constant demand, the need to pull the wagons into a circle and defend a project’s resources is a sad fact of life and the subject of many Dilbert cartoons. But in an open source world, where there are no offices, and everyone is a volunteer, there’s no ‘enemy within’ to fend off.Hence the only limited resource is skilled attention.

Raymond’s parting shot on the issue of traditional management vs. open source development is a rhetorical zinger: ‘if the open-source community has really underestimated the value of conventional management, why do so many of you display contempt for your own process?’[5]

The cynicism of office culture is omnipresent, from movies like Office Space to the tales of worker alienation in zines like Processed World. If Raymond was correct in his assumption that ‘It may well turn out that one of the most important effects of open source’s success will be to teach us that play is the most economically efficient mode of creative work,’[6] well, that wouldn’t be such a bad thing, now would it?

The Aggregation Nation

As the notion of collective work builds steam, the walls between organizations are beginning to topple. Lines are disappearing – inside and outside, customer and partner, supplier and buyer. Amid the rubble, astounding pools of collective information and ideas are emerging.

Just think for a minute about how companies have changed their tune about 'links.'

When the Web first appeared, most companies feared links. Linking offsite: Don't do it, it's like giving away customers. Linking in: We don't want it, unless you are linking to our majestic homepage. ‘Please, please, please Mr. Internet MegaZord, don't deep-link to the individual stories within our site,’ was the plea of the marketing wonks. Most companies were afraid that the connections the Internet provided would ruin their business. They believed that they'd be fine if they could just corral people inside their sites as they could in the mall.

The situation has changed drastically in the last five years. At least at the basic level of links, most companies have realized that their Web site doesn’t amount to squat if it’s not linked to the larger collective resources of the net.

The emergence of Rich Site Summary (RSS)-based headline sharing is clear evidence of this shift. Sites as diverse as HotWired, Slashdot and CNNhave all started broadcasting their headlines and links for free to the rest of the Internet. As a result, anyone can easily and automatically pick up these headlines and incorporate them into their site’s contents. Not only has this aggressively encouraged the practice of deep linking; it has also led to more off-site linking, because it ultimately generates more traffic.

The copyright-obsessed corporate media world has finally twigged to the fact that online, links are life itself. Of course, what they have really discovered is something more revolutionary than ‘links’. They've discovered the power of content aggregation, a phrase that means exactly what you think it does: bringing content from a variety of online sources into one place (literally or metaphorically). With headline-sharing, information becomes a huge pool from which anyone can draw the pieces they want. This kind of aggregated data pool – and the opportunity to take what you want from it – is one of the most powerful forces in commonspace.

When most people say 'content aggregation' they mean 'portal.' For several years now, becoming a popular content aggregator (or ‘portal’) has been the Holy Grail of Internet business. Building an aggregation site like Yahoo! or MySimon has meant lots of traffic, lots of eyeballs, and the potential for lots of advertising dollars. It was (and remains) a quick way to create a Big New Media Brand.

A portal is also a centralized pool of information. Anyone can come to draw on it, but only within the confines of the rules and categories imagined by the people who created the site. Anyone can provide content to the pool, but they have to accept that users will only find their content through the rigid filter provided by the portal. Of course, portals are useful in the same manner that the Yellow Pages are useful. But despite their helpfulness, they’re also boring, over-regulated, and frequently out-of-date.

SlashDot

- content -

Linux.com

- content -

Wired

- content -

Yahoo! (or any other portal)

- referrals and links to content -

Users

[diagram here]

The standard portal is not that different in structure from the hierarchical one-to-many media models that we are hoping to leave behind us. Despite the shiny technological trappings, it’s still one site, one brand, one group of editors and webmasters pushing content out to the masses.

The New Zodiac

Isn’t there a way to draw on the power of content aggregation that is more creative and interesting than the Yellow Pages? Is there a way to that takes better advantage of the collective power of the Internet than a one-to-many portal?

Of course there is. In fact, there are many.

The most recent approaches to aggregation are more collective in structure than their predecessors. They recognize that people online want constant access to filters, context and opinion. Such approaches blur the lines between producer and audience and between organizations. They provide X-ray vision that lets you see through the cubicle walls and super-strength to break down the gates around carefully hoarded data and open up an omni-directional flood of information. Most importantly, they recognize that the Internet works best when we all contribute to and draw from collective pools of aggregated information.

These emerging models look less like the Yellow Pages and more like the chaotic headline-sharing that content providers are starting to embrace. The closest analogy would be a networked, many-to-many version of the old media concept of ‘syndication’. By sharing headlines freely across the Internet, information pools feed the diverse needs of the network and its users.

The result of these linked information pools resembles a constellation more than a portal. Vital pieces of information flow freely within the network. Just as in a constellation, information appears bigger/more important or smaller/less important depending the location and orientation of the viewer. For example, on Slashdot, the ‘Slashback’ section (a summary of commentary on important current stories)appears in large type with a long summary. But stories from other sites such as Salon or Blue’s News appear as small headlines in the margin column of the page. The opposite would be the case if your were on Salon or Blue's News.

[diagram here]

SlashDot

- content + referrals -

Linux.com

- content + referrals -

Wired

- content + referrals -

Users

These are considerable advantages. The traditional portal is still too much like a firewall between the user and the content provider. It benefits the portal owner in the short term, as the site’s large size generates a lot of ad dollars. But the trade-off is that the content provider is isolated and disconnected, and the user has to dig for related content across disconnected sites.In contrast, the collectivized constellation model – the MegaPortal – is more useful to both content providers and users. Content sites receive traffic and additional material from related sites. Users discover links to material they wouldn't otherwise have found. The constellation model gives every site the potential to be the know-it-all – the expert in their field. Sites become more vibrant and users find content more easily and intuitively.

This is not to say that there is no need for aggregator sites. They still provide the road maps and the ability to pinpoint specific information. But over time, aggregator sites will simply become players within larger constellations. Often, they will aim to fill a particular niche, acting as the main aggregator for a constellation focused on a particular topic.

One World <www.oneworld.net>, a human rights network, is already playing this kind of role. At first glance, One World appears to be a slick online magazine and portal with a strong focus on human rights. The front page is a tight mix of well-written headlines, photos and links that push through to an eco-friendly, people-friendly shopping mall. While it isn’t the New York Times, it’s a pretty impressive site by non-profit Web standards.

What’s brilliant about One World is that it’s really a collectively written digital media product and knowledge base. All of the content comes from One World Partners – non-profits who pay a membership fee to be listed in the search engine. Partners push content to the One World editors, who pick up the best content for their front pages and sector-based theme sites. Essentially, One World is a traffic co-op for its members and a great magazine for its users. Professional-quality media, created by the collective. Amazing.

Internet syndication and content co-ops like One World and Slashdot are creating a collective media universe that is more flexible, configurable, and responsive than analog media models, and even first-generation digital models. Small players can instantly become a part of something much larger – and bootstrap themselves in the process – by contributing to the syndication pool and picking up headlines from others. Constellation members can provide their own editorial take on information. Users can quickly and easily find what they need from anywhere in the constellation. In content syndication, everyone contributes a little, and the result is a media universe that is much, much bigger than the sum of its parts.

And so publishing, the bastion of the copyright, becomes an act of sharing rather than hoarding. Small players leverage off each other, and everyone gains in the process. Hucksterism wanes as publicity becomes dependent on high-quality writing and content-sharing.

Of course, such growth is impossible for businesses that retain outmoded, silo-style organizational thinking, where information only gains value from being stockpiled. Constellations require mutual access to yield mutual success.Businesses have to be willing to see the benefits of their information appearing on websites run by their partners and even their competitors. Getting over this conceptual hump is going to be a huge challenge for many. In fact, some people just don’t get it no matter how many times you show them or tell them. They see their Web site as their outpost in a vast and chaotic universe of information. They don’t see the constellations, only individual stars, cold and lonely.

But once they do .... bang! It’s time for fireworks.

Frequently Answered Questions

The fireworks of collective media aren’t being created solely by professional media-makers and people consciously seeking to ‘get their message out’. In fact, amazing new kinds of collective media are being produced by all of us every time we open our metaphorical mouths online. Our words, our ideas, our complaints are transforming into self-generating knowledge bases and information resources that we could never have dreamed of ten years ago. Unconsciously or consciously, we are all becoming a part of the collective mind.

Collective minds begin to form when online communities, discussion forums, listservs and other ongoing digital conversations turn into searchable archives. As in the real world, online discussions often contain much wisdom. But unlike real-world discussions, online talk is recorded more often than not. Once recorded, discussions can be searched, analyzed and referred to by others. It’s as if every community meeting, kaffeeklatsch and bull session down at the pub or around the cooler had a set of written minutes.

In the majority of cases, the response to all this recorded chatter might justifiably be ‘So what?’ The minutiae of daily life online is usually as tedious and banal as it is in the real world.But because the archives exist, and because we already have sophisticated search tools that alleviate the gruntwork of having to sift through mountains of hay for the digital needle, online communities and other kinds of discussions have the potential to be knowledge bases and answer pools of unprecedented power.

The first real example of a conversational archiving form was the USENET FAQ. For those who’ve never stumbled across it, USENET was (and is) a vast Internet-wide discussion system that contains thousands of different ‘newsgroups’ and has been in constant use since 1979. Each newsgroup focuses on a particular topic – Java programming, stamp collecting, bondage, you name it. Before it became a high noise haven for pranksters, trolls and spammers (around 1995), USENET was a place that people could turn to for useful answers from their peers.

In order to capture the essential knowledge generated by a given newsgroup and to avoid having to answer the same questions repeatedly (not suprisingly, telling potential new users to RTFM – Read The Fucking Manual – tended to alienate them), people in these mini-communities started developing documents containing the answers to ‘Frequently Asked Questions’. These collectively generated documents became known as FAQs. Many FAQs are available at AskJeeves <www.ask.com> in their Internet FAQ Archives, if you’re curious.

It’s something of an exaggeration to call a UseNet FAQ a knowledge base, because FAQs only skim the tip of the UseNet knowledge iceberg. But they do serve a valuable purpose as the distilled core knowledge contained of any given group. Think of them as the Cliff’s Notes of community knowledge.

Once the practice of mining USENET for information had developed, it didn’t take long for people to start digging deeper – and more broadly. Enter Deja News <www.deja.com>. Deja’s great revelation was that USENET could become a hugely more powerful information resource if someone just pointed a search engine at it. So they did. They didn’t really know what else they would do with their site, or how it would make money; they just hoped that people would search. And they did. In droves. They searched for solutions to their technical problems, for feedback about what laptop to buy, for hints on where to find stolen software or what to feed their iguanas.

Over time, the patterns and the opportunities became clear. People were searching for the opinions of others. Building on this knowledge, Deja News became Deja.com – a collective opinion pool. Over time, the site began to organize information from newsgroups most likely to include opinions on consumer items, such as laptops and camcorders. It also started adding proprietary content-gathering tools to supplement the opinions generated through Usenet. For example, product-rating polls were added to collect – and reflect – the opinions of searchers. Every time you searched for ‘laptop,’ you’d be discreetly asked for your opinion on various laptops models. The aggregated opinions of other Deja users began to supplement the information available from USENET. And so the ‘collective opinion mind’ was born.

The sad part of the story is that the collective mind is still kind of an idiot savant. The commercial side of Deja has gradually swallowed almost all of the site’s original purpose, Jekyll-and-Hyde fashion. The USENET search engine is still there, but you have to do a bit of tunneling to find it. In any event, the Deja archive only extends back to 1995, so it’s hardly a comprehensive research tool. But we can always hope that some day the archive will be extended to cover earlier years as well, by Deja or someone else.

In some respects, opinion sites take on the same role as Consumer Reports or Edmunds used car guides.

They provide feedback on the quality and usefulness of products we might consider buying.While the collective Internet mind lacks the objectivity and rigor of such publications, it offers a kind of information that could never come from a print publication or even from top-down Web publishing: current, dynamic and unmediated.

Together We Compute

No matter how brawny individual computers become, they will never be as powerful individually as collectively.

Howard Rheingold

For the most part, the power of the collective comes from some sort of active participation, such as words, ideas or code. But slowly, collective work is beginning to extend from ideas to resources, such as spare CPU cycles.

‘Community computing’ used to refer to the setting up of a FreeNet or some kind of socially-minded online community. But increasingly, it’s come to mean sharing your spare computer resources with a large project that needs extensive computing power. Small computing tasks are divided up and sent out to participating computers across the network by a central server. Once the ‘work’ on each little

packet has been completed, it is sent back to the central server for integration into the main data pool.[7]

SETI@home <setiathome.ssl.berkeley.edu> was the first to try this ‘voluntary collective processing’ on a large scale. In the SETI (Search for Extra-Terrestrial Intelligence) project, over two million users worldwide lend their computers to aid in the search for alien life. The project gathers a huge pool of galactic surveillance information with the project’s telescopes. Then, using a little program disguised as a screen saver, it sends small amounts of this data to the each of the users’ computers for processing. The computers, in turn, send back the processed data. This huge network of regular desktop computers gives SETI the equivalent of a powerful supercomputer that would be far outside of their budgetary constraints.

SETI@home has proven so popular that it’s even inspired Internet hoaxes. A group of young programmers posing as a Russian computing company posted a notice that they’d developed a computer accelerator board designed exclusively to speed up the SETI@home software. They were deluged with so many inquiries from people interested in acquiring this nonexistent product that they were forced to reveal their spoof almost immediately and subsequently removed their website.

At one level, the emergence of this kind of networked parallel processing is a useful yet boring development. The most obvious applications for it will probably involve the sharing of processing across unused machines on large corporate networks. Who really cares if Shell can speed up the analysis of geological data by farming the task out to idle computers in the secretarial pool? It’s certainly a nifty technical feat, and an efficient use of resources.But it’s not commonspace.

The really interesting applications are in projects like SETI@home, where huge numbers of people can pool their spare resources to support a project that inspires them. The participants in SETI@home aren’t really giving up anything they’ll miss, but they are giving something nonetheless. In doing so, they are creating a collective supercomputer, a computer that is literally bigger than the sum of its parts. Who knows how far these kinds of projects will go? What if SETI really discovers some extraterrestial life? We are on the verge of a huge shift in how we share resources and participate in large-scale projects.

We Are Each Other

As a species, we have always needed each other in order to grow and thrive as individuals. For a long time, we’d forgotten that. But it’s possible once again to work together in short-term, non-hierarchical groups in order to accomplish highly specific goals with very little individual work.

People always have varying degrees of interest in any project.The beauty of collective work is that it doesn’t require everyone to contribute all of their resources, so long as some people contribute some of their resources. And everyone benefits from these casual moments of philanthropy.Collective synergies yield results that can never be produced deliberately and mechanistically, no matter how large the project budget.

This is not wide-eyed, New Age optimism. It is clear and present reality. Hierarchies and other structures that traditionally drive collective work are dying out. Now, success demands flat systems and commonspace:

To compete ->

.... seize the power of the collective.

To lead ->

To teach ->

To change the world ->

In practice, this means 'think open source' and 'contribute to the collective data pool.' You need to put your ideas into the collective to benefit and thrive.

And here’s the dirty little secret about joining the collective: your work doesn't just get more powerful… it gets easier.

[1] Raymond, Eric. The Cathedral and the Bazaar. <www.tuxedo.org/~esr/writings/cathedral-bazaar/>