Jimmy Wales, founder of Wikipedia, and Dale Hoiberg, editor-in-chief of Encyclopedia Britannica, engage in an entertaining smackdown over at the Wall Street Journal. I give the match to Wales. Let’s go to the videotape:

Mr. Hoiberg: No, we don’t publish rough drafts. We want our articles to be correct before they are published. We stand behind our process, based on trained editors and fact-checkers, more than 4,000 experts, and sound writing. Our model works well. Wikipedia is very different, but nothing in their model suggests we should change what we do.

Mr. Wales: Fitting words for an epitaph…

But it’s a shame we’re not past this us-v-them narrative in the worlds of Wikipedia, encyclopedias, and shared knowledge. Newspapers are finally starting — just starting — to figure out how to work together. How should the publishers and the people in this world work together?

Sometime ago, I suggested that if I were a publisher, I’d piggyback onto Wikipedia and put effort into vetting articles there in what Fred Wilson called the Redhat version of Wikipedia.

If I were a reference publisher, a library association, a university, a media company, or a foundation, I’d take Wikipedia as raw material and vet entries, perhaps even charging for the service: On demand or on the basis of traffic and links, I’d go in and vet already-written pieces and bless that version of it. Then maybe I’d publish a book from it. Subsequent changes would be unvetted until and unless I chose to or the audience asked me to review them.

So that’s what I would do starting from Wikipedia. Britannica could use the work of Wikipedia and its experts to create the world’s largest vetted encyclopedia. If only it opened itself up to the possibilities.

And starting from Britannica? Well, they could put up the encyclopedia as a wiki and invite people to correct it, add to it, to propose articles that aren’t there but ought to be. They could turn it into something the community cares about, instead of merely buys.

Britannica isn’t going to have the latest weird word (woot!) that I just saw on the internet. If I’m going to do brain surgery on my cat, I want a source that has been fact checked.

Both have their strength and weaknesses. Wikipedia articles often get worse over time, Britannica articles are often yesterday’s news.

http://katrinacoverage.com/ KatrinaCoverageDotCom

On demand or on the basis of traffic and links, Iâ€™d go in and vet already-written pieces and bless that version of it.

Wikipedia already does that to a certain extent (tinyurl.com/g4e9l). Of course, those reviews may suffer from the same problems as the entries there.

One of the major problems with Wikipedia is not necessarily what’s there, but what isn’t there. It might not be there because no one has visited that page and added it, or it might have been there at one time but was deleted dozens or hundreds of edits before. Wikipedia would have to wait for a volunteer to come along and add it, but might end up waiting in vain. Encyclopedia Britannica hires people.

For example, the claim “the sky is red” is easily corrected. But, a sky expert would need to come along and explain why it’s blue. And, that sky expert probably isn’t a history expert and probably has no clue that different civilizations had different thoughts on the color of the sky. Unless like WP you can get people to do it for free, that equals money as you have to go out and hire those experts.

A more tangible example of “What Wikipedia may not be telling you” is the linked blog which despite having far more information on the politics of the subject (1000+ posts, 400+ tags) than the Wikipedia page on the same topic (tinyurl.com/kbo6e), was removed from that page (tinyurl.com/gb23b) despite having been there for six months.

It’s also ironic that Wikipedia looks on blogs the same way that Encyclopedia Britannica looks on Wikipedia. Many editors at Wikipedia are anti-blog, and their guidelines strongly frown on using blogs as sources (tinyurl.com/hojfs). And, perhaps all the links Wikipedia has gotten from blogs leads to it usually turning up on the first page of google searches for various terms.

http://katrinacoverage.com/ KatrinaCoverageDotCom

Here’s another example. Check out the entry for Media Matters for America: tinyurl.com/kkst6

Notice anything odd about that page?

To a more than fair extent it reads a bit like a press release MMFA might write about themselves (“Employing methods such as content analysis, fact checking, monitoring, and comparison of quotes or presentations from media figures to primary documents such as Pentagon or Government Accountability Office reports, MMfA provides daily analysis and more comprehensive overviews to its readers.”)

And, there’s no mention of any connections they have to George Soros (tinyurl.com/ednlx).

And, aside from one blog link, I hardly see any negative or contrary information at all.

In fact, several months ago I added a paragraph detailing one of the cases where their “analysis” was faulty.

That negative paragraph was deleted and, while I put it back once or twice I soon tired of the game.

How exactly would the “vetters” mentioned above find this missing information, or would they simply be forced to deal with what’s there, incomplete as it may be?

http://www.thesportstv.com Andrew Beinbrink

Jeff,

Here is response to an article you wrote awhile back about network, longtail, Metcalf, etc…

Wow this network conversation is out of control. This law vs. that law. The bottom line I agree there are too many people trying to develop the next best social network. The issue I try to address for our intended market is HOW TO GENERATE VALUE. How can we provide value. To me personally it means a unique and well positioned blend of content, communications, social network strength, target network strength, credibility, verification, marketing power, interoperability, UI design / feel, brand, etcâ€¦

For social networks focus on a target market and power it with compelling media and sophisticated technology. Run like hell and try to generate a strong base of users get acquired so you donâ€™t die when the shake out and consilidation happens within 3 – 5 years.

Just one entrepreneurs thoughts at 3 AM on the whole WEB 2.0 craze!

Mike

I imagine they could go on being pissy for as long as they wish. You see.. The critical difference between wiki and brit.. Is that, as far as most profs and teachers are concerned, brit is a legit reference source and wiki is not. Every course I’ve had where we were supposed to write papers and list our sources forbade the use of wiki as a reference.

http://sethf.com/ Seth Finkelstein

Some problems with vetting Wikipedia articles are that by the time you add up the additional cost of dealing with potential copyright violation / plagiarism, and rewriting the piece for acceptable grammar and style (one of Wikipedia weakest aspects), you might as well just hire an expert to do it from scratch.

Comparing something to Linux is often misleading, for at least two reasons:

1) Many of the top Linux-writers are paid, either directly or indirectly

2) Validating a program, while difficult in theory, is a fairly objective task, a true “neutral point of view”.

I think Seth made a good point about timeliness, though. Maybe the vetting could point out when then information was vetted, so it doesn’t take responsibility for later edits? I know you can track edit history on Wikipedia — maybe they could also make “frozen” version of pages available for a vetting project?

This idea mirrors what Grace Note did with CDDB, which was compiled for free by hundreds if not thousands of volunteers. Grace Note monetized all the donated efforts of a bunch of now-pissed people without returning any of that value to the creators. This broke that group effort (which resulted in forever-free freedb) and made at least some, such as myself, vow never to donate my time to any effort that could end in the same result.
Don’t think this would work or even that it’s a very good idea.