Incredible that the Mac is still around; the 90s were a dire time for Apple and it’s amazing to see the current fantastic iMacs and Macbooks that came after some epically bad mid-90s machines. Here’s Steve Jobs introducing the original Mac in 1984 (a snippet of the full introduction video):

First, I met the machine. From the instant the woman running the demo switched on that strange-looking contraption (inspired in part by the Cuisinart food processor), I knew the Macintosh would change millions of lives, including my own. To understand that, you must realize how much 1984 really was not like 2014. Until that point, personal computers were locked in an esoteric realm of codes and commands. They looked unfriendly, with the letters of text growing in sickly phosphorescence. Even the simplest tasks required memorizing the proper intonations, then executing several exacting steps.

But the Macintosh was friendly. It opened with a smile. Words appeared with the clarity of text on a printed page - and for the first time, ordinary people had the power to format text as professional printers did. Selecting and moving text was made dramatically easier by the then-quaint mouse accompanying the keyboard. You could draw on it. This humble shoebox-sized machine had a simplicity that instantly empowered you.

If you have had any prior experience with personal computers, what you might expect to see is some sort of opaque code, called a “prompt,” consisting of phosphorescent green or white letters on a murky background. What you see with Macintosh is the Finder. On a pleasant, light background (you can later change the background to any of a number of patterns, if you like), little pictures called “icons” appear, representing choices available to you. A word-processing program might be represented by a pen, while the program that lets you draw pictures might have a paintbrush icon. A file would represent stored documents - book reports, letters, legal briefs and so forth. To see a particular file, you’d move the mouse, which would, in turn, move the cursor to the file you wanted. You’d tap a button on the mouse twice, and the contents of the file would appear on the screen: dark on light, just like a piece of paper.

The rollout on January 24th was like a college graduation ceremony. There were the fratboys, the insiders, the football players, and developers played a role too. We praised their product, their achievement, and they showed off our work. Apple took a serious stake in the success of software on their platform. They also had strong opinions about how our software should work, which in hindsight were almost all good ideas. The idea of user interface standards were at the time controversial. Today, you’ll get no argument from me. It’s better to have one way to do things, than have two or more, no matter how much better the new ones are.

That day, I was on a panel of developers, talking to the press about the new machine. We were all gushing, all excited to be there. I still get goosebumps thinking about it today.

What’s clear when you talk to Apple’s executives is that the company believes that people don’t have to choose between a laptop, a tablet, and a smartphone. Instead, Apple believes that every one of its products has particular strengths for particular tasks, and that people should be able to switch among them with ease. This is why the Mac is still relevant, 30 years on-because sometimes a device with a keyboard and a trackpad is the best tool for the job.

“It’s not an either/or,” Schiller said. “It’s a world where you’re going to have a phone, a tablet, a computer, you don’t have to choose. And so what’s more important is how you seamlessly move between them all…. It’s not like this is a laptop person and that’s a tablet person. It doesn’t have to be that way.”

The Macintosh is the future of Apple Computer. And it’s being done by a bunch of people who are incredibly talented but who in most organizations would be working three levels below the impact of the decisions they’re making in the organization. It’s one of those things that you know won’t last forever. The group might stay together maybe for one more iteration of the product, and then they’ll go their separate ways. For a very special moment, all of us have come together to make this new product. We feel this may be the best thing we’ll ever do with our lives.

I like to claim that I bought the second Macintosh computer ever sold in Europe in that January, 30 years ago. My friend and hero Douglas Adams was in the queue ahead of me. For all I know someone somewhere had bought one ten minutes earlier, but these were the first two that the only shop selling them in London had in stock on the 24th January 1984, so I’m sticking to my story.

The Next Web has an interview with Daniel Kottke (no relation) and Randy Wigginton on programming the original Mac.

TNW: When you look at today’s Macs, as well as the iPhone and the iPad, do you see how it traces back to that original genesis?

Randy: It was more of a philosophy - let’s bring the theoretical into now - and the focus was on the user, not on the programmer. Before then it had always been let’s make it so programmers can do stuff and produce programs.

Here, it was all about the user, and the programmers had to work their asses off to make it easy for the user to do what they wanted. It was the principle of least surprise. We never wanted [the Macintosh] to do something that people were shocked at. These are things that we just take for granted now. The whole undo paradigm? It didn’t exist before that.

Like Daniel says, it’s definitely the case that there were academic and business places with similar technology, but they had never attempted to reach a mass market.

Daniel: I’m just struck by the parallel now, thinking about what the Mac did. The paradigm before the Mac in terms of Apple products was command-line commands in the Apple II and the Apple III. In the open source world of Linux, I’m messing around with Raspberry Pis now, and it terrifies me, because I think, “This is not ready for the consumer,” but then I think about Android, which is built on top of Linux. So the Macintosh did for the Apple II paradigm what Android has done for Linux.

Make the list entirely random consisting of selections from the entire Twitter userbase. After signing up, each new user sees 100 recommended accounts randomly chosen out of a HUGE pool of non-spam accounts (where HUGE = hundreds of thousands) that have been active for more than 3 months, tweet more than 5 times a week & fewer than 10 times a day, and have 2 times as many followers as followees (or something like that). Twitter has to be doing similar calculations to find spam accounts…just reverse it and whitelist accounts for the recommended list. That way, Twitter gets what they want (new users following people) and the super-user & conflict of interest problems are eliminated.

Advertising will get more and more targeted until it disappears, because perfectly targeted advertising is just information. There’s little point in saying something until the time is right, then you just have to say it once, and the idea takes over and does all the work.

That sounds overly optimistic to me but there’s definitely something of substance there.

In a Google search of five keywords or phrases representing the top five news stories of 2007, weblogs will rank higher than the New York Times’ Web site.

I decided to see how well each side is doing by checking the results for the top news stories of 2005. Eight news stories were selected and an appropriate Google keyword search was chosen for each one of them. I went through the search results for each keyword and noted the positions of the top results from 1) “traditional” media, 2) citizen media, 3) blogs, and 4) nytimes.com. Finally, the scores were tallied and an “actual” winner (blogs vs. nytimes.com) and an “in-spirit” winner (any traditional media source vs. any citizen media source) were calculated. (For more on the methodology, definitions, and caveats, read the methodology section below.)

So how did the NY Times fare against blogs? Not very well. For eight top news stories of 2005, blogs were listed in Google search results before the Times six times, the Times only twice. The in-spirit winner was traditional media by a 6-2 score over citizen media. Here the specific results:

My feeling is that Mr. Nisenholtz will likely lose his bet come 2007. Even though the nytimes.com fares very well in getting linked to by the blogosphere, it does very poorly in Google. This isn’t exactly surprising given that most NY Times articles disappear behind a paywall after a week and some of their content (TimesSelect) isn’t even publicly accessible at all. Also, I didn’t look too closely at the HTML markup of the NY Times, but it could also be that it’s not as optimized for Google as well as that of some weblogs and other media outlets.

“www.nytimes.com” has a PageRank of 10/10, higher than that of “www.cnn.com” (9/10), yet stories from CNN consistently appeared higher in the search results than those from the Times. The Times clearly has overall authority according to Google, but when it comes to specific instances, it falls short. In some cases, a NY Times story didn’t even appear in the first 100 search results for these keyword searches.

By 2007, it may be difficult to differentiate a blog from a traditional media source. All of the Gawker and Weblogs, Inc. sites are presented in a blog format and are referred to as blogs but otherwise how are they distinguishable from traditional media? Engadget paid to send 12 people to cover the CES technology conference, probably as many or more than the Times sent. The Sundance film festival was heavily covered by paid writers for both companies as well. In the spirit in which this bet was made, I’d have a hard time counting any of their sites as blogs. (And what about kottke.org? I get paid to write it. Am I still a member of the citizen media or have I crossed over?)

Choosing appropriate news stories and keywords for those stories was difficult in some cases. Katrina was a no-brainer, but was the Terri Schiavo story really one of the top eight news stories of 2005? Resolving the methodology for this bet in 2007 will be tricky. I wonder how the Long Bets Foundation will handle its determination of the victory.

Wikipedia does very well in Google results for topical search terms. Overall, traditional media still dominates (in first appearance as well as number of results), but blogs and Wikipedia do very well in some instances.

What do these results mean? Probably not a whole lot. Nisenholtz asserts that “[news] organizations like the Times can provide that far more consistently than private parties can” while Winer says that “in five years, the publishing world will have changed so thoroughly that informed people will look to amateurs they trust for the information they want”. It’s difficult to draw any conclusions on this matter based on these results. Contrary to what most people believe, PageRank has a bias, a point of view. That POV is based largely (but not entirely) on what people are linking to. As someone said in the discussion of this bet, this bet is about Google more than influence or reputation, so these results probably tell us more about how Google determines influence on a keyword basis rather than how readers of online informational sources value or rate those sources. Do web users prefer the news coverage of blogs to that of the NY Times? I don’t think you can even come close to answering that question based on these results.

Methodology and caveats

The eight news stories were culled from various sources (Lexis-Nexis, Wikipedia, NY Times) and narrowed down to the top stories that would have been prominently covered in both the NY Times and blogs.

The keyword phrase for each of the eight stories was selected by the trial and error discovery of the shortest possible phrase that yielded targeted search results about the subject in question. In some cases, the keyword phrase chosen only returned results for a part of a larger news story. For instance, the phrase “pope john paul” was not specific enough to get targeted results, so “pope john paul ii death” was used, but that didn’t give results about the larger story of his death, the conclave to select a new pope, and the selection of Cardinal Joseph Ratzinger as Pope Benedict XVI. In the case of “katrina”, that single keyword was enough to produce hundreds of targeted search results for both Hurricane Katrina and its aftermath. Keyword phrases were not tinkered with to promote or demote particular types of search results (i.e. those for blogs or nytimes.com); they were only adjusted for the relevence of overall results.

The searches were all done on January 27, 2006 with Google’s main search engine, not their news specific search.

Since the spirit of the bet deals with the influence of traditional media versus that of citizen-produced media, I tracked the top traditional media (labeled just “media” above) results and the top citizen media results in addition to blog and nytimes.com results. For the purposes of this exercise, relevent results were those that linked to pages that an interested reader would use as a source of information about a news story. For citizen media, this meant pages on Wikipedia, Flickr (in some cases), weblogs, message boards, wikis, etc. were fair game. For traditional media, this meant articles, special news packages, photo essays, videos, etc.

In differentiating between “media” & citizen media and also between relevent and non-relevent results, in only one instance did this matter. Harriet Miers’s Blog!!!, a fictional satire written as if the author were Harriet Miers, was the third result for this keyword phrase, but since the blog was not a informational resource, I excluded it. In all other cases, it was pretty clear-cut.

Dave Winer comments on the weblogs.com/Verisign deal…odd omission of a link to my post even though he references it several times. Bad luck that I caught him traveling; if I’d realized that beforehand, I would have held off. Dave seems to trust Verisign to do the job; I think Verisign has shown itself to be an untrustworthy, terrible company.

Boy, the scent of money is in the air these days. The latest report is that Dave Winer has sold weblogs.com to Verisign (~$5 million is the figure being bandied about for $2.3 million). This is an interesting one because it seemed crazy (see below) when I first heard about it, but now that I’ve heard it from multiple sources, who knows?

Verisign is interested in blogs and RSS (another of their acquisitions in this space will be announced soon) and it’s not hard to see why Dave would sell weblogs.com (the site needs some firm financial backing to keep from buckling under the ever-increasing strain of all those pings), but to Verisign? To me, Verisign embodies the idiocy and ineptitude of the BigCos Dave often rails against…the BigCo to end all BigCos. If true, those are some odd bedfellows indeed.

We’re getting confirmation that the rumors about Verisign buying Dave Winer’s Weblogs.com are true. The price is $2 million. What Verisign wants with Weblogs is another matter. Weblogs was one of the first, if not the first, centralized ping servers that blogs could use to alert the world to new content.

I like how when a weblog has two independent sources on something, it’s a “rumor”…