Jobs later said, “If I had never dropped in on that single calligraphy course in college, the Mac would have never had multiple typefaces or proportionally spaced fonts.”[47]

In the fall of 1974, Jobs returned to California and began attending meetings of the Homebrew Computer Club with Wozniak. He took a job as a technician at Atari, a manufacturer of popular video games, with the primary intent of saving money for a spiritual retreat to India.

Jobs returned to his previous job at Atari and was given the task of creating a circuit board for the game Breakout. According to Atari founder Nolan Bushnell, Atari had offered $100 for each chip that was eliminated in the machine. Jobs had little interest in or knowledge of circuit board design and made a deal with Wozniak to split the bonus evenly between them if Wozniak could minimize the number of chips. Much to the amazement of Atari, Wozniak reduced the number of chips by 50, a design so tight that it was impossible to reproduce on an assembly line. According to Wozniak, Jobs told Wozniak that Atari had given them only $700 (instead of the actual $5,000) and that Wozniak’s share was thus $350.[52]

After leaving Apple, Jobs founded NeXT Computer in 1985 with $7 million. A year later, Jobs was running out of money, and with no product on the horizon, he appealed for venture capital. Eventually, he attracted the attention of billionaire Ross Perot who invested heavily in the company.[59]

In 1986, Jobs bought The Graphics Group (later renamed Pixar) from Lucasfilm‘s computer graphics division for the price of $10 million, $5 million of which was given to the company as capital.[64]

Jobs became de facto chief after then-CEO Gil Amelio was ousted in July. He was formally named interim chief executive in September 1997.[68] In March 1998, to concentrate Apple’s efforts on returning to profitability, Jobs terminated a number of projects, such as Newton, Cyberdog, and OpenDoc. In the coming months, many employees developed a fear of encountering Jobs while riding in the elevator, “afraid that they might not have a job when the doors opened. The reality was that Jobs’ summary executions were rare, but a handful of victims was enough to terrorize a whole company.”[69]

The talk about “Bad Steve” reminds me about Gundotra telling the “Icon Ambulance” story the day that Steve stepped down, in which Steve calls him up on a Sunday because “we have an urgent issue” involving the color gradient on the Google iphone icon.

For many people, that kind of anal attention to detail (and describing it as an “urgent” matter) would be the key sign of a “bad” boss. I think it’s acceptable in Steve because if anyone has shown a good grasp of the cost-benefit analysis of such perfection, it’s him.

Pixar is beyond “industry-dominating”. They basically invented the (CG film) industry. When Steve first acquired Pixar he was losing $1 million/year. It takes a true visionary to realize the potential in an idea and work through what to most would seem like an insurmountable challenge.

I think it’s more than the just the Apple II and the iPhone. The way I see it, the man’s main hobby was disruption. He disrupted the PC industry with the Apple II and later the Mac line. He disrupted the music industry with the iPod and iTunes, and the phone industry with the iPhone. He changed the animation industry with Pixar one of the most successful, both financially, and critically acclaimed movie company. He revived an industry which most considered a dead horse with the iPad and where most companies are still playing catch-up.

I agree if this was to happen back in the 90s, he wouldn’t be as praised as he is today, but what does that have to do with anything? If I was to judge anyone before their greatest moments, they wouldn’t pass the bar. And yeah, he was kind of an ass in his early days, but by most accounts he became a lot more mellow when returning to Apple.

I think you nailed it quite well, it is not about WHAT he did, but HOW he did it. Disruption was his way of doing things. And it matches quite well his “Think Different” heros, all of them were disrupters.

I imagine you’re right, but so what? A lot of excellent people are unpleasant in person in some circumstances. There’s a lot more to being praiseworthy than simply being a nice person.

I’ve worked with dictators (kitchen chefs in very high end restaurants) since I was 13.. First I hated it then it got to me.. If you’re truly passionate about something, and its your life mission, not much else matters.

Sometimes, not always, he’d invite me in to see certain big products before he unveiled them to the world. He may have done the same with other journalists. We’d meet in a giant boardroom, with just a few of his aides present, and he’d insist — even in private — on covering the new gadgets with cloths and then uncovering them like the showman he was, a gleam in his eye and passion in his voice. We’d then often sit down for a long, long discussion of the present, the future, and general industry gossip.

One year, about an hour before his appearance, I was informed that he was backstage preparing dozens of slides, even though I had reminded him a week earlier of the no-slides policy. I asked two of his top aides to tell him he couldn’t use the slides, but they each said they couldn’t do it, that I had to. So, I went backstage and told him the slides were out. Famously prickly, he could have stormed out, refused to go on. And he did try to argue with me. But, when I insisted, he just said “Okay.” And he went on stage without them, and was, as usual, the audience’s favorite speaker.

He quipped: “It’s like giving a glass of ice water to someone in Hell.” When Gates later arrived and heard about the comment, he was, naturally, enraged, because my partner Kara Swisher and I had assured both men that we hoped to keep the joint session on a high plane.

In a pre-interview meeting, Gates said to Jobs: “So I guess I’m the representative from Hell.” Jobs merely handed Gates a cold bottle of water he was carrying. The tension was broken, and the interview was a triumph, with both men acting like statesmen. When it was over, the audience rose in a standing ovation, some of them in tears.

He certainly had a nasty, mercurial side to him, and I expect that, then and later, it emerged inside the company and in dealings with partners and vendors, who tell believable stories about how hard he was to deal with.

But I can honestly say that, in my many conversations with him, the dominant tone he struck was optimism and certainty, both for Apple and for the digital revolution as a whole.

He looked at me like I was crazy, said there’d be many, many stores, and that the company had spent a year tweaking the layout of the stores, using a mockup at a secret location. I teased him by asking if he, personally, despite his hard duties as CEO, had approved tiny details like the translucency of the glass and the color of the wood.

He said he had, of course.

I begged him to return to the house, noting that I didn’t know CPR and could visualize the headline: “Helpless Reporter Lets Steve Jobs Die on the Sidewalk.”

But he laughed, and refused, and, after a pause, kept heading for the park. We sat on a bench there, talking about life, our families, and our respective illnesses (I had had a heart attack some years earlier). He lectured me about staying healthy. And then we walked back.

If you want to wonder why the original iPhone had a 3.5 screen, you’ve got to go back to 2007 when everyone else was still trying to copy the Blackberry. A 3.5 inch screen at that time was already unprecedented.

— a Samsung Galaxy S II, the “best Android phone you can buy, anywhere” — 15 days ago, I have realized another huge downside of larger screens: when holding the phone with one hand, I can’t reach the other side of the screen with my thumb.

This means it’s easy to touch any area of the screen while holding the phone in one hand, with your thumb. It is almost impossible to do this on the Galaxy S II.

This is an example of one of those design decisions that you don’t usually notice until you see someone doing it wrong. It’s one of the things that makes Apple products Apple products.

What that chaos needed was curation — a way to get value out of the information flood. But the role of the curator has been a contentious one, and not everyone has been on board with the concept.

That development, not surprisingly, creates a new problem. “The problem is who gets heard,” Blau says. “The real issue that remains is access to an audience.

Blau is right: Speech is easy. Being heard is hard and getting even harder.

“We don’t have an information shortage; we have an attention shortage,” Godin said. “There’s always someone who’s going to supply you with information that you’re going to curate. The Guggenheim doesn’t have a shortage of art. They don’t pay you to hang paintings for a show — in fact you have to pay for the insurance. Why? Because the Guggenheim is doing a service to the person who’s in the museum and the artist who’s being displayed.”

“The content aggregators are vampires!” said the always colorful Cuban. “Don’t let them suck your blood.” Cuban points to sites like Google News and The Huffington Post as the most aggressive content criminals. He tends to see no value in folks who gather, organize, summarize, or republish. He only finds value in content creation: “Vampires take but don’t give anything back.”

Not surprisingly, Godin wrinkles his nose at Cuban’s vampire metaphor. Simply put, he says it’s all wrong. “When a vampire sucks your blood, you make new blood,” Godin says. “The thing about information is that information is more valuable when people know it. There’s an exception for business information and super-timely information, but in all other cases, ideas that spread win. I’m not talking about plagiarism; I’m talking about the difference between obscurity and piracy. If the taking is so whole that the original is worth nothing … that’s a problem.”

Scoble has declared curation as the next “billion dollar” opportunity and wonders aloud as to whether he should “create or curate” as tech news breaks in Silicon Valley. Scoble says a curator is “an information chemist. He or she mixes atoms together in a way to build an info-molecule. Then adds value to that molecule.”

“I used to drink from the real-time fire hose, because on the social web, everything was about real time,” says Brian Solis, author of Engage. “Then I realized over the years that it’s actually more about right time than real time. In fact, when information comes through, it doesn’t necessarily mean that that’s the right time to engage, capture it, and share it. I’m more successful now creating a list of information, relevant information, and then repackaging, repurposing, and broadcasting that information at the right time.”

Curation taps the vast, agile, engaged human power of the web. It finds signal in the noise. And it’s most certainly going to unleash a new army of web editors armed with emerging curation tools.

But directories have clumsy interfaces, and they didn’t scale to the overwhelming growth in the number of websites. There were too many sites to catalog, and it was hard to determine relative rank of one site to another, in particular in context of what any one individual might find relevant (this is notable – because where directories broke down was essentially around their inflexibility to deal with individual’s specific discovery needs. Directories failed at personalization, and because they were human-created, they failed to scale. Ironically, the first human-created discovery product failed to feel…human).

But once again, what we want to pay attention to is changing. Sure, we still want to find good sites (Yahoo’s original differentiation), and we want to find just the right content (Google’s original differentiation). But now we also want to find out “What’s Happening” and “Who’s Doing What”, as well as “Who Might I Connect With” in any number of ways.*

Over the past 20 years what we’ve seen is the democratization of the first two principles above. Hypertext and Google made it possible for anyone to access any type of document. Platforms like WordPress, YouTube, Twitter and even Flickr have made it possible to disseminate documents. As Twitter co-founder Ev Williams said at last year’s Web 2.0 Summit, one of his overarching goals was to democratize publishing “by lowering the bar about as far as it could go”.

The graphic above illustrates this process of democratization. At the apex of the pyramid you tend to have people that use a technology as part of what they do for a living. With each successive shift in the user base a broader audience begins to adopt that technology until, at the bottom of the pyramid, the technology has permeated a society.

So what is the evidence that this is starting to happen? Along with the rapid growth of our community, the composition of our user base has shifted. At first it was people that curate as part of their daily work: researchers that are curating content for a project, journalists that are using our tools for their work. Now it’s broader and more more representative of the early adopter crowd.

They’re “incidental altruists” – people that are happy to share what they’ve curated but aren’t going out of their way to build value for a community. In other words, people that are mainly curating for themselves but are happy if what they’ve done benefits others.

So is curation here to stay? I think it’s safe to say it is. Curation is part of the DNA of the Web from its very origin, we have strong evidence that curation is at the beginning of the democratization process, and investors are providing the capital to ensure that this process continues. Given these facts I’m confident that curation is not just the flavor of the month, but a rapidly growing trend that represents both a third phase of the Internet and a huge opportunity for entrepreneurs and investors.

In other words, the value derived from connections between objects is far more valuable than any object individually. Existing search engines and social networks are good at connecting us to individual pieces of information and content, often providing recommendations based on our explicit and implicit browsing history. However, no modern entity has truly connected the fragmented web.

Obviously there are holes in this, such as people scamming the up-vote clicking mechanisms

I think you put your finger on it there. There are folks who have a financial incentive for their pages to rank high. A “user-curated” system just opens up additional avenues for “spam and SEO content farms” to game the system.

Nope, sorry. There are users with a financial incentive to game the system. The “actual measures to prevent abuse” would have to be pretty damned sophisticated. Algorithmic, in fact. Which brings us full circle…

(Good) Blogging requires time, talent and inspiration and it’s hard to have the three combined all the time.

While the above arguments suggest there could be more curators than bloggers, readers or followers of curators and curation platforms should even be a bigger number, making the use of curation mainstream.

What we’ve learned during this ongoing exercise is that curators are a diverse group but there seems to be several common denominators: most curators using Pearltrees are educated, highly intelligent people with somewhat of an altruistic nature. They tend to curate first because they want a reliable archive for things they’ve found and which they find interesting but at the same time they have a strong desire to help share what they know and/or have learned with anyone else that has a similar interest and/or inclination.

There is one other compelling reason why I think this is likely: human beings are all natural curators. I think it would be difficult to find a person that doesn’t collect something and take pleasure in the activity. For some people it’s baseball cards, for other high end automobiles, for others photos – the thing is, regardless of what a person collects and organizes – it’s still an act of curation. In other words everyone curates and what Pearltrees and other tools are doing is simply bringing to the digital world abilities which have long existed in our day to day lives.

But to add my two cents to this dialogue, I do not believe that anyone with the time, talent, and inspiration, as Guillaume Decugis points out, wishes to just become a content curator. Instead, the talented and inspired indivdual probably wants to do both: create original, compelling, exciting content AND serve as a curator a reader can trust.

Curation may seem like a simpler form of expression, but in order to become a trusted resource, the author/blogger would need to add his or her own thoughts to the article or conversation — “blogger lite” — if you will.

The average user wants to save and share content. Why push curation when this model can be conformed to essentially become curation.