That's 1.5 tricks more than you need to be considered a genius, Ballmer says.

Most companies fail, and successful companies are often one-trick ponies, but Microsoft is a two-and-a-half trick pony, according to former CEO Steve Ballmer, speaking at Oxford's Saïd Business School.

He was responding to a question about why Microsoft had failed to innovate in the mobile space, particularly given that it had invented the tablet computer way before it was popularized by Apple.

"Most tech companies fail," Ballmer replied. "They are zero-trick ponies. They never do anything well and they go away. You are a genius in the industry if you are a one-trick pony. You get some innovation right and then spin it. I am very proud of the fact that [Microsoft] has done at least two tricks. Tricks are worth billions and billions and billions of dollars."

He described the first "trick" as inventing the modern PC with Windows and Office. The second was "bringing microprocessor technology into the data center." He later referred to Xbox as being the additional half trick.

"I won't try to tell you that our record of innovation is perfect, but I'd say we've done more tricks than anybody else," he said. "Apple's done two, we've done two-and-a-half—half for Xbox."

When it comes to mobile, he admitted that "we got a bit behind," but that instead of giving up, Microsoft has tried to work out where it went wrong and find out how to build its assets. This led to Surface, Windows Phone, and the Nokia acquisition, which he described as "very important to us."

"With 20/20 hindsight, I regret that we didn't put the hardware and software together soon enough," he added, describing the "magical" way the PC came together with a Microsoft operating system and IBM hardware, or how Android and Samsung have benefited each other.

When asked what the toughest decision he had to make at Microsoft was, he said that the top five were all about "hiring or firing somebody." He said he had "more angst" about those decisions than that to buy Nokia.

The Nokia acquisition was particularly significant as it is a major hardware play. "The name of the company is Micro-soft," he said, referring to the company's software roots. "Xbox, Surface, and phones mean a pretty fundamental change to the way we self-identify and express our value-add."

The notoriously effervescent executive also told the audience that he was "quite a shy kid." The "most transformative" thing that he did to address this shyness was to be the team manager for the Harvard football team, a thankless administrative task. "You had to get up in front of the team every day and tell guys what to do," he said. "Footballers are not nice to managers, so I had to get myself pumped up. Since Microsoft I've had a lot more practice."

One of the final questions he was asked by an audience member was: "What's the best perk of being immensely wealthy and powerful?" After a raucous laugh from Ballmer—who owns four percent of Microsoft—he said that it was the fact that he can "play about any golf course" he wants on the planet. "I get a real kick out of that," he said. "I'm a lousy golfer, but I really love it."

"You thought it would be something bigger and more cosmic," he said, "but noooooo!"

191 Reader Comments

"that it had invented the tablet computer way before it was popularized by Apple."

Bullshit, Microsoft didn't do shit.

Those tablets were just PCs without keyboard and a mouse but came with an awful stylus. If they would have been the tablets we know now, they would have succeeded. Those things came with a PC operating system, they had terrible battery life in fact the battery life of PCs of those days, they were big, heavy, clunky and slow. They had nothing to to do with today's tablets. Again just PCs in a tablet format, not tablet computers.

And if you literally mean the form factor, rather than the usability or figuring out what tasks users would actively prefer a tablet to a traditional form factor for, then it seems to me the credit goes either to Apple, for the Newton line (which predates any MS tablets) or to Gene Roddenberry.

Ballmer has it all wrong. Apples one and only trick was iTunes with its media deals. That alone begat the dominance of the iPod/iPod Touch. The dominance there begat the iPhone when a cellphone was glued on to it. The dominance of both begat the iPad which is simply a bigger iPod Touch. You go back in time and kill iTunes and the media deals and none of the others (including Apple itself) would even exist today.

C'mon, man ... give Apple some credit. They made a usable touch-interface. MS tried for so long, but it always failed, b/c they looked at PDA's and Pads as "little computers". If Apple hadn't come along and brought a fully-fingerable (oh god...bad language) interface, MS would still be trying to push stylus-interfaces upon us.

edit:

I think what helped Apple was internet. Old MS PDA's (even Palm) had useless programs on them, like spreadsheets. Very few folks are going to use their PDA to look at a spreadsheet. But, execs used that to justify charging the PDA to the corporate account instead of their own wallet.

Apple came along and said "screw spreadsheets and office crap ... you want internet access, email, youtube, maps". Um...yeah. "Anything else, and let some 3rd party develop it, you pay for it, and we'll host the store you can get it at." Um... ok!

Before then, MS foisted stylus PDA's on folks, and folks had to go out and buy applications for their PDA. No consolidated app-store to make it easy. It's just ridiculous how MS dropped the ball on this by simply thinking folks would accept what was handed to them as "good enough". Then Apple came along and said "fuck that, if we make it easier, more folks will buy-in"... and they did.

I know everyone disses on the stylus, but in 2003 my Pocket PC could do something no Apple product can do, no Android product can do - not even Windows 8 can do: Handwriting recognition. I could write a sentence on that thing and it was translated into ASCII with almost zero errors. Not that chicken-scratch pseudo-printing that Palm made you learn, I'm talking actual cursive. And not in a limited window at the bottom of the screen, like on my Windows XP Tablet Edition, but anywhere on the screen. If anyone offered that today I'd dump my current phone and tablet and switch. The Galaxy Note comes close, and I may eventually get one, but even it doesn't do handwriting recognition on the whole screen, only in a window.

Oh yeah, you can do a lot on a phone... They are 'something', but workhorse computers they are not. Light email and some basic maps, internet use is about all they are good for, but they are too small to do much 'real work' on, separate uses and separate categories.

Quote:

43% of US smartphone buyers are cultists now?

At least, if not more.

Quote:

Why is it valuable to have to figure out how something works? If you're worth $40 and hour and it takes you three hours to figure something out, wasn't it money better spent to get something you can figure out in 10 minutes and spend the next three hours, you know, making $120?

Most people are off the clock when they figure it out, and generally the longer it takes the more ability the device actually has to do real world stuff. There is a reason 3 year olds can play with an iphone, because its basically designed for kids, to do stuff kids can figure out. Steve Jobs plan all along was to treat you like a 3 year old and guess what, it works as a business model.

Quote:

The only people cheering Apple on are people too dumb to figure out an interface with more than 3 options.

"that it had invented the tablet computer way before it was popularized by Apple."

Bullshit, Microsoft didn't do shit.

Those tablets were just PCs without keyboard and a mouse but came with an awful stylus. If they would have been the tablets we know now, they would have succeeded. Those things came with a PC operating system, they had terrible battery life in fact the battery life of PCs of those days, they were big, heavy, clunky and slow. They had nothing to to do with today's tablets. Again just PCs in a tablet format, not tablet computers.

And if you literally mean the form factor, rather than the usability or figuring out what tasks users would actively prefer a tablet to a traditional form factor for, then it seems to me the credit goes either to Apple, for the Newton line (which predates any MS tablets) or to Gene Roddenberry.

Tablets were in 2001 A Space Odyssey before they were in Star Trek, and someone else will probably post an even earlier reference.

Also, you forget the dozens of other things that Apple did in that period to cement their success:1) Inventory control (days instead of weeks/months)2) Buying out the majority of supply for things like NAND flash, or minidisc drives, or today like chips from Samsung or screens from other suppliers3) Actually designing CE hardware people would want to buy and pay hundreds of dollars more for!

The second was "bringing microprocessor technology into the data center."

What? Microsoft had nothing to do with that. All the credit goes to intel/amd.

If any software can claim credit it's not Windows, which has never been a dominant force in the server market. Linux came out years ahead of windows server and already had 50% marketshare by the time Microsoft got 20%. Today linux is closer to 75% and windows is still around 20%.

And if you literally mean the form factor, rather than the usability or figuring out what tasks users would actively prefer a tablet to a traditional form factor for, then it seems to me the credit goes either to Apple, for the Newton line (which predates any MS tablets) or to Gene Roddenberry.

Actually, on this one point, I have to mention that "Windows for Pen Computing" came out in 1991. I used to be a Newton developer (as well as dabbling with MagicCap, PalmOS, and a few other things), and I remember "Pen Computing" magazine back then having plenty of articles discussing "Windows for Pen Computing".

It was crap, sure, but it was earlier. (And it was genuinely useful for certain very specific markets, like field data collection.)

What Apple did with the Newton (and later Palm as well) was, they figured out that UI conventions for a pen-driven handheld were completely different than the UI conventions for a mouse-driven desktop. For example, the business with the four corners of the screen having an "infinite hit box" (because you can't mouse past them) doesn't apply with a stylus. A menu bar at the top of the screen would be covered by your hand if used with a stylus. You needed completely different UI conventions.

(Something that's interesting to me is, touch needs different UI conventions than a stylus, or at least enables them. If you're interacting with the app by ephemeral touches, rather than by frequent menu operations and writing, covering the display with your hand becomes less of an issue.)

By the way, another bit of relevant history: know how Palm figured out what UI designs and conventions would work in a handheld? They used to sell Graffiti as add-on software for multiple handheld operating systems. I actually own two copies of it! (I've got the Newton and MagicCap versions. It was amazing on MagicCap, which had no built-in handwriting recognition at all.) So they were intimately familiar with what everyone else was doing in the space, and used that expertise to build PalmOS.

Also, you forget the dozens of other things that Apple did in that period to cement their success:1) Inventory control (days instead of weeks/months)2) Buying out the majority of supply for things like NAND flash, or minidisc drives, or today like chips from Samsung or screens from other suppliers3) Actually designing CE hardware people would want to buy and pay hundreds of dollars more for!

I know everyone disses on the stylus, but in 2003 my Pocket PC could do something no Apple product can do, no Android product can do - not even Windows 8 can do: Handwriting recognition. I could write a sentence on that thing and it was translated into ASCII with almost zero errors.

This is very much a "YMMV" thing. I actually had that experience on the Apple Newton (with OS 2.0, which had a better HWR engine than OS 1.X). It very much depends on the individual user's handwriting -- not only how it looks, but how it's executed.

Related note: I'm told that handwriting recognition got much better much more quickly in Japan (even when writing English) because of how handwriting was taught there. In the US, you're taught what the result should look like. In Japan, I'm told people were told precisely what strokes to make and how; it doesn't matter if the result looks like an "A", if you didn't do exactly the right strokes in the right order, it was "wrong". That's why the IBM PC110 (a palmtop 486 with a small HWR area between the keyboard and the display) was released in Japan and not the US. (And yes, this is another piece of old hardware that I own one of. The last time I booted it up, it ran Red Hat Linux very well... on a type 3 PCMCIA hard drive, since it had no internal hard drive of its own.)

Also it still shows he doesn't get it by not including iTunes / iCloud, which insanely ironic seeing he comes from software. A massive chunk of iOS success comes from the integrated experience of getting apps, music and movies, and syncing the data. It is also just not a part of the expected experience of any modern computing device, its a large source of revenue for Apple. This is a huge shift from the "I buy my hardware here, then I buy my software from another store, then I sign up for online services elsewhere". How does he miss that ?

Which specific two tricks was Ballmer referring to in regards to Apple?

I don't think credit for GUI can go to MS or Apple, it has to go to the inventors, Xerox Parc. I do think invention of the PC as a mass market object has to go Apple though, with the Apple 1. Before that point there really wasn't a conception of "regular" people owning a PC, it was all mainframes and terminals with thousands of dollars worth of equiptment (back when that was expensive). After that point you get "trash"-80's, Comodore PETs, Apple ][, IBM PC (kicking and screaming), etc.

If Xbox is half a trick, then would Apple TV also be half a trick? If not, why not?

The second was "bringing microprocessor technology into the data center."

What? Microsoft had nothing to do with that. All the credit goes to intel/amd.

If any software can claim credit it's not Windows, which has never been a dominant force in the server market. Linux came out years ahead of windows server and already had 50% marketshare by the time Microsoft got 20%. Today linux is closer to 75% and windows is still around 20%.

Oh, I wouldn't say that. The "data center" didn't start when Intel or AMD entered the market, and a lot of people seem to have forgotten Unix.

Of course the data center didn't start when Intel or AMD entered the market. Nobody said Intel and AMD created the data center, they said they brought microprocessor technology into the data center. And it was Linux, not Windows, that made it happen.

In the '90s, when I was with a Fortune 500 company and the data center was Sun and IBM and HP servers - and IBM Big Iron mainframes - you needed mondo permission (and budget) to do anything, and getting your "own" server was impossible. Then this Linux thing came along. It wasn't "standard", but IT had a hard time stopping it. Departments started taking an old PC, loading it with Linux, and making their own servers. Suddenly departments weren't willing to pay tens of thousands of dollars to share a server that they didn't control. Pretty soon the IT had a choice - lose their jobs or embrace Linux. The rest is history.

What exactly do you think was inside those Suns and HPs anyway? Linux had nothing to do with microprocessors getting into the datacenter. Once they were already there, it had no problem getting itself installed on them though.

Linux wasn't a threat to MS until NT was well entrenched, itself having replaced the expensive minicomputers in many datacenters. Linux moved in with the promise of lowering that expense even more. It may have been different in your particular case, but broadly, MS definitely had a lot more to do with early datacenter microprocessor adoption than Linux.

Also it still shows he doesn't get it by not including iTunes / iCloud, which insanely ironic seeing he comes from software. A massive chunk of iOS success comes from the integrated experience of getting apps, music and movies, and syncing the data. It is also just not a part of the expected experience of any modern computing device, its a large source of revenue for Apple. This is a huge shift from the "I buy my hardware here, then I buy my software from another store, then I sign up for online services elsewhere". How does he miss that ?

Other than #1 (which by the way is failing), it's all the same customer. It is the youthful trendy person who wants to be entertained on the go.

It's not the person trying to get their multinational business to share files, statistics, e-mail, and raw data across 3 countries to deliver millions of dollars worth of products to millions of customers world wide.

It's not the person trying to run a small at home business in their spare time.

It's not the person that is trying make invitation cards at home, or print out an essay for a school project.

Sure, those people can have "i" products, and many of them will, but those specific categories of needs are not met by "i" products. An iPhone/iPod/iPad isn't going to print your report and it isn't going to run your enterprise software. What it will do, is keep you entertained while you sit on the train, or in the waiting room, or on your couch eating popcorn. That's all "mobile multimedia," that's Apple's one trick. The enterprise, desktop, and console gaming are Microsoft's 2 1/2 tricks.

Microprocessor technology? What do you think they were using before? Fairy dust?

Processors. In my experience, older mini and mainframe computers typically used processors (central processing units) that were not microprocessors (central processing units contained entirely on a single chip). Workstations blurred the line.

(Consider the Sun 2. It had a 68010, but it also required a proprietary memory management unit as a separate thing, and you needed both of 'em.)

That's the thing about lines. They're so flexible. One could argue if it doesn't come with a hardware FPU then it's not a microprocessor. Anyway someone already made the point that it wasn't MS doing, and even during Intel's Pentium heyday there was the workhorse UltraSPARC.

The Apple fans are going to idolize and think higher of anyone else's offerings. Their "god" is obviously the best one.

Microsoft did some good stuff and impacted modern computers (i.e. - think broad spectrum). Apple did too. Example: the iPod wasn't the first, but it broke new ground on the ease of use. Most Ars readers are tech savvy, but for every one of us, we have probably 5 people whose technological acumen extends so far as swapping out a DVD in the tray. and pushing play. Anything that improves accessibility is going to have a huge impact.

These things evolve. Again, in the case of the iPod.... once upon a time Sony came out with this awesome little thing called the Walkman. That came from the desire to listen to cassettes. That evolved. Eventually, CD's came out. Someone adapted the Walkman to CD's. From there, it was really only a matter of time before someone figured out how to store the music on another media.

At what point do we no longer use a portable electronic device, and we have the music stored in an implant of some type and the music is played to us by transmitting the vibrations directly into the jawbone or directly into the auditory nerve? Oh sh*t.... I better patent that idea first................

The second was "bringing microprocessor technology into the data center."

What? Microsoft had nothing to do with that. All the credit goes to intel/amd.

If any software can claim credit it's not Windows, which has never been a dominant force in the server market. Linux came out years ahead of windows server and already had 50% marketshare by the time Microsoft got 20%. Today linux is closer to 75% and windows is still around 20%.

Oh, I wouldn't say that. The "data center" didn't start when Intel or AMD entered the market, and a lot of people seem to have forgotten Unix.

exactly!!!!

Have to credit the "previous risc/workstation revolution" sun/sgi/dec etc and xerox and the microcontroller/processor revolution in the 70's and 80's and before that all the previous tech leaders in the 40's to 70's (from government labs to private industry and startups etc). Modern computing maybe a lot advanced since then but its built on the work of all the others that have gone before from the recent back to Ada and babbage and earlier.

Its also a bit amusing to see how many the times the wheel gets reinvented.

Look at how some of Intels research chips resemble some of the Inmos transputers (a lot of resultant cores from the transputer ended up in set top boxes in chips from ST)

Apple and microsoft have contributed a fair bit to modern computing but also to the dumbing down/standardisation of pc's.

The second was "bringing microprocessor technology into the data center."

What? Microsoft had nothing to do with that. All the credit goes to intel/amd.

If any software can claim credit it's not Windows, which has never been a dominant force in the server market. Linux came out years ahead of windows server and already had 50% marketshare by the time Microsoft got 20%. Today linux is closer to 75% and windows is still around 20%.

Oh, I wouldn't say that. The "data center" didn't start when Intel or AMD entered the market, and a lot of people seem to have forgotten Unix.

exactly!!!!

Have to credit the "previous risc/workstation revolution" sun/sgi/dec etc and xerox and the microcontroller/processor revolution in the 70's and 80's and before that all the previous tech leaders in the 40's to 70's (from government labs to private industry and startups etc). Modern computing maybe a lot advanced since then but its built on the work of all the others that have gone before from the recent back to Ada and babbage and earlier.

Its also a bit amusing to see how many the times the wheel gets reinvented.

Look at how some of Intels research chips resemble some of the Inmos transputers (a lot of resultant cores from the transputer ended up in set top boxes in chips from ST)

Apple and microsoft have contributed a fair bit to modern computing but also to the dumbing down/standardisation of pc's.

Microsofts biggest trick: Decoupling the OS from the hardware in the microcomputer market. Before MS, the OS was "married" to hardware. Most people don't remember the markups Amiga and Apple charged for their hardware. Microsoft came up with the following idea: You can choose hardware from any manufacturer you want (competition, prices go down), our software is just a part of the price.

I'd like to see that markup on the Amiga. Find a PC compatible from 1985 that could display 32 colors from a 12 bit palette, had 4 channels of digital sound, and had a GUI OS. Now compare the PCs cost vs the Amiga which had all of that built-in.

Everyone seems to have forgotten some of Apple's older tricks: -desktop publishing. Revolutionary for the time. They followed it with -multimedia publishing, creation, editing and mastering of cd's and dvd's, and desktop video editing-maybe another half trick for the desktop audio.- oh and let's not forget that the Apple II was the first mass produced personal computer ever.

then you can factor in iPod, and IOS.

Whether or not they won the majority of market share they were, and still are, highly influential and vastly copied.

Ballmer has it all wrong. Apples one and only trick was iTunes with its media deals. That alone begat the dominance of the iPod/iPod Touch. The dominance there begat the iPhone when a cellphone was glued on to it. The dominance of both begat the iPad which is simply a bigger iPod Touch. You go back in time and kill iTunes and the media deals and none of the others (including Apple itself) would even exist today.

C'mon, man ... give Apple some credit. They made a usable touch-interface. MS tried for so long, but it always failed, b/c they looked at PDA's and Pads as "little computers". If Apple hadn't come along and brought a fully-fingerable (oh god...bad language) interface, MS would still be trying to push stylus-interfaces upon us.

edit:

I think what helped Apple was internet. Old MS PDA's (even Palm) had useless programs on them, like spreadsheets. Very few folks are going to use their PDA to look at a spreadsheet. But, execs used that to justify charging the PDA to the corporate account instead of their own wallet.

Apple came along and said "screw spreadsheets and office crap ... you want internet access, email, youtube, maps". Um...yeah. "Anything else, and let some 3rd party develop it, you pay for it, and we'll host the store you can get it at." Um... ok!

Before then, MS foisted stylus PDA's on folks, and folks had to go out and buy applications for their PDA. No consolidated app-store to make it easy. It's just ridiculous how MS dropped the ball on this by simply thinking folks would accept what was handed to them as "good enough". Then Apple came along and said "fuck that, if we make it easier, more folks will buy-in"... and they did.

I know everyone disses on the stylus, but in 2003 my Pocket PC could do something no Apple product can do, no Android product can do - not even Windows 8 can do: Handwriting recognition. I could write a sentence on that thing and it was translated into ASCII with almost zero errors. Not that chicken-scratch pseudo-printing that Palm made you learn, I'm talking actual cursive. And not in a limited window at the bottom of the screen, like on my Windows XP Tablet Edition, but anywhere on the screen. If anyone offered that today I'd dump my current phone and tablet and switch. The Galaxy Note comes close, and I may eventually get one, but even it doesn't do handwriting recognition on the whole screen, only in a window.

For the brief-moment-in-time when I owned a low-end galaxy android phone, what impressed me the most was using voice to write IM's. I could push the button, say my IM, and it would transcribe it. Push "send".

I think hand-writing recognition may be neat, but voice-recognition seems to be a much bigger deal now.

1) Bringing the Graphical User Interface to the masses and changing the way people interact with PCs2) iPod/iTunes store (this might even be two separate ones)3) iPhone / Apps4) Tablet computing (Microsoft tried and failed so many times on this one).

And each has been wroth billions and billions of dollars.

Really?

I will give Apple half for the GUI for the simple fact Microsoft brought it to Billions of people.I will give Apple 1 for iTunes because none of the devices that work with would have gone anywhere without it.I will give Apple half for iPhone because they really did change the "smart phone" what was considered a "smart phone" before the iPhone was complete junk in today's standards.The iPad was a normal progressiion can't give Apple anything for it, natural evolution from the iPod touch and iphoneI will however give apple 1 for the Apple I

All your points are debatable but half for the iPhone is insane. The iPhone is probably worth two points On its own. Apple single handedly created the hardware and os/style for mobile computing. They literally created the modern day smart phone and tablet industries on their own.

You're not understanding the point of his "pony". They get one point for mobile phones, Thats it. You don't get one for the OS on it, and one for the hardware. I also don't know why people care this much.

So people can assign half trick ponies but not multiple trick ponies? Especially in the case where an os and the hardware it runs on are very different things and often not even made by the same company.

For all you know I am a bronie and thus am an expert on ponies.

I don't care about this magical unit of measurement as much as the irrational idea that the iPhone was not worth a full unit. In fact the Xbox was assigned a half trick pony. Equating the impact of the xbox to that of an iPhone is crazy. An iPhone would be 500 trick horses if the xbox is half a trick pony.

Technologically speaking the two biggest mainstream advancements in the last twenty years are the invention of the Internet by Al Gore (or the expansion of the Internet to the general public) and the introduction of the iPhone.

1) Bringing the Graphical User Interface to the masses and changing the way people interact with PCs2) iPod/iTunes store (this might even be two separate ones)3) iPhone / Apps4) Tablet computing (Microsoft tried and failed so many times on this one).

And each has been wroth billions and billions of dollars.

It's not a list of what MS has done, it's a list of "tricks". Which doesn't mean doing it first, but doing it successfully and being able to continue to generate revenue and products well as a business on that same core idea.

Sure, Apple did the GUI. But it was a niche product until Windows brought GUI to the masses - so, for example, if the GUI was anyone's trick (I'd argue it isn't, as no one has managed to capitalize on it almost exclusively -- it'd be like counting a car engine as a trick), it'd be *Microsoft's* for bringing it to people.

Similarly, 2-4 I'd call "iOS + ecosystem". That is the trick. The devices are "iOS, iOS + phone, iOS on big screen". The actual "trick", the actual monetized product and mindshare / revenue generator, is iOS. So, I'd call Apple *probably* a one-trick pony, but I'd have to think on it a bit deeper than I'm inclined to at the moment to say so with conviction.

It's also why Ballmer called the datacenter an MS trick. Breaking the vendor-specific nature of things was HUGE. It's still huge. It was a gateway to the enterprise market, too, which MS dominates.

The second was "bringing microprocessor technology into the data center."

What? Microsoft had nothing to do with that. All the credit goes to intel/amd.

If any software can claim credit it's not Windows, which has never been a dominant force in the server market. Linux came out years ahead of windows server and already had 50% marketshare by the time Microsoft got 20%. Today linux is closer to 75% and windows is still around 20%.

Oh, I wouldn't say that. The "data center" didn't start when Intel or AMD entered the market, and a lot of people seem to have forgotten Unix.

I haven't forgotten Unix, I just didn't mention it. I was talking about Windows and simply pointing out that Linux is currently ahead of Windows in terms of server influence and it has always been ahead of Windows, at no point was Windows ever a dominant force in servers. Yet Balmer seems to think otherwise.

Ballmer kept hammering away at the PC side of things when he should have focused on mobile. Huge fumble there. MS has ground to make up. Can they do it? Sure they can, but it will take time.

His comment about "Micro-SOFT" buying up hardware ... if you're so focused on a brand name that it dictates what you do and don't do, then you need to seriously have your head examined as a CEO. Who cares if your name is Microsoft ... buy fucking hardware and leverage that shit to your advantage. It's a no-brainer.

Of course, it's easy to arm-chair quarterback Ballmer. We all weren't the head of a multi-billion-dollar company. But, maybe it's easy b/c we're outside looking in. Ballmer was perhaps mired down in day-to-day Ops that he couldn't look outside-in the way Jobs could.

I don't really have any significant issues with Ballmer as many seem to have. I think microsoft's biggest issue is they grew stale. They also had a massive amount of brain drain from the company because of their success. So many of their employees became millionaires in the early days that they either left or were less motivated/interested. Mix in the introduction of the public Internet through the dot com bust and not only had Microsoft lost a massive amount of good employees they then had a lot of competition for the scraps and up and comers.

Oh yeah, you can do a lot on a phone... They are 'something', but workhorse computers they are not. Light email and some basic maps, internet use is about all they are good for, but they are too small to do much 'real work' on, separate uses and separate categories.

That depends entirely on your definition of 'real work'.

By my definition of 'real work' a smartphone absolutely is capable. The thousands of people using enterprise iOS app I wrote are spending hours every day five days a week doing computational work in the field with their smartphone. They used to spend half their day taking down notes on a clipboard then go back to the office and spend half a day working with their notes on a computer. Now they just do all the computing in the field.

For a lot of workers, the primary function of a computer is communication, and smartphones do a perfectly good job of communication.

I'm ever going to replace my desktop workstation with two or three 27" displays for a 4" phone but that's just me, my work is different to other people's work. Not everybody needs a large screen to get work done. We have been doing real work for many thousands of years without any computing devices at all.

Why is it valuable to have to figure out how something works? If you're worth $40 and hour and it takes you three hours to figure something out, wasn't it money better spent to get something you can figure out in 10 minutes and spend the next three hours, you know, making $120?

Most people are off the clock when they figure it out, and generally the longer it takes the more ability the device actually has to do real world stuff. There is a reason 3 year olds can play with an iphone, because its basically designed for kids, to do stuff kids can figure out. Steve Jobs plan all along was to treat you like a 3 year old and guess what, it works as a business model.

Quote:

The only people cheering Apple on are people too dumb to figure out an interface with more than 3 options.

So, like, 90% of humanity?

Yep at least 90%

Really, so you think I'm a three year old who can barely figure out an interface more than 3 options? I'm pretty sure the amount of time I spend at a unix prompt or writing C code suggests otherwise.

Apple does well with OSX devices but the market is going nowhere. OSX is still peanuts compared to Windows/Office or iOS devices.

Mac sales are climbing despite a shrinking traditional PC market. In a few years the landscape could be very different from OSX of today, I recently purchased my first mac ever after being a windows user since 3.1, and I really can't see myself going back.

Apple does well with OSX devices but the market is going nowhere. OSX is still peanuts compared to Windows/Office or iOS devices.

Mac sales are climbing despite a shrinking traditional PC market. In a few years the landscape could be very different from OSX, and I recently purchased my first mac ever after being a windows user since 3.1, and I really can't see myself going back.

I have a work colleague who did the same thing a few years ago.

He had some hardware problems and got burned by choosing to buy the wrong mac (against my advice by the way...) so after 2 years of being a mac user he's gone back to windows. From his complaints the last few weeks I suspect his next purchase will be a mac.

Ballmer has it all wrong. Apples one and only trick was iTunes with its media deals. That alone begat the dominance of the iPod/iPod Touch. The dominance there begat the iPhone when a cellphone was glued on to it. The dominance of both begat the iPad which is simply a bigger iPod Touch. You go back in time and kill iTunes and the media deals and none of the others (including Apple itself) would even exist today.

C'mon, man ... give Apple some credit. They made a usable touch-interface. MS tried for so long, but it always failed, b/c they looked at PDA's and Pads as "little computers". If Apple hadn't come along and brought a fully-fingerable (oh god...bad language) interface, MS would still be trying to push stylus-interfaces upon us.

edit:

I think what helped Apple was internet. Old MS PDA's (even Palm) had useless programs on them, like spreadsheets. Very few folks are going to use their PDA to look at a spreadsheet. But, execs used that to justify charging the PDA to the corporate account instead of their own wallet.

Apple came along and said "screw spreadsheets and office crap ... you want internet access, email, youtube, maps". Um...yeah. "Anything else, and let some 3rd party develop it, you pay for it, and we'll host the store you can get it at." Um... ok!

Before then, MS foisted stylus PDA's on folks, and folks had to go out and buy applications for their PDA. No consolidated app-store to make it easy. It's just ridiculous how MS dropped the ball on this by simply thinking folks would accept what was handed to them as "good enough". Then Apple came along and said "fuck that, if we make it easier, more folks will buy-in"... and they did.

I know everyone disses on the stylus, but in 2003 my Pocket PC could do something no Apple product can do, no Android product can do - not even Windows 8 can do: Handwriting recognition. I could write a sentence on that thing and it was translated into ASCII with almost zero errors. Not that chicken-scratch pseudo-printing that Palm made you learn, I'm talking actual cursive. And not in a limited window at the bottom of the screen, like on my Windows XP Tablet Edition, but anywhere on the screen. If anyone offered that today I'd dump my current phone and tablet and switch. The Galaxy Note comes close, and I may eventually get one, but even it doesn't do handwriting recognition on the whole screen, only in a window.

I still think I could bang out five hundred words in graffiti. The original palm devices do deserve credit. They were the real predecessors to the mobile device market today, as opposed to the "smart phone" market in 2006.

I actually still have an original palm pilot in its original shrink wrap. I had bought one for a group of my employees when they first came out and I bought one for one of my best friends. My friend was not into computers at all and it ultimately just stayed at my house and has kicked around ever since.

I certainly would take a stylus over a little tiny hamster keyboard. I still remember the chorus of "fail" heard when it was shown the iPhone would not have a stupid hardware keyboard.

While I like the stylus, the touch screen simply offers too many advantages over the stylus.

By my definition of 'real work' a smartphone absolutely is capable. The thousands of people using enterprise iOS app I wrote are spending hours every day five days a week doing computational work in the field with their smartphone. They used to spend half their day taking down notes on a clipboard then go back to the office and spend half a day working with their notes on a computer. Now they just do all the computing in the field.

For a lot of workers, the primary function of a computer is communication, and smartphones do a perfectly good job of communication.

I'm ever going to replace my desktop workstation with two or three 27" displays for a 4" phone but that's just me, my work is different to other people's work. Not everybody needs a large screen to get work done. We have been doing real work for many thousands of years without any computing devices at all.

Again, that is taking notes or doing basic PHONE stuff, not really computing. Totally different purposes and not really the same demographics or 'apples to apples' comparison... At the end of the day nobody is going to write a massive report, multi million dollar presentation, or thesis on a phone. With windows you have a whole host of applications designed for pretty much any purpose there is, with OSX you better hope they spent extra to backport it which is not usually the case.

Quote:

Bullshit. There are some "cultists" but it's not everyone. Please don't brand me as a cultist just because I happen to like the same products as they do.

Calls em like I sees em, maybe you should have better taste then.

Quote:

Really, so you think I'm a three year old who can barely figure out an interface more than 3 options? I'm pretty sure the amount of time I spend at a unix prompt or writing C code suggests otherwise.

Okay so why wouldn't you go with a higher powered device that is based on the Linux core that you can customize and root, and modify versus a sandboxed toy for 3 year olds? You pick the toy, why throw a fit when you get called on it?