Research firm IDC (which is part of my former employer, IDG) has released new numbers on PC sales, and there’s only one way to describe them: they’re uuuuuuuuuuuuuuugly.

How ugly? Worldwide shipments in the first quarter were down 13.9% over the first quarter of 2012. That’s not only worse than IDC’s already gloomy expectations — it’s the biggest drop since 1994, when the company started publishing these quarterly figures.

It’s tempting to blame the bad news on Windows 8, but it can’t shoulder all the guilt — IDC said Apple is shipping fewer Macs these days too, so it’s conventional personal computers in general that are suffering from doldrums, not ones based on a particular operating system.

Are people not buying Windows PCs and Macs because they’re spending their dough on iPads and other tablets instead? Well, maybe. It’s worth noting that in December 2009, right before the iPad was announced, IDC said it expected PC sales in 2013 to be growing by double digits. (That’s yet more evidence that trying to predict long-term sales of technology products is not an inexact science, but a hopeless one.)

I can’t help but think, though, that the first signs that the PC market might be maxing out came in early 2007, before Windows 8, the iPad or even the iPhone had any influence on the business. That’s when Microsoft released Windows Vista, and an enormous number of consumers and businesses responded by saying, essentially, “No thanks, we’re perfectly happy with Windows XP.” Even today, almost a dozen years after XP’s release, the company is trying to convince a fair chunk of the PC-using world that it didn’t perfect the PC operating system back in 2001.

It’s not just that people are stubbornly refusing to see newer versions of Windows as superior to older versions of Windows. Back in the 1990s and early years of this century, PC hardware was getting better at such a rapid clip that new PCs were often far better than the machine you’d bought two or three years earlier. Today, even a four- or five-year-old PC may still have more processing power, RAM and disk space than you need. And the industry is having trouble coming up with new features that large numbers of people find irresistible.

I’m a fan of ultrabooks myself, but if they’ve failed to restore the PC business to vibrant growth, it may be because they’re about stripping out features rather than adding new ones — they have smaller screens, less storage, fewer ports and no DVD drive. That’s a tough sell to garden-variety PC buyers who have been conditioned to expect next-generation computers to offer tangible new stuff, not tasteful minimalism.

People, generally speaking, are sensible. The PC market grows when the industry gives them computers they see as meaningful improvements on what they already have. It stays stagnant, or shrinks, when new PCs don’t look that much better.

Folks aren’t going to stop purchasing conventional computers altogether, of course; the PC industry isn’t the PDA industry. But maybe it’s turning into the TV market — one in which a typical household buys the best product it can afford, then holds on to it until it breaks or a true great leap forward (like HDTV) comes along.

There is one meaningful caveat: IDC doesn’t include Windows tablets and tablet-laptop hybrids (like notebooks with detachable keyboards) in its study, which means that it’s not accounting for sales of most of the most interesting, forward-looking Windows 8 computers. Nor does it include iPads, Android tablets and other devices people are using to perform tasks that look like personal computing to me.

If IDC and other research firms defined PC as “a computing device used for general-purpose personal and business productivity, capable of running user-installable software,” their numbers wouldn’t show the market in decline. Instead, it would look like it was booming — and we pundits would all be writing about how enthusiastically consumers and companies were adopting the radically new sorts of PCs that the industry was introducing.

I pretty much agree with everything here but Ill add that more people then ever are scratch building PCs. 26 comments here and at least 3 of us have 1 or more home built PCs.

I think PCs will eventually settle down as a sector of a larger computing ecosystem. I mean I love my Nexus tablet but if im hanging out at home I would much rather surf and play on my 24 inch wide screen and type on my keyboard.

Pretty much. Hewlett Packard is NOT a good indicator of the PC market's health. Neither is Dell. Why would I buy a Dell or HP when I can get better hardware at near wholesale prices and manufacture the PCs myself? It's not hard to do. IT training is widespread. I can do it. Millions of other people can do it. There's no need to pay a middleman hundreds of dollars to do what you can do yourself in an hour and a half or less. If your studies aren't covering sales of OEM kits and individual hardware, then chances are you're missing the big picture. If you see a sales combo of at minimum, motherboard CPU and Power Supply, chances are it's part of a full system rebuild. Or, a new PC. Especially if cases are selling along with it. And GPUs. And Memory, hard drives, disc drives, monitors, keyboard and mouse.

Then, you have tablets. Tablets cut into the laptop market in a direct sort of way. Laptops sell to the average joe on their portability and utility more than on power. Tablets are more portable, pretty useful and frankly, are cheaper as well. Laptop sales are going to be for die-hard users of that medium, such as people that want souped up gaming laptops.

What about components for custom builds and upgrades? I mean the fact is in 1991 very few people built their own PCS - In 2013 I personally have 4 PCs in the house of varying spec all built from scratch. Even only taking into account my main machine, it's a 3yr old i7 quad and I just dropped around £600 on upgrades for it. Since upgradeability is one of the key selling points of desktop PC architecture then not including the sale of components for those machines is a glaring omission, especially considering the increase in people building their own computers. The other issue people are ignoring is the gaming market - One big issue for PC gaming was the dominance of consoles - this meant that many PC titles were "dumbed down" in order to allow console ports of the same titles - this also stifled the hardware market to some extent so now at the end of a console generation, even a mid-level PC can easily play the most demanding titles. With the advent of the PS4 and new Xbox, and more importantly their shift to X86 architecture, we can expect to see an increase in high level PC games. Additionally the display market has slowed down with most monitors settling at a 1080p resolution. So over the next year or two we can expect to see a leap in hardware as super-high res displays become more commonplace and developers start to bring out more demanding titles.

The reduction in the PC market is probably inevitable. But the area where market is being lost unnecessarily is in providing user-friendly products for seniors and other not tech-savvy persons. I am in my sixties and have used pcs for decades and e-mail regularly. But my ability to use PCs and other technology at all is not because of the design, the tutorials, or the tech support, all of which are abysmal. Windows 8 (which I use as I write this) I have learned to use, to a limited degree, not by online or offline instructions, but by trial and error, using time that is available to me only because I am retired. As I buy each piece of new technology, and try to learn how to use it, I cringe in frustration at the ambiguous, incomplete, and near-useless sources of instruction, comprehensible only to those who already know how that technology works.

More PCs and related products could be sold if major companies 1) tested their products using lay persons, not engineers (and if they do this already, it is being be done incompetently), and 2) hired English majors to design their instructional materials.

I would extend Randall's point to computers in general of different price points; an entry level $300 laptop doesn't behave that much differently on the internet from a $500 laptop or even a $3000 laptop. It's only when you're actually doing things well beyond mere word processing and internet browsing that you'll actually feel the difference between computers of different price points, and really, Photoshop still runs pretty well on my 4 year old Macbook Pro when processing 16 megapixel files with 16bits per pixel.

PC's have not improved much since XP. Nobody really needs 64 bit and programs that you have to pay for every month either. The people who could use 64 bits are gamers, but they just don't make the expandable machines they would need and XBox has taken a big part of that market.

For most people, internet access is all they need and they can get that on their overly expensive phones.

I think desktops PC's are going the way of the VHS tape. My current laptop, console, and portable devices all have more processing power than my old desktop. I'm only keeping that giant paperweight until I get a new scanner. Then I'm thru with desktop pc's. BTW: Here's my website on technology if anyone is interested: http://technology4democracy.com/ PEACE! ^_^

The first thing that popped in to my mind: grate another doomsayer praising tablets that are essentially useless for any serious work. Needless to say that I was pleasantly surprised. I agree with the fact that the PC market is turning in to the TV market. After an era that saw the growth from expensive 60Mhz CPus to affordable quad core at 2Ghz+ ones, we now live in an era that sees only slow and incremental performance changes.

I was arguing with a friend about Moore's Law a few years ago. I was arguing that there is a wall and we were going to hit it soon, and that progress would slow WAY down. But I also argued that nobody would notice, since we already have so much surplus computing power laying around, even in cheap throwaway consumer devices, that it would take decades for software to really transform and take advantage of it.

The innovation space is in the interfaces. We now have so much cheap power that it has finally become more important HOW we interact with computers than that we can do various things at all. And we have the computing power to explore ways of interacting that nobody would ever have even considered 10 years ago. I continue to believe that "touch" is mostly a transitional fad. We will move straight on through it to truly organic and integrated interfaces that just mesh with our lives invisibly and effortlessly.

I really think this is because a lot of people have PC's and they are really stuck in terms of performance besides in the gaming area and mainframe side. When something new comes out that pushes the boundaries past what we have, I think we will want them in our PC and our Tablets, not just tablets. I'm still coding on a 1.6 ghz 2gb ram. Yes its slow, but I need a laptop, and though I use my phone a lot to do the basic things, it still ain't no laptop or PC. If anything I think the PC might be gone for me personally, but a laptop will always be needed over a tablet. The combo of the tablet and keyboard could then take a laptop out of action. But I really think this will depend on the size of the next computer architecture and where it goes. I remember seeing those cpu's that will use atoms (living cells or something like that) 10 years ago, but that I know of it still hasn't happened. Talk to a gamer, and the PC won't go away. But talk to just your everyday user then yes it's gone. But again, I look around and I know nearly everyone has a PC, but not a tablet or some even a decent phone.

Allthough the article says Win 8 is not the total reason for the slowing PC sales, I would have to say it is the "main" reason. The second "main" reason for slowing sales is Obama's economy.

On another note, one concern I have with PC manufacturers is that they totally convert to building nothing but tablets, netbooks, etc. As a systems designer and developer I need a powerful PC with expansion capabilities and lots of real estate on my monitors (2 X 27 inch).

I disagree. I think a 4-core, 32 Gig RAM, 5 TB HD, 256 GB SSD + a $1000 graphics card with 2x 30" monitors is the bare minimum to run Solitar. Sure an Atom powered $350 laptop runs MS Office as well but who would want that ?

Gartner said Apple sales rose, about an equal percentage of growth that IDC had as a loss?

Second, perhaps they have perfected the PC. to at least extended its life? They last longer than they did at one time. Last year we bought two new Macs, they replaced two Mac's that were 6 years old, they held up really well and it was things like less weight, power for Photoshop, etc that really drove the sale. In many situations our iPads and smart phones do great

@rodaverich I grew up on PCs, starting with MS-DOS as a kid. It can take a lot of effort to learn the ropes. I'll agree with you there. I'll also agree that most of the time, tech support is beyond useless. Whenever I run into a wall, I find the fix if I go on to internet forums related to it and get solutions from other users.

I built my desktop in 2009, upgraded the cards in 2012, and am already 4 years in...

Not doing much gaming with it and it's functioning more as a work and media PC (hooked to 30" 1600p monitor and a projector to a 120" screen).

And I mostly agree with the article here. Frankly, I wouldn't be surprised if I got another 5 or more years out of mine with/without a second video card upgrade.

Am looking for the next pc game, program that has me going: "jeez, the next desktop I build is going to be this and have this, this, this..." Am probably going to be looking for a while -- don't think the business case is there for such games, programs.

Google Fiber, online gaming with a 4 figure a year buy-in price (so all playing are typically adult, yup-users and game designers can afford to make some design leaps), multi-monitor game-play using SHDTV 3D monitors, several other things I can think of that might lead me to build another desktop over the next decade or so. That's about it, though.