Intel says quarterly profits fell 25 percent as PC industry reels

Intel holds to revenue forecast, analyst says that it "scares the hell out of me."

Remember how we told you last week that PC makers are hurting? Not surprisingly, Intel, the world’s largest chip manufacturer, has been hurting too.

On Tuesday the company released its latest earnings report, revealing that its quarterly profits from the first quarter of 2013 ($2.045 billion), are down 25 percent when compared to the same period last year ($2.738 billion).

"Amidst market softness, Intel performed well in the first quarter and I'm excited about what lies ahead for the company," Paul Otellini, the company’s CEO, said in a statement Tuesday. "We shipped our next-generation PC microprocessors, introduced a new family of products for micro-servers, and will ship our new tablet and smartphone microprocessors this quarter."

The announcement comes one day after Morgan Stanley semiconductor industry analysts argued that “ARM-led commoditization” would likely hit the industry in the near future.

"Intel is in a battle of survival—not only do they need to penetrate massively into the computing spaces currently dominated by the ARM camp, but they also need to keep the ARM camp from burrowing their way upward into the PC space," wrote Stacy Rasgon, a Bernstein Research analyst, in a recent investors’ note. "This will only get harder (and more confusing) as the lines between PCs and tablets further blur."

In its Tuesday press release, Intel said it is reducing its annual 2013 capital spending from $13 billion to $12 billion, plus or minus $500 million, but it held to its forecast of a “single-digit percentage increase” for its 2013 revenue forecast.

"That scares the hell out of me. They are holding to the same ultra-bullish forecast they gave before," Rasgon told Reuters. "They are presumably pretty bullish on the new products they are planning."

124 Reader Comments

It's more like Intel is in a battle for their margins. They always could've stomped on ARMs turf, but they never had any desire to sell chips at such cheap prices.

Once they throw in the towel and give up on margins, then there's not much competing to do with them. They have far more engineering and fab capacity than any of their competitors. And there's no special sauce in ARM that makes it more power efficient, it's simply slower, and therefore uses less power. (The x86 front end is a shrinking piece of the die every cycle, all chips decode to uops anyway)

Intel is trying to figure out where they can hold a margin advantage, nothing more, nothing less.

If Intel give up on margins, how are they going to pay for their lead in design and fabrication?

Intel's success in the future is built on its success now and if their profitability has a significant decline, their current business model won't work.

Man I hate accountants and what not that can look at a $2 BILLION quarterly profit, and consider it a failure! You made $2 BILLION for crying out loud! It's not like you're losing money!

RIM was making money hand over fist and had dominant market share when the iPhone was released - now look at them. It's called a leading indicator.

Also, the stock market rewards growth - not treading water or shrinking. So while yes, they are still making plenty of money now, after literally decades of ridiculous growth pivoting to back to back quarters of not just slowed growth but shrinkage, this is pretty disturbing and significant. And probably not an anomaly but that whole post PC thing that Steve Jobs was ridiculed for even bringing up. Virtualization can't be helping either, for servers OR desktops as VDI takes off and thin clients powered by ARM or VIA x86 clones gain popularity for their small form factor and low power consumption. Intel seriously needs to kick Atom into high gear re: power use or they are going to be further marginalized.

EDIT: And seriously, this comment is an editors pick? I hope it was more in a bid to get page clicks than endorsing the validity of the statements - otherwise I'm concerned about the competency of the Ars editor that promoted it...

Right now you have graphics, photo, presentation, video, and documents on iPads. Two years from now they will meet just about everyone's needs, and still cost $300 or less.

And more importantly run for a full 10 hours (not just nap most of that time, but actually be used for 10 hours) while being smaller, lighter and cooler. Intel's problem is they focused on desktop and server performance for far too long and completely missed the importance of mobile. That Apple ditched PowerPC for mobile chip performance (power consumption/heat) should have been the wake up call.

Quote:

Why do you expect PCs to take back market share from tablets?

They don't know. The deniers just can't wrap their heads around the notion that traditional desktop computers (be they Windows, Mac or anything else) are on the decline. Most have probably never had to go through a platform transition before. I was young, but geeky enough to be paying attention when the 80's PC users rebelled against the 70's Mini computer proponents that rebelled against the 60's main frames.

Mobile is in. Furthermore, there are still more people on the planet who have never used a PC than who are. Odds are when they finally do get a primary computing device it won't be a traditional PC. Nor will it be a table - but a phone.

Now don't get me wrong, desktops aren't going anywhere and neither are laptops - but the days of them being the logical or automatic first choice are long gone, just like mainframes and mini's (Ok, mini's to a greatly lesser extent) are still around and serving useful roles. But the decades of year over year growth are over.

Based on upgrade habits I should have replaced my aging laptop last year but I found I use my iPad enough and it does 95% of what I want when I'm mobile that there was no reason to upgrade. I look at the MacBook Air or Retina MacBook Pro's, but haven't yet been able to justify using them enough over what I can already do on my iPad. Heck I'm typing this out on a nice keyboard on my iPad on a commuter bus right now...

2 years from now, desktops will have superior performance for the price yet.

Desktops (and laptops) are ridiculously overpowered for the vast majority of people as it is today - let alone two years from now! Your argument isn't much of one and if that's the best you have then this article is probably understating things.

At least now I understand why you don't see the issue - you are completely out of touch with the market...

If Intel give up on margins, how are they going to pay for their lead in design and fabrication?

Intel's success in the future is built on its success now and if their profitability has a significant decline, their current business model won't work.

The solution is now obvious- keep prices for PC/laptop chips the same ("to pay for their lead in design and fabrication"), but make ultra-mobile chips at very low margin until ARM is sufficiently reigned in, then try to gradually move mobile prices up to get a meaningful profit. Intel hasn't done this yet because they were afraid ultra-mobile chips would cannibalize their PC chips. Of course that happened anyway- Intel's refusal to compete properly in ultra-mobile did nothing but give others a head start. I'm sure at this point Intel has realized their error and will be taking the appropriate measures.

I regularly run as many as 8 applications simultaneously spread across at least 2 displays so that there's sufficient space to actually see the multiple apps as I move between them. Until a tablet can support sufficient processing power to run multiple instances of Word, PowerPoint, Excel, browsers, screen capture utilities and video capture simultaneously, tablets will never replace desktops.

You are a special use case--and it's important to know what the majority of users work with, and desire. Even Microsoft, who authors most of the software you cite, understands there is a shift in the market happening.

Not to mention a lot of the world's population, who is coming into the digital world with little to no experience with desktop computers...

The chip business is extremely high-overhead. The cost to build a fab is extraordinary, and the cost to run one is also fairly insane. Fab companies like Intel have gigantic fixed costs that they cannot escape. If they can run their fabs at or near capacity, they can make legendary profit, because once they cover their overhead costs, their actual material input costs are a tiny fraction of the selling price of a chip. After the overhead's paid, every chip they make is about 95% profit.

But in slow times, this huge overhead bites them in the ass; if their breakeven on running a fab is, say, 75%, then running at 60% will lose vast sums. Chip companies like Intel can go from $2 billion profits to $10 billion losses practically overnight.

The big money in semiconductors is in that last 25% of production. If they're not projected to be running in that range, then the analysts are right to be a little panicky.

This.A lot of people seem to forget this key point. Some other industries (aerospace, optics, semiconductors) have high overhead, others (retail, marketing, software) not. That overhead is also a significant barrier to entry.

I wouldn't write Intel off yet--they have some of the best people and processes in the industry. The question is how well they tackle the emerging markets...

I regularly run as many as 8 applications simultaneously spread across at least 2 displays so that there's sufficient space to actually see the multiple apps as I move between them. Until a tablet can support sufficient processing power to run multiple instances of Word, PowerPoint, Excel, browsers, screen capture utilities and video capture simultaneously, tablets will never replace desktops.

You are a special use case--and it's important to know what the majority of users work with, and desire. Even Microsoft, who authors most of the software you cite, understands there is a shift in the market happening.

Not to mention a lot of the world's population, who is coming into the digital world with little to no experience with desktop computers...

I'm a special use-case because I'm a content creator that actually needs the power of a PC to do the work I do. The majority of people are happy with a content consumption device, but there is still a demand for content creation. I'm not trying to suggest that there is no place for some content creation on tablets, just that the majority of serious content creation requires more processing and graphics power, and screen real estate than a tablet can provide. Until I finally get my isolinear processor and holographic display working, the PC is not dead.

The first is Windows 8, many peoples hate it and it reflect a lot in the sell department.

The second is the lack of really new innovation in the PC world, Intel didn'nt make a real innovationin the processor department its been a long time since the processor have really jumped in powersand even if they did, The processors of today already do a lot and not much more power is needed.

I regularly run as many as 8 applications simultaneously spread across at least 2 displays so that there's sufficient space to actually see the multiple apps as I move between them. Until a tablet can support sufficient processing power to run multiple instances of Word, PowerPoint, Excel, browsers, screen capture utilities and video capture simultaneously, tablets will never replace desktops.

You are a special use case--and it's important to know what the majority of users work with, and desire. Even Microsoft, who authors most of the software you cite, understands there is a shift in the market happening.

Not to mention a lot of the world's population, who is coming into the digital world with little to no experience with desktop computers...

I'm a special use-case because I'm a content creator that actually needs the power of a PC to do the work I do. The majority of people are happy with a content consumption device, but there is still a demand for content creation. I'm not trying to suggest that there is no place for some content creation on tablets, just that the majority of serious content creation requires more processing and graphics power, and screen real estate than a tablet can provide. Until I finally get my isolinear processor and holographic display working, the PC is not dead.

There is going to be an ongoing need for 'serious' computing hardware for people doing pro video editing, CAD/CAE, coding, and the rest but this market is such a small part of the overall PC market that Intel cannot rely on it. If these markets were the only ones buying their processors, the company would fail.

I regularly run as many as 8 applications simultaneously spread across at least 2 displays so that there's sufficient space to actually see the multiple apps as I move between them. Until a tablet can support sufficient processing power to run multiple instances of Word, PowerPoint, Excel, browsers, screen capture utilities and video capture simultaneously, tablets will never replace desktops.

You are a special use case--and it's important to know what the majority of users work with, and desire. Even Microsoft, who authors most of the software you cite, understands there is a shift in the market happening.

Not to mention a lot of the world's population, who is coming into the digital world with little to no experience with desktop computers...

I'm a special use-case because I'm a content creator that actually needs the power of a PC to do the work I do. The majority of people are happy with a content consumption device, but there is still a demand for content creation. I'm not trying to suggest that there is no place for some content creation on tablets, just that the majority of serious content creation requires more processing and graphics power, and screen real estate than a tablet can provide. Until I finally get my isolinear processor and holographic display working, the PC is not dead.

So am I.

At one time, it was considered to be a given to need a high-end minicomputer (such as Sun or SGI) to do any serious creative work.

Then later, especially for 3-D, video, and high-end photo work, it was more-or-less a given to have a video card that cost at least as much as the computer using it.

Today, a lot of that can be done on a well-equipped laptop. The primary limiting factors are user preferences (dealing with form factor and input methods), and data storage. High capacity processing can be dumped to a render farm, just as with desktops.

Moore's law is alive and well, and there are creatives who are willing to push the boundaries--sometimes driven by necessity, other times just because...

Sure, it's nice when I can be by my dual monitor setup with my Wacom tablet, fast broadband, and storage archives, but it's sometimes not practical, or even possible. A small number of creative professionals are even using smartphones to create content. With the computing capabilities continually increasing, it is becoming more an issue of preference than necessity on what platform to utilize.

If you go back over the last twenty years and look at the "Whatever Company's" quarterly reports, you'll see that every time earnings were less than [the brain-dead] analysts had predicted, you'll see the company refer to poor economic conditions somewhere in the world. And, it might even be true. Depends on the company and the quarter.

Of course Intel, having made a $2B profit, knows that the PC isn't dead--which is why it is smiling and rubbing its hands together with relish--the company knows what is coming down the pike. Brain-dead analysts, however, are of course scared s***less by anything that moves. All I can say is that people who think the PC is dead (having a difficult time keeping my face straight here) are people who don't use PCs and are people, therefore, whose opinions on the subject are worthless. Viva la PC!

Intel hasn't done this yet because they were afraid ultra-mobile chips would cannibalize their PC chips. Of course that happened anyway- Intel's refusal to compete properly in ultra-mobile did nothing but give others a head start. I'm sure at this point Intel has realized their error and will be taking the appropriate measures.

I doubt it was an error. If Intel had taken the lead in low power chips, I think there's a serious chance that they've own the space and there profits would be substantially lower.

If you've got a lucrative market, you don't help it to die. You hold on as long as you can, but no longer.

I think Intel had timed it pretty much exactly right. They'll enter the market as a fierce-some competitor, not necessarily as market leader (yet), and they could easily be $10B richer than if they'd led the market and migrated a lot of their customer base to low-price chips.

The PC market is in a slump now. But, it's still a market. Once there's compelling reason for consumers, it'll bounce back readily. Currently, there is no compelling reason. Old computers today do everything consumers need. Most gaming went over to consoles. Most social went over to smartphones. Most businesses just need an old P4 to keep chugging away with Office & XP. As needed, things can upgrade to Win7, b/c Win 7 is efficient enough to run on netbooks (I've got an old netbook running it), so you don't need a quad-core beast to run it.

There is no "killer feature / app" that requires newer computers today other than niche markets, like multimedia. And even companies that do major multimedia processing are starting to farm that stuff out to cloud companies that provide the horse power for rendering as-needed.

Intel needs to do like IBM did when it hit the wall; focus on more than PC's right now. Expand out into new markets. IBM is currently on top of their game with R&D and cloud/infrastructure development. Intel could expand their market. They have the cash to take a chance and still survive.

Meanwhile, look at AMD. They were hurting and struggling. Game consoles moved in and got a sweet deal to get AMD to power their new consoles. Why didn't Intel get that deal? B/c Intel still has a fat head. Intel and MS are both acting like they're still living in the good old 2000's. Wake up and smell the 2010-2020. Others are starting to eat your lunch.

I don't think the economic environment has anything to do with it, people are still spending money on electronics, it's just that things have shifted. Everyone is realizing their smartphone/iPad is nearly as capable as their PC, and they just don't need to get a new desktop every couple of years.

Intel just needs to make products that fit into the new world.

I have been on a 5-6 year computer upgrade cycle for at least a decade now. CPUs have been fast enough for XP, and then Windows 7 for a long time. My 4 year old desktop still runs Windows 7 fantastic, and with the SSD upgrade it's faster than ever which shows it wasnt the CPU that was slow it was the I/O!

Quite right. I also find myself in a position where I don't need to replace PCs for performance reasons. PCs are open and maintainable. Even an older poor performer can be perked up with very little trouble or expense. Meanwhile, the bulk of the PC still remains suitable despite being 5 years or more old.

You just can't say that about ARM devices. They're where PCs were in the 80s and early 90s.

My HTPCs have outlived multiple generations of streamer appliances and those streamer appliances still haven't managed to catch up to the PCs in terms of capabilities.

2 years from now, desktops will have superior performance for the price yet.

Desktops (and laptops) are ridiculously overpowered for the vast majority of people as it is today - let alone two years from now! Your argument isn't much of one and if that's the best you have then this article is probably understating things.

At least now I understand why you don't see the issue - you are completely out of touch with the market...

An obvious counterexample is Plex.

Plex is something you run on a real PC so that mobile devices can access whatever media you happen to have lying around. Otherwise, you need content that has been specifically created to cater to the limitations of ARM hardware.

A weak device with special purpose silicon will limit your options sooner or later.

Garbage. Until you can run multiple apps for things like video compiling, presentation development, graphics capture and documentation all at once on your media consumption device, I'll keep my desktop, thanks.

Welcome to about 2 years from now.

Also why are opinions being expressed so aggressively? Doesn't anyone get socialized anymore?

I regularly run as many as 8 applications simultaneously spread across at least 2 displays so that there's sufficient space to actually see the multiple apps as I move between them. Until a tablet can support sufficient processing power to run multiple instances of Word, PowerPoint, Excel, browsers, screen capture utilities and video capture simultaneously, tablets will never replace desktops.

You can't speak for the rest of humanity that only uses 1 application fullscreen at any one time.

For those people, tablets can replace desktops.

I never claimed to speak for the rest of humanity. I do know quite a large number of content creators, though, and all of them use multiple applications simultaneously. I also claimed that the desktop is not dead because content creators that run multiple instances of applications can't make due with a solution that doesn't support that (hence, desktops aren't dead).

People see the shiny shiny and don't pay any attention to the industrial engineering of the situation. Tablets are toys. They are not tools. People that still need to use tools will continue to buy them. They will need a tool interface even if a tablet can deliver the necessary horsepower.

Man I hate accountants and what not that can look at a $2 BILLION quarterly profit, and consider it a failure! You made $2 BILLION for crying out loud! It's not like you're losing money!

Hey man, that's not what matters here !2 billions is already done and gone.What matters for money-fiddlers is how much will Intel earn in the NEXT FEW YEARS.

And the trend here is pretty scary.Actually it's a revolution awaiting Intel. The old Wintel duopoly is dying hard and fast. Though they will likely survive, it's certainly not with their current "cash cow" label.

It's more like Intel is in a battle for their margins. They always could've stomped on ARMs turf, but they never had any desire to sell chips at such cheap prices.

If Intel give up on margins, how are they going to pay for their lead in design and fabrication?

Intel's success in the future is built on its success now and if their profitability has a significant decline, their current business model won't work.

By taking everyone else's market share away. Doesn't even require much extra engineering effort to reuse their current processes for cheap chips, and future processes for the expensive ones. And maybe only having a 20x profit per chip instead of 100x. And if stuff really tanks, farm out fab capability to QC or anyone else who needs it.

They're still the 800lb gorilla and can muscle the industry wherever they like. Give vendors an x86-capable chip at below ARM prices, and see how quickly there'd be a mass migration. Of course, they're not going to do that all at once, rather they will slowly move towards ARM to preserve the top-end market as long as possible.

Of course, once Intel decides to go whole-hog against ARM, that's really the end of AMD. They can only survive because Intel likes their high margins.

People see the shiny shiny and don't pay any attention to the industrial engineering of the situation. Tablets are toys. They are not tools. People that still need to use tools will continue to buy them. They will need a tool interface even if a tablet can deliver the necessary horsepower.

The terminal form factor isn't going anywhere.

People are still, in 2013, trudging out this tired and ridiculous analogy of "toys" and "tools?" Really? Back in my late teens or early twenties I would have thought it unbelievable. I guess now in my jaded 40s, I guess it's par for the course.

Let's extend your analogy a bit.

We're at a coffee shop. Sitting at a table is Mary, who is developing a new iPad app in Xcode. She's naturally doing this work on her rMBP.

At the table next to her is John. He is watching an episode of "The Newsroom" on his iPad. He is laughing often, and occasionally admiring Olivia Munn.

A few tables over, Sam is getting too many Twitter notifications on his iPhone to be able to pay attention to finish reading the email he got from his financial advisor.

Finally, in the back, sitting alone with some kick-ass headphones, Elisabeth is researching a term paper on the history of comic book art. Using her iPad, she is studying various artists throughout comic book history and saving various images as she goes. She occasionally makes notes in, well, Notes, to remind her of things.

None of these are "toys" or "tools." Each of these people is doing what he or she needs or wants to do, on the device he or she wants or needs to work on.

It's more like Intel is in a battle for their margins. They always could've stomped on ARMs turf, but they never had any desire to sell chips at such cheap prices.

If Intel give up on margins, how are they going to pay for their lead in design and fabrication?

Intel's success in the future is built on its success now and if their profitability has a significant decline, their current business model won't work.

By taking everyone else's market share away. Doesn't even require much extra engineering effort to reuse their current processes for cheap chips, and future processes for the expensive ones.

That's silly. Their current process for cheap lower power chips is requires engineering effort to either transition to new process (32nm->22nm) or to redesign the chips to be more powerful (from in-order to out-of-order), and both are supposed to happen by 2014.

Quote:

And maybe only having a 20x profit per chip instead of 100x. And if stuff really tanks, farm out fab capability to QC or anyone else who needs it.

And so the original problem remains; how are they going to pay for their lead in design and fabrication with a $10 part when they are used to a $30 part (Atom) or $100 part (Core) or $1000 part (Xeon)?

Quote:

They're still the 800lb gorilla and can muscle the industry wherever they like. Give vendors an x86-capable chip at below ARM prices, and see how quickly there'd be a mass migration. Of course, they're not going to do that all at once, rather they will slowly move towards ARM to preserve the top-end market as long as possible.

Of course, once Intel decides to go whole-hog against ARM, that's really the end of AMD. They can only survive because Intel likes their high margins.

That's like saying "Once Apple decides to go whole-hog against Android, that's really the end of Android. They can only survive because Apple likes their high margins."

Intel could become the next AMD... but then they wouldn't be able to afford to be Intel.

It's more like Intel is in a battle for their margins. They always could've stomped on ARMs turf, but they never had any desire to sell chips at such cheap prices.

If Intel give up on margins, how are they going to pay for their lead in design and fabrication?

Intel's success in the future is built on its success now and if their profitability has a significant decline, their current business model won't work.

By taking everyone else's market share away. Doesn't even require much extra engineering effort to reuse their current processes for cheap chips, and future processes for the expensive ones.

But there won't be any future processes without those high margins.

Intel's entire business model is based around preserving very high sales margins which then fund the best design and process engineering in the market. If they followed your logic, they would own the DRAM market but strangely enough, they abandoned that years ago because it doesn't pay the bills even though they clearly have the knowhow to compete.

That's silly. Their current process for cheap lower power chips is requires engineering effort to either transition to new process (32nm->22nm) or to redesign the chips to be more powerful (from in-order to out-of-order), and both are supposed to happen by 2014.

They're not in a huge hurry. And they aren't broke.

Quote:

And so the original problem remains; how are they going to pay for their lead in design and fabrication with a $10 part when they are used to a $30 part (Atom) or $100 part (Core) or $1000 part (Xeon)?

Quote:

I dunno, maybe by using their piles of cash... or leveraging assets if shit really hits the fan, or a million other ways. Intel has tangible assets that would allow them to fund whatever they'd need to for a long time. Hell, AMD hasn't died yet, and they had significantly fewer assets to leverage. And no cash.

That's like saying "Once Apple decides to go whole-hog against Android, that's really the end of Android. They can only survive because Apple likes their high margins."

Intel could become the next AMD... but then they wouldn't be able to afford to be Intel.

Apple doesn't compete on price. If there was a cheap Intel x86 SoC, every hardware manufacturer out there would put it in something. Every last one of them. But Intel severely restricts the devices that you are allowed to put atom in.

Intel has the money and resources to have a *very* long price war. But that's the last thing they want. So they'll drag out their transition to cheap stuff as long as possible. Really, the problem wouldn't be running out of money, it'd be anti-trust issues. No ARM vendor has that sort of power in the market that they have to worry about pricing stuff too cheap.

Apple doesn't own much other than their software. And they are already getting crushed in the market by the cheap Android stuff. They could give away iPhones and I wouldn't get one. (same for samsung phones too tho )

That's silly. Their current process for cheap lower power chips is requires engineering effort to either transition to new process (32nm->22nm) or to redesign the chips to be more powerful (from in-order to out-of-order), and both are supposed to happen by 2014.

And so the original problem remains; how are they going to pay for their lead in design and fabrication with a $10 part when they are used to a $30 part (Atom) or $100 part (Core) or $1000 part (Xeon)?

Quote:

I dunno, maybe by using their piles of cash... or leveraging assets if shit really hits the fan, or a million other ways. Intel has tangible assets that would allow them to fund whatever they'd need to for a long time. Hell, AMD hasn't died yet, and they had significantly fewer assets to leverage. And no cash.

That's like saying "Once Apple decides to go whole-hog against Android, that's really the end of Android. They can only survive because Apple likes their high margins."

Intel could become the next AMD... but then they wouldn't be able to afford to be Intel.

Apple doesn't compete on price.

Neither does Intel.

Quote:

If there was a cheap Intel x86 SoC, every hardware manufacturer out there would put it in something.

If there was a cheap iPhone, every person would buy one.

Quote:

Every last one of them. But Intel severely restricts the devices that you are allowed to put atom in.

But Apple restricts the devices they make to only specific pricepoints and margins.

Quote:

Intel has the money and resources to have a *very* long price war. But that's the last thing they want. So they'll drag out their transition to cheap stuff as long as possible. Really, the problem wouldn't be running out of money, it'd be anti-trust issues. No ARM vendor has that sort of power in the market that they have to worry about pricing stuff too cheap.

Apple has the money and resources to have a *very* long price war. But that's the last thing theyw ant. So they'll drag out their transition to cheap stuff as long as possible. Really, the problem wouldn't be running out of money, it'd be anti-trust issues. No Android vendor has that sort of power in the market that they have to worry about pricing stuff too cheap.

Quote:

Apple doesn't own much other than their software. And they are already getting crushed in the market by the cheap Android stuff. They could give away iPhones and I wouldn't get one. (same for samsung phones too tho )

Intel doesn't own much other than their hardware. And they are already getting crushed in the market by the cheap ARM stuff. They could give away Atoms and I wouldn't get one. (same for AMD SoC too tho )

Why is it you can't see the equivalence here?

Intel has high margins, just like Apple. Intel dominates in profits, just like Apple. Intel is dwarfed in market share by the low cost competition, just like Apple.

Yet you say Intel, unlike Apple, can fight and win why?

Apple's 45nm A5X was about 163mm square yet probably only cost them $15 (meaning Samsung made a couple dollars profit from it). The 45nm Atom was about 25mm square but cost $30, or in terms of equivalent area and potential GPU and memory performance, about $180 if you assume that quadrupling the GPU and tripling the memory bus would add another 125mm of surface to it.

And you say that Intel can go from $10 per chip margin to $2 (and remember, instead of 6 Atoms at $30, it is now 1 Atom at $20) to compete against ARM? You're talking about wholesale destruction of Intel's profits, akin to saying Apple could sell the iPhone 5 for $210, the iPhone 4S for $180, and the iPhone 4 for $150, and dominate the smartphone market, or the iPad mini for $200, iPad for $250, and iPod touch for $150.

It's a good thing. My PC is a 3 year old Dell, quad core, 4GB RAM, running Visual Studio, Office & games on Windows 8 just as smoothly if not better than back in 2010. No need to buy a new one for at least couple more years. At work, most machines still run XP. At the expense of outrageous profits of PC makers we are getting some much needed longevity.

Designed to fit in a ITX case, tho the hardware is not W8 compatible (MS requires such idiosyncrasies as PCI(E) device discovery).

You can get something very close to what you're thinking about in the x86 world now that both AMD and Intel are putting GPUs on the same silicon as the CPU. I assembled such a build a year ago using AMD and a ITX board.

Apple requires iOS. Intel sells hardware, so their stuff can run Windows, Linux, Android, Chrome, etc. See a difference? Much as Android outsells iOS, Intel could get tons of OEM support, because OEMs are cheap as hell (I deal with this daily), and would buy an Intel part if it was $.00001 cheaper than some other part. Samsung doesn't even use it's own chips for the vast majority of galaxy phones they're going to sell. (though that may change if they had their own LTE for everywhere on chip)

Apple isn't really a fair comparison, because the iPhone's high margins are shored up by the phone companies. Once the phone companies can shake loose (and they'd love to) of having to pay a premium to carry the iPhone, that will directly affect how much they can sell a phone for. I'd love to see all subsidies removed (or required to be consistent for all phones), as that'd make the phone market a lot more competitive. The number of people who'd spend $700 every 2 years for a new phone is likely like the number of people that'd spend $700 for a new laptop every 2 years.

Maybe you mean there is a difference because Intel sells mostly to OEMs who in turn sells product to end users and Apple sells mostly to... carriers... who sell to end users... no, still not seeing a difference.

The biggest difference is that Intel has the burden of actually manufacturing their own goods for sale, while Apple outsources just about all the component and assembly to outside companies...

Which reinforces my point. Intel cannot afford to lose their large profits since the large profits are what allow them to manufacture ridiculously powerful and complex CPUs that scant few can match.

It doesn't matter how Apple's high margins are shored up, that's not the point of the comparison. The point of the comparison is that both are high margin companies, that could compete with low margin companies by lowering their margin.

The difference being that Intel would be unable to afford bleeding edge manufacturing capability while Apple would just sell more things at lower prices. Apple's daily operations aren't dependent upon enormous amounts of capital (Cook has said as much!).

People don't buy a new stove or washer or fridge every year either - they last a decade or more. But I don't see companies giving up on home appliances just because they don't fly off the store shelves like ipads.

-- US firm Apple has confirmed its long awaited venture into the kitchen appliance industry, -- unveiling the long-awaited iToaster at the next Macworld Expo.---- As well as browning bread, users will be able to download music and videos onto the -- toaster's internal hard drive demonstrated by Apple boss Steve Jobs at the annual event-- in San Francisco.---- Mr Jobs hailed the toaster's design telling the large assembly of shareholders that -- "this new magical device" would "force people to rethink their whole approach to toasting-- bread.---- Music and toast fans will have to shell out between $899 to $999 to own one of the first-- batch of iToasters, due on the shelves in the US in June and Europe later this year. -- Apple say the product will reach Africa sometime close to 2014.

Intel hasn't done this yet because they were afraid ultra-mobile chips would cannibalize their PC chips. Of course that happened anyway- Intel's refusal to compete properly in ultra-mobile did nothing but give others a head start. I'm sure at this point Intel has realized their error and will be taking the appropriate measures.

I doubt it was an error. If Intel had taken the lead in low power chips, I think there's a serious chance that they've own the space and there profits would be substantially lower.

If you've got a lucrative market, you don't help it to die. You hold on as long as you can, but no longer.

I think Intel had timed it pretty much exactly right. They'll enter the market as a fierce-some competitor, not necessarily as market leader (yet), and they could easily be $10B richer than if they'd led the market and migrated a lot of their customer base to low-price chips.

I'd believe you if Intel was ready with products in the wings to take over mobile right now. They aren't, so they clearly got caught with their pants around their ankles. They will take over mobile, but now it's going to be a long, costly war with well funded competitors.

Intel hasn't done this yet because they were afraid ultra-mobile chips would cannibalize their PC chips. Of course that happened anyway- Intel's refusal to compete properly in ultra-mobile did nothing but give others a head start. I'm sure at this point Intel has realized their error and will be taking the appropriate measures.

I doubt it was an error. If Intel had taken the lead in low power chips, I think there's a serious chance that they've own the space and there profits would be substantially lower.

If you've got a lucrative market, you don't help it to die. You hold on as long as you can, but no longer.

I think Intel had timed it pretty much exactly right. They'll enter the market as a fierce-some competitor, not necessarily as market leader (yet), and they could easily be $10B richer than if they'd led the market and migrated a lot of their customer base to low-price chips.

I'd believe you if Intel was ready with products in the wings to take over mobile right now. They aren't, so they clearly got caught with their pants around their ankles. They will take over mobile, but now it's going to be a long, costly war with well funded competitors.

I think they were but got sideswiped by their old partner in crime, MicroSoft.

You can see this in the ups and downs of Moblin/Meego/Tizen.

Intel started Moblin because when they started pushing a SoC style Atom, they dropped the PCI bus to save on power requirements. But MS didn't want anything to do with a chip that didn't do PCI hardware lookup. MS is even requiring PCI hardware lookup on Window RT devices at present, making them oddballs in the ARM world.

And so Intel had to look for an alternative OS for their Atom SoC, and turned to Linux. That ended up in development hell. Then along comes Android, and now there is one phone on the market and another on its way that is running on Atom.

So all in all, Intel has had the hardware available for quite some time. But they lacked a OS to run on it.

Designed to fit in a ITX case, tho the hardware is not W8 compatible (MS requires such idiosyncrasies as PCI(E) device discovery).

You can get something very close to what you're thinking about in the x86 world now that both AMD and Intel are putting GPUs on the same silicon as the CPU. I assembled such a build a year ago using AMD and a ITX board.

If that's the case how did they manage to get windows 8 running on raspberryPi?