I'd guess there's at least a few billion dollars difference between a reasonably up-to-date fab and the people/infrastructure it requires, and what is required to cast and CNC chunks of metal (unless it's something like sub propellers). If Apple was throwing around that kind of cash it wouldn't be a secret.

Not necessarily. These days, when you make a CPU for some type of embedded application (like a mobile device), you don't design an entirely new CPU core. You just take an off-the-shelf core, like an ARM or PPC, and then create a SoC (system on chip) using that. The core stays the same, whether it's an ARM9 or ARM11 or PPC e300 or PPC e500, but all the other stuff on the chip is different, and specific to its application. If it's for something like an iPhone, it probably has a logic block that controls t

It was actually revealed that their real motivation behind the Apple team's efforts is to build an uber sophisticated intelligent computer system capable of downloading Steve Jobs' brain in case he becomes too ill to continue his role as RDF overlor...er...CEO.

It was actually revealed that their real motivation behind the Apple team's efforts is to build an uber sophisticated intelligent computer system capable of downloading Steve Jobs' brain in case he becomes too ill to continue his role as RDF overlor...er...CEO.

The WSJ story talked about how Apple had designed a variant already, but were unhappy that so much design was being sold to other companies. It looks like they want to design their own extension of the ARM and gain a real competitive advantage. Certain aspects include better power consumption, network interface, handwriting recognition, and more horsepower. There is some speculation that it will also bleed over to the desktop design. Maybe they are getting tired of using commodity hardware and want to differentiate themselves from Dell.

No, they dropped PowerPC because IBM couldn't keep up with producing faster chips and lower power envelopes (for laptops). Remember, they were never able to stuff a G5 into a Powerbook. I doubt it had anything to do with whether the hardware was "commodity" or not.

SJ stood on a stage a promised a 3GHZ G5 in one year (because IBM had promised it to Apple), and IBM let him down. That, together with IBM's decision not to develop a low-power G5 suitable for laptops is what closed the book on Apple's PPC machines.

They switched to Intel instead of AMD because they had had quite enough of vendor disappointments. AMD was a far riskier prospect.

They switched to Intel instead of AMD because they had had quite enough of vendor disappointments. AMD was a far riskier prospect.

AMD was not a riskier prospect, AMD was a backup plan while intel was the lowest bidder. Or, so I would speculate. Remember, this was a time when it had become clear that AMD could be taken seriously. A friend of mine has an Athlon 64 laptop from back when that was a new idea; it's still one of her best machines. (I helped her pick it out, at Best Buy... on clearance. But I wouldn't expect many savvy customers there anyway. IIRC it was around $900 with 1GB memory and maybe 60 or 80GB disk and I think the po

AMD was not a riskier prospect, AMD was a backup plan while intel was the lowest bidder. Or, so I would speculate.

Your speculation is wrong. AMD had arguably better parts; Apple didn't choose them because they weren't confident in AMDs ability to maintain their performance edge over Intel, or deliver sufficient quantities to meet Apple's schedules. They weren't about to go through all the pain of switching to land right back in the same situation they'd had with IBM.

Wrong on both counts. Intel had been offering money to Apple for years to make the switch, that wasn't a deciding factor. As for x86 familiarity, Apple had no shortage of developers on the platform before the switch.

Apple dropped IBM's POWER series for Intel marketing dollars and Intel x86 familiar developers. The Power series is faster than the Intel or AMD chips, and has been for years. IBM released a 4.7GHZ dual core chip more than 2 years ago...

"... "PA Semi is going to do system-on-chips for iPhones and iPods," Apple CEO Steve Jobs said, according to The New York Times during Apple's June 2008 Worldwide Developers Conference...."

From the Horse's Mouth, 9 months ago, announced publicly at the WDC. I think I would be going with "... will manufacture it's own chips..." since that's what they said they would be doing, right out loud in front of God and everybody.

These days most chip companies don't really make their all own chips anyways. They make the design and get foundries like TSMC to do actual fabrication. Companies like TI, Intel, AMD, and IBM make some chips in-house but also farm out the work to chip foundries.

The objective likely to be more proprietary enhancements [engadget.com] to their product lines that require licensing and royalties from secondary vendors who wish to manufacture and sell peripherals and products to work with Apple products. Its all about building monopolies, U.S. businesses believe competition is a bad thing.

The auto industry shows why a lack of competition and high barriers to entry are bad. When presented with serious competition, like the Tucker, the US automakers resorted to bribing politicians and judges to shut down their smaller competitors. When they couldn't stay solvent themselves, they got sweet government loans and bailouts (not just recently, but in the 80s too) that other companies didn't have access to. Despite all this history, our stupid government continues to bail out these fumbling, faili

So the difference is that neither one of them really innovates? I don't see the difference. Sure, Apple is good at repackaging things to be pretty and easy-to-use, but that doesn't matter when it comes to chips. In this case, they will *have* to innovate to turn their investments into something useful.

I think this move has more to do with Apple's obsession with controlling everything - they'd like to be a vertical company. It's a risky move, because hardware is a costly industry to enter. Will their recent purchases be worth it? Very possibly, it's an interesting gamble.

It's a very entrepreneurial idea -- quit all the talking and hand-waving and actually ship something! There's not much value in developing great ideas that never get out of the lab.As for the claim that neither innovates? Hogwash. Taking an idea and integrating it into a viable product IS innovation by definition -- it is something that has not been done before that point. Both MS and Apple innovate, to different degrees, which we can squabble about, ad infinitum.:) I would say MSFT is far better at marketing their ideas and capturing market share, while Apple is better at inventing. Others will have a different view.

But back to the original subject, I suspect Apple's desire for custom chips comes not from a desire to save power (there are already many viable low-power CPUs and chipsets available) but rather a desire to fight off Hackintosh clones (OSX running on non-apple hardware, such as the Dell mini 9 or generic desktop PCs). Technologically, there's no reason why this can't happen but one must consider that Apple's hardware sales are quite profitable and that share is worth protecting.

I think also for their iPods/iPhones, Apple probably wants more customization than they have right now. They have to accept whatever chip that they are buying balancing processing power/power consumption/functionality. Incidentally this may have been driven by the iPhone. While the iPod is fine with an underpowered chip as its functionality is limited, the iPod touch/iPhone require more computing power. There are rumors that Apple was not happy with the original chip on the iPhone. The problem is the chip was exactly what they specced out. Apple may have lost the chip expertise that they had with the original Macs.

But back to the original subject, I suspect Apple's desire for custom chips comes not from a desire to save power (there are already many viable low-power CPUs and chipsets available) but rather a desire to fight off Hackintosh clones (OSX running on non-apple hardware, such as the Dell mini 9 or generic desktop PCs).

This doesn't make sense to me for their desktop machines. If that's their plan, they have to either 1) have all of the standard chipset for a commodity PC plus their proprietary chips and duplicate functionality, or 2) lose Windows compatibility because some subset of the operations are being pushed onto proprietary chips, and Windows doesn't speak their instruction set. Option 1 would increase the complexity on their motherboards and increase power usage, and I'd bet pretty strongly that they're not goi

I doubt that the hackintoshes are a serious concern to Apple. The fact they exists demonstrates that a certain market exists, but not one that's likely too profitable right now. It's something relatively few people are willing to do, or even have the knowledge (or at least patience) to pull off. Not much of a problem among the Slashdot crowd, but certainly among the general public. More importantly, in order to fight that off, Apple would have to transition back off of the x86 architecture - not a feat o

I would say MSFT is far better at marketing their ideas and capturing market share, while Apple is better at inventing.

Considering I've never met anyone who thought the MSN butterfly commercials were anything besides ghastly, I'd have to say that Apple is far better at marketing. Unfortunately, despite their clever ads, Apple hasn't been able to capture much marketshare from MSFT simply because of their monopoly power (and probably other factors, like lower initial cost for a Dell/HP PC vs. an Apple one).

I think this move has more to do with Apple's obsession with controlling everything - they'd like to be a vertical company. It's a risky move, because hardware is a costly industry to enter. Will their recent purchases be worth it? Very possibly, it's an interesting gamble.

This worked really well for Commodore - and they kinda died over it. Up until their bankruptcy almost all Commodore machines used custom chips designed and manufactured by them.

I think the OP's point has been completely lost. Apple, Dell, and HP design/sell "real hardware", and microsoft designs/sells peripherals.

I believe the original point was that microsoft has never attempted any serious hardware development; so comparing microsoft's supposed failure to design "simple hardware" to Apple's attempt to design "real hardware" is stupid.

Generally the hardware is designed well by every company; it's the software where things fall down. I have several Apple and Microsoft Keyboards and Mice.

Of my peripherals that are at least 2yrs old that should still be supported:

1xUSB MS mouse = support officially discontinued(3 out of 5 buttons work with default driver).

1xUSB Apple mouse = supported (but only 1 has button)

2xUSB Apple Keyboards = supported (but new Macs/PCs no longer support the power-button on the keyboard to power on when turned off)

All in all, a pretty pathetic amount of support. Microsoft drops support for their own USB mice(you can still find 3rd party drivers to enable all 5 buttons). Apple didn't officially drop support, but no longer provides the needed circuitry on their motherboards to power-up a computer via a USB keyboard's power button(I'm wondering if this is so they use less power when turned off).

And we all know how well the 360 ended up turning out. Lets see, drives that scratched disks, red rings of death, etc. Sure, they have fixed most of their problems now, but at the start of the 360 lifetime it was a total mess. On the other hand, the PS3 and Wii consoles had little to no issues (about the only one I can think of is that some Wii units could have a dirty optical lens because of smoke, dust, etc. that made it hard to read some dual-layer disks but that is mostly all fixed now)

Companies like Foxconn and ASUS build Apple's hardware to Apple's specifications, as they do for Dell and just like they used to do for Packard Bell.

One more time and we'll get it right:
Companies like Foxconn and ASUS build Apple's hardware to Apple's specifications. Companies like Foxconn and ASUS also build hardware for Dell and Packard Bell, most of which was (reference) designed by engineers at Intel. Hell, for the longest time, Dell's boards had "Intel" silkscreened on them, since they were literally carbon copies of Intel's reference board and nobdoy bothered to change it.

There's really not as much to it as you think. They're just layers of etched copper in a fiberglass substrate. Intel makes a lot of reference board designs with the complete layout of the traces and layers in the copper and fiberglass, and Dell is notorious for just taking those and putting their names on it. Formerly they didn't even bother to go that far, but now they actually do change the silkscreens (and sometimes the shape a bit to make them fit in their case, which really isn't hard for even a first

Apple design their hardware, inside and out, down to the motherboard and chip placement. They outsource the manufacturing, but the actual system builder isn't just sticking an Apple badge on a generic piece of hardware.

Nope, not even close. Apple *designs* or works with manufacturers to create custom *designed* boards and hardware but they build nothing. They are the same chips and chipsets as Dell, which actually does the same thing and custom *designs* their gear just like Apple.

I don't know for sure, but just because they own a chip "manufacturer" doesn't mean they own a fab. I'd be willing to bet they don't. There's lots of semiconductor companies that don't have fabs at all; they're called "fabless". P.A. Semi was probably one of them. Here's some others you may have heard of: Qualcomm, Broadcom, NVIDIA, Marvell, MediaTek, ATI (before AMD acquired them), Xilinx. Here's an article [wikipedia.org] about them. These companies simply design chips; they get other companies called "foundries" to make their chips for them. The largest and oldest of these is TSMC, a Taiwanese company.

Nope, not even close. Apple *designs* or works with manufacturers to create custom *designed* boards and hardware but they build nothing. They are the same chips and chipsets as Dell, which actually does the same thing and custom *designs* their gear just like Apple.

So in the same way, all cars and buildings are exactly alike because they all use exactly the same materials, the design is just a bit different. Right?

Yeah, the CPU and chipset are the same. They even use copper and solder to make their boards, oh my! But there's are millions of ways to lay out the same components, and they do yield different results. Take cities: they all have similar needs (hospitals, police stations.. just play Sim City), but how you lay them out can be the entire difference between

Just think how much [Apple] could power reduce and cost reduce if they dictated the chip-specs!

(I am an electronics engineer, and make chips that Apple buys for many of their products)

They already do, to an extent.

Chip design companies are constantly battling it out to get design wins at big companies like Apple. If Apple tells them, "Hey, we want this chip to do X and Y while consuming Z mA," then those companies are going to try their best to meet those requirements so that they can get Apple as a customer.

Your assertion that chips are being overcomplicated for the purpose of driving up cost is incorrect. Semiconductor companies are constantly trying to simplify their chips' designs, in order to improve yield and reduce costs, while charging the same price to their customers. It's much, much easier to improve margin than it is to convince your customers to pay more.

I doubt that Apple will be able to substantially improve cost or power consumption. While they do have some experience in chip design, it's highly unlikely that they'll be able to go in and do a better job than all of the companies that do nothing else.

Apple is not going to waste money developing their own chips just for bragging rights.

That's right, they won't do it just for bragging rights. They'll do it for a compelling performance, power consumption, and/or cost advantage. Right now, they pay Intel, Nvidia, and AMD a hell of a lot of money for CPUs and GPUs, and I'm sure they'll do their homework before making the next build or buy decision.

Apple is not going to waste money developing their own chips just for bragging rights.

That's right, they won't do it just for bragging rights. They'll do it for a compelling performance, power consumption, and/or cost advantage. Right now, they pay Intel, Nvidia, and AMD a hell of a lot of money for CPUs and GPUs, and I'm sure they'll do their homework before making the next build or buy decision.

The question is, would Apple seriously get into developing their own CPUs? I find that hard to believe, if only because- in terms of worldwide market share- the Mac is still probably quite small compared to Intel and even AMD based PCs.

Given how competitive the CPU market is and how hard AMD have to work to even compete with Intel, I find it hard to believe that Apple could afford to develop a chip that was competitive in terms of price *and* performance with either Intel or AMD solely for their own use.

The question is, would Apple seriously get into developing their own CPUs?

That would depend on what advantages they thought they could gain from it. They can certainly afford to do it, they've got about 37 billion dollars in cash on hand, they've got most of the talent they'd need for such a project, and they could easily recruit anyone else they might need. Building a whole new architecture isn't an opportunity that comes along that often for a hardware designer these days.

Given how competitive the CPU market is and how hard AMD have to work to even compete with Intel

That's the market for commodity parts. It doesn't apply to vertically integrated companies like IBM with their POWER CPUs or Sun with the SPARC. The question for Apple isn't whether a new CPU would fly with other users, it would be whether it works for Apple's needs. Outside OEM sales would be gravy.

Another thing to keep in mind here is that Apple's very big on recruiting the top talent in any area they go into, and you don't get the best chip designers by offering them run-of-the-mill projects to work on.

If they build everything in house, they don't have to deal with 'leaks' from other companies

That would be a major factor, but not as big as whatever advantages they could realize from having parts made entirely to their specs. Another major factor would be that if their products are built around their own parts, cloning becomes infeasible.

Apple participated in the design of the PowerPC. That worked out pretty well. I've had two people tell me within the past week that they went back and used a PowerPC Mac Mini (both upgraded to 1GB of RAM) and how zippy it was under Leopard. They were surprised, since the systems were something like 5 years old, and max out at 1GB of RAM.

Apple also participated in the design of the initial ARM processors. That seems to be going pretty well. (Direct descendants of the design are in iPhone).

Apple is also a participant in LLVM, which is going to help Apple shorten the design-to-deployment cycle for new silicon.

Also, Apple has had several internal projects that were oriented towards building chips in the past. The story has it that once Steve Jobs met Seymour Cray and told him that he was using a Cray-1 to design his next computer. Seymour says, "That's great, Steve. I'm using an Apple to design my next computer."

If you had said the Nokia 1100, I might have not said you were insane.

But the Jitterbug? Seriously? For starters, it has a color screen - something that the Slashdot favorite "basic phone" doesn't have at all. And, second, it costs more than some modern smartphones. Yes, I know, it's not on contract... but the prepaid prices are ridiculous, so if you actually use this "basic phone," you're paying OUT THE ASS.

Apple knows how they intend to use the chip, but that is not the same as knowing how the chip gets it done. Participating by writing various specifications and testing is a very long way from designing logic circuity.

That's not to say they can't do it, or wont be good at it. But your making a leap.

When Apple moves into a new area, they go and hire the people they need to do it right. They knew nothing about retailing, so they hired Ron Johnson. When they decided to make the iPod, they hired Tony Fadell, who had a lot of experience in portable devices.

Apple now employs Dan Dobberpuhl , who was the lead architect of the DEC Alpha, and the StrongARM. He was the founder of PA Semi. One of their more recent hires was a GPU designer at ATI and AMD, who also happens to have worked on the Pixar Image Computer back before Pixar became a movie studio.

The way I read the writing on the wall is Apple's going to start making their own CPUs, and possibly their own GPUs as well. Whatever they come up with, I expect it to fit in very well with the work they're doing on LLVM and their software OpenGL implementation.

Apple also participated in the design of the initial ARM processors. That seems to be going pretty well. (Direct descendants of the design are in iPhone).

Nitpick: Acorn, not Apple, solely did the design of the initial ARM1, ARM2, and ARM3. They then spun the ARM CPU (which originally stood for Acorn RISC Machine) off into another company, Advanced RISC Machines, which was a joint venture between themselves (40%,) VLSI (who did most manufacturing of ARM CPUs and chipsets at that point - 40%,) and Apple (20%,) as Apple had expressed interest in using the chip, but didn't want to use a competitor's chip (Acorn directly competed with Apple in the personal computer market, especially in schools.)

Only the ARM6 (there was no ARM4 or ARM5) and newer had any Apple involvement, and I doubt anything newer than the DEC StrongARM had much of any Apple influence. (The ARM6, ARM7, and StrongARM were all used in the Newton.)

And, the ARM6 and ARM7 are essentially tweaked versions of the ARM3 with 32-bit addressing (as opposed to 26-bit on the previous ARMs,) and more cache and a slightly faster clock in the case of the ARM7. As for the StrongARM, it wasn't even designed by ARM, it was designed by Digital, to meet the ARMv4 ISA.

They'll just hire some taiwanese design team to take and ARM core and hang some extra bits on it like a functions for mp3 decoding, then get TSMC or some other taiwanese Fab to produce it.
AFAIK they didn't even design the ipod tech themselves, just decide on the look of the thing and contract all the rest out.

Apple is a hardware company that also makes software. Microsoft is a software company that also makes hardware. The MS hardware I can think of is their keyboards and mice, the Zune and Xbox 360. Considering that the entertainment division of microsoft that builds the zune and xbox lost 31 million dollars [microsoft-watch.com] last quarter, I wouldn't hold Microsoft up as the paragon of what is possible to do in hardware.

Well it worked when they did the ASC "Apple Sound Chip", the IWM "Integrated Woz Machine", SWIM "Super Integrated Woz Machine" and some others. I don't know from the newfangled Intel Macs though what sorts of custom Apple chips are used any more.

The problem with Cell is that in a general purpose computer, threads need to talk to each other; you can't just have non-barriered pipelines you keep fed. That's OK for some types of specialized processing, but most of that type of processing is going onto general purpose GPUs these days (e.g. OpenCL), rather than building specialized hardware for it.

A personally would have liked to work on a port of xnu to the Cell architecture, since I think it would have yielded a lot of useful information about the OS

Mr. Jobs rejected the idea, telling Mr. Kutaragi that he was disappointed with the Cell design, which he believes will be even less effective than the PowerPC.

And this was way back in 2005. The Cell is arguably good in it's role as the cpu for a game console / blu-ray playback device, but that doesn't mean that is the best choice for a general-purpose computing device.

Apple's domestic market share has doubled in the three years since moving to Intel. They moved to Intel just as Intel was introducing a great leap in technology. Going with Intel also allowed for virtualizing MS Windows or dual booting, eliminating most of the risk for switchers.
They bought PA Semi specifically for iPod/iPhone systems. It had nothing to do with Macs, so how is that a "flailing effort"? They haven't even introduced a PA Semi-designed product yet.
CUDA tech is basically what they are doing

Sticking with IBM for Cell would have made very little sense. The Cell processor is very similar to how NVidia's CUDA presents the graphics card to you: limited cache (shared memory), lots of very simple hardware threads, almost no branch predication, etc. So, both CUDA and Cell would crank out great numbers on things like a particle simulator, MPM, image processing, and the like, but are not equipped to do some useful things like running a scheduler, or a word processor. Basically anything that's very

Well, because before hipsters and "cool" kids flocked to Macs they were amazing at... Image processing and the like. Imagine a cell CPU with say 8 cores or so and CUDA with an additional ton of cores all churning out amazing shit like Pixars latest flick or handling massive RAW files and tons of filters.

See, then fine, have your commodity Intel crap for the in-the-know crowd. Ever notice how the "pro" lines are anything but since the switch to Intel? Well, probably not because actually doing the things that

Well as a few others already pointed out Apple did look at the Cell chip and rejected it. There are in my opinion two big problems why Apple could never seriously consider the Cell:

First of all, it was new architecture, which consisted more or less of a PowerPC chip which shares some similarities to the G5 and the SPUs which the PPC part had to feed with data. Well, to truly take adavantage of the Cell architecture, all programs would have to be optimized to the Cell architecture, which is not an easy ta