AMD: Apple Will Eventually Use Our Chips

AMD chief executive Hector Ruiz said Wednesday that Apple will eventually use its microprocessors alongside those from Intel. Ruiz made the comments during a dinner speech at the Commonwealth Club in San Francisco, according to Bloomberg. “Everybody wants choice,” he said, adding that rival Intel’s practices have stifled the PC industry’s growth. “Knowing Apple, why would they want to be held hostage like everyone else has been?”

…to join forces once again with the CEO who repeatedly screwed it on Motorola chip deliveries.

My guess is that AMD’s lobbing not-so-subtle threats at Apple as far as ATI video card pricing is concerned; before The Switch Mac Minis had ATI video onboard, and now they’ve lost that market to Intel integrated graphics.

I believe Intel’s per processor manufacturing costs are lower than AMD’s, and I am sure the pricing Apple gets is completive, likely even cheaper than what AMD would offer.

Currently Core 2 looks like it is going to be a strong contender for a while in performance and price and Intel’s mobile line has always been strong.

Apple other product is their Xserv server and computing clusters, 2 more areas where Intel holds a dominant position.

I also wouldn’t be shocked if Intel played an important role in helping Apple design an X86 compiler and motherboards.

I think in shifting away from PowerPC, Apple was looking to move into something more mainstream to keep up without skyrocketing R&D costs.

The cost of additional R&D and testing associated with including AMD processors as well as a reduction in their volume discounts with Intel means (at least in the short term) bundling AMD is not worth it.

Maybe AMD has some kind of long term plan cooking, but who doesn’t? What if ATI tanks after they throw away the ATI name and put it under new management?

If AMD’s processors become such a powerful product that Apple will have no other choice, then what difference will it make if Apple is buying or not since they would have the rest of the industry at their door step anyway?

I think CEO’s generate hype like this once in a while because they are good for short term gains in the stock price.

I’m not sure where you found that config at that price (didn’t see anything matching on Pricerunner), but you should do a side by side comparison of the two and also take into account OS cost, weight, etc. of the two.

Who cares if you can find a laptop at that performance/price if it weighs 6-7 kg? As an example.

I’ve not found a 6-7 kgs notebook yet, but I agree that Apple stuff is always more expensive. If you live in a country where VAT is 20% an Apple PC turns out to be a mere status symbol. I personally don’t see any reason to buy one when you can build your own system for less, unless you really want to splash out big bucks and show it off. As for notebooks, well, why should anyone buy a tiny-screened MacBook? Would a Turion chip make it more palatable, cost-effective or performant? I don’t think so. They’ve proved they might do so for sure, but heaven knows why and when they’re going to transition again. At the moment there’s no point in going for AMD: what would be the strategy behind it at all? If they should see an environment friendly PC could bring more money they’d even be using Via C7-D processors…

VAT is added for Mac’s and PC’s both, and if we’re gonna talk stationary (not laptop), then Apple does have really good prices on their offerings when compared to other pre-built options.

Most comparisons of cost between pre-built and custom-built seems to not take into account the time spent on each before you can start using them. If you assign a cost to each hour spent, don’t be surprised if the custom-built end up more “expensive” for equal performance and capacity.

Lately I was able to configure 2 systems, one with Core2 Duo E6400 ( @2.13Ghz) and one with AMD X2 Athlon 4200+ socket 939.

I used intel board X975XBX for intel CPU and Used Asus A8N-SLI Deluxe for AMD. and then I used the lowest power consumption of both from the BIOS and Control Panel/Power Options. and watched for heat generated by means of IR/Laser Thermometer.

I got AMD CPU to be cooler when idle than intel one, then I was chocked to find this by experience, then I knew why was that. Actually the speed step of intel CPU makes CPU clock down from 2.13 to 1.8GHz while AMD CPU from 2.2GHz to 1.0GHz!! which resulted in 34 degrees celcius AMD CPU while @full power it was not more than 45 degrees. For Intel when Idle Temp was 38 and when working fully was 43 degrees.

But as we all know, CPU average load during the day will be just 1-5% like in my file server and my windows desktop and fedora desktop. So, I concluded that if you are not gonna use the system you build for gamming It would be wise to buy AMD solutions rather than Intel, because in gamming average load is always above 50%.

I used AMD and Intel Boxed Heat Sinks/Fans. and checked temperatures by both Mobo/CPU sensors and laser guided InfraRed thermometer, just for confirmation.

By the way Intel chipsets were unbelievably hotter than AMD one both North bridges and South Bridges (52 vs 32)

I have noticed that Fedora also stepped intel and AMD CPU speeds like windows and thus I achieved these values for these two OSs.

AMD board was using nvidia nforce 4 SLI chipet.

So, maybe Apple might consider AMD without any fear of heat problems.

Lately, I have checked Apple laptops on the show at Fry’s Elecetronics, and found the white/black 13″ laptops bottom boiling (@ 10 o’clock position).

Desktops have rarely heat problems these days.Laptops are a different story though.What’s needed is defenitely a breakthrough.Perhaps a peltier element that absorbs heat from CPU and streams back a percentage of electrical current to the battery.

“Knowing Apple, why would they want to be held hostage like everyone else has been?”

Because nothing AMD has can take on Core 2 architecture?

If AMD comes out with something better, might be a different story. At the same time, Intel is the only company who already has ability to do 45nm and 32nm to follow. Soo… AMD got their work cut out for them.

When Opterons and Amd64 came out it was Intel who was in this situation. I don’t think Amd is doing worse (hell, check architecture, consumption, thermals, performance charts) now than Intel was back then, given the situation.

Apple probably knew that Core2 will be fastewr than A64’s so they chose it. Intel managers were probably better salesmen in this case than AMD’s. And for spoiled Mac users Intel is probably a word they want to see sticked on their box (or elsewhere).

It might be possible for servers IF TDP is low enough, and pricing per chip is low enough to warrant Opteron use v. server version of Core 2 Duo (or quads of either when available). Of course it IS possible that some customers may be willing to take an initial price hit and somewhat lower performance for the small power savings in LARGE installations, although I don’t think that there are or ever will be too many large Xserve installations as there are equivalent and much cheaper alternatives. GUI’s are pretty worthless on servers…

I can’t really see it on the desktop though, as the power difference is not nearly enough to justify the pricing and performance gaps.

Then we could also go onto AMD’s mfg cap, as I believe they have even lower cap than IBM does.

I like AMD and would use them again if they could at least match Intel performance and pricing, but right now, they aren’t even on my list any longer except for my older socket 939 mbs which is already covered/maxxed.

Oh well, good luck to AMD, but I won’t be seeing a new AMD product again until they can produce something at least equivalent to Intel’s current arch and at a comparable price.

Apple chose Intel because “Intel” is brand that is very easy to sell – in despite of massive Intel screwups. Also Intel is able to deliver almost everything within the box, making it somewhat a controlled environment. That also has an effect on the price.

In short Intel is a much better package deal than any other, and I haven’t even mentioned Core yet.