According to ZDNet's Paul Murphy, Apple's migration from PowerPC to Intel processors resulted in a massive spike in power consumption that has hurt economic diversity of the US and devastated the environment. Even worse, he suggests, it involves scandal on the part of Apple board member Al Gore. He's wrong, here's why.

Why the Myth was Woven.

Few things make a better story than exposing an ironic contradiction of epic proportions. Luke: I am your father! Soylent Green is made of peeeeople! They couldn't hit an elephant at this distance! Microsoft pushed QuickTime's Final Cut! Religious advisor to the Bush administration paid a same sex escort for meth enhanced adultery sessions while advocating against gay marriage!

Such twists are shocking and engaging because they force a reexamination of what we assume to be the case.

So just imagine if one of the largest and widely lauded microprocessor transitions in the tech industry were actually a miserable failure of poor decision making, resulting in inferior technology paired with higher power consumption. Oh the humanity!

Further, imagine that a company with a green-friendly reputation and a high profile board member who is known as one of the foremost leaders in calling attention to environmental issues--and in particular greenhouse gas emissions and global warming--was behind this contradictory twist.

Wouldn't that make a spectacular story?

The Myth Weaver.

Murphy, who writes for LinuxInsider, isn't your typical Apple basher by any stretch of the imagination. In fact, of the relatively few occasions Murphy has written about Macs in his column, he has on a number of occasions lined up facts and research to refute mythical ideas.

Murphy presented a factual comparison to take apart the myth that Macs were ever significantly more expensive when compared to equally configured PCs; more recently, he called out problematic pundit George Ou on an irresponsible, sensationalist portrayal of Mac OS X as being less secure than Windows XP, based on Ou's simpleton effort to conflate vulnerability report counts with exploitable security issues.

Murphy's recent comments and calculations describing the dramatically increased carbon footprint of Apple's products subsequent to its transition to Intel are therefore more deserving of critical consideration than the typical volleys of impassioned emotionalism that gush from the usual Apple antagonists.

The problem with using math and statistics to prove a point is that if you start with rubbish numbers, no amount of calculations will correct them. Anytime you start off at zero and facing the wrong direction, the only real potential course is to head off into a downward spiral of exaggerated miscalculations.

Murphy started by citing Steve Jobs’ Green Apple comments, where the CEO wrote, “We are also beginning to explore the overall carbon 'footprint' of our products, and may have some interesting data and issues to share later this year.”

In order to beat Jobs to the punch at figuring the carbon footprint of Macs, Murphy put together his own calculations.

Murphy wrote that the new Core Duo processors range from “31 watts for the low end of the laptop line to well over 180 watts,” then compared these figures to those cited by Freescale (formerly the Motorola PowerPC fab), arriving at an Intel premium of 16 to 47 watts per processor over PowerPC. According to his math, that makes the “average difference a minimum of about 31 watts per usage hour.”

Some Guesti-ti-mates.

“What this means,” Murphy wrote, “is that, at the very least the six million Apple computers being sold in this fiscal year would be burning 186,000KW per hour less power if Apple had not switched to Intel's x86 products.”

Citing the amount of coal needed to extract all that extra energy, Murphy states that the “additional greenhouse gas burden imposed on the planet by Apple's decision to prefer x86 over PPC is about 357,000 tons per year.”

He compares this to the striking image of “a column of SUVs 100 feet apart and 38 miles long run 24/7 at 60MPH.” These numbers, he warns, are only “rock bottom guesti-ti-mates with more realistic ones easily reaching a million tons a year.”

The first ironic twist is complete; to ice his cake, Murphy then begins to assail individuals at Apple. “Given his political posture you'd think he'd [board member Al Gore] have raised the greenhouse gas issue with Jobs and perhaps even resigned in protest over the Intel decision, but he didn't.”

What Did He Know, And When Did He Know It?

Raising the rhetoric from technical to political, Murphy continued, “Gore not only voted for the MacTel switch, but actively campaigned on Intel's behalf prior to the vote.” With what results?

According to Murphy, Apple's transition to Intel “not only hurt U.S. economic diversity, but is directly responsible for pouring at least another three hundred and fifty thousand tons of green house [sic] gases into our atmosphere each year.”

Oh dear! Is the world's most famous publicist of global warming actually guilty of campaigning for changes that resulted in the unnecessary burning of tons of fossil fuels and the dismantling of the foundations of the US economy?

And is Apple really just hiding its disastrously incompetent technical decisions behind a pleasant front of cheery marketing in order to melt the polar ice caps and destroy the world? If so, this is news.

Before looking at the math, lets look at the Google search that supplied Murphy's initial numbers. Murphy cites Intel processor power consumptions of 31 to “well over 180 watts,” but Intel's most power hungry Core Duo processors ever were only 130 watts. That’s a fairly significant error to be multiplying by 6 million.

The latest Woodcrest Xeon processors used in the top of the line Mac Pro workstations use 65-80 watts (depending on their speed) when cranking away at full bore. When idle, the processors cycle down to a lower clock rate and consume less power, and of course the entire machine can go to sleep in power saving mode.

But Apple doesn't sell a large portion of its millions of Macs in the form of Mac Pros. More than half of all Macs sold are laptops, which not only use more power-efficient chips running in the range of ~35 watts, but also spend more time sleeping and in other power saving modes.

I Don’t Think That Means What You Think It Means.

Now take a look at the PowerPC options Apple abandoned. The examples Murphy cited are 'system on a chip' embedded designs, not laptop or desktop processors. Murphy doesn’t seem to understand what this means, despite linking to a Freescale page that pointedly describes it as being designed to run a phone system.

The general purpose PowerPC chips Apple was using in 2005 consumed more power than today's significantly faster Core Duo chips, not less. No PowerPC partner has delivered anything comparable to Intel’s Core Duo.

A major reason why Apple moved to Intel was not because Intel chips were outrageously faster, but because they offered more performance in relation to the power they consumed. This was public information.

That efficiency enabled Apple to build new devices that PowerPC partners IBM and Freescale did not have on their radar. PowerPC was migrating away from desktop and laptop processors, and toward embedded devices that were not suitable for the new products Apple wanted to introduce, including the Apple TV and a new range of significantly more powerful Mac Books with better battery life.

Rather than demanding an additional 31 watts of power, the new Intel Macs use less power overall. Backing up yet another step, it's important to consider that the central processor in a PC is not the most demanding power consumer anyway. The graphics processor, RAM and other devices can use far more power combined than the CPU itself, and all that power can be dwarfed yet again by the energy used for the display.

CRT displays can easily burn up 150 watts; flat panel displays commonly consume less power when operating than a CRT uses when asleep: around 30 watts. The real power consumption of a PC is therefore not central to the processor, leaving Murphy’s entire argument without a foundation.

Apple's pioneering discontinuation of CRTs did far more to drop Mac power consumption than the move to Intel Macs, but both were positive improvements. The new Mac Pro's total power consumption is less than the Power Mac G5 it replaced, despite its being significantly faster and its use of higher performance parts throughout.

There are no magic G5+ chips hidden away in the bowels of IBM that offered less power consumption at equal performance to what Intel was offering, and nothing at all in the world of PowerPC that could pretend to approach the power and performance combination of the mobile-optimized Core Duos used in Apple's Intel-based laptops.

A properly formed Google search would reveal that in one click. It's certainly not a controversial topic.

Murphy's comparison of the premium power consumption of the entire world's Intel Macs to a fleet of SUVs stretching far beyond the horizon makes for a shocking picture, but is an absolute fiction.

The migration to Intel wasn't a step toward more power usage, but rather significantly less. There is no growth to calculate, apart from the fact that there are significantly more Macs being sold now.

However, the doubling of Mac unit sales has not come at the expense of hippies using their fingers to do math. Instead, Mac sales are displacing new PC growth and old PCs which are far less efficient. New Mac sales are growing at around three times the rest of the industry.

Cheaper PCs have long relied on power chewing CRT displays and inefficient boxes full of fans running full tilt. Pentium 4 processors inside the former generation of Intel PCs could easily consume over 300 watts; that incited rival AMD to focus on its power savings in its advertising campaigns.

Hot heatsinks require spinning fans, which are notoriously inefficient themselves. Add a big cheap CRT and a fat GPU, and your PC is consuming a lot of power. Replacing it with an iMac results in a significant power savings.

Having stated that however, using your iMac to work from home--rather than driving your vehicle to work every day--has an astronomically larger impact on energy consumption than does adjusting your computer’s efficiency.

This fact makes it particularly disgusting that Murphy compared electrical computer power usage to the consumption of fuel used to move around tons of metal.

Don't waste air trying to tell me that electrons racing along copper pathways inside my computer are at all similar to the power inefficiencies of burning gasoline to move an enormous vehicle around.

Press your face against the hottest part of your Mac Book, and you still won't be prepared for the heat you'll encounter by touching any portion of your engine after a short drive. Was all that heat created for free?

Faulty Math vs Irresponsible Conclusions.

I am careful not to present numbers without some extensive double checking because I have extensive experience in making mistakes. That allows me to easily forgive Murphy for taking some erroneous numbers, multiplying them out into the millions, and jumping to an absurd conclusion in error.

What is harder to understand is his casual attack upon the integrity and honesty of somebody who has spent a considerable amount of his own resources to direct attention to environmental issues. Gore is not serving himself by presenting global warming as an issue to the world.

One can say what they want about another's political ideas, but inventing a false scenario, then portraying Gore as a merciless hypocrite and fraud based upon that false information, is really just inexcusable.

Using the same Internet that Gore pioneered funding for as a Senator to attack him and his reputation with poorly conceived, specious claims just adds a further ironic twist to the story of the “evil conspiracy that wasn't really there.”

PowerPC Died, Move On.

Murphy also used the opportunity to take a wild stab of conjecture about how Apple's abandonment of PowerPC--many years after Motorola, Microsoft, IBM and other PowerPC participants effectively gave up on the project as it was originally intended--is somehow an undermining of the diversity the American economy.

Would the world be better off with Apple trumpeting PowerPC on its own, delivering lower powered computers that consumed more energy, and offering fewer alternative choices to the power hungry and inefficient generic PCs available, all just to preserve artificial ‘diversity” in the number of PC processor families available?

Does it help to point out that--despite Apple's transition--PowerPC hasn't gone anywhere, and that there has been no loss in diversity? Just ask any next generation game console maker!

There is a big difference between being factually in error and serving as a dramatically incendiary explosion of absurdly false information.

If Murphy was 17 years old and writing in a school newspaper, his article would be easier to laugh off as an experiment in misguided youthful rage. Murphy's bio says he has been paid to write about technology for 20 years. How shocking that he would sully his reputation with such irresponsible and reckless wording.

The Mother of All Twists.

Perhaps the oddest aspect of Murphy's erroneous, misguided exposé is that the basic scenario he unfolds actually already happened, but the culprit wasn't Apple. Back in the 90s, there was a diversity of significant CPU designs, each offering different advantages in performance, price, efficiency, and simplicity:

•the newly emerging PowerPC

•DEC's amazing Alpha

•Silicon Graphics' MIPS

•Sun's Sparc

•HP's PA-RISC

Compaq's purchase of DEC, followed by its merger with HP resulted in three of those families being owned by HP. Under the disastrous management of Carly Fiorina, HP decided to scrap all of its processor technologies to join SGI and most of the rest of the industry in supporting Intel's promises for a new generation of high end processors: Itanium.

Intel's Itanium ended up a miserable failure that sucked up billions of dollars to deliver nothing, while effectively destroying billions more dollars of value locked up in the competing technologies that were sacrificed on an altar of credulity. To salt the wound, Itanium also consumed horrific amounts of power while it helped set back the state of the art in processor design.

That's a story to tell, and it's true. Nobody seems to like to tell it, because it reminds the world that know-it-all tech giants have made epic blunders before, and that traditional media outlets--like Murphy's ZDNet and every other pundit podium and soapbox on the web--were nearly all unanimous in cheering on the monoculture that Itanium promised to usher in.

With Itanium from Intel and a new Itanium-optimized version of Windows from Microsoft, flacks of all suits jeered at proponents of Sun's Solaris on Sparc and Apple's PowerPC Macs as befuddled nincompoops who had stupidly failed to drink their portion of the FlavorAid alloted by the never-to-be-questioned fearless leader of WinTel.

However, as the dotcom bubble inflated, Itanium rotted on the vine, leaving Apple and Sun as major beneficiaries of the extra money being spent. Had Sun and Apple scrapped their own plans to join the Itanium cult as every conformist pundit had insisted they should, there would only have been extra carnage associated with its sinking.

Apple's successful independence helped to fund the development of Mac OS X, while Sun's eventually resulted in the open sourcing of Sparc as well as Sun's investment in Intel rival AMD and its Opteron processors.

It was tough competition from AMD that shook Intel awake from its snoozing at the wheel. Embarrassment over the spectacular failures of Itanium and the Pentium 4 reinvigorated Intel to return to the drawing board and deliver new technology, including the Core processors that tempted Apple away from PowerPC.

Murphy has previously defended Sun's intentional independence from the Itanium fiasco, and seems to maintain a sharp memory of where the industry has been and why.

He should know better than to invent absurd techno-political conspiracy theories when there are so many deliciously true stories to tell in the tech world.