Posted
by
Soulskill
on Friday October 31, 2008 @04:54PM
from the single-core-multi-core-or-apple-core dept.

Preedit writes "Apple plans to get into the business of designing microprocessors for handheld devices, according to legal papers that are part of a dispute between IBM and one of its top technology executives. IBM is suing Power chip expert Mark Papermaster for allegedly violating a non-compete agreement and accepting a job at Apple. In court papers, IBM claims Apple wants Papermaster 'to design microprocessors for incorporation in a variety of electronic devices, including handheld devices.' The suit, according to Infoweek, also notes that Apple earlier this year bought out P.A. Semi. IBM thinks it knows why."

Chip designing these days is like the child game you used to play called connecting the dots.

People use the term SoC(system on chip) to describe them. It's actually quite modular. Basically you can license a arm core or a mips core and put in all your other blocks(PCI, USB, ethernet) all on the same chip, so if Apple were to license the ppc architecture from IBM I'm sure IBM would be happy. I doubt thats what they are doing since the iPhone is based on ARM.

Not a lot of people design processors from scratch anymore.

Unless he designs the processor from scratch he's really not competing. I can't imagine apple doing something that stupid.

That article alludes to his experience with low power. He probably knows a few tricks on how to reduce power load. This is the expertise they are drawing from. He isn't competing with IBM; MIPS, ARM and intel is.

There are several types of licenses one can buy from ARM. The most expensive type, the type Apple is rumored to have acquired, is an architectural license, which allows one to design ones own CPU core. Why would Apple buy this expensive of a license if all they were going to do was "connect-the-dots"?

Well I was talking about optomechanical mice there (shining LEDs through a rotating grating, instead of video processing a picture of your desk), which might be a little out of date now, so if you have a new PC your mileage may vary. Winbond are/were a major manufacturer for mice chips based on 6502 chips lets say five years ago (no patents and licensing fees there), but this year they have been spun off into Nuvoton, which apparently uses the "CompactRISC 16" CPU from National Semiconductor. I guess 16 bit

There's no such thing as PPC or ARM expertise. Once you get past the instruction decoder, you are designing an execution engine. The kind of workload you expect influences this design a lot more than the ISA.

Huh? My first Mac keyboard (not quite three years old) has page up and page down keys. Never noticed my MacBook doesn't have them because the two-finger-scroll is so easy. They're normally buried in "Fn" hell on PC notebooks anyway.

I disabled two finger tap and two finger scroll on my macbook pro, and went with Sidetrack to give me a much less annoying virtual second button that doesn't keep getting accidentally misinterpreted as unwanted mouse pointer movements.

Yes, I'm sure my fingers are defective, but it's a lot cheaper to replace a driver than my fingers.

I'll accept some shitty media buttons that I'll never use as long as they're not in the way, don't add 5 pounds of ugly, and don't do stupid shit like put the system into standby if I accidentally hit it.

That is honestly my prefered keyboard layout too, although I don't mind having media buttons. I will tolerate other keyboards, but there are some things I will not accept.

I will not accept Power management buttons that can be accidentally pressed. (The sleep button on my current keyboard is level with the plastic face, amking it hard to accidentally push. It is also at the extreme top right corner, in the area where the status lights would normally be. That works OK for me.)

As for other things, the indented Windows keys Microsoft now mandates (Since the release of Vista) should really be flat, although that is not a deal breaker.

What is this?The DELL keyboard I'm on right now (which I think it actually of a Logitech pedigree) has the windows key just like the alt key, excepts it's got a circular indentation with a raised windows logo in a circle. Similar to the top 2 here: http://static.flickr.com/36/113313821_90211ff7ec.jpg [flickr.com].

Yes, that is what I was talking about. It is definately not a deal breaker by any stretch, but they key really ought to be flat. There is no good reason for the key to be other than flat. It's not like it actually gets used so frequently it needs an aid to be able to quickly find it with your little finger.

Everyone did business with the Nazis. Germany was (and is today) an economic power. I assume that before the US entered the war most knew that the Nazis were a totalitarian regime, but is there is historic evidence that the general public knew what was going on in concentration camps. There is a big difference between imprisoning an ethnic group and torturing and murdering them (including children). I am skeptical that BMW, IBM, and hundreds of other major businesses were keyed in to what was going on until

OS/X is portable. They are still supporting Power based Macs last time I checked.The next IPod touch could be moved to power if they are low enough power.The next AppleTV could use a Power CPU.A netbook could use Power as well. That might be a big win for Apple since they wouldn't takeaway any sales from Macbooks.

Now I am just waiting for Apple to buy AMD and Foxconn:)I think they have the cash on had for AMD for sure.

But they're not using any now. You don't need *replace* chips you're not using anyway.

I think IBM's smart enough to be able to check the Apple store and notice the complete absence of any obviously Power PC based products, from iPod Shuffle up through the 8 core Xeon-based XServes. So who am I to doubt IBM's word that they're making and selling systems using Power PC? Clearly I'm doing a bad job in my search, and figured someone here could point me to the missing Powermac or Powerbook they're still shipping

IBM also claims that Apple considered replacing the IBM Power chips used in some of its computers with chips made by P.A. Semi.

Apple was in talks with PA Semi long before they bought them and before they stopped making PPC computers. It was rumored that Apple was going to switch to their chips for their laptops but instead decided on Intel.

AMD is going to roll out 45nm CPUs soon. A duel core 45nm version of their latest CPU would make a good laptop cpu.Add in that there 780g gpu blows away what Intel offers.Then throw in the ATI line so Apple can have access to the latest and greatest GPUs.It might not be a bad buy for Apple.But hay I don't write the checks.The thing is if Apple did buy AMD and Foxconn they would have complete control from the ground up. I can see Jobs going for that idea.

OS/X is portable. They are still supporting Power based Macs last time I checked.

As I recall snow leopard or whatever the next version is called is dropping support for Power based Macs. If they were planning to switch back or support the chips on handhelds or something, why drop support on the G5s etc.

Dropping support is not the same as not being portable.Do they claim support for ARM? I don't think so but the iPhone uses one.Power on the desktop? No I don't think so.Power on the iPhone, iPod, and maybe a netbook?That I can see. But we are all just guessing.Unless I am right. Then I am brilliant and insightful.

OS X ran on Intel the entire time it was in development. They didn't mention or release an Intel version until 10.4. I wouldn't put it past Apple Inc. to have an internal version of OS X for PPC, or anyother architecture, ready for the right moment.

Chip supply is a major weakness/obstacle for Apple. Smart business practice will have options should the current supplier have trouble with yields or other issues, not to mention forward looking technology ideas. Apple is not just smart about tech, they're smart about business. They won't risk their whole business on the fortunes of Intel. Let me repeat that, they won't risk their whole business on the fortunes of Intel. And t

I don't know about OS X specifically, but OPENSTEP ran on MIPS and SPARC as well. Both of these architectures have low-power versions aimed at embedded devices. Apple got a team of people from NeXT who were very experienced in shipping software that ran on half a dozen architectures. This puts them in a really good position now - they can easily ship three or four architectures, depending on which manufacturer produces the cheapest chips for any market segment they're interested in. ARM wasn't one of th

There aren't any good SPARC64 chips in this power range, but there are quite a few SPARC32 implementations. The only SPARC laptops I know of are SPARC64s, which is a bit strange considering that my SPARC workstation spends most of its time running 32-bit code (there's no point building things as 64-bit on it since it just makes them use more i-cache and run slower).

Hold on, this is what IBM has put in this lawsuit. This NEW lawsuit. This lawsuit started AFTER the whole Power PC transition was complete. I know what was rumored THREE YEARS AGO, but why is IBM referring to it NOW?

Why? If you spend 5 years at a company and learn 'the trade' on their dime they should be safe from you running to the next company and spilling everything they worked hard to make, at lest for a short time. It would be massively unfair for me to take your designs for "insert tech here" and run to "insert corp/country of choice" and beat you to market , or, very closely join you.

Stealing designs is already illegal in the first place. Non compete agreements prevent you from taking a similar job after your current job has been terminated, even if you have no intention to steal your former employer's trade secrets.

The real aim of non compete agreements is to lower your negotiation power. Take this salary cut, and no you can't go to the competition because of the non compete.

that may be true in some cases, but given that "IBM offered to pay Mr. Papermaster one year's salary in exchange for Mr. Papermaster to respect his contractual obligation to refrain from working for an IBM competitor for one year," i don't think that's the case.

it seems to me like they just don't want to lose their trade secrets to their competitor. and in a hi-tech field like chip design, a year's lead on the competition would be very significant (or at the very least enough for the trade secrets held by a

A year out of the industry is not going to be looked on favourably by future employers, so even receiving full pay for the duration is not enough compensation for such a restrictive term in my contract. HR normally backs off when you point things out in these terms.

Sorry, but it's not fair for one company to have better employees than another company, just because the hire smarter people or give them better training. We must redistribute smart, knowledgeable employees to companies that aren't as well off.

Why? If you spend 5 years at a company and learn 'the trade' on their dime they should be safe from you running to the next company and spilling everything they worked hard to make, at lest for a short time. It would be massively unfair for me to take your designs for "insert tech here" and run to "insert corp/country of choice" and beat you to market , or, very closely join you.

Yea..the keyword there is if. If you do that, then you should suffer the legal consequences (if there are any), but you shouldn't be punished simply because you could do that. In any event, treat your valued professionals like they are valued, otherwise somebody else will. Like it or not, the labor market succumbs to the same market forces that every other market does...

If they can't keep you there by treating you well, providing you opportunity to grow or paying you well. Then why does a company deserve to hold a monopoly on your employment?

The other problem with non-competes is that there have been numerous cases where employees are laid-off, but their NC are enforced preventing them from getting jobs in the industry.

Also a company should not be defined by an individual contributor. A company's success depends greatly on the culture and teamwork within that company. Something that is not easy to export (or import, as many merged companies have found out).

Also "trade secrets" and patents are outside of the scope of a non-compete clause. And you are liable for civil damages if you distribute trade secrets. Even if you no longer work for that company.

I've been in a situation where I have moved jobs and been asked by my boss at the new employer for technical details of something I developed at my previous company, and refused - my position was that I would happily develop something different but equally effective for my new employer, but that the particular formulation belonged exclusively to my previous employer. I was prepared to use my expertise, but not any knowledge of the other compan

"Why? If you spend 5 years at a company and learn 'the trade' on their dime they should be safe from you running to the next company and spilling everything they worked hard to make, at lest for a short time."

That would be true if they just paid you to hang out and learn. Their "dime" goes to pay you for the work you did to help their company prosper.

You can't take any trade secrets with you, but the general knowledge you gained belongs to you.

traditionally trade secrets are held to a different standard as a non-compete. Even after a non-compete expires your still not allowed to disclose trade secrets. The sole reason for a non-compete is to render the signer completely useless for a year or two to prevent companies in the same line of work from benefiting from another companies stupidity. It completely goes against the point of a capitalistic society, which is why many states just tear it up as overstepping legal bounds of a contract. There are

God forbid people show some personal responsibility and not work for companies that force such agreements. Of course since it seems people will put it with anything as long as those numbers next to the $ are slightly bigger than at another company I'm not surprised.

"Mark Papermaster"? What is it with all the oddball last names in the technology business? There's Faith Popcorn, but wikipedia says her birthname was Faith Plotkin. But "Papermaster" sounds like someone who should be running either a D&D game or Dunder Mifflin (or Wernham Hogg, I guess).

Probably more like this person [animenewsnetwork.com]. Why point to the sequel when the original is available? But who am I to talk; this was the first time I'd heard of this. No, I'm not a huge anime fan, but I occasionally watch it on CN, Spike, and G4. I'll keep my eye open for it.

You might want to look into that a bit further. First, from the article that blog quoted, I think the blog is wrong about it being unconstitutional. It's just against state law -- CA law. Second, even by "state law" the article meant "constitution", the decision only applies within CA, which has stronger pro-worker laws than just about anywhere else in the US. IBM is suing in NY. So this decision probably means almost jack squat for this case.

I prefer French law on non compete. If you have one of your employee sign a non compete, three conditions must be respected:-limited scope on geography-limited scope on timeand the better one-while your former employee is unemployable due to the non compete, you must pay him a compensation for his unemployability. I don't remember how much but it's a certain percentage of the salary.

don't know about the first one, but IBM seems to be meeting the last two requirements:-they're only asking that he refrain from working for Apple or another direct competitor for one year.-they offered to pay him a year's salary (on top of his default compensation package) in exchange for his abidance with the non-compete clause.

while i think that non-compete clauses definitely have some potential for abuse by employers, i don't think IBM is being that unreasonable in this instance.

Power.org [power.org] is the standards body that controls the POWER(PC) ISA specifications, among other things. Its members include IBM, *Apple*, Freescale and many others. If you want to build a custom designed chip based on one of the ISAs "owned" by Power.org, then all you need to do is become a member and license the ISA of your choice. You are then free to design any kind of custom *micro*-architecture your heart desires as long as the ISA presented by your chip/micro-architecture is compatible with the ISA you licensed from Power.org.

I am NOT an "Apple fanboy" ( in fact, I can't stand them and I've never owned any Apple hardware or software in my life-- I'm running Kubuntu 8.04 ) My comment concerning IBM's competency was in the context of their supporting, actually giving away, their processor ISAs and encouraging everyone to join the Power.org custom ASIC bandwagon, yet at the same time crying about some idiotic non-compete agreement when they created the competition in the first place. Oh, and my "harangue about Apple's licensing rig

Seems to me that if you work in a fairly specialized field, like microprocessor design, then pretty much your only choice to stay in the field when you leave your current employment is to go to a competitor; everyone else in the field is a competitor. It's pretty unusual for a court to enforce a clause which boils down to "if you leave the company you can't work in your field".

And what was the last company that thought designing their own chips was a good idea. NVidia right?

Um, nVidia has no option but to design their own chips, they're the top GPU designer in the world. A better question would be why Intel bothers designing their own GPUs instead of partnering with nVidia.

A better question would be why Intel bothers designing their own GPUs instead of partnering with nVidia.

Might be because nVidia's chips are heavily based on licensed technology, which would restrict what Intel could do with it.
Wasn't there a problem with Microsoft being pissed off because of nVidia's license for the XBox GPU, making them go to ATI for the 360?
Something like that, don't remember where I read it though.