Posted
by
Hemos
on Wednesday May 10, 2000 @08:24AM
from the open-the-spec-brudda dept.

Quite a number of people have recognized the power of open source development - Intel definitely has by opening the specs for Itanium. They've got major blueprints up on the Web. Good sign of the success of open development - but I wonder if AMD's resurgence has anything to do with this.

What else could they do? "Oh sure, we have this new kickass architecture and we would like get people to develop for it. Due to our paroid "don't tell them how it works"-policy we won't be releasing any specs though, so you'll just have to guess the right opcodes."

Saying Intel is "open sourcing" Itantium because they are releasing architecture and programming info is like saying Black and Decker is "open sourcing" their toasters because they posted instructions on how to put the bread in.

I agree. I've noticed that Slashdot editors are falling over themselves to post stories which contain any sort of reference (however erroneous) to existing companies jumping on the open source bandwagon. Another example of this is in the Windows Source Code Proposal Confirmed [slashdot.org] story.

Editorial quality on Slashdot needs to improve, because if this trend continues, it'll end up not being much difference from any other tech news site, with stories written by people who don't actually have a clue what they're writing about.

At that point, someone else will get frustrated and set up a new site to fulfil the purpose which Slashdot used to, and Slashdot will wither on the vine, frequented by wannabe geeks, Katz-fans, VA/Andover groupies and AC first posters. The last visitors will be the Four Horsemen of the Infocalypse as we^H^Hthey pass through on their journey of destruction... The demise of the 'Net will precipitate a stockmarket crash of incredible proportions as thousands of billions is wiped off the stock exchanges, as massively over-valued dot-coms go bust. A global economic downturn will result, with hundreds of millions dropping below the poverty line...

I'd vote for stupidty. And, um, I hate to say it, but it's not the mainstream media at fault -- it's the Slashdot headline that is wrong. (Yes, the article says "blueprint", but it could very well be using that metaphorically -- especially considering their audience.)

The only thing new here is that the processor manuals are being released prior to release of the chip and they are not under NDA restrictions. With previous chips, the release of the manuals would have waited until the chip was released. (Full Disclosure: I worked for Intel for 15 years and quit last December.)

The only reason this page was called a blueprint is because the borders on the left and top were blue.

This contains no useful information for hardware designers. People making compilers may find something to look at, but to say that the architecture has been open sourced is simply a complete lie. Once again the mainstream media has been used by one of the big companies to spread disinformation and hype either through their stupidity or by having simply been bought off.

Well nobody in his right mind would write complete software in assembler. But the idea that most 3D action games or multimedia software don't use one single line of assembler, not even for time critical loops, makes me really sick. Because the coders are so lazy and ignorant, I have to buy a bigger CPU, a faster graphic card and waste money (and pollute the environement) on something that could be done better in software.

One could argue that this is Intel just trying to show Microsoft just how painless it is to open up your APIs:-)

Anyway, I have to agree with those who have pointed out that Intel is being Open Source friendly. It's a rare company in Intel's position that actually gets around to figuring out on its own that trade secrets aren't all they're cracked up to be. Truthfully, one can't look at Intel as doing anything more than responding to market demands. They know that the installed base for Linux is huge, and it's going to grow. In order to maintain their position in the Linux market so everyone doesn't switch over to PowerPC or Alpha, they have to play the game the same as IBM or SGI are doing. That means swallowing your pride and opening up.

It's becoming a common thing, actually -- Lego brought out their Mindstorms sets for Windows only, but they at least don't crack down on third-party development environments. Texas Instruments found out that so many people were hacking their calculators and programming them in assembly code that when they shipped the TI-83 and TI-86 they broke down and told people how to do it their way. Why? Not because Stallman propaganda is finally getting to them. They just realized that it's suicidal to blow off potential customers just because they aren't doing it through channels.

You're confusing speed with MHz again. There's no reason for Motorola to crank the MHz much above 500 MHz at the moment, since at that level (combined with Altivec and the rest) it's already competing nicely with the 1000 MHz offerings of AMD and Intel. If they did, then they'd just be leaving themselves less room to expand in the future. IBM demonstrated some 1000 MHz PPC chips almost two years ago, so it's not an engineering matter that's keeping them back.

And if you want to discuss thermal breakdowns, then why bring up AMD? Their processors are a semi-order of magnitude hotter than Motorola's.

Number 2 is probably exactly what they are shooting for. By having the design public, they can say to the courts: 'yes, obviously, our competitor just looked at our design -- which by the way is public -- and copied it.'

What I'm hoping, though, is that peer-review will reveal that Intel has infringed other people's patents in the Itanium design. That would be a nice unintended side-effect since Intel likes to sue everyone else all the time over chip designs.

Well, if you had some reliable way to simulate the chips (not emulate: we want more than just functional equivalence here), then you could test out mods virtually before submitting them. Intel would, of course, have to choose which submissions it wanted to test in hardware (most likely after running their own simulations on the submissions), and it would have to have the humility to acknowledge that some submissions by non-Intel engineers could possibly be worth looking at. But with those issues acknowledged, this could work...

In the Linux kernel, for instance, people routinely align their data structures by cache line. That's the level of assembly programming I feel comfortable with. But I wouldn't trust a kernel with one of the core loops (the scan for free memory, for instance) with anything. Too many mistakes have shown up in far simpler assembly-language statements.

Besides, the real tight inner loops (stuff like memcpy() and friends) usually are already inlined behind your back, by the compiler and libc, in any decent C development environment.

Had the 386 been a complete secret as to the internals(memory management/protection, etc) would Linux have even been started? (perhaps, but it would have been much more difficult)

Actually, I do believe that for the longest time major portions of the 386 instruction set were either maintained as a secret or under some other form of restriction. I recall Insignia's SoftWindows for the MacOS being unable to run Windows 3.1x in Enhanced mode due to some issues with the 386 instruction set.

Up until now, Intel was keeping even the databooks on these chips under lock and key. By giving these out now, they're letting anyone who wants to port an OS to their chips a fair chance.

Now, instead of just Windows NT, Solaris and Linux(the three OS's they were supporting) anyone can go do a IA-64 port.

They were being secretive before, claiming that the data contained in those.pdf's were trade secrets and it would be revealing too much information to let the whole world see it. This really is a good step on their part. Had the 386 been a complete secret as to the internals(memory management/protection, etc) would Linux have even been started? (perhaps, but it would have been much more difficult)

No, they're not "Open Sourcing" their chip, and I don't really see where *Intel* said that. But they are being Open Source Friendly. Don't flame someone for making a good effort, even if it's not as much as you want.

The MIPS is still an academic project. My final project for a class here at UIUC was to design in a group of 2-4 a fully pipelined processor for a MIPS instruction subset (and,sub,or,and,beq,bne,addi,andi,j,jr,jal, a few others). The only problem with this is the architecture is too simple. An x86 chip can do add [di],ax to add a register to a memory location and store it in another location in 1 clock cycle. In MIPS it takes 3- a lw, add, and sw instruction sequence. Changing this would take major redesign- by the time you're done it isnt a RISC chip anymore. This is why most succesful chips are more complex than RISCs

Sure, just buy yourself a copy of Renoir to enter in the diagram and some VHDL code, Leonardo Spectrum or some similar software to compile the VHDL for you, and ModelSim to simulate the VHDL code when you're all done. Should cost you about 30K or so. Oh, and be prepared to send some time- simulating anything as complex as a processor is slow. It took 25 minutes last time I did a simple pipelined one, with no superscalar features. Have fun.

Of course, I agree that it's different from open source software. Intel really doesnt need to fear anything, since no-one except them has the know-how and hardware to actually make the chip.

Actually, it's patents. Intel is (1) offering you the ability to interface, increasing the market and (2) offering you the ability to infringe their patents, so they can take your company:-). Of course, I don't think they're really expecting to get infringements, but it's a potential side benefit.

Another issue that hamstrung the Motorola 68k family was the fact that Motorola didn't seem to do very well improving the speed of the CPU.

I do know that Intel (and more recently AMD) are able to crank up the CPU core speed pretty easily. This shows that Intel and AMD engineers had a pretty good idea how much faster the clock speed could go without causing thermal meltdown problems. After all, how come Motorola has not gotten the PowerPC G4 CPU past 500 MHz? Shouldn't they be able bump it up to 1,000 MHz or more?

They are just providing all the specs for their CPU well before release date. But we have always had specs for Intel CPU at some points - or else how are we supposed to make our assembler code works ? (ok I know not many Linux people like assembler, but that's still the programming language of the real die-hard geek, the only way to squeeze as much juice out of your hardware)

Intel did not post the masks for the chip, all they have done is publish the "databook" for the chip on the web. Now, over the years, people have spent good money buying Intel databooks, but most of the time, if you went to the offices of Intel in your city, and many major cities have them, and simply asked for them, chances are, you could get them for free.

Why? It's simple...the more people with the Databook, the more people who can design stuff with it simply by pulling off their shelf. That is part of how they have to build themselves up to the place they are this decade.

Actually they're not based on Intel's design. The K6 series is based upon a core developed by NexGen, a company AMD purchased. The core is, at heart, a RISC processor. There is custom logic and/or uCode that converts X86 instructions to a series of RISC micro-ops on-the-fly. As for Athlons, I'm not as sure about the basis for their core, but I know it's not Intel.

I think it takes away one of the biggest advantages closed-source OSs have over open source ones. If linux and FreeBSD arent working as well with the new chips as Windows it suggests there is something wrong with the open source model itself. Compare with something like SMP where Windows had already finished supporting it when OS OSs were just getting started.

There's no reason for Motorola to crank the MHz much above 500 MHz at the moment

So with Altivec the Motorola chips whup the x86's brain dead fp. Big deal. Why don't they crank up the MHz and whup everyone's ass (Alpha, MIPS, etc) in integer and fp? There are only two reasons I can think of: Milk every last penny out of customers on a slow upgrade cycle or Motorola engineers are incompetent. Since IBM seems to get better results out of the chips I think it's a bit of both. Too bad IBM isn't looking for the mass market.

Most of connectix's Mac Products (RAM Doubler, Speed Doubler, Virtual PC, and Virtual Playstation) have all relied quite heavily on assembly language to eke every bit of performance from their products... It's not that no one does it because of the insanity of it, it's just becoming a lost art with, as it's need becomes less and less with every new generation of CPU's

It's funny that the best thing Intel ever did for AMD was to stop licencing them their chip and bus designs. AMD are now running cirles around Intel.

Intel's Itanium (aka Merced) is such an awful design that HP (their partner) refused to use it and completely redesigned it - Intel then licenced the resulting McKinley (still IA-64) back from HP!

Latest Intel screw-ups after all their recent chipset failures are their having to revert to Katmai (off chip L2) slot-based PIIIs since the socketed Coppermines are overheating and cracking the packaging. Now today, Intel are recalling about 1 million 820 chipset based mobos because of spontaneous lockup and reboot problems...

Maybe it's time for Intel to try to licence some decent designs from AMD.

Why is that, even if it is market pressure from AMD, that Intel's choice to open source its specs is need for another stupid comment from the Slashdot editors? Can you all please report the news, and nothing more. Really your personal opinions were fine back in the day, but this is a news site for the users, not your personal stomping ground. I could be wrong, and there are alternatives that I read everyday. But while Slashdot is king of the hill, you could at least try to maintain your user base and be slightly more professional.

But didn't Intel usually wait before fully releasing datasheets to their CPU's? They were worried about Cyrix and AMD taking their precious instruction set and pinouts, and building Intel clones. But yet Intel also wanted to ensure that by the time their chips hit the market, people would have motherboards and other products ready to accept them. So, they only offered the full datasheets during the pre-release period to developers under some form of NDA.

So the fact that you now don't need to sign the NDA to peek at the datasheets may imply that Intel is getting pretty close to shipping out the Itaniums en masse.

I would just wish more developers released such information without the NDA. Especially with devices (ie, sound, video, network cards amongst others). What these companies don't seem to realize is that just having the information on how to interface to a device is only a drop in the bucket to actually developing and building a functionally equivalent device. And yet, if some companies goal was to clone the part in the first place, they would reverse-engineer the hell out of the device, and the datasheets would only make their jobs slightly easier. Plus, by the time the identical device came out, the original device could be much pretty outdated.

There was a study on a Japanese company that spent all their time reverse-engineering another company's GPS receiver. By the time they figured out how it worked, and built their functional equivalent, several years had passed. Nobody would use their product, because it was too inaccurate, and the original GPS company moved onwards to a better and more accurate part.

PDF documents really suck for online reading - pdf readers generally have crappy text search, if any at all - they're not internally hyperlinked, and you're restricted in your choice of readers.

The one good thing about pdf is it looks pretty when you print it out. But I for one never intend to print these out - if Intel's past policy is anything to go by, they will give you the official bound books for free.

Hasn't Intel heard of Docbook? Second question: has Intel good a clue who the target market for these docs are, and what their needs and desires are?--

What they *have* done, which is unusual, is to release performance data at the individual instruction level *before* the actual chip has shipped.

Normally this information *would* be available at this point, but only under non-disclosure, and therefore generally would not be availabile for use in open source projects like gcc, etc.

Therefore the "open source" connection to this latest Intel information release is the suggestion that they are making this information *public* now at least in part because they recognize the importance of "open source" products, and they don't want the open source community to be at a disadvantage relative to those willing to sign non-disclosures.

Saying Intel is "open sourcing" Itantium because they are releasing architecture and programming info is like saying Black and Decker is "open sourcing" their toasters because they posted instructions on how to put the bread in.

At MOST this makes Intel "Open Source friendly", but I would argue that it just makes them pragmatic. How else am I going to create devices and compilers for a platform except if I have the specs? Duh.-- Have Exchange users? Want to run Linux? Can't afford OpenMail?

"Here, I improved your chip design. There was a fundamental flaw which I immediately spotted, right there were 100 lines cross those 131 other lines and near that thing over there, so here you go take my improvements"

I don't think so. It's like spaghetti on LSD out of hell and I can look at the blueprints but probably won't get anything more out of them than I do from abstract art.

Wow.... I can't believe it! No more mindless bandwagon-jumping from Intel, but some real dedication to the Open Source cause!

I'll tell you what I'm going to do: I've got a few ideas on how to really improve the floating point speed on these guys, so I'm gonna put together a few patches and then whip out a few chips and test 'em and see if they work. I'll submit those to Intel, for next week's CVS snapshot, and then I've got a few tweaks I want to make to the speculative execution. After I submit those patches and get them tested for a while, I'll whip up another batch of chips and put them on my website free for download!

It's the developers guides - not the blueprints for the chip. They would be mad to release the blueprints to the chip! "Hey guys, this chip we've spent the last 2 years and $4 billion developing - here's the design - please, fab your own" - right! However, it's nice to see that they arn't charging for the guides

AMD would be stupid to open source their top chips. They've got more to lose. Intel, getting strung along by the technical advantages of the Athlon, can use this as a great marketing ploy and say "See, we've got nothing to hide. But AMD must.."

The loser can afford to take more chances. Don't get me wrong, I like intel chips. They're pretty darn good. But when it comes to innovation and raw performance, AMD has the top slot.

How is Intel "losing"? I'm a Mac user myself, but Iknow that Intel is far from losing. Doing this can only help push Itanium into the mainstream server market. As far as AMD is concerned, I'm sure they make great chips, but when the day is over, they are still based on Intels design.

That said, anyone planning on developing cutting-edge software should care. This information, which is normally protected by Non-disclosure agreements,is normally only available to approved software and hardware vendors/manufacturers. This really levels the playing field by giving every programmer equal access to this information.

Go look at the intel website link and it will take you about 5 seconds to see that this has nothing to do with open source. No designs have been published, no schematics, not even decent scale picture of the insides.

In fact what has happened is intel has published it's usually array of developer documentation online. This is the same information that has been freely available for all it's over processors. Yes intel has made these freely available online (a good thing), but even this is not new.

What we have here is a case of a journalist slapping the term 'open source' on a news item to get a bit of attention, either that or a journalist who never reads the material, some how 'technical specifications and programmer docs' has become 'blueprints', a bit a leap of imagination if ever I saw one.

This reminds me a lot of that altervista 'open sourceing' which just turned out to be the html code for a search box.

It's already been posted multiple times how releasing specs is not open source, so I won't repeat it. (I also won't go into open source software vs. hardware.)

Is this the fate of the open source movement? Thousands of notices of victory and congratulations whenever someone releases documentation? As if having ESR on board weren't bad enough for the movement...

This behavior reminds me of natives of the American southeast (also known as "hicks", "rednecks", or "inbreds"). With a dearth of native cultural artifacts, they claim anything even remotely related to the South as a direct result of their culture. (I believe it's currently up to claiming all Internet traffic through the Atlanta switches as having benefited from them.)

Hmm... Maybe I didn't look closely enough, but all I see are programming manuals and overviews of the chip's architecture. This is much less information than they released for past chips [intel.com]. I wouldn't trump it up as "opening their hardware" unless they did something totally uneard of like posting its complete ASIC design.

Slashdot had an article not too far back about a company doing REAL open source architecture. CPU blueprints that were available for anyone to look at and change.. What Intel is doing certainly doesn't qualify as open sourcing; better to call it what it is, DOCUMENTING..

It's one of the major reason the IA-32 platform won out over the Motorola M68k family: Motorola charged for their documentation, while Intel made it readily available for anyone who wanted.

They've also always said they'd release the specs for IA-64, but they also said they intended to keep it under NDA until it was ready. That didn't stop them from giving access to VA etc. and funding Linux and GCC development for it, with the intent of releasing it once they'd be releasing specs.

It's nice to see their specs are available, but there's nothing more to it than business as usual.

No, you go down to your basement foundry, fire up all the GNU-brand fab equipment you downloaded from the FSF, and you pump out your own Itanium.

The whole idea is ridiculous! This is almost the same kind of idealism that lead the Chinese government during the 'Great Leap Forward' to convince people all across China to build backyard steel foundries. For the more dense idealists in the crowd itching to reply to this comment- no, it didn't work in China.

It's so ludicrous it can't help but make this website look like a bunch of damn fools for even putting up an article about it.

This used to be the case, but with modern CPU designs, it is very difficult to get the instructions in the right order to allow for out of order execution, keeping multiple pipelines full, etc.

You can do this by hand, but your code won't be very maintainable. In assembly, there is a tradeoff between readability/maintainability and speed of execution. Your better off writing in a high level language and let the compiler mangle your code for you.

Of course, there are some exceptions, but compilers are getting better, not worse.