If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

Editorial: Rambus

The Rambus debate is a subject which truly needs no introduction. Never have there been so many crying out against a single technology, but is it all really fair? Does Rambus really deserve all the resistance it's met with?

Van Smith writes it, going thru another test that shows RDRAM to be severely deficient.. so much so that unlike Mr Mepham (scuse the spelling if its wrong) he says he'd have a hard time recommending anyone to use it even if it were the EQUIVALENT in price to SDRAM.

Also.. he severely criticizes sites that have been posting as he puts it Rambus Fluff. Hardware Central is mentioned by name and Sanders two columns are severely criticized as being dubious in facts and nothing more then "pure snake oil" Anandtech also gets taken to task - tho at a much lesser extent.

Rambus is probably the most hated company on
earth right now for the same reason that
engineers generally make wisecracks about
marketing and sales staff.
When Rambus started several yeasr ago, the
new designs were intended to have only one
advantage over normal SDRAM, Fewer Bus wires.
This would be achieved due to the fact that most of the time that a request for data from an SDRAM chip is active, is time not being used except internally to the RAM chip. Someone at Rambus got the bright idea of using this extra time to "multiplex" the
64 bit bus onto a 16 bit bus, thus reducing the pincount for the devices. The idea is sound, and there is enough dead space in SDRAM access times that it is a relatively easy thing to do. The problem however is that this design has nothing to do with improving the performance of the memory, only trying to make it a little cheaper. They failed at that one, and to make matters worse, their marketing decided to push it as being a superior solution, (which when it was introduced was true), but only in a very twisted sort of way. What they said at the time was "RDRam will be better than what you're using now". What they didn't say, and what wasn't true nesescarily was "RDRAM will be better than SDRAM will be". In any case, Intel, Sony, and a few others are now in a tight spot. They have started development using RDRAM, and have to make a decision about what to do, and you'd better believe that the decision is entirely monitary. They have to decide to either continue with RDRam or scrap and do the re-development with SDRAM. For some of the more complete designs, starting over would constitute more cost than just living with RDRam. trying to say anything about the merits of technology based on who's releasing hardware with it today, is moot, since odds are that whether or not they wanted to, they couldn't switch anyway, not without delaying their product another 6 months. In 10 months after the next design cycle, we can make accurate inferences about RDRam's future based on who is releasing products with RDRAM in it.
So as company X, what do you do if you suddenly discover that you've paid umpteen billion dollars on a development effort, and that the entire time and money was wasted because of an inferior part? Well if you're engineering, you get started on fixing it in the next rev (see intel's and Nintendo's roadmaps). If you're marketing, you do the only thing you can do to recover some losses: you gloss over the negatives and try to sell it anyways.

Where is the beef? I see lots of bread in this article, but no useful info other than the links. Surely there could be a graph or two. At a bare minimum I expect some kind of technical discussion. This is after all a technology debate.

Perhaps the person that wrote this is not competent to write a technical review. If this is the case, it is a sad day for HWC. Are we now host to fiction reviews? Don't get me wrong, I like fiction, just not bundled as a technical review.

Technical discussion would have been a nice add to a technical article.

How about this for a financial incentive: After the ink is dry, Intel realizes Rambus isn't "future tech". What do you do to cover your butt and get your "warrants"? Push Rambus as hard as you can, meet your requirements of selling your 20% of chipset for two quarters, and then cash anywhere from $100 million to $500 million, depending on the stock. And lets not forget this is all profit, or as accountants might say, Net profit. Don't try to tell me $100 million in profit isn't incentive, even to Intel. P.S what was Intel's net last year?

It is quite clear that RAMBUS is not adequate as it stands, it may be in the future. What I would like to understand is why you feel it is necessary to "defend" it in any way. A product which is not suitable for the market will fail, plain and simple.

Are you defending it because you want to get pally Intel in the hope that they will give you a Willamette for testing?

What your latest argument seems to come down to is "Intel has always done well. They must have got it right and anyway they don't just want your money and huge market share".

Hey guys. First off, thanks for your feedback! If you're willing to take the time to comment on my article, I should be willing to take the time to give you some feedback in return, so I'll try to address what you guys have said and questioned. Here goes:

As for lack of technical data and benchmarks, that wasn't my focus with the article. Itís an editorial, not a review, which I did attempt to make clear. HardwareCentral has already posted numerous technical articles, as have tons of other sites (Anand, Toms, etc.), so I didn't see the point. Of course I could reiterate their findings, but that would be just that - reiterating. Instead, as I said in the first paragraph, I wanted to take a common-sense, editorial-style approach - I never claimed the article to be a technical piece. If that was the impression I gave, then it's my mistake, and I apologize.

As to what Intel does now - I donít know. Many of you are exactly right, whether Intel likes Rambus now or not, they're stuck with it. They're too deep to get out now, it would throw off all their plans. However, my point was this - the initial reason for Intel's involvement was NOT monetary, in my opinion. Intel has forked out way more than they're going to get back from Rambus - sticking with SDRAM would definitely have been less costly for them. When they got involved with Rambus, I do believe they honestly thought it was the most sound technological decision. Again, whether that decision was correct, and whether they still feel that way now is an entirely different issue. Maybe they do, maybe they don't, but they're stuck for at least the next little while anyway, you're quite correct.

Further, I clearly stated that I am in no way implying that we should blindly follow Intel, because it's Intel. As well, I clearly stated that, yes, of COURSE Intel cares about money and market share. What I also stated was, at least CONSIDER the idea that isn't the whole reason behind their working with Rambus. Maybe they really believe Rambus is the future. Iím not saying that decision is wrong or right, simply pointing out that Intel may be acting out of interest for the technology, rather than just for money. I don't know if that's the case - no one does, save for a few high-ups at Intel - but it's a good possibility that many seem to completely ignore. All I'm asking is that it be considered.

As to 'defending' Rambus - I would defend any technology which I feel is receiving an unfair rep. As I have stated (several times), I'm absolutely NOT implying Rambus is without fault. Much of the predicament they're in now is no one's fault but their own, and to that end, they deserve it, and hopefully will learn from their mistakes. However, I do feel that some of it is undeserved. And as I clearly stated in the opening paragraph, I receive no kick backs, no payment, no processors, not even a Rambus T-shirt. I have presented what I believe to be an unbiased article, pointing out positives and negatives, and I do not believe questioning one's integrity is an appropriate course to take, simply because their views do not coincide with others. To future posters, by all means, debate, constructively criticize, comment, etc. Ė but please letís not embark on the whole Ďkickbacksí journey. I donít feel itís appropriate, and Iíd like to put an end to it here.

At any rate, the focus of the article was in essence this: Rambus and Intel have made some mistakes, some rather large ones. Ill be the first to admit that. However, that said, I don't think they deserve what they're getting in return. As I said, they're not these evil monsters intent on robbing everyone of their money. My point was this - at all times, do your absolute best to remain as neutral as possible. It's SO easy to hate Rambus (again, partially Rambus' fault - they've given good reason to hate them at times), but we should always, always do our absolute best to consider all information first, and make our own decisions. My decision is that, despite popular opinion, it's a decent technology; not one without fault, but not one without promise. I simply believe it's been mishandled. If your decision doesn't parallel mine, that's perfectly alright. I'm not out to make everyone see everything the way I see it - rather hopefully prompt people to make sure they consider all sides before making their own decision.

That was longer than I expected. Again, thanks for those who left comments. I will, of course, continue to defend my writings, but I do appreciate hearing othersí views. If there are more concerns, please do post them, and Iíll be sure to check up.

Thanks,
Dan Mepham

[This message has been edited by XkALiBRe (edited 05-30-2000).]

"In the computer industry, there are three kinds of lies;
lies, damn lies, and benchmarks."

I think it's an excellent article. Unlike the previous articles it is objective.

I have one question though. I don't understand the part about "Coupled with newer chipsets, and 400 MHz RDRAM, the performance is increasing."
I did not know there was a new chipset for Rambus?

Personally I'm not convinced that rambus will be the memory of the future. It has no advantage against current solutions (SDRAM). How will it be able to fend of future solutions, I mean beyond DDR SDRAM? I gues we'll see.

BTW One could see this all as an attempt from Intel to control yet another part of computers. They more or less control processors and motherboard-chipsets, why not memory too.

Well, this was definitely better than the "Rambus: facts and fantasy" dreck that preceded it-- in fact, it was quite good. Of course, overclockers.com covered all of this much, much better than anyone else has, IMO.

I would say that there are only a couple of areas where I would disagree, and I'll lay those out here.

For one, I don't hate Rambus. I don't hate Intel. I don't hate RDRAM. I do hate the possibility that control of the memory market will be in the hands of two companies, instead of open standards. Intel, in the past, has used its control of chipsets and licenses as a club against other companies (as they're doing with VIA now). I become nervous at the thought that they'll have yet another club to wield, and one that will almost certainly be used against AMD, but can also be used against any other company that doesn't toe the line. Remember how it was when Intel was the only game in town? Slow progress, and high prices. No thanks!

I am also in disagreement that Intel would consider $200 million (or more) as chump change, which is something I've seen several people say. Let me clue you in to corporate thinking, especially where shareholders are concerned: Even $1 million is not considered a small deal. And the $100 million to $500 million that the Rambus warrants may be worth is of considerable significance to Intel. You do not pull in $6 billion a year in profits by disregarding even small amounts of money.

In light of that, I do believe that Intel was driven, to a large degree, by factors other than plain old technology. I believe that they thought Rambus was a great technology, sure. But I also think that their timing coincides with AMD's emergence as a legitimate power in the desktop CPU industry. I think that Intel was driven by the desire to regain the stranglehold they have lost in the last year or so (after all, they first signed contracts with Rambus back in 1995!).

And frankly, I think any other company in their shoes would do the same thing. For all of the corporate talk of idealism, corporations are ultimately motivated by two things- power and money.

I don't think that the backlash against Rambus is due to hate as much as it is due to the aforementioned fear of Intel regaining control, as well as the prospect of paying more money for no real performance benefit. Remember, when the transition to SDRAM was being made, SDRAM was an open standard, and Intel did not force it on us, they eased the way in (their SDRAM-capable chipsets at the time, the TX and VX, also natively supported SIMMs). RDRAM was an overly aggressive attempt to take back control of the market by force. And I would resent that from any company.

AH! And I guess it should be pointed out as a clarification. Intel is not necessarily stuck with Rambus. The contracts that they signed give Intel considerable leeway and quite a few back doors in case they decide that Rambus is too much of an albatross. This should not surprise anyone- Intel is a massive company, and they'd be foolish to tie their hands where a small company like Rambus is concerned. And they won't be quick to duck out of the deal, since it would look bad PR-wise.

If anything, it seems to me as though Rambus technology was simply thrust upon us a while before it was ready. Chipset problems, problems with getting manufacturers up to speed, and initial lack of performance differential have all lead to the same thing, people think RDRAM is overpriced and underperforms, which is absolutely true with today's systems.

Further more of what you said, it's going to stay that way for at least another year, until & if Rambus will find a way to improve it. That's the exact point Tom's Hardware and others been trying to make.

AMD debate still remains one of the most vehemently argued in the industry, but never have so many hated a single company with such passion as Rambus.

I really don't hate Rambus nor Intel. what I dislike, is their way to promote an Overpriced & Inferior product against the whole market and the consumers.
If RDRAM currently had a real performance advantage over the current SDRAM, even if it was still overpriced, (though, still offer chipset to the low end) I'm sure you wouldn't see all those attacks. And I'm sure, Tom's Hardware would recommanded it to a certain high-end system.

So it's really not about those companies, but about a -currently- Overpriced & Inferior product.

I must admit that I'm a bit impressed with the brilliant way Rambus managed to force its way to the pc memory industry. It's like selling ice to the Eskimos... unfortunately for Rambus, they just discoverd that their own ice is just as good and much cheaper.

I believe that even if Dan Mepham were honest in his editorial, he is either BIASed or refuses to acknowlegde key points.

Think back to P55C, AMD K6, the P2 and Socket7, and socket verses Slot...

I remember that debate well, and particiapted in it on other boards.

Intel claimed that the Slot architecture cleaned up electrical characteristics such as trace capacitance, inductance and cross talk on a platform (Socket7) that made 100MHz operation unfeasible.

Is was pure "Snake oil" to quote THG (although I have choicer words). AMD had a 100MHz socket7 board developed and mature before Intel had BX boards out.

There was lag, the L2 implementation on Socket7 was limited to 100MHz, and Intel is superior to the others (at the time) in chipset implementation, so it did take a while to implement socket7 at 100MHz, and it was painful.

Could Intel have done it? Of course they could. Was the decision for Slot 1 merely motivated by technology? OF COURSE IT WASN'T!

The K6 surpassed Intel sales in the OEM sector for a couple of months... Slot 1 was a technology designed to hurt its compeditors (AMD, CYRIX, VIA, SIS, ALI) and prevent them from stealing market share.

Now Intel has reverted back to sockets to cut costs. The at one point stated the socket imposes too much inductance and capacitance to faciliate 100MHz operation, and the slot was electrically cleaner. We have Socket 370 Boards that run P3's at 150MHz bus... I have a socket adapter running a P3 500 at up to 155MHz bus, through a slot...

Yes, P3/Celeron and Thunderbird/Duron integrated L2 cache alleviates the need for the slot, and allows the cash savings of the socket.

But the Slot move, was never intended to be merely a technilogical improvement (IE it moved L2 onto the CPU and allowed to CPUs to reap the benefits of faster L2 cache advances), it was designed to kill competition in a none to sutble manner.

The Rambus introduction in the I820/840 is the same concept. All over again. A new standard designed to neutralize chipset manufactors ability to produce competing chipsets.

Dan... sorry but RamBus is not a revolutionary technilogical advance.

If you study Memory subsystems and processor advances that new Caching implementations in the P3 and other CPUs are going to 4 and 8 way... as this and caching algorithms become more effective, the randomness of memory accesses also increase, and latency becomes more and more important. Van showed this in his reviews at THG and its NOT surprizing. This is completely to be expected. And if L3 caches become the norm, and Athlon cores with intregrated L2 cache start exceding 1MB and 2MB and approach 8Mb L2 cache as AMD plans to offer, this phenomina will become even more prominent, and memory access will become "generally" more random.

So Rambus is not a good choice for future CPUs either. Its not a good choice for AMD chips... It has some potential in Multi CPU configurations where the system utilized unified memory (As most do) and the system can utilize the extra bandwidth.

But DDR is coming out... PC133 can run over 150 in some cases, and new advances are allowing SDRAM to hit PC166 and DDR PC333 specifications. Thus SDRAM bandwidth will surpass that of Rambus in the short term given the tremendously poor yields on PC800 RDRAM.

RDRAM offers the benefit of reduced traces on the board, just as SLOT1 reduced board clutter. But there is no performance gain to be had.

I wonder how fast SDRAM could go, if the chipsets properly interleaved the memory banks, or if SDRAM worked in pairs and the ram bus was increased by a factor of 2...

SDRAM is far from dead.

Memory performance does bite, and a better alternative to SDRAM "must" be found. But RDRAM offers very little in performance gains.

It is an absolute shame that users have fragmented into groups to argue this point. Like other users have stated. Its about money and power. And other companies in Intel's shoes would do the same, including AMD. Its up to us as the buying public to support the best alternatives.

And right now, RDRAM and I820 offers very little in return for exorband amounts of money. The price of OEM PC133 SDRAM is $160CDN via resellers. That is about $110US. Where the $200-300 figure came from, I have no idea. But its not accurate. The cost differential is huge.

This is a moot arguement...

Want something valid? List 10 examples that Intel and Microsoft utilize their monopoly to ensure the success of technilogically inferior products, and abuse their power to force distributors to back them and not the competition.

I think Saruman TWC hits a bulls eye with his interpretation of Intel's motives. Many Intel's decisions since AMD became a serious player have been motivated by the need to segment the market. By separating the platforms, Intel can use its greater mass and marketing muscle to its advantage and maybe regain some of its lost monopoly power. The competitor loses the ability to "free-ride" on (for example) quality chipsets developed for Intel CPUs etc.. Intel must have hoped that by betting on RDRAM it could over time choke SDRAM and leave AMD as a hostage to an obsolete technology. AMD would then have to make a panic turnaround to a technology where Intel has a good head start. The fact that AMD holds a licence to RDRAM shows that they see at least a marginal change for that happening.

As it seems now, Intel's strategy has not worked as well as they hoped. AMD has gathered enough mass to keep up the development of chipsets for its products, and SDRAM seems to be doing better than ever.

Dan, you say you think that the reasons for Intel choosing RDRAM were not financial. Then what might those reasons have been? Environmental? Moral? Ethical?

Of course all decisions of just about any enterprice are made for financial reasons. Technologies play a role only to the extent they serve the bottom line. Firms that choose technologies for technologies' sake go out of business.

But I take it you mean that the financial reason for Intel choosing RDRAM was not those 1 million stock options, and there I have to agree. Of course $500 million (or whatever the prize may end up) is a lot of money and if Intel were to stumble on it, they would pick it up. But is it enough to make Intel gamble the future of its CPU business on? Not by a long shot.

The real money is elsewhere. It is in the long-term strategic goal to wrongfoot AMD back to realms of low-end market with small margins. That's worth gambling for. Whether that leaves us with the better or worse memory technology is secondary.

"The real money is elsewhere. It is in the long-term strategic goal to wrongfoot AMD back to realms of low-end market with small margins. That's worth gambling for. Whether that leaves us with the better or worse memory technology is secondary."

Many will recall Intel's insistance on pushing SDRAM back in the days of EDO. At the time, SDRAM didnít appear much faster than EDO RAM, but look at it now. Likewise, back in the days of the 486 DX33, Intel planned to start using a bus multiplier and took flak from analysts saying the performance increase would be negligible, but clearly Intel was right, and the analysts were wrong.

Hmm ... I must be the only moron then to have thought both were good ideas, hence the fact that I am still running a VX motherboard (with SDRAM) which has ALWAYS since its introduction been THE FASTEST FRIGGIN WAY TO RUN A PENTIUM.

And when was a significantly faster CPU considered to provide "negligable performance increases"? Are you saying that analysts said that a 486 DX2 50 was no better than a 486/25? (Or maybe you are referring to the 486DX/50 compared to the 486DX2/66? At the same bus speeds, the 486DX2/66 was always faster.)