Posted
by
kdawson
on Friday October 09, 2009 @11:20AM
from the told-you-so dept.

The rumor that we discussed a few months back is looking more real. Vigile writes "Once the darling of the enthusiast chipset market, NVIDIA has apparently decided to quit development of future chipsets for all platforms. This 'state of NVIDIA' editorial at PC Perspective first highlighted the fact that the company was backing away from its plans to develop a DMI-based chipset for Intel's Lynnfield processors due to legal pressure from Intel and debates over licensing restrictions. That effectively left NVIDIA out in the cold in terms of high-end chipsets, but even more interesting is the later revelation that NVIDIA has only one remaining chipset product to release, what we know as ION 2, and that it was mainly built for Apple's upcoming products. NVIDIA still plans to sell its current offerings, like MCP61 for AMD platforms and current generation ION for netbooks and nettops, but will focus solely on discrete graphics options after this final release."

I'm not looking forward to that day. Everything done with JavaScript so far has sucked filthy penises.

Take the stupid comment slider here at Slashdot, for example. The old non-AJAX approach worked just fine. You didn't have to click "More" and then wait, click "More" and then wait, etc. hundreds of times just to see all of the comments.

And you could view the -1 comments easier, as well. Even now I still don't know how to show the hidden comments. The piece of shit sidebar panel says "# Hidden", but I pull on the dragger thing and it refuses to move! The other one works fine, though.

My point is that AJAX front-ends like that of Gmail became "fairly typical" only after Google had released Gmail into a limited beta. Imagine Google Maps Street View using polygonal models of nearby buildings rather than still skyboxes. It'd be like the step from Myst (1993) to Real Myst (2000) [wikipedia.org].

My laptop with it's built in ATI PCIe chip recently died & I had to swap the drive out to another Dell laptop I had laying around, one whose HD died on a friend and said friend basically didn't want anymore . It had one of those Intel IGP chips in it. I was pleasantly surprised when it would still play NWN & Dungeon Keeper II. I was freaking shocked that it played DDO & both ran faster & looked even better than the ATI PCIe chip did.

I would argue Intel's strength relies a little on the U.S. intellectual property laws and procedures. If the country loosened intellectual property law, Nvidia might have a chance in hell.

But this is also about a global market where 80% of product comes from maybe 10% of all possible manufacturers and there are few laws preventing Intel from doing all kinds of market shenanigans in places like China.

I know the loosening of intellectual property laws would help Nvidia's case, but I don't think it would bring about a semi-competitive marketplace because this market (global OEM) has few legal constraints.

Then there's also the whole thing of nVidia producing utter crap chipsets... That might have a teeny weeny little something to do with it.

It has nothing to do with intel's "market dominance" and everything to do with nVidia's inability to be competitive in a market segment they know little about, and the shoddy crap they try to pass off as a chipset. Once you put the koolaid down and have an objective look at their product, it simply sucks.

I've had 3 of them and all three were utter garbage. DFI, Gigabyte, ASUS, it didn't matter. Every time it turned out to be the MCP in the chipset or some other part of it failing or not working correctly to begin with. In one case the interrupt controller didn't work at all with a dual core CPU and on both linux and MS Windows they had to put a "software" interrupt controller in the kernel to make it work with a dual core cpu. As you might guess this made the multi cpu performance _worse_ than a single cpu. And this was a chipset designed for multi-core CPUs.

nVidia was playing the same game in terms of wanting an excessive amount to license SLI to Intel. That was why Intel boards used to only support Crossfire, not SLI. ATi licensed it for a minimal fee, nVidia didn't because they wanted to push their own chipset products.

I think this is a clever ploy to make Intel play nice with Nvidia. By "letting go" of the market, true or not, Nvidia sends a message that Intel is a monopoly, which puts Intel in a much worse position (remember the EU) than Intel has when competing with Nvidia in the chipset scenario. Obviously, it's impossible to know what's going to happen. But if I were at the top @ Intel, I'd be freaking out a little, for this tiny little company "we have crushed" (that's what Nvidia makes it look like) will get us into the spotlight from regulators. I'm gonna go get some popcorn.

AMD = ATIVIA = mostly down the small form factor road (to complement their Epia and Nano CPU's) these daysSiS = are they even in the chipset business anymore? I can't remember seeing a motherboard with SiS chipset the last few years

AMD doesn't make Intel chipsets, VIA's modern offerings are laughable (hit up newegg, ONE LGA775 VIA mobo), SiS seems to have gone OEM only and I haven't seen one of their chipsets in ages, so presumably the marketshare is miniscule. ATI stopped making new Intel chipsets when they got bought out (still have one old one available), and ALI seems to have dropped out of the chipset business years ago.

Yeah, they made Nvidia look bad by putting out chipset that met spec, survived average use, then had the gall to not hide the fact! (see http://support.apple.com/kb/TS2377 [apple.com]) I mean really, how can Intel do business like that? And people wonder why Nvidia is bailing, then trying to hide it before Wall Street notices and downgrades them more.

The 'denial' they are throwing around now states that they are not going to develop AMD chipsets anymore, not going to develop Intel chipsets anymore, and only going to continue selling the ones they have made. Until Intel stops making FSB chips in a few months, then it WILL be Intel's fault somehow.

Back to the original question, can you explain how Nvidia voluntarily stopping design of AMD chipsets is Intel's fault?:)

I saw this a year ago when I saw them stop most if not all future chipset products. I wrote it up. Nvidia denied it. A year later, they announce a stoppage for a few hours until the implications sink in. Then they deny it.

Yup. Intel. Those bastards!

I agree about the competition part, but this isn't sad, it was planned.

Any pretty much every manufacturer has had screwups. That being said, nVidia has made some nice performance chipsets in the past, and it's a shame to see them go. Really, for my experience, and in terms of reliability, they are have been the only company to produce chipsets that could compete with Intel.

Yup, you are right, but the same thing happened with their chipsets, same problem. Look up the recent Sony admission on the same topic, and Dell, HP along with many others. I won't keep spamming my own links/stories here, you can find them an a lot more with a little searching.

I would not say their chipsets are reliable, nor bug free, but they did have speed at times. This may be OK for a home user, but looking at the data corruption problems for their RAID setups, drive controller issues in general, networ

I'm still not clear, are you trying to say that all or of Nvidia's problems are there own due to poor quality and none of it is due to their legal inability to produce chipsets supporting current Intel chips?

Yes and no. Their excuse is the legal inability, but they have known that for ~2 years. Why it suddenly becomes an issue AFTER they realized they needed to publicly have a scapegoat is something you will have to ask them.

The basic problem is that there will not be any chipsets in about a year, with memory controllers, graphics and PCIe moving on package or on die, depending on the exact chip, but on all on die shortly thereafter. What is a chipset? Sata controller, boot rom and USB ports? And why do I need

For the longest time, they were the only company that had decent SATA controllers. I can remember getting 300MB/sec sustained read speeds in RAID-0 with burst speeds close to 700MB/sec off their nForce 5 boards. At the time Intel's controllers(ICH9R?) were choking on SSDs and couldn't manage more than 300MB/sec burst. Sustained read was significantly lower.

No. More like their chipsets being utter crap for some years now. Always hailed as the greatest in the tests, but when you actually buy them, you notice weird things. Like the main bus not being big enough, so that a average raid0 can make professional sound cards crackle beyond usability. or like the builtin NIC being so bad, that you actually have to buy another one and disable the on-board one in the bios to avoid it crashing your OS on the first transferring packet. Things like that.

True for me. The last nVidia chipset I was happy with was the nForce 1: perfect reliability, outstanding sound. Since then I've always had small problems, like RAM sticks working everywhere except with their chipsets, heavy HD loads causing OS crashes, heavy USB loads causing OS unresponsiveness...

Funny, I've been using Macs with NVIDIA chipsets for a while and haven't noticed any of those problems. Maybe it's not the chipset so much as poor BIOS and Windows support for ACPI interrupt steering, poor chipset drivers for Windows, poor Windows drivers that spend way too much time in interrupt handlers... hmm... I think I see a common theme here... Windows....:-)

Well, as Apple made public knowledge when they switched to Intel, (not an exact quote) "we develop, compile, and test OS X on multiple hardware platfors, always have since the very first day of development, include new processor platforms as they come available, and can change to an alternate platform at any time."

IBM appears to be working on a low power P6/P7 architecture, AMD has some nice new stuff, They have their own fab now for low power CPUs, I'm sure they're compiling against Atom and likely even Ce

Which means what the GP said. Nvidia's integrated graphics solutions come in the form of Nvidia chipsets (of which the nForce is the most common). If they're no longer making chipsets, then they're no longer making integrated graphics. There's still the possibility of a maker taking a discrete chip and adding it separately to the motherboard PCB, but with virtually every modern northbridge chip having built in graphics already I don't see that happening. The people who are satisfied with integrated will

Their only chance of getting into the CPU business is ARM. x86 is a licensing dead-end. Luckily companies like TI, Apple, Nokia, and Google are driving a wedge in there, so they might be able to get their foot through the door with those high-performance 2ghz ARM quad-cores that are supposed to come out in 2010 or 2011.

They are stopping their nForce line of chipsets (as in, northbridge/southbridge). I couldn't be the only one to see this coming a mile away, could I? Before AMD acquired ATI, they and Nvidia were perfect partners. After that they became a lot less relevant. With Intel and AMD producing their own well regarded "gamer-grade" products for some time now, I can see why Nvidia sees little point in fighting.

All of the above plus Intel is going to put the GPU on the CPU soon.Intel is going to kill the integrated graphics market with that move and AMD/ATI is planning on doing the same thing.So since Intel's GPUs are terrible we will just have to wait and see what comes of this.The big impact I see is on Apple. They are really tied to Intel but have been using nVidia GPUs .

Intel will be putting graphics on the CPU, according to their roadmap.AMD will be putting graphics on the CPU, according to their roadmap.

At that point the GPU is already a "sunk cost", noone will buy an integrated GPU that's only slightly better than another integrated GPU. It's also not only legal reasons, but also about pricing, timing, access to resources and so on. Intel can increase license costs, do accounting so more profits go on processors, delay launches of competing chipsets, deny access to resources trying to work out incompatibilies or instabilities and so on. Intel is doing extremely well and is ready to do that landgrab, one way or the other. I think nVidia is doing a better play as the victim of Intel's legal department rather than being gently pushed out the door as the GPU joins the CPU.

I stopped using discrete modems and went for winmodems (softmodems) almost immediately because of the latency getting the data through the serial port [linmodems.org]. Sadly, it is the same for graphics cards which is why you will never catch me dead with one in my machine. I will pown (sic) you all everyday of the week.

due to many problems. reports of data corruption at the design level (not build or parts but *design* faults). their ethernet drivers were horribly reverse engineered and never came close to the stability of the eepro1000, for example. at least on linux.

there were issues with sata and compatibility.

in short, they were over their heads. glad they finally admitted it (sort of).

Give credit where it is due. During the Athlon64 days (socket 939?), Nvidia were in a class of their own.

They were only in a class of their own because no one else was attending that school. Via was always a joke, Nvidia just provided the punchline. AMD was pulling out of chipsets at that point, and Intel had no interest in chipsets for AMD CPUs. Who then now?

AMD bought ATI, between the two of them they manage to synthesize half a decent chipset, and et viola, Nvidia is irrelevant. Since no one on the Intel side ever had much love for NV, they managed to put THEMSELVES out in the cold.

They better have a compelling product with the upcoming fermi then, but from I what I hear they're trying to design their GPUs for more general purpose computing, specifically scientific computations. It's a really big gamble and I can't see that it will be a huge market. Their upcoming products are supposed to have 3 billion transistors which is way more than 4x the amount in an i7 CPU. It's probably going to cost a ton too.

Sure it will, but it's meant as a replacement for a clusterf*ck of metal that costs in the millions. If it can compete with small supercomputers, they have a good chance IMO. They're also attacking from below with Tegra, and with ChomeOS running on ARM, so I think Nvidia is a company to watch.

You have to look at target market though. Sure they might be the deal of the century for that occasional scientist looking for supercomputer power on a budget, but in reality, few regular users - hell few extreme power users - need anything resembling a supercomputer (not just raw speed, but super computers are designed much more for parallel processing, and a ton of what users do is more suited to serialized processing).

Overall, I think they do indeed have a target market - I just don't see that target ma

And yet the 3 billion transistors is only 50% more than the AMD Radeon HD5800 series.

Considering that they're adding general purpose functionality and direct C++ programming onto the chip, it might not be an entirely unreasonable result adding an extra billion transistors. But, time will tell.

I don't see why they couldn't go for the GPGPU and scientific computation market. They've acquired a lot of SGI and Cray IP. The x86 has been done to death. Except for more cores and a faster bus there isn't much more R&D there. And I'm not really sure why they got into the chipset business in the first place. Intel and AMD had it helmed up leaving very little for a third competitor.

Their core competences are in GPUs, they have a lot of IP there. This is valuable for negotiating licenses against the

And I'm not really sure why they got into the chipset business in the first place. Intel and AMD had it helmed up

You must have a very warped memory of when nVidia entered the chipset business. The first chipsets were before AMD bought ATI and nForce mostly killed off a terrible line of VIA chips. They were really good at their best, they're just being squeezed out of the market.

...that nVidia are at least giong to make a stab at providing graphics-enabled southbridges or something... as for things like HTPC's an Intel CPU + nVidia integrated graphics is brilliant. If I'm in the market that's looking for integrated graphics (in the case of HTPC's, power usage and space considerations) then the GPU is more important than the CPU... and I find myself being pushed to AMD for the whole platform.

Intel is really shooting themselves in the foot with all the bus licensing stuff IMHO. By scaring off nVidia IGP's, they're left with their own mediocre offerings which, in my experience, are vastly inferior even in graphics tasks that don't involve 3D.

If nVidia can supply us with miniscule IGP's-on-a-PCIe-stick-for-a-tenner then great, but their recent developments seem to be pushing themselves into niche applications (bigger and bigger GPU dies primarily) and I'm worried an Intel platform will make me choose between Intel IGP or a power-guzzling graphics card. Heck, pretty much every machine I've built for others in the last five years has come with an ATI or nVidia IGP because I don't know anyone that games.

Disclaimer: I have every type of GPU in my house; I use nVidia IGP's for all my HTPC's since they're the only ones that are consistently good for HD content under both windows and Linux. Intel IGP's suck for video (my X3100 can't keep up with SD x264 scaled over a 1900x1200 screen without tearing and lag) but are fine for my laptops (low power usage preferred), and a mix of ATI and nVidia grpahics cards on the machines that need 3D. I was annoyed enough when nVidia IGP's stopped appearing for AMD boards, but not having them at all will be a serious pain in the arse.

Unless you run Linux. Check the MythTV mailing list sometime, nearly every post referencing an ATI gfx chip can't get even basic stuff working. NVidia gave us VDPAU, ATI has yet to answer that one. I have no idea how ATI does in Windows for HTPCs as I don't run Windows on my HTPCs. The license alone would be 30% of the cost for the machine even if I wanted to use it. Too much for too little.

Interesting. Something on the internet that isn't true. Worse it's on slashdot..oh to bad it's not the glory days when everything on the internet was true and you didn't have to worry about hoaxes or fake news stories.

So, they are stopping development of AMD chipsets, and stopping development of the Intel chipsets, leaving.... what again?

And their triumphant "no we are not" leaving statement amounts to, "We are going to sell the ones we have designed". Great. As long as Intel makes FSB chips, they can continue to trickle out older chipsets. But no new ones. And they aren't leaving. And there are no American tanks in Baghdad.

Come on, the only reason they are countering this is because the financial community is noticing,

Keeping up a sham for the stock analysts to see while the insiders bleed off shares perhaps? I don't know if this is the case, or have any evidence to back it up, but if I cared enough to look, that is where I would start.

That's great. Nvidia is outselling ATI chipsets by dumping stock of their Nforce4 (that is what the MCP61 is, you'd know these things if you read the PCPer article linked in the summary), a chipset from 2006 that doesn't even support PCIe 2.0. If that's not a sign of things to come, I don't know what is.

And Nvidia is developing ONE new chipset - ION2, for Apple. Since the rest of the world is moving-on to mobile i7/i5/i3, and even Atom is getting on-die graphics, I can't forsee Nvidia really investing an

Or did they? I know from experience, that companies all the time state that they have no intention of doing something *ever*... until the day where they actually do what they had long planned and just wanted to keep secret.

Of course that makes such a company look like complete untrustworthy idiots. But hey, managers are managers for a reason (= huge ego. Everything that makes them look bad "does not exist"). ^^

x86 would go nowhere if only IBM could make PCs, only open OEM market achieved dominance of competitors like Apple or Commodore. If Intel is not letting other people release chipsets/motherboards for it's own processors but AMD is free for all, any technical advantages of Core/Xeon would not be enough to slowly erode the market share in favor of a more open product.

NVIDIA's Ken Brown wanted to give us NVIDIA's thoughts on the current state of its chipset business. So here it is in its full text.

Hi,

We've received a number of inquiries recently about NVIDIA's chipset (MCP) business. We'd like to set the record straight on current and future NVIDIA chipset activity.

On Intel platforms, the NVIDIA GeForce 9400M/ION brands have enjoyed significant sales, as well as critical success. Customers including Apple, Dell, HP, Lenovo, Samsung, Acer, ASUS and others are continuing to incorporate GeForce 9400M and ION products in their current designs. There are many customers that have plans to use ION or GeForce 9400M chipsets for upcoming designs, as well.

On AMD platforms, we continue to sell a higher quantity of chipsets than AMD itself. MCP61-based platforms continue to be extremely well positioned in the entry CPU segments where AMD CPUs are most competitive vs. Intel

We will continue to innovate integrated solutions for Intel’s FSB architecture. We firmly believe that this market has a long healthy life ahead. But because of Intel’s improper claims to customers and the market that we aren’t licensed to the new DMI bus and its unfair business tactics, it is effectively impossible for us to market chipsets for future CPUs. So, until we resolve this matter in court next year, we’ll postpone further chipset investments for Intel DMI CPUs.

Despite Intel's actions, we have innovative products that we are excited to introduce to the market in the months ahead. We know these products will bring with them some amazing breakthroughs that will surprise the industry, just as GeForce 9400M and ION have shaken up the industry this year.

We expect our MCP business for both Intel and AMD to be strong well into the future.

Charlie - you claim that Nvidia will be dropping their midrange graphics chipsets, but offer no explanation why. While I tend to agree with your insight, I can't see why Nvidia would be willing to give-up marketshare just to staunch the bleeding a little. I mean, what the hell else does Nvidia make money off of, aside from midrange graphics (Tegra? too early to tell. Chipsets? They're gone. HPC? Small market.)? It would be foolish to allow their one remaining profitable enterprise to languish.

Short story #1, the G200b based cards are huge and need expensive PCBs. They cost more to make than the upcoming and likely faster ATI Juniper parts, so NV will have to wrap a $20 bill around each card to make them sell. Not a long term good business plan. I can't say more because I was prebriefed on the ATI cards and agreed not to talk about them. When you read this, keep in mind that I gave Nvidia a very generous benefit of the doubt. You will

Thanks Charlie. You don't have to say any more about ATI, because the cat's already out of the bag (some site broke the Tuesday NDA). They'll be moving exclusively to GDDR5 on 128-bit bus for their midrange parts. This means that right now, they could sell a cheap 512MB 5850 with 4 memory chips for next to nothing. And once the 2Gbit GDDR5 parts ship next year, those 1GB 5770 parts can be paired with just 4 memory devices, and could probably be sold for the same cheapo $100.

It's quite simple. Nvidia has graphics chips that cost $x to create. If they can't sell them for more than that due to competition with another company, then what is the point of creating said chips? Building something and selling it for a loss is a losing strategy. It's better to regroup and try again with something that makes you money.

What Nvidia needs are cheap powerful chips. What they have are expensive not-so-powerful chips that soon no one will want to buy. It's quite simple, their engineering dept.

Nope not fired. Only one person that I know of was, and it wasn't me. Luckily, it is easy enough to search the posters who say that out should I be bored and want to sue. It is a pretty clear case of libel, eh?

They have a huge contract with Apple as they've adopted NVidia chipsets for pretty much the entire Mac product line. Given that Jobs would preemptively shift to another chipset platform in the last round of announcements if this were even remotely true, I seriously doubt that NVidia would even think of limiting further R&D in their chipsets to Ion 2.

Unfortunately I'm used to the editors slipping at least twice a day...

You know Apple switch to x86 architecture a while ago and uses Intel processors exclusively, right?

If Nvidia can't produce chipsets for the new Intel processors, that deal is only going to last as long as the FSBs remain marketable. As soon as DMI is the norm from high end to low end Nvidia won't be selling chipsets to anybody.

Sure, it will be a while, but that deal was doomed as soon as it was written - it is not a long term contract.

Maybe this is a sign that NVIDIA is going more towards ARM, that has always been a system-on-a-chip architecture. Tegra lineup is a very nice product already, with ARM going Cortex-A9 and multicore this year, maybe Nvidia just has a more important space to play in, than to tinker around with x86 chipsets ?

The summary and the official response say the same damn thing. Furthermore, if you would have RTFA, you would know that it quotes the official statement that every one is posting, giving a paragraph by paragraph critique of how it does not refute anything, just tries to spin it nicely for the stockholders.

NVIDIA currently has no plans to create any new AMD or Intel chipsets after the ION2. Period.

It looks like long-term, Intel and AMD/ATI are going to be the only games in town. That wouldn't worry me a whole lot, because I think their stuff looks good on paper, and they'll compete. And both of them are slowly advancing their open source drivers. But the key word is "slowly." If, say, you want to buy a machine to use as a MythTV box or something like that, right now NVidia is currently the only one it makes sense to buy. Anybody else, and you're going to have to decode your video with CPU and read promises about how some day you might not have to. I hate reading promises.

I am not looking forward to the day when these two windows of acceptability don't overlap. What happens you want to build a box and neither Nvidia nor Intel not AMD have a product that can actually be used, either because they're gone (Nvidia) or their drivers aren't yet working (Intel and AMD)? That is going to suck.

I think you're confused. nVidia isn't leaving the graphics card business. Just the mainboard chipset market (allegedly). I suppose this will mean fewer integrated video solutions based on nVidia, but you'll always be able to go buy a discrete PCI Express 2.0 card for your MythTV box. And on top of that, Intel has really good open drivers for their mainboard chipsets, so the combination of the two could actually make good sense for your situation.

apple should move to ati/amd and dump low end intel systems. The ATI780g / 790GX video systems kicks intel laptop cpu + intel gma video and the mini and imacs need to have desktop cpus and much better video cards. Apple can keep intel in the laptops + ati video and the high end mac pro and make the xmac.