Posted
by
Zonk
on Friday July 21, 2006 @06:36PM
from the curiouser dept.

Krugerlive writes "The rumor of an ATI/AMD merger/buyout has been out now for sometime. However, this morning an Inquirer article has said that a merger deal has been struck and that the two companies will seek shareholder approval on Monday of next week. In the market, AMD is down as a result of lackluster earnings announced last evening, and ATI is up about 4% on unusually high volume." This is nothing but a rumour at the moment, a point that C|Net makes in examining the issue. From the article: "AMD has always boasted that it only wants to make processors, leaving networking and chipsets to others. AMD does produce some chipsets, but mostly just to get the market started. Neutrality has helped the company garner strong allies."

If it's ATi trying to buy out AMD (which is perfectly possible), then they might not have enough money left to stop nVidia doing a hostile takeover of them both. That would eliminate one of nVidia's competitors -and- give them control over the CPU that looks set to take over.

You need to bear in mind that the GPU is the critical component in most systems, but makes almost no money for the vendor and has a relatively low volume. There is precisely no reason whatsoever for AMD to want to merge with ATi or to buy them up. That would be expensive and earn them little. In fact, given how much they've made from their component-neutrality, sacrificing that might mean they'd actually lose money overall.

On the other hand, CPUs are high volume, high profit, and AMD is gaining market-share. It is an ideal target for a buy-out, particularly as ATi can't be doing that well in the GPU market. Buying AMD would be like buying a money-printing-machine, as far as ATi were concerned. Better still, AMD is a key player in bus specifications such as HyperTransport, which means that if ATi owned AMD, ATi could heavily influence the busses to suit graphics in general and their chips in particular.

(Mergers are never equal, as you don't have two CEOs, two CFOs, etc. One of them will be ultimately in charge of the other.)

If the rumour is correct, then don't assume AMD is the one instigating things - they have the most to lose and the least to gain - and don't assume either of them will be around when the mergers and buyouts finish.

There is precisely no reason whatsoever for AMD to want to merge with ATi or to buy them up.

What about (I hate that I am going to type this word) synergies. Maybe AMD thinks that they have enough in common with ATI that they could reduce redunancies after the merge (ie fire people and possibly sell off a fab plant) and make both companies more profitable. Just a thought.

Hmmm. The only possible overlap is in the fabrication. Designing a good graphics processor is going to be very different from designing a good CPU, so they can't overlap the development teams (which will likely be small anyway). It's very doubtful the chips would be of similar enough size and have similar enough characteristics to do much about packaging or testing. Unless they're planning on unifying the scale at which they're making the chips, it's not clear they could do much about the etching. They coul

Half the directors could be fired, but it's doubtful either CEO is going to consider their choice of senior management to be the inferior choice. Which means that one board would win and the other board will lose. On the whole, that is. The CEO of the winning board might cherry-pick a few who are really exceptional or who have given him lots of money.

What makes you think the CEO gets to choose the board? You clearly don't know shit about corporations.

It's very doubtful the chips would be of similar enough size and have similar enough characteristics to do much about packaging or testing.

Not at the moment. But with a little more miniaturisation and time both CPU and GPU will be merged onto the one chip package. This is a situation where the combined company will have more than a small edge over their rivals. Avoiding the use of (relatively) long transmission wires to communicate across the motherboard bus; speeds will increase beyond anything current

Once, CPUs did integer computation. Floating point computation was performed by an external chip or emulated with (lots of) integer operations. Now, most CPUs have a floating point unit on-die.

Once, CPUs didn't do vector computations. They were either converted to scalar operations, or performed on a dedicated (expensive) coprocessor. Now, lots of CPUs have vector units.

Once, CPUs didn't do stream processing. Now, a few CPUs (mainly in the embedded space) have on-die stream processors.

A GPU is not much more than an n-way superscalar streaming vector processor. I wouldn't be surprised if AMD wants to create almost-general coprocessors with similar characteristics that connect to the same HT bus as the CPU; plug them directly into a CPU slot and perform all of the graphics operations there. Relegate the graphics hardware to, once more, being little more than a frame buffer. This would be popular in HPC circles, since it would be a general purpose streaming vector processor with an OpenGL / DirectX implementation running on it, rather than a graphics processor that you could shoehorn general purpose tasks onto. The next step would be to put the core the same die as the CPU cores.

The CPU industry knows that they can keep doubling the number of transistors on a die every 18 months for 10-15 years. They think they can do it for even longer than this. They also know that in a much smaller amount of time, they are going to run out of sensible things to do with those transistors. Is a 128-core x86 CPU really useful? Not to many people. There are still problems that could use that much processing power, but most of them benefit more from specialised silicon.

Within the next decade, I think we will start to see a shift towards heterogeneous cores. The Cell is the first step along this path.

You need to bear in mind that the GPU is the critical component in most systems, but makes almost no money for the vendor and has a relatively low volume... On the other hand, CPUs are high volume, high profit...

I beg to differ. GPUs have higher volumes than CPUs, assuming you count GPUs embedded in chipsets, along with the discrete GPUs. Just think about how often people upgrade their CPUs as opposed to their GPUs.

As for profit margins, then you have a point there, although for the wrong reasons, I

Not at all.Every computer sold has at least 1 CPU, but may not have a GPU at all (what use does a server have for a GPU if it's sitting in a rack and never has a screen attached), or it might have one integrated into the chipset.The most a single system will have is 2 GPUs, whereas highend machines could have many CPUS, and are unlikely to need a GPU at all.

Wow. That scenario would make for an interesting marketplace...ATi merges with AMD, gets swallowed by nVidia. Now we have one super-company, with two world-class GPU design teams, a world class CPU maker, and a world-class chipset maker. No need to hide sources from what are now themselves, so we'd have a much better chance at proper docs or even (gasp) shared-source, the design teams can be re-organized, so one dept focuses on gaming, and another on multimedia devices (HTPC, hardware video decoding/scal

No, we'd have one entity that assumes control of most of the market (let's face it, Matrox and VIA are as important to the GPU market as ASRock is to high-end gaming) and puts out shitty but utterly competition-free products until the public is aggravated enough to risk incompatiblity by defecting to a fringe product. It'd be Internet Explorer all over, only this time even more people would have to suffer - for example, how high would accelerated Linux drivers be on the new company's priority list? As for n

I think you have a slightly skewed view of the GPU marketplace. Here are the figures from Q1 of this year:

Intel: 39.1%

ATi: 28.7%

nVidia: 18.7%

VIA: 9%

SIS: 3%

Where it really matters these days is in the laptop space. Laptop sales are set to pass desktops in the next year or two. They did for Apple last year, and they're at about 50% of desktop sales for the rest of the industry. While desktop GPU sales grew by about 25%, laptop GPU sales grew by over 30%; particularly noteworthy since most laptops d

Yes, matrox.com [matrox.com] - their niche market is video post-production, so they were never affected by the 3D wars. A good number of the 25+ original 3D chip vendors [isdale.com] are still around - most have abandoned the high-performance workstation/desktop market if not chip manufacture and went into either board manufacturing only and/or embedded systems - 3DLabs recently announced they were concentrating on embedded systems and laid off 100 people.

For a while, I tried maintaining a timeline of all the different 3D chip vendo

XGI is the spun-off graphics division of SIS which bought the assets of Trident, and sold off their assets to ATI a couple months ago.

The original corporate entity S3 ended up changing it's name to SonicBlue, filing bankrupty, and selling most of their assets to Denon and Best Data. The graphics division, however, was acquired by Via and is now operated as S3 Graphics and used as the basis for Via's IGP solutions. So they're still around, sort of.

I think NVidia needs to get into the processor market themselves. Maybe not for general computing, but I bet their designers have some great ideas for a processor that would be at home in a console! With GPU's being so powerful these days, I can't imagine that they lack the expertise to do it.

Well they do have the GPU going into the Playstation 3 - but IBM seems to have a lock on at least this series of next gen consoles. All 3 CPUs (Xbox 360, PS3 and Wii) have been developed in conjunction with IBM.

I could be wrong but I thought their first card to have a GPU was the Geforce 3.

I was close It seems the first one was Geforce_256 which did not do well because of cost and lack of performance in non-gaming applications. Seems like the idea took off once they introduced it again in the Geforce 2. If you want to read more about it some information can be found here. [wikipedia.org]

I think NVidia needs to get into the processor market themselves. Maybe not for general computing, but I bet their designers have some great ideas for a processor that would be at home in a console! With GPU's being so powerful these days, I can't imagine that they lack the expertise to do it.

Hardly. CPU and GPU design are very different tasks at so many levels.

At the highest level, the architectures are radically different - a GPU is basically a bunch of the instantiations of a minimally-programmable, cus

interesting... incidentally, i happen to work for a gpu company (one mentioned in this article even...), and we have a large number of engineers doing full-custom circuit-design work. they may not be working on custom adders (we don't need them), which is perhaps the point you were trying to make, but they are often doing some very complicated circuits nonetheless...

Circuit design is not something GPU companies do - they're given a library of gates from the fab company they use, and use those gates. In CPUs, lots of circuits are designed using fancier circuits and many things are laid out by hand

(disclaimer: I work for one of the two big GPU companies)

Man, your information is very outdated. I would estimate that at least 25% of a current GPU is laid out by hand. CPUs definitely have more custom parts, but not more than 50-60% of the chip. The rest is synthesized u

I always thought that AMD and Nvidia were the better combo. Besides the ATI Drivers suck for Linux, where a large percent of the enthusiast market's interests lie. Isn't AMD still more of an enthusists processor until it can get into one of the top vendor's machines?

Actually, the X.org drivers for ATis are probably the best out there. The problem is they lack support for recent ATi hardware (lacking good 3D support for vaguely recent, e.g. R300 and up, though it's getting there apparently, and completely lacking any support 2D or 3D, for the most recent R500 hardware), as ATi havn't made documentation available in a *long* time.

If you meant ATis' own drivers, yeah, they suck. But really, if ATi just made docs available, the much better X.org drivers would be able to support far more of their hardware..

The X.org drivers do support 3D, and quite well, on the older R100 and R200 cards. R300/400 are also supported for 3D, but those have needed extensive reverse engineering, and hence are not quite as mature (though, getting there apparently), also they have only really reverse engineered the equivalent of the R200 feature set, so they're not getting the most out of the cards - all thanks to ATis silly attitude about supplying documentation.

The problem is they lack support for recent ATi hardware (lacking good 3D support for vaguely recent, e.g. R300 and up

Funny way to define recent. You don't happen to be a Debian developer, do you?

I just threw away a R300 series card (ATi 9800 XT) for an nVidia SLI. I bought the ATi back in mid '05 and it has sit on the store shelves for 1/2 a year before I picked it up for the "Free" Half-life 2 and then "stable" accelerated proprietary drivers.

I game under Linux. But with an ATi card, nothing worked well or for very long. Wine, the commercial Cedega, even native games would kill the driver. I had to install nVidia dependancies for my team's 3d software. Software which in the end wouldn't work without the nVidia drivers.

If you meant ATis' own drivers, yeah, they suck. But really, if ATi just made docs available, the much better X.org drivers would be able to support far more of their hardware..

I don't see that improving quickly unless somebody is a big itch to scratch builds a community like the one around nVidia. A lot of people doing games in Linux only develop and test with nVidia hardware. Not everyone can afford two $600-800 rigs with recent cards.

Once I switched to nVidia 3D a ton of games that only worked on Windows now install and play as well if not better than native on Windows. Older 3d games like Diablo 2, Warcraft 3 and Startopia fly at high frames-per-second (> 60-100._ Current generation games like Tron 2.0, Guildwars and Half-life 2 get respectible fps (~30) where the ATi drivers would struggle to get 2-3 fps and often crash if anything changed the drawing state.

I hope AMD care about open drivers..

This assumes that AMB comes out on top. Or that the ATi proprietary midset doesn't infect AMD. On one side you have two companies that are basic chip fabbers, spewing out GPUs, CPUs and chipset engineeing specs as fast and cheaply as possible. On the other you have ATi, buried deep in a race with nVidia, and AMD, who won the last round of CPU wars with x86_64. As has been mentioned by others, mergers are little more than one company eating another. I for one would not be surpised if after any such ATi/AMD merger that the next (last?) AMD nVidia motherboard chipsets are at least 6 months to a year behind the next ATi releases.

At the best, it would be intersting to see a dual-core CPU with one core a GPU and a metric ton of cache. I'd be almost like the old 468SX vs. 468DX days.

AMD, after buying out ATI opens up the architecture or supports Linux as a 1st tier platform.

I bet if ATI was putting out first rate drivers it might influence quite a few purchases in that direction... of course it might also push nVidia to do the same... arms races can be fun for the spectators (and consumers:) )

I bet if ATI was putting out first rate drivers it might influence quite a few purchases in that direction

Sigh. This detrimentally short-sighted acceptance of binary-only drivers that users like you have is precisely why there are no good drivers for recent ATi hardware, or most recent graphics besides Intel. And until users like yourself start demanding that vendors provide documentation, not binary blobs, graphics support will continue to suck.

Binary drivers kill kittens (thanks airlied for that one). They don't help if you run other free Unixen, they don't help if you use a non-mainstream platform (e.g. PPC, AMD64 up until recently, it doesn't help the Radeon in the Alpha I have here).

Demand DOCUMENTATION - even if it's gibberish to you personally, it's will benefit you far more than binary blobs eventually...

YOU demand documentation for your other free *nixen and your mainstream platforms.

Right - cause they don't make any difference to your chosen platform, do they? Except that 99% of graphics work on Unix platforms is done in userspace, in Mesa and Xorg code, so work done by FreeBSD, Sun, etc.. engineers also tends to apply to your Linux machines (and vice versa).

You're simply an ignoramus: you're using a system (only parts of which are either Linux or Linux specific) which tens of thousands of people have don

You are so out of the loop, it's not even funny. This tiresome argument is so fucking late 1990s!!! Seriously, ever since ATI moved to a unified driver architecture known as Catalyst (equivalent to nVidia's ForceWare), problems of instability have long since vanished.

So for the love of all that's holy, please stop spreading OUTDATED INFORMATION!!!

My AMD64 desktop machine has an NVidia graphics card which works much better than the ATI rubbish built into the motherboard. But I'm not using that machine to write this. In fact, other than for occasional gaming, that machine rarely gets switched on.

Why ATI? I think there are two major reasons... First, ATI dominates the mobile market, and AMD is very weak in it. Creating a solution to compete with Intel's mobile offerings requires you to offer all the parts at a good price, and it's much harder to do that as 2 companies instead of one. ("Buy our CPU, we'll toss in a cheaper ATI chipset/card in" doesn't work if you don't own ATI:) ). Second, nVidia, even with its recent dismal stock performance, is worth over $6B, making it a lot more expensive then A

I always thought that AMD and Nvidia were the better combo. Besides the ATI Drivers suck for Linux, where a large percent of the enthusiast market's interests lie. Isn't AMD still more of an enthusists processor until it can get into one of the top vendor's machines?

I think, then, what you're looking for could come from this merger. AMD being the less expensive of the major CPU producers is a first choice for the free Unix group, and they know it. Maybe joining with ATI will cause the joined company to beco

OK, So let me first declare that I am a stockholder in both NVDA and AMD, and I've been mulling this over all day. Clearly by that, I agree with the nvda/amd combo. I've seen plenty articles that tout how NVDA has benefited from "neurality" between the two, but where's the neutrality? green grid? certified desktops? These AMD and NVDA have recognized each other's strengths and been building on that for awhile.

First of all, this arrangement benefits NVDA as much as AMD. How? It eliminates their main compet

As much as I like AMD, I have to say that if Intel and nVidia teamed up they would probably beat the crap out of AMD + ATi.

And if AMD and ATi merge.. It sort of seems like a punch in the face to nVidia. Leaving them wanting to talk to Intel. Leading to... what?

For a long time there have been two beasts in the CPU market and two beauties in the GPU market. AMD and Intel in CPUs, and ATi and nVidia in GPUs. If they marry respectively, the offspring might have the good qualities of neither and the bad qualities of both. I think overall the consumer would probably (more than likely) lose out.

nVidia does just as well with both Intel and AMD processors. Even if AMD and ATI merged, it would be in nVidia's best interest to stay on their own unaligned. It's not like the ATI + AMD combo would actually make something better than nVidia could for chipsets. And assuming they could, so what? nVidia would just turn it up and prove that they can compete. nVidia can always crank up the heat when they need to. They're good at that.The only concern nVidia should have is if the AMD Process line started c

Nobody will ever get to align with Intel. Intel already makes their own graphics, and, sucky as they are, they sell more of them than either ATI or nVidia. Their chipsets are already rock-solid and well-enough performing. They have all the pieces of the puzzle to sell a good "all-in-one" platform -- they've already taken over the mobile market with the Centrino platform, because they had the best mobile CPUs on top of the other things. Now they seem to be ready to attempt the same on the desktop, with the n

Even if there is some sort of a merger, it's not like that means AMD will make their processors only work with ATi cards, or make ATi cards only work with AMD processors. Well, I guess they could do that, but I'm not sure what the point would be.

As much as I like AMD, I have to say that if Intel and nVidia teamed up they would probably beat the crap out of AMD + ATi.

People say that, but I have to wonder what Intel has to gain. I mean, they're already the biggest player [reghardware.co.uk] in the graphics industry when it comes to market-share, so they clearly have the know-how to build graphics chips. Sure, they don't currently go after the enthusiast market, but there might be reasons for that:1. Lower margins - Nvidia's gross margin [yahoo.com] is under 40%, and Intel's [yahoo.com] is cl

The market's view of this is visible from the fact that ATI is up and AMD is waaaay down.

Wrong. The company doing the takeover (AMD) almost always declines -- rather noticably, too -- and the company being taken over almost always increases -- usually because the takeover bid is at a higher stock price.

AMD is just reporting bad earnings news in a volatile, short-heavy, news-sensitive market. With companies reporting good earnings still trading downward, it's no surprise that reporting bad earnings will

Amen!Forget even Linux, nForce is seriously crippled under WindowXP also.

It was very painful trying to get a new system built with nForce4, RAID 1 SATA aray, and a SATA Optical Drive (system is AMD also, but that wasn't a problem:) )

Finally I traced the problem to an incompatibility in the nForce chipset. It can EITHER support a SATA RAID array, or it can support a SATA optical drive. Doing both unfortunately causes the system to bluescreen.

I'm wondering, why are people jumping to these kinds of assumptions? Intel makes its own motherboards, chipsets, graphic chipsets, etc., but that doesn't prevent them from functioning with other manufacturers' parts. What business sense would there be in AMD making their processors incompatible with nVidia chipsets? If either were Microsoft, then maybe they could get away with it, but generally hardware/software benefits from compatibility.

There is a company out there that has an FPGA in a 940 pin socket. What about putting a GPU in it? Dual channel memory, HT link to the main processor, HT link to a DAC from the GPU [make mobos with fixed DACs on the board].

They could create a market for mainboards with upgradeable video memory...GDDR4 slots or whatever. They'd still be profitable from the market that buys the latest and greatest every six month, but they'd just be dropping in the GPU and using their existing system's video memory, reducing manufacturing costs.

I think this is bad for AMD because ATI has crappy support, crappy customer experience, and crappy drivers.Either this would vastly improve ATI or it could drag AMD down into mediocrity. If the merger does happen I truly hope that it is the former (ATI cleaning up its act across the board) but all too often with these sorts of mergers its the former that happens. ATI has a lot of great technology with fast GPUs, but when the drivers suck, customer service and support are nonexistent, and they absolutely re

I am a hard-core AMD and nVidia fan. I don't have any Intel PCs in my house except those that I got as freebies, and I've never had good luck with *any* ATI card. I cringe in fear at what would (or at least could) happen to my gaming systems of the future if ATI and AMD merge. Yes, I can see some type of exclusivity where ATI cards are going to somehow be more advantageous than nVidia when it comes to gaming hardware for reasons other than plain, old competition.

Doesn't this story look like a Dilbert-ish situation - the companies themself don't even consider merging but because "the word is out" and "everybody knows they'll do it" it somehow becomes a reality?

The recent announcement by Apple that they are going to be partnering with Nvidia for future ipod use. This could be the first step in them getting ready to switch over to Nvidia for their graphics processors since they use Intel chipsets and ATI graphics cards currently. I'm sure AMD is bitter over Intel being picked instead of AMD for the new "Mactels" too, so I could easily see them withdrawing ati support if the merger takes place.

At first glance, this is a stupid idea for AMD, but upon reflection, it isn't that bad. We've got to look at the 5 year picture for a deal of this size. What will AMD need to do to be more successful in 5 years than they are today? Well, despite what the teenage gamers will say, it actually doesn't mean having the highest FPS in Quake 5. The stable, highest volume, and generally profitable sales are in corporate servers and workstations. That's Dell, HP, and to a lesser extent Gateway, Lenovo, et al. So, what do they need from AMD or Intel? They want cheap, fast, reliable supply, few defects, and ease of integrating into the individual computers. After several years of the Athlon and Opteron, AMD is only now starting to get a toe hold in workstations and a reasonable share of server CPUs.

IMHO, AMD would be well advised to start shipping it's own chipsets, just like Intel. It just makes things easier for their most important customers, the big OEMs. They have one less vendor to worry about. There's less testing required, since presumably AMD would test the CPU and chipset together. And it's less risky for both customers and AMD since AMD has a very strong incentive to make sure that chipsets will be available for their platform on time, whereas third parties have different priorities.

Then there's the whole GPU angle. Why shouldn't GPUs be produced in company owned, i.e. tweaked for performance, fabs? They're every bit as complex and big and expensive as CPUs. Bringing that in house should give a nice bump to performance. And what is a GPU going to be in five years anyway? On the AMD platform, all the tools are in place to allow the GPU to work much more like a cheap DSP/co-processor than we've ever seen before. If the Opteron wasn't an Itanium killer, maybe a couple Opterons and a couple "GPU-DSPs" will do the trick. Even for regular workstations, imagine just plugging a GPU into a free socket on the MB? That would fit very nicely in the middle of the graphics market... way better than integrated, but way cheaper than an add-on card.

Lastly, AMD needs a way to use the last generation fab equipment a little longer. Making chipsets would let them use the fab equipment for an extra few years. They lost that cost efficiency when they spun off the flash business. Fab gear is expensive, so it's kind of a waste for them to be yanking it out everytime the minimum for a marketable CPU moves higher.

Five years ago AMD needed partners and an ecosystem to support their own platform and survive as a company. The next five years are about turning the CPU market into a duopoly.

I have a few shares of AMD. And I'd like to see this deal happen, but only at a decent price (from AMD's point of view). Hmm... this post turned rather long...

True dat. I've always loved AMD's own line of chipsets - IME they're never the fastest out of the bunch, but they're always rock-solid stable and (naturally) are open spec and so work perfectly in Linux, often before release. Much like Intel and their chipsets in fact.Contrast with ATI and nVidia chipsets (now that VIA, SiS and ULi are pretty much out of the market) - drivers are always binary blobs. True,you can generally run Linux on an nVidia chipset with open source drivers, even up to the reverse engin

Just an interesting side-note is that Intel has been filling it's low-end motherboard lineup with ATI chipset-based systems.Check out the D101GGC: http://www.intel.com/products/motherboard/d101ggc/ index.htm [intel.com]I find it odd for Intel to use a third-party's chipset in their mobos, but it would be double-weird if that third-party was AMD.

AMD has Centrino envy. More specifically, they need a platform strategy.Let's face it. CPUs are commodities. You buy price/performance.Recently, Intel has been using the platform to differentiate itself.Centrino is one example in the notebook world.You can see other examples with "advanced I/O" in the newer server platforms.Intel dictates the platform and can define it to suit their needs.

AMD has no platform strategy. It's at the mercy of various 3rd party chipset makers.

Stock trading volume on ATI spiked today and price went up. Volume tells you traders are looking to make some quick cash on the spread between today and the announced merger price. Increase in ATI price says people buying stock think it's a good deal for ATI.

The AGP slot has been getting faster and faster. The GPU has been getting bigger and has been doing more. There is an obvious need for a physics core and multicore CPUs. Clearly this is leading to adding the GPU to the CPU on the same chip, or at least very close to it, like the L2 cache on the slot1 Intel CPUs. After a certain AGP/PCIX bus speed, the AGP or PCIX slot will become less feasible, and it will be important to put the GPU as close to the CPU as possible.

Now think of the PS3. Its a revolution. Its not here yet, and its release is not being managed very well, but the ball on multicore CPUs (not just dual core) has gotten rolling. The Ultrasparc T1 has shown the world that multicores can be real and actually work. Not to mention the fact that most computers bought today at least has a mediocre GPU somewhere in it. This means AMD needs a GPU to add to its multicore CPUs as another core. They've already added the northbridge to it havent they? And that has saved us money hasnt it?

Intel has one-upped AMD recently with its Core chips, and AMD sounds like its really gonna one-up Intel with chips that should take the market away.

We supported AMD during its long fight with Intel. We gave it its power. We can take that away in one mass consumer action.

It's been very clear for along time that ATI are rubbish outside the fanboy wars and that you get the best bang for your buck using AMD+nForce+nVidia GPUs. That is the combination I've bought for the last few years now and I've never regretted any of those purchases. If that were to change I guess my grassroots support for AMD may have to be realigned, although very painfully, to

For me working in Window is painful, and only have useful with a dozen tools like cygwin, adobe, etc...

In Linux I can "just work". Booting windows to play games isn't an option as multiple people use this box. Where in Linux I can totally dominate one of my four cores with a video game, if I boot windows they get 0 of the 4 cores to use.

Linux users are perhaps a few percent of the general gamer market, but on the other hand, they make up a substantial percentage of the professional market. If you're using your GPU to do 3D modeling, scientific visualization, etc, there is not insubstantial chance you're on Linux.

Linux users are only a few percent of the market. But you've got to admit, nVidia lets you know they'll support any hardware you buy from them, any way you wish to use it. They have (good) drivers for all major operating systems, and their drivers work on their entire line of cards.Even an old GF4 Ti works with the latest drivers, on all 3 operating systems, with full 3-D acceleration. When a new card comes out, within a couple of months, it has support for all OSes. When new Linux kernel config options bre

As you see. You didn't need to go further, we already know you use XOrg drivers and not fglrx. Yes, that drivers are nice. Not fully featured, but open and nice. I preffer open too.

Now try using some X1800 and tell us how do you like that for a difference (no magic without crapiest ever driver named fglrx). Then pop in nvidia and their drivers and tell us how that feels. Believe me, using nvidia drivers will suddenly

Intel will continue to loose the bang for the buck war untill they get less greedy. There is no reason to pay 20% more for the same performance or 30% more for a very small increase. Most games will not show the difference anyway, and since games are what drives the PC industry now, thats really all that counts.

Buyers of integrated graphics aren't the informed type. In the current market, nobody of a technical inclination would buy integrated graphics based on principle (even if it was halfway decent), and the uninformed people wouldn't care if it was any good.

Not everyone buying computers gives a flying fig about graphics performance. There are some damn nice boards available on the cheap with integrated graphics that are perfectly adequate for desktop use.

Buyers of integrated graphics aren't likely to be buying a large volume of video games anyhow, and no gamer that I've ever come across has stopped buying video games because copy protection is annoying.Here's a thought for the every-whiney video game industry: I'd be much more inclined to go plop down $50 on a new game if I could run it at high resolution with all the detail on with my *AVERAGE* PC. That would impress me a hell of a lot more than awesome new graphics that only people with brand new hardwar

What is it about the intel integrated graphics chips that makes them so crappy.I had a GeForce 4 MX 440 that was BETTER than even the highest spec integerated intel graphics chipset you can get today.If intel wanted to, they could easily make (or licence from someone else) a better chipset for even the lowest end systems with hardware T&L and other 21st centuary graphics card features and all without affecting the functionality of the chipsets or motherboards or making them cost significiantly more.