I'm not suggesting, nor did I allude to that point that he is unfair towards AMD. All I am saying is that his statement was unnecessarily strong and misleading IMO (as in its NOT useless, AMD WASNT cheating - though he nor I can prove otherwise. AMD said they didnt test for it like that. Now that they can/do, they are fixing it. And last, the Second card for crossfire does NOTHING). NONE of that is true! Plenty are just fine with using CFx, and scaling is solid, IMO outside of a few titles (which both parties have issues with scaling in some titles).

I have a TON of respect for Dave and the knowledge he brings to the table, him and I have discussed this already in another thread too. I just feel the tone and words used were not helpful in bringing the reality of the situation to the people here.

It certainly isn't true that a second card does nothing. I can understand people's strong reaction when they see things like this, though.

They're not universal issues, but they are alarming for such expensive and popular hardware. (Please don't take me as suggesting that SLI is issue-free, btw - the 690 doesn't exactly flatter itself in these same graphs.)

I hear ya blibba... and it is a REAL issue. Just not everyone is affected. Some games dont show those severe frame time problems. Yes, they are AAA titles and the problem should be fixed, no doubt. The kicker is this... can you SEE the problem even though its there? It depends on the title really.

And I dont blame you for being concerned. What I take exception to was the tone/wording of the thread here and several articles on it just blowing it out of proportion.

This is my message to you guys, frametime measurements have been taken out of proportion. What you see in these charts sometimes looks alarming with massive spike, whilst your average end-user will never ever even see it on screen.

I really hope AMD nail this card and have release drivers to match it. That would spur Nvidia to more action - they have in the past reacted to AMD driver leaps with improvements of their own.

I'd be hard pressed though to go back to dual gpu for gaming. Even with latency issues being addressed some titles don't work in crossfire and that can be done at developer level. Maybe with AMD's control over gpu's in next gen consoles though this will be the start of a software revolution for them. They have the hardware and if the power optimisations pay off then they'll have done an awesome job.

But just to add, i stopped using 7970 crossfire because of the things being 'debated' above. As pretty as this card might be, it needs to match Nvidia in smoothness - not beat it in fps.

Perhaps AMD's focus with crossfire this time around hasn't been on improving the frame times for each game individually, but rather on the mechanic. They could've implemented something Nvidia is also using; namely frame buffering. This would would probably be a lot more easy to implement, hence the fairly quick release, and work for all games/apps currently suffering from spiky frametimes.

We all have different ideas on what we think IS better. Price/performance, performance/watt, pure performance, games bundled. Recommend what you honestly think is better. If I have a friend who has deep pockets and doesn't care about price, I'd recommend he get a Titan and be done with it. If I had the money I would too.

They could've implemented something Nvidia is also using; namely frame buffering. This would would probably be a lot more easy to implement, hence the fairly quick release, and work for all games/apps currently suffering from spiky frametimes.

Maybe patents prevent this. That would make a ton of sense, actually. If NVidia patented the tech they used(and it is very likely they did), AMD would have to develop a whole new method of doing their frame-time balancing, and that would easily, in my books, explain why it has taken so long for them to produce proper Crossfire support.

Maybe patents prevent this. That would make a ton of sense, actually. If NVidia patented the tech they used(and it is very likely they did), AMD would have to develop a whole new method of doing their frame-time balancing, and that would easily, in my books, explain why it has taken so long for them to produce proper Crossfire support.

That be odd since the data coming out of the Game Engine never lines properly between the two vendors then you introduce mischief from both hardware and software implementations to rectify an issue in the pipeline that might not even be your problem to begin with.

Both should stick to synchronozing additional cards more efficiently but imposing solutions to mask inherent problem from the PC gaming pipeline id say no. If anything the FCAT test says that method is proving wrong in scalability with Higher resolutions.

That be odd since the data coming out of the Game Engine never lines properly between the two vendors then you introduce mischief from both hardware and software implementations to rectify an issue in the pipeline that might not even be your problem to begin with.

Blah, blah, blah. You haven't read much of the info that has come out recently on the subject, have you? There is no such situation. Game devs do not make drivers. It's all the hardware's fault. Really. DirectX is "supposed" to offer an open rendering environment that makes this not an issue. That's why they got rid of "Cap Bits". Oh, what's that you say? CUDA? Nah, it doesn't limit that in any way, at all. Why did you ask?

Either way, there must have been a specific reason that AMD has these issues, and NVidia does not. I don't accept the answer they have given of "we are too stupid to look at that". I've been bitching to them directly about their issues for years. They made a choice to ignore it, and all issues now are 1000% the fault of hardware vendors. There is no denying that. The past 10 years have proven it.

But, I guess AMD now seems to think that the "7990" is a viable enough option to want to release this card. I'm very interested myself, but I remember the 5970 launch, and 5870 Crossfire not getting proper Eyefinity Crossfire drivers like the 5970 had, for months. Not surprisingly, back then AMD have cursor corruption issues, and they do now too. AMD has already said July for that, and that's about the same time frame with the 5870 Crossfire Issues.

I really care about products like this, because I'm an Eyefinity user, and if this card will fix the issues I have, I'll sell my current cards and get one.

I think adulaamin provided a pertinent statement in the case of this dual chip product because it essentially needs Crossfire working well to be outstanding. If you’ve follow recent reviews C/F on GCN architecture hasn’t been a super strong suite for AMD lately, and Latency has been brought recently. I’m still not sure such frap measurements early in the pipeline are a true indicator of the actual smoothness you eyeball on the screen. Hopefully we’ll see very soon.

I was saying that if Nvidia introduced a hardware solution like you were impling. The FCAT test at PCPer have proven that scaling above 1080p is horrid. So I wouldnt call that a solution.

AMD could provide a similar solution but then you'd have both AMD and Nvidia multi-GPU configurations being equally as aweful beyond 1080p.
I'm pretty sure thats a niche of a niche that buy the higher end GPUs and go Multi-GPU just to stick to 1080p

AMD would be wise and look at how badly Nvidia is scaling in that departement when the implications are they are doing it "Properly" so to speak.
They should take note aswell as Nvidia themselves and improve rather then to match a less then preferable performance outcome for those using Multi-GPU solutions or thinking about going multi-gpu in the future.

I think adulaamin provided a pertinent statement in the case of this dual chip product because it essentially needs Crossfire working well to be outstanding. If you’ve follow recent reviews C/F on GCN architecture hasn’t been a super strong suite for AMD lately, and Latency has been brought recently. I’m still not sure such frap measurements early in the pipeline are a true indicator of the actual smoothness you eyeball on the screen. Hopefully we’ll see very soon.

The discrepancy between fraps measurements and the frame delivery to the screen itself was indeed there, shown by both anandtech, and techreport. They also showed that nvidia used some "secret sauce"(read frame buffering) that caused frames that were delivered unevenly by the game engine to be smoothed out by the driver/hardware layer in order to deliver smooth frames on screen. This isn't THE solution to the problem, as this will cause other things, like input lag to increase, because frames will sit in the buffer for a longer time, and it will probably not decrease all of the juddery movements on screen, but atleast the frame delivery will be a bit smoother.

Some kind of buffering would be in my opinion the most easy kind of "cheap fix" they could do that would improve things, especially for crossfire. However, as cadaveca also said, they might have to find a solution that is not exactly the same as Nvidias' , as a result of patent issues that would otherwise arise(i dont have a lot of knowledge about patents myself, so i wont argue about that)

In any case, I do believe that they atleast made some progress with drivers for this card, because of a lot of reasons, some stated in a previous post of mine .

I was saying that if Nvidia introduced a hardware solution like you were impling. The FCAT test at PCPer have proven that scaling above 1080p is horrid. So I wouldnt call that a solution.

AMD could provide a similar solution but then you'd have both AMD and Nvidia multi-GPU configurations being equally as aweful beyond 1080p.
I'm pretty sure thats a niche of a niche that buy the higher end GPUs and go Multi-GPU just to stick to 1080p

AMD would be wise and look at how badly Nvidia is scaling in that departement when the implications are they are doing it "Properly" so to speak.
They should take note aswell as Nvidia themselves and improve rather then to match a less then preferable performance outcome for those using Multi-GPU solutions or thinking about going multi-gpu in the future.

I wouldn't say its AMD cheating more as eveyone relying on frap's fps numbers with out looking at the video output. Frap's pulled fps numbers before it gets to gpu so if a frame is slow to render or gets dropped for what ever reason both cards get it counted for it. Nvidia has hardware in their cards prevents it. Which card is faster? its pretty much a dead heat when its 1 card vs 1 card, not in SLI/CF or dual gpu card. Its pretty much up in the air.

Here is the thing. If you test with FCAT, a lot of the frame times and spikes you see, you wont notice as a user. Its when the spikes hit above 50ms or so that a user can notice. FRAPS can catch this...

Here is the thing. If you test with FCAT, a lot of the frame times and spikes you see, you wont notice as a user. Its when the spikes hit above 50ms or so that a user can notice. FRAPS can catch this...

you will notice it as low as 30ms, 33ms frame is only 30fps time's you might not notice it if its say once every 10+ sec but if it happens say every other second you will notice a slight bit of a stud-er.