If you guys remember, a while back I bought a 3DTV. I had tested it with the PS3, and while it was "neat" at first, it ultimately was just not viable with a PS3.

But now I have a GTX 580, and nVidia has a 3D thing, so I'm gonna hook up my gaming PC to my 3DTV and see how that works out.

I'm gonna test out Skyrim and Borderlands in 3D. If anyone has any specific requests of games to try in 3D, post here and I might try them.

Question...what do I have to do to get nVidia 3D working? Do I have to download something, or what's the deal with that? :what:

Noct

12-09-2011, 06:39 PM

If you guys remember, a while back I bought a 3DTV. I had tested it with the PS3, and while it was "neat" at first, it ultimately was just not viable with a PS3.

Yeah man, gotta go ahead and disagree with you there...

I was rather unimpressed with 3D in general after getting my set, but as a pretty big fan of 3D in general, I kept at it, and let me assure you, there are definately games that do it very well on Ps3.

Outside of Uncharted 3, I think I've played just about every 3D game available on Ps3, and while there are surely some lackluster uses of it, there are some outstanding ones too.

I'll wholeheartedly agree that Killzone and Resistance are both absolute balls in 3D, but on the other hand:

MotorStorm Pacific Rift 3D is fan-freaking-tastic. There is a great sense of depth, some stuff actually comes out of the screen, and there is barely any noticeable ghosting or loss of PQ.

I would say absolutely the same thing fo Crysis (and surely C2 since it uses the same engine). It works and looks gorgeous in 3D, and has a fantastic sense of depth.

Wipeout HD has nice 3D, but personaly I freaking hate that game, so it doesn't do much for me, and similarly, Ridge Racer 7 has some pretty good effects too, but is just as big of a snoozefest for me.

Shadow of the Colossus is spectacular, Batman AC is very good...

I'm sure there's some other stuff I'm forgetting too, but the point is, you really shouldn't dismiss it on Ps3 man.

It took me a week or so of tweaking and calibrating my TV before I got it to wok right, but once I did, I've been thriled with the results. One thing I learned later in the game was to go in and edit my 3D perspective settings on a game by game basis (mine goes from -5 to +5), and that made a HUGE difference in ghosting and screen-pop.

Some of the stuff that I thought had horrible 3D effects at first (Motorstorm for instance) looks absolutely fantastic once I set my TV right. MotorStorm:PC3d (and the Deep Sea 3DBD) is actually my goto to show people 3D on the TV, and so far, most people have been pretty freaking impressed.

The one thing I have to ask is, why is it your expecting a different result from doing through a PC? I mean, outside of possibly getting a frame-rate or resolution boost (depending on your hardware), whats the difference? :what:

TwoPlusTwo

12-09-2011, 08:22 PM

Yeah man, gotta go ahead and disagree with you there...

I was rather unimpressed with 3D in general after getting my set, but as a pretty big fan of 3D in general, I kept at it, and let me assure you, there are definately games that do it very well on Ps3.

When I hooked up my PS3 to my 3DTV the framerates were absolute shit. Games were pretty unplayable.

I mean, it might have gotten better by now, but the launch titles/demos I played, the framerate was really bad.

The one thing I have to ask is, why is it your expecting a different result from doing through a PC? I mean, outside of possibly getting a frame-rate or resolution boost (depending on your hardware), whats the difference? :what:

Well, I wouldn't say it's gotten better as much as a couple of games seem to have actually given a crap about it during dev and not just tacked it on at the end to put it on the box. (I'm looking at you Resistance)

Thing is, frame rate has not been a problem for me in anything... Granted, I'd love to be playing everything in true 1080p, but the main issue I have is ghosting, not rates/rez.

Poor frame rates usually give me really bad motion sickness, and I don't recall that happening to me a single time while playing in 3D. The issue with stuff like R3 is that the ghosting/double-image is so feaking heinous that you literally have to close one eye to line up a shot from anymore then 2 feet away from the target, which obviously absurd...

But hell, if you're this impressed at the difference, maybe I should really look into impriving my rig, as I am happy with Ps3 as it is. If it can really get that much better, I'm game...

TwoPlusTwo

12-09-2011, 10:00 PM

But hell, if you're this impressed at the difference, maybe I should really look into impriving my rig, as I am happy with Ps3 as it is. If it can really get that much better, I'm game...

The difference is beyond astounding. Comparing 3D gaming on PS3 to 3D gaming on a high-end PC rig is a joke.

It would be like comparing Super Nintendo graphics to Playstation 3 graphics. It's a whole new ballgame.

FWIW keep in mind that my GPU is a GTX 580 w/ 3GB VRAM. Pretty much the most expensive GPU you can buy right now. Over $500. So YMMV.

But yeah, these 3D PC games are blowing my mind. :yippee:

railven

12-09-2011, 10:03 PM

The difference is beyond astounding. Comparing 3D gaming on PS3 to 3D gaming on a high-end PC rig is a joke.

It would be like comparing Super Nintendo graphics to Playstation 3 graphics. It's a whole new ballgame.

FWIW keep in mind that my GPU is a GTX 580 w/ 3GB VRAM. Pretty much the most expensive GPU you can buy right now. Over $500. So YMMV.

But yeah, these 3D PC games are blowing my mind. :yippee:

You'd want this for 1080p 3D gaming, THE most expensive graphics card you can buy ;)

* Borderlands - At first I was wondering what the fuck was wrong with this game in 3D. I had some kind of strange issue...it seemed like ghosting/crosstalk, but it was "different." It was like the game was trying to render shadows in 3D.

So I went into options and found something called "Dynamic Shadows" (or something like that) and turned that shit off. Now Borderlands runs great in 3D. It's very cool.

* Crysis 2 - This is by far the most impressive 3D PC game I've played so far. Words can not even express how badass this game looks with nVidia 3D. It blew my mind, it was incredible. You'd just have to see it to believe it.

Since I'm off work for like 3 weeks, I guess I'll just have to test more games lol. :thumbsup:

TwoPlusTwo

12-09-2011, 10:08 PM

You'd want this for 1080p 3D gaming, THE most expensive graphics card you can buy ;)

http://www.newegg.com/Product/Product.aspx?Item=N82E16814130693

$750 :eek:

Doesn't the GTX 590 have some kind of severe problem where it literally catches on fire?

I've seen YouTube videos of 590s going up in smoke. Literally.

railven

12-09-2011, 10:11 PM

Doesn't the GTX 590 have some kind of severe problem where it literally catches on fire?

I've seen YouTube videos of 590s going up in smoke. Literally.

Launch units, like the GTX 480 in the past - revisions addressed all the issues (I'd hope.)

But, the GTX 590 has it's own slew of issues - like every other videocard, but that or 2x580 3GB's is the only way to get 1080p in 3D for the games that matter. You tap out of VRAM/GPU juice with a single card.

TwoPlusTwo

12-09-2011, 10:17 PM

Launch units, like the GTX 480 in the past - revisions addressed all the issues (I'd hope.)

But, the GTX 590 has it's own slew of issues - like every other videocard, but that or 2x580 3GB's is the only way to get 1080p in 3D for the games that matter. You tap out of VRAM/GPU juice with a single card.

Well tbh at this point I'm kinda considering a dual-monitor setup...

...where both monitors are 50" plasma HDTVs. :bowdown:

I have no idea how it would work or what I'd need to do, I'm just tossing ideas around in my brain... :p

kamspy

12-09-2011, 10:18 PM

Screw the 590. The -only- way to go is an ASUS GTX 580x2 MARS (http://www.newegg.com/Product/Product.aspx?Item=N82E16814121470&Tpk=asus%20mars)

Two full powered 580s. 3 GB vRAM. No smoke. No limiters. Basically what the 590 should have been.

TwoPlusTwo

12-09-2011, 10:22 PM

Screw the 590. The -only- way to go is an ASUS GTX 580x2 MARS (http://www.newegg.com/Product/Product.aspx?Item=N82E16814121470&Tpk=asus%20mars)

Two full powered 580s. 3 GB vRAM. No smoke. No limiters. Basically what the 590 should have been.

Yeah kams, I've been looking at that link since you sent it to me on Steam.

I could run the 3-monitor setup with it?

Even if the "monitors" are plasma HDTVs?

kamspy

12-09-2011, 10:24 PM

Yep. Only 1,000 were made, so you might want to hop on it.

railven

12-09-2011, 10:42 PM

Haha, if you're going to drop $1500 on a Mars, just get tri-sli GTX 580's and really be done with it.

Guess kammie is getting another upgrade soon ;) haha.

kamspy

12-10-2011, 12:39 AM

I'm sure 2+2 can do that math how much a certain number of 580 GTXs cost. The ASUS Mars has a lot of creature features that stacking regular GPUs do not. Without sacrificing clock speeds or voltage.

No upgrade here. If he ordered that, I'd help him run it with his current GTX 580 3GB in Tri SLI. I have no need for 580 SLI on 1080p mang. Just having one is the best experience I've ever had with a video card. It's the third "high end" card I've bought close -ish to launch. The other being the 7800 GTX and a HD 4780. Both were pretty top of the line cards when they came out. I remember paying over 5 bills for the 7800 at least. I got the 4870 for cash and trade with some PS3 accessories from my buddy who instantly wanted a 4870x2. Got a good deal there. He's had a hard run the past couple years like me and is still using the 4870x2, and I'm passing the 460 along to him since 2+2 passed it along to me. That GPU is gonna set some type of record for happy homes.

But those cards had weakspots from the get go. The 4870 was a real bitch with Crysis at first. The 7800 had some stinkers iirc. But man, the 580 just custs through anything. 4x4 Sparse Grid Super Sampling on MW3? No prob. 60fps. BF3 Ultra FXAA and HBAO MP at 60fps? No problem.

I'm floored by this card. It doesn't have that traditional Nvidia oven inside it either. At maximum stress, when overlcocked and overvolted, it doesn't hit 85. Most games don't even bother it to kick the fans on and it stays in the high 50s.

kharaa

12-10-2011, 12:43 AM

I do the same thing kams.. I got that 6870 from 2+2 and i'll pass my 5850 down to someone in need of it, just like I did my 460 and my 275. Both of them have been given away for free to people in need. :)

kamspy

12-10-2011, 12:48 AM

I do the same thing kams.. I got that 6870 from 2+2 and i'll pass my 5850 down to someone in need of it, just like I did my 460 and my 275. Both of them have been given away for free to people in need. :)

Yup. I always try to pass around the love when I get it. I'm pretty much making this guy's christmas.

The legend of Rambo's 460 lives on. It's a beast too. Freak of a chip. I ran it at 950mhz without a sniff of instability and it never reached 70c. Fucking wonderful card. I love it, but I'm not gonna use it as a PhsyX card for the dozen games that support it when it would change my friend's entire gaming experience.

kharaa

12-10-2011, 12:57 AM

Yup. I always try to pass around the love when I get it. I'm pretty much making this guy's christmas.

The legend of Rambo's 460 lives on. It's a beast too. Freak of a chip. I ran it at 950mhz without a sniff of instability and it never reached 70c. Fucking wonderful card. I love it, but I'm not gonna use it as a PhsyX card for the dozen games that support it when it would change my friend's entire gaming experience.

What are you upgrading to?

I'm not sure who is gonna get my card at this moment, I usually try to keep it local.. but i'd change that rule for one of the guys here as well.

In hindsight.. my friends 260 super clocked 216 core would make an excellent physx card for me.. if I knew what the heck he did with it. (me and him freely trade computer parts, as he upgrades like twice a year, because he has money to blow. Sucks with computers though LOL)

kamspy

12-10-2011, 01:07 AM

I bought 2+2's refurbed 580.

railven

12-10-2011, 06:01 AM

I'm sure 2+2 can do that math how much a certain number of 580 GTXs cost. The ASUS Mars has a lot of creature features that stacking regular GPUs do not. Without sacrificing clock speeds or voltage.

Do tell, what are these creature features you speak of?

TwoPlusTwo

12-10-2011, 06:14 AM

is the only way to get 1080p in 3D for the games that matter

Wait...AFAIK you can't actually play 3D PC games in 1080p, you have to use 720p.

When I tried running a game in 1080p, it says "this resolution is not supported for 3D, please change to 720p" or something like that.

From what I've been tinkering with, you *might* could run 1920x1080@24 hz...but what the hell kind of TV has 24 hz? :what:

The 3DTV I'm using has 48 hz, 60 hz, and 96 hz options.

My Kuro Elite uses 72 hz for 24p playback.

Basically 24 hz doesn't actually do 24p very well from what I understand, so TV manufacturers use frame packing and stuff for 24p movies.

So, the way I understand it, you can't play 3D PC games in 1080p, you have to use 720p.

railven

12-10-2011, 08:18 AM

Wait...AFAIK you can't actually play 3D PC games in 1080p, you have to use 720p.

When I tried running a game in 1080p, it says "this resolution is not supported for 3D, please change to 720p" or something like that.

From what I've been tinkering with, you *might* could run 1920x1080@24 hz...but what the hell kind of TV has 24 hz? :what:

The 3DTV I'm using has 48 hz, 60 hz, and 96 hz options.

My Kuro Elite uses 72 hz for 24p playback.

Basically 24 hz doesn't actually do 24p very well from what I understand, so TV manufacturers use frame packing and stuff for 24p movies.

So, the way I understand it, you can't play 3D PC games in 1080p, you have to use 720p.

Care to make a wager on that? ;)

3D rendering requires double up of the frame rate for equal refreshes per eye - so for 60 frames per second 3D gaming you need a 120hz monitor minimum, which the new 3D Vision kits support.

Because each frame is being rendered twice, you need a bucket load of memory AND GPU processing thus why you need SLI to hit 1920x1080p resolutions for 3D rendering.

Just because you can't do it or don't have the appropriate hardware, doesn't mean it isn't doable.

EDIT: Hint, to get the best performance nVidia wants you to use their kits (ie 3d Vision approved glasses, 3D vision approved display, and 3D vision approved hardware) you'll run into limitations using products outside that scope without registry tweaking.

TwoPlusTwo

12-10-2011, 08:21 AM

Care to make a wager on that? ;)

3D rendering requires double up of the frame rate for equal refreshes per eye - so for 60 frames per second 3D gaming you need a 120hz monitor minimum, which the new 3D Vision kits support.

Because each frame is being rendered twice, you need a bucket load of memory AND GPU processing thus why you need SLI to hit 1920x1080p resolutions for 3D rendering.

Just because you can't do it or don't have the appropriate hardware, doesn't mean it isn't doable.

EDIT: Hint, to get the best performance nVidia wants you to use their kits (ie 3d Vision approved glasses, 3D vision approved display, and 3D vision approved hardware) you'll run into limitations using products outside that scope without registry tweaking.

I think I understand what you are saying...to do 1080p 3D I'd need a different 3DTV and another 580 via SLI?

I don't understand...what if I bought a 3DTV with 120 hz, why would that not work? I don't get it.

Also, I just figured out that I can force 1920x1080@24 hz via nVidia Control Panel, but then every game plays at a locked 24 fps. :p :lol:

railven

12-10-2011, 08:37 AM

27 inches?!? Am I supposed to view it with a telescope? :confused:

I wouldn't want to go below 50 inches.

I don't understand...what if I bought a 3DTV with 120 hz, why would that not work? I don't get it.

Also, I just figured out that I can force 1920x1080@24 hz via nVidia Control Panel, but then every game plays at a locked 24 fps. :p :lol:

Because that is how nVidia makes money. You have to pay a premium for SLI options on a motherboard, they lock out PhysX from working with competitor's cards so you have to buy two GeForce cards (if you want Dedicated PhysX.) It's just how they roll.

There is 3rd party software you can use, but in the end the driver will throttle down if the IDED tags don't match.

So you got three options:
a) suck it up and use 24hz forced
b) hack the drivers
c) buy a 3d Vision monitor

Welcome to team Green :). (note: you need their 3D glasses which won't work with your TV.)

TwoPlusTwo

12-10-2011, 08:53 AM

Because that is how nVidia makes money.

I thought they made their money on this 3D thing by making me purchase the $40 3D drivers after my 14-day free trial?

You're saying that they also want me to buy their monitors as well?

I mean it seems like they could make plenty of money since I'd have to buy another GTX 580...

So you got three options:
a) suck it up and use 24hz forced

That's not a real option because doing this locks your framerate at 24 fps which is totally unacceptable.

b) hack the drivers

That sounds fine but unfortunately I don't have a Ph.D. in Computer Science. :p

c) buy a 3d Vision monitor

No way am I playing on some tiny-ass 27" screen.

~~~~~

I guess that leaves me with option "d"

d) Keep playing 3D games like I'm doing right now to decide whether or not the 3D effect is worth the drop to 720p.

My free trial lasts 14 days so I should have plenty of time to decide that.

railven

12-10-2011, 09:12 AM

I thought they made their money on this 3D thing by making me purchase the $40 3D drivers after my 14-day free trial?

You're saying that they also want me to buy their monitors as well?

I mean it seems like they could make plenty of money since I'd have to buy another GTX 580...

They make money from selling their products and royalties. In the end - it's just software (registry keys) that are restricting using their products in a manner which would benefit you, the end user.

You have to pay $40 dollars for software to allow your card to pass through 3D rendering to your TV? The card isn't doing anything it normally doesn't already do. Haha. It's just passing the doubled up video to your TV, yet - you have to pay for that? It be the same if Sony charged you a fee for the 3D playback option.

That's not a real option because doing this locks your framerate at 24 fps which is totally unacceptable.

Hack the driver ;)

That sounds fine but unfortunately I don't have a Ph.D. in Computer Science. :p

It's called the internet, people already did it. You just apply a registry hack. What, you think I re-wrote the GeForce driver so I can use it with my Radeon. All I did was click "Apply." Done.

No way am I playing on some tiny-ass 27" screen.

But, but, but, it's the only way. ;)

~~~~~

I guess that leaves me with option "d"

d) Keep playing 3D games like I'm doing right now to decide whether or not the 3D effect is worth the drop to 720p.

My free trial lasts 14 days so I should have plenty of time to decide that.

:eek: Or, you can hack the driver, get 3rd party software, and do it better - FOR FREE!!!! Haha.

Course, not knowing you're setup, unless you got 120hz, you might be able to do 96hz so 48FPS in 3D.

TwoPlusTwo

12-10-2011, 09:22 AM

It's called the internet, people already did it. You just apply a registry hack. What, you think I re-wrote the GeForce driver so I can use it with my Radeon. All I did was click "Apply." Done.

Or, you can hack the driver, get 3rd party software, and do it better - FOR FREE!!!! Haha.

Well that sounds pretty good. :)

Any hints/ideas on what I should be Google-ing to find these hacks?

railven

12-10-2011, 09:30 AM

Well that sounds pretty good. :)

Any hints/ideas on what I should be Google-ing to find these hacks?

Bit-torrents. :)

John Rambo

12-10-2011, 10:41 AM

Hmm... I don't even own a 3D TV yet. Although I will soon... The TV I get for my bar will be a 50" 3D Samsung Plasma. Maybe then I'll try this out just for fun

And if I like it, pick up a Sharp Elite PRO-70X5FD for eight grand :D

TwoPlusTwo

12-10-2011, 11:43 AM

Tried out Metro in 3D. Incredible.

The particles are amazing. Seems like you can reach out and touch them...

TwoPlusTwo

12-10-2011, 11:47 AM

And if I like it, pick up a Sharp Elite PRO-70X5FD for eight grand :D

I was just thinking, I might eventually end up with the 60" version of that if don't get tired of the 3D thing and also figure out this 1080p hack that railven says I can find on bit-torrents...

kamspy

12-10-2011, 12:23 PM

Care to make a wager on that? ;)

*mostly FUD*

HDMI 1.4 is only capable of 1920x1080/24 in 3D ;) Just the HDMI chipset holds us back.

No hacked drivers, unless you can hack the HDMI chipsets in the GPU and TV.:lol::lol::lol:

Another sad Radeon owner living with substandard features. Even the vaunted 7000 series won't have much 3D support. :(

Third party drivers throw out all the hard work Nvidia engineers did making non native 3D games awesome in 3D.

John Rambo

12-10-2011, 12:28 PM

I was just thinking, I might eventually end up with the 60" version of that if don't get tired of the 3D thing and also figure out this 1080p hack that railven says I can find on bit-torrents...

Can someone explain to me WTF is the story with the 720p thing? Why can't you run 1080p for 3D gaming?

kamspy

12-10-2011, 12:30 PM

Can someone explain to me WTF is the story with the 720p thing? Why can't you run 1080p for 3D gaming?

HDMI 1.4 only supports 1080p/24 in 3D.

Only way to get 1080p/60 3D right now is with a dual DVI display (PC monitor, very few DLPs).

John Rambo

12-10-2011, 12:35 PM

HDMI 1.4 only supports 1080p/24 in 3D.

Only way to get 1080p/60 3D right now is with a dual DVI display (PC monitor, very few DLPs).

1080/24? That's fuckin gay.

So even though its PC gaming, you can't just go to another resolution, like 1600:900 and play at 60FPS? So it must be a display/HTCP issue?

But it does support 720p/60?

railven

12-10-2011, 12:37 PM

HDMI 1.4 is only capable of 1920x1080/24 in 3D ;) Just the HDMI chipset holds us back.

No hacked drivers, unless you can hack the HDMI chipsets in the GPU and TV.:lol::lol::lol:

Another sad Radeon owner living with substandard features. Even the vaunted 7000 series won't have much 3D support. :(

What does HDMI have to do with anything I said? He needs a 120hz TV for 60hz 3D I already told him that (which he doesn't have.)

Nope. Doesn't matter what TV he has. The only way to enable 3D vision in games that don't natively support it in the options is by settings the resolution to 1280x720. There is no check box in the Control Panel etc.

Doesn't matter if he has a 240Hz TV if he's hooking it up with HDMI.

HDMI 1.4 only supports 1080p/24 in 3D. No matter the refresh rate of your TV. So tell us about these hacked drivers again Rail:lol:

But it does support 720p/60?

Yep.

railven

12-10-2011, 12:41 PM

Nope. Doesn't matter what TV he has. The only way to enable 3D vision in games that don't natively support it in the options is by settings the resolution to 1280x720. There is no check box in the Control Panel etc.

Doesn't matter if he has a 240Hz TV if he's hooking it up with HDMI.

HDMI 1.4 only supports 1080p/24 in 3D. No matter the refresh rate of your TV.

Again, in my posts - point out where I said he had to use HDMI? He needs dual-link DVI, which I assumed all 3DTVs have just all 120hz monitors do. If his TV doesn't have that, than woof, but he can definitely do 1080p @ 48hz if his TV supports 96hz over HDMI.

kamspy

12-10-2011, 12:44 PM

They don't make dual link DVI TVs outside of a small handful of DLPs that aren't even in production anymore. You know 5 and Rambo aren't gaming on anything under 50".

And I never said HDMI specifically. I didn't know his TV didn't have Dual-DVI. I thought that was a standard in 3D-TVs. My fault. Maybe I should pull a kams and make racial remarks or worse :eek: haha.

kamspy

12-10-2011, 12:52 PM

You're missing another fundamental thing here Rail.

When you are playing a game in 3D, you have to set in the game resolution to 1280x720 to enable 3D. That's the only way to turn the 3D on. If the game supports it in the GFX options menu, the game automatically goes to 1280x720, or it prompts you to set it to 1280 x 720.

So now you're saying by hacked for 1080p, you really meant cracked for free use (but no updates).

IZ3D:roflmao: Would you really recommend using that instead of 3D Vision? Either way, it's not getting past HDMI 1.4 It's a hardware limitation. Not of the TV or the GPU. 3DTVs and HDMI 1.4 spec just weren't designed with PC gaming in mind.

I can see how and why you're uneducated about this. Instead of waiting to play games They Way They Were Meant, you bought a $500+ turd. Thereby limiting your knowledge on one of the most killers apps in PC gaming. Full 3D.

Just accept your defeat this time buddy. Ol Kams was right.

http://i.imgur.com/sQ78H.gif (http://imgur.com/sQ78H)

railven

12-10-2011, 01:02 PM

You're missing another fundamental thing here Rail.

When you are playing a game in 3D, you have to set in the game resolution to 1280x720 to enable 3D. That's the only way to turn the 3D on. If the game supports it in the GFX options menu, the game automatically goes to 1280x720, or it prompts you to set it to 1280 x 720.

You can force your resolutions through third parties (you named it) and whether you like it or not doesn't change that fact. IZ3D is used by a lot of people who don't want to use the nVidia 3D driver due to limitations set by nVidia.

So now you're saying by hacked for 1080p, you really meant cracked for free use (but no updates).

IZ3D:roflmao: Would you really recommend using that instead of 3D Vision? Either way, it's not getting past HDMI 1.4 It's a hardware limitation. Not of the TV or the GPU. 3DTVs just weren't designed with PC gaming in mind.

What's wrong with IZ3D? It works. It works good and it is hardware agnostic. You can use any 3D-glasses, any monitor that is 3D capable, and any video card? Wait, so you mean you have to buy from one maker and use one brand specifically? Gotcha.

I can see how and why you're uneducated about this. Instead of waiting to play games They Way They Were Meant, you bought a $500+ turd. Thereby limiting your knowledge on one of the most killers apps in PC gaming. Full 3D.

But I didn't make personal attacks. :) I take the high road, occassionally, and I know with you - facts don't mean a thing. I don't own a 3DTV, I do own a 120mhz monitor. Hey - I enjoy 1080p3D with the IZ3D software set on ALL games. Is the 3D vision version better? Perhaps, I wouldn't know, but I 'd rather not wait for a profile.

And not all games look good with 3D on via IZ3D. I can be honest about that, but I still have the option and if I switch over to the GTX 460 for testing - guess what, it still works! :eek: (just not as good, lack of GPU power.)

3D gaming to me is still gimmicky. Some games, it looks good, others, not so much.

kamspy

12-10-2011, 01:08 PM

What limitations does Nvidia impose with 3D vision?

Waiting on a profile? pfft. Nvidia drivers already got the Max Payne 3 3D Vision update and two performance improvements. It doesn't even come until March!

See, that's how you do it. Make the drivers before the game comes out.

http://i.imgur.com/sQ78H.gif (http://imgur.com/sQ78H)

railven

12-10-2011, 01:18 PM

What limitations does Nvidia impose with 3D vision?

Waiting on a profile? pfft. Nvidia drivers already got the Max Payne 3 3D Vision update and two performance improvements. It doesn't even come until March!

See, that's how you do it. Make the drivers before the game comes out.

http://i.imgur.com/sQ78H.gif (http://imgur.com/sQ78H)

Haha, I love Tim and Eric.

Anyways:

When you are playing a game in 3D, you have to set in the game resolution to 1280x720 to enable 3D. That's the only way to turn the 3D on. If the game supports it in the GFX options menu, the game automatically goes to 1280x720, or it prompts you to set it to 1280 x 720.

I can play my games @ 1080p 60hz in 3D at medium settings. IZ3D can do that - but nVidia can't?

TwoPlusTwo

12-10-2011, 02:01 PM

I'm typing this at a Red Lobster a few blocks away from a home theater store in Atlanta.

I just came from the HT place. I asked them if the Sharp Elites had dual-link DVI.

They told me that, as far as they know, no HDTV has such a thing. Dude told me that there might be adapters/converters/etc but they don't sell anything like that.

FWIW

railven

12-10-2011, 02:38 PM

I'm typing this at a Red Lobster a few blocks away from a home theater store in Atlanta.

I just came from the HT place. I asked them if the Sharp Elites had dual-link DVI.

They told me that, as far as they know, no HDTV has such a thing. Dude told me that there might be adapters/converters/etc but they don't sell anything like that.

FWIW

Well then - see I didn't know that about 3DTVs. So, with such a big limitation - how does Sony/Microsoft expect 3D-gaming to blossom for their consoles?

My fault for thinking all TVs supported a standard that openly supports high bandwidth video.

kamspy

12-10-2011, 02:49 PM

Well then - see I didn't know that about 3DTVs. So, with such a big limitation - how does Sony/Microsoft expect 3D-gaming to blossom for their consoles?

My fault for thinking all TVs supported a standard that openly supports high bandwidth video.

Sony and MS really don't know what they're doing with it on consoles. They're basically ticking off a checkbox for a comparison sheet.

Maybe HDMI 1.5+ will support it.

John Rambo

12-10-2011, 02:56 PM

This thread rules

railven

12-10-2011, 04:57 PM

Sony and MS really don't know what they're doing with it on consoles. They're basically ticking off a checkbox for a comparison sheet.

Maybe HDMI 1.5+ will support it.

See, to me thats total bullshit.

TwoPlusTwo

12-10-2011, 11:57 PM

I can play my games @ 1080p 60hz in 3D at medium settings. IZ3D can do that - but nVidia can't?

I looked at the IZ3D website. It seems more or less like a ripoff of nVidia 3D Vision, without all the really nice features of nVidia 3D Vision.

But I can play in 1080p 3D without being locked at 24 fps? Using my HDMI connection?

If that's the case, cool, I'd try it out.

However, from what I understand, it's HDMI 1.4 that is preventing me from playing games at 1080p in 3D. Like, it seems that I need some dual-link DVI or some such nonsense.

So here are two very basic questions I have...I'll keep these questions simple.

(1) If I use this IZ3D thing, can I play in 3D at 1080p? With an HDMI cable?

(2) Why the hell does this shit not work on HDMI 1.4 anyway? I remember when 3DTV first came out, we were all promised that HDMI 1.4 was the future. I was an early adopter. Here I am a year or so later, I can't even play PC games on a GTX 580 at 1080p. I'd even be willing to buy a second 580 for SLI. But from what I gather, it just doesn't work with HDMI. What the hell is going on here? Am I supposed to be waiting for HDMI 1.5? What kind of fucking nonsense is this? :confused:

kharaa

12-11-2011, 12:04 AM

I looked at the IZ3D website. It seems more or less like a ripoff of nVidia 3D Vision, without all the really nice features of nVidia 3D Vision.

But I can play in 1080p 3D without being locked at 24 fps? Using my HDMI connection?

If that's the case, cool, I'd try it out.

However, from what I understand, it's HDMI 1.4 that is preventing me from playing games at 1080p in 3D. Like, it seems that I need some dual-link DVI or some such nonsense.

So here are two very basic questions I have...I'll keep these questions simple.

(1) If I use this IZ3D thing, can I play in 3D at 1080p? With an HDMI cable?

(2) Why the hell does this shit not work on HDMI 1.4 anyway? I remember when 3DTV first came out, we were all promised that it was the future. I was an early adopter. Here I am a year or so later, I can't even play PC games on a GTX 580 at 1080p. I'd even be willing to buy a second 580 for SLI. But from what I gather, it just doesn't work with HDMI. What the hell is going on here? Am I supposed to be waiting for HDMI 1.5? What kind of fucking nonsense is this? :confused:

Try flopping that around.. IZ3D was first. Nvidia just ripped them off. ;)

TwoPlusTwo

12-11-2011, 12:15 AM

Try flopping that around.. IZ3D was first. Nvidia just ripped them off. ;)

OK, sure, that may be accurate. The nVidia 3D Vision has some killer features, but hey, if I can get 1080p via HDMI with IZ3D, I'll try that.

Or does the IZ3D thing also require "dual-link DVI" or whatever?

I'd just like to play my 3D games, on an 3DTV, via HDMI preferrably, in 1080p without being locked at 24 fps.

Does IZ3D solve this issue?

kamspy

12-11-2011, 12:46 AM

You are the monkey we are launching into space.

Download it and try it out.

kharaa

12-11-2011, 01:32 AM

OK, sure, that may be accurate. The nVidia 3D Vision has some killer features, but hey, if I can get 1080p via HDMI with IZ3D, I'll try that.

Or does the IZ3D thing also require "dual-link DVI" or whatever?

I'd just like to play my 3D games, on an 3DTV, via HDMI preferrably, in 1080p without being locked at 24 fps.

Does IZ3D solve this issue?

not one hundred percent positive, i'll check with my friend who worked for them for like 4 years.

Mase

12-11-2011, 10:48 AM

See, here's the difference between me and Rail. When I'm wrong, I'll come out and say it after the info I thought was right was debunked. Rail just wants to spin and spin and spin.

I stopped reading after this, because WOW. Really Kam, when you are wrong you either dont go back to that thread or you post lame one liners and weak ass GIFs. Anyone who has been here for more than a month knows this already.

Also you really reccomended that 5 should get a MARS card, that is ridiculous and you know it. You have always bashed dual chipped PCB cards because they come with their own issues and driver support is not as fast or good as just going SLI or XF, oh and I did not see your post telling of the creature comforts as you called it. Talk about dodging questions, :lol:

The bullshit is thick in this thread Kams..

TwoPlusTwo

12-11-2011, 11:16 AM

Alright, I've DLed IZ3D and I'm trying to set it up, let's see how this does. :)

TwoPlusTwo

12-11-2011, 12:17 PM

I couldn't even get IZ3D to actually work on my 580+3DTV.

However, when I was Google-ing info on IZ3D, I found out...you can't do 1080p unless you're OK with a locked 24 fps.

Conclusion: nVidia 3D Vision is cool as hell. The 3D is amazing and the 3D Vision interface/features are great. But if you use HDMI, you are limited to 720p. :(

As for now, I'm gonna move the gaming PC back up to the Kuro Elite.

kamspy

12-11-2011, 12:29 PM

I stopped reading after this, because WOW. Really Kam, when you are wrong you either dont go back to that thread or you post lame one liners and weak ass GIFs. Anyone who has been here for more than a month knows this already.

Also you really reccomended that 5 should get a MARS card, that is ridiculous and you know it. You have always bashed dual chipped PCB cards because they come with their own issues and driver support is not as fast or good as just going SLI or XF, oh and I did not see your post telling of the creature comforts as you called it. Talk about dodging questions, :lol:

The bullshit is thick in this thread Kams..

Uh, he wanted to play in 3D, he's an Apple guy (happy to pay for convenience and build quality). If there was ever an candidate for the MARS, it would have been him (had he stayed on his 3D kick).

I would never recommend SLI to someone building a computer, but he was talked about doing multidisplay 3D. Single 580 wouldn't have cut it and I could see him paying the premium for a single card solution that doesn't have the inherent physical headaches that multiGPU comes along with.

I like to think a math prof can do math. He knows the price on the MARS, knows what 2 580s cost, and is capable of doing the math himself. ;)

Mase

12-11-2011, 12:37 PM

Bravo on continuing to dodge the questions. :golf clap:

The MARs card is a multi GPU and has more inherent issues than going true SLI, that is a fact and you have taken that stance before. He might as well get a 590 as he doesnt OC anything as is and thats all the MARs card is good for, better heat management and power distribution when OCing. Aside from that a stock MARs vs a 590 is virtually the same just one costs half as much as the other.

Nice reccomendation Kam, bravo!

kamspy

12-11-2011, 05:22 PM

The MARS is not a 590. The 590 has gimped clock speeds, voltages and built in power limiters. The MARS is just two 580s in one PCI slot. The 590 is two underclocked 580s, and again, there are built in limiters to regulate voltage so the TDP didn't make front page news on every blog. ASUS didn't care, they just wanted to make the baddest piece of GPU that can fit into one PCI slot. They didn't have to worry about PR or anything wth the power consumption.

It won't have anymore instability than a regular 2x580 set up. That's why it's preferable over the 590. It avoids the issues of the 590 while retaining the footprint. It's also going to run a shit ton cooler than 2 580s sandwiched against each other.

But I think he's holding off on the multi display route for now, so it's a moot point.

Now, back up your words. What are the inherent issues of the MARS card vs traditional SLI.

Mase

12-11-2011, 06:09 PM

The MARS is not a 590. The 590 has gimped clock speeds, voltages and built in power limiters. The MARS is just two 580s in one PCI slot. The 590 is two underclocked 580s, and again, there are built in limiters to regulate voltage so the TDP didn't make front page news on every blog. ASUS didn't care, they just wanted to make the baddest piece of GPU that can fit into one PCI slot. They didn't have to worry about PR or anything wth the power consumption.

It won't have anymore instability than a regular 2x580 set up. That's why it's preferable over the 590. It avoids the issues of the 590 while retaining the footprint. It's also going to run a shit ton cooler than 2 580s sandwiched against each other.

But I think he's holding off on the multi display route for now, so it's a moot point.

Now, back up your words. What are the inherent issues of the MARS card vs traditional SLI.
Way to dodge the question once again and reiterate what I stated above :lol: Also I never said the MARs card in particular I said dual GPU cards vs SLI/XF. Why bother with the space, heat and power issues of Dual GPU cards and not to mention driver support. The MARs card is a beast but SLI (dual or Tri) is a better idea and cheaper.

You know what the MARs card is good for, heating your house! That thing takes up THREE PCI slots (so no it does not retain the footprint of a 590 :lol: )and requires a PSU with over 1000watts! Yea its a good recomendation for 5, go grab a new mobo, case and PSU for something he wont even OC anyways. Thats what the MARs card is good for, OCing, as said before its built in features allow it to OC like crazy! Maybe if 5 took to OCing it would be a good reccomendation for his next build, but that seems unlikely.

Link (http://www.guru3d.com/article/asus-mars-ii-review/)
But yeah, ASUS took the GTX 590 concept and pretty much only took the two GPUs and left the rest "as-is". ASUS merged GTX 580s onto a single PCB and each of them GPUs is tick-tocking away at 782MHz, making the Mars II much faster than the (already discontinued) GTX 590.

Soo I googled for like 10 seconds and I already found that you are full of shit. Its not two seperate PCBs or 580s(true SLI) its the same concept as the 590. The difference is the the two dies that are on that single PCB for the MARs are cherry picked and clocked higher than even a 580. Good card, but not worth the price and headache of changing your entire rig just to fit one in.

kamspy

12-11-2011, 06:27 PM

*okay* we're going to have to go back further here Masey.

Having two separate 580s has no advantage over single card. They both have the same SLI pros and cons.

Also, I never said it was on 2 PCBs. :confused: That would be awful. They did that with the 295 GTX first edition and those cards were a mess. I said 2 GPUs in a single PCI slot. Maybe you misunderstood me. The 590 and 6990 are both single PCB cards. AFAIK, the first runs of the 295s were the only card to actually just cram two GPU boards into one shroud, SLI bridge and all.

Other than having the option to sell one down the line and build a second PC with it, there is no advantage to regular 580 SLI compared to the MARS. The plus' for MARS would be

- Build quality. Only 999 were made. They're hand numbered.
- One PCIe slot. One pair of wires coming from the GPU. Less spaghetti in the case. No SLI bridge. No wondering if the custom cooler on the card is going to extend just past that second PCIe slot and make the whole thing a bust.
- Cooling. See how much room there is between GPUs on most SLI set-ups? Yeah, not much. That top card is gonna be an oven everytime. Sure you can buy reference with the exhaust, but then it's gonna be loud as hell. The MARS has that nice 2-3 fan CPU style cooler that alot of GPUs are using now. The fans run at lower RPM and keep the cards cooler. Though, he'd need to make sure his exhaust game is up to par on his case. It's basically the Mac Book Pro of 580 SLI. You're paying a lot just for little creature features. 2+2 has a tendency of doing that. I dunno. Plus, it's an awesome big card.

And the PSU would be the same as regular 580 SLI.

Now how is regular 580 SLI better again?

Mase

12-11-2011, 06:56 PM

*okay* we're going to have to go back further here Masey.

Having two separate 580s has no advantage over single card. They both have the same SLI pros and cons.

Also, I never said it was on 2 PCBs. :confused: That would be awful. They did that with the 295 GTX first edition and those cards were a mess. I said 2 GPUs in a single PCI slot. Maybe you misunderstood me. The 590 and 6990 are both single PCB cards. AFAIK, the first runs of the 295s were the only card to actually just cram two GPU boards into one shroud, SLI bridge and all.

Other than having the option to sell one down the line and build a second PC with it, there is no advantage to regular 580 SLI compared to the MARS. The plus' for MARS would be

- Build quality. Only 999 were made. They're hand numbered.
- One PCIe slot. One pair of wires coming from the GPU. Less spaghetti in the case. No SLI bridge. No wondering if the custom cooler on the card is going to extend just past that second PCIe slot and make the whole thing a bust.
- Cooling. See how much room there is between GPUs on most SLI set-ups? Yeah, not much. That top card is gonna be an oven everytime. Sure you can buy reference with the exhaust, but then it's gonna be loud as hell. The MARS has that nice 2-3 fan CPU style cooler that alot of GPUs are using now. The fans run at lower RPM and keep the cards cooler. Though, he'd need to make sure his exhaust game is up to par on his case. It's basically the Mac Book Pro of 580 SLI. You're paying a lot just for little creature features. 2+2 has a tendency of doing that. I dunno. Plus, it's an awesome big card.

And the PSU would be the same as regular 580 SLI.

Now how is regular 580 SLI better again?

Link (http://www.guru3d.com/article/asus-mars-ii-review/)
You'll need a beefy power supply as well as not one, not two.. noper-di-nope, three 8-pin PCIe connectors are required.

Link (http://www.maximumpc.com/article/hardware/asus_mars_ii_review)
Imagine a graphics card weighing 5.25 pounds with three (yes, three) 8-pin PCI Express power connectors. Now imagine this card taking up three PCI Express slots and almost sucking the life out of an 850W power supply.

the 785W that the Mars II consumed under load set a Lab record for a single graphics card.

Yea a single PCI slot, how about taking up 3. One pair of wires from from the GPU, how about 3. What else you got?

AS to the pros of SLI 580 vs MARs, how about Benchmarks.. Virtually the same in all games except the cost is less on SLI.. Again great reccomendation, a gargantuan card with a gargantuan appetite with a price tag to match and with what performance gains? Marginal at best. Gotcha! This card is for collectors, I have yet to see a single professional reviewer suggest one of these over a 590 or SLI 580s. Here you are though, suggesting it like its not a big deal.

kamspy

12-11-2011, 07:34 PM

For a guy who is happy to spend $2800 on laptop with $900 performance, this is much less of a leap.

Nothing wrong with having that kind of value prospect. You like what you like. The MARS is the MacBookPro of 580 SLI.

Mase

12-11-2011, 07:47 PM

For a guy who is happy to spend $2800 on laptop with $900 performance, this is much less of a leap.

Nothing wrong with having that kind of value prospect. You like what you like. The MARS is the MacBookPro of 580 SLI.

In true kamspy fashion, bravo sir!

kharaa

12-11-2011, 08:12 PM

I wouldn't waste a dime on the MARS.

He already has a single 580, adding another one would provide him a better cost to performance ratio.

As the performance between the two would be nearly identical, one just wouldn't cost him more than his computer did. :P

kamspy

12-11-2011, 08:33 PM

I wouldn't waste a dime on the MARS.

He already has a single 580, adding another one would provide him a better cost to performance ratio.

As the performance between the two would be nearly identical, one just wouldn't cost him more than his computer did. :P

We've been through this. I like to think 2+2 can do the math on 2 580 vs 1 MARS.

Again, this is a guy who buys MacBooks for convenience. There aren't many other people in the world I'd recommend it to, and when I was recommending it, he was thinking about a multi display 3D surround set up. So it would have been MARS plus his 580 as either Tri SLI or beast mode PhsyX.

kharaa

12-11-2011, 08:55 PM

We've been through this. I like to think 2+2 can do the math on 2 580 vs 1 MARS.

Again, this is a guy who buys MacBooks for convenience. There aren't many other people in the world I'd recommend it to, and when I was recommending it, he was thinking about a multi display 3D surround set up. So it would have been MARS plus his 580 as either Tri SLI or beast mode PhsyX.

indeed, 1 580, 500 dollars.

or 1 580 MARS SLI 1500 dollars.

Both same performance for this fine individual.

TwoPlusTwo also has a brain. :P

kamspy

12-11-2011, 09:32 PM

Well with a multi display 3D set up, 1 580 wouldn't cut it. 2 might even have trouble. Therefore I linked the God Card (almost in jest), but he made a follow up post about it. I'm sure he saw the price and didn't forget how much a regular 580 cost. Therefore like his MacBook interest, I assumed he liked the MARS for the creature features. ie: not having to deal with two physical GPUs, better cooling, etc.

jmwatkins

12-22-2011, 07:51 PM

I did some research on AMD's website and apparently I have everything I need to do 3D with my 6870 except I need software. Either DDD or iZ3D. Both have free trials, so I'm going to give them a shot sometime during my Christmas vacation. I'll let you guys know how it works out.

railven

12-22-2011, 10:01 PM

I did some research on AMD's website and apparently I have everything I need to do 3D with my 6870 except I need software. Either DDD or iZ3D. Both have free trials, so I'm going to give them a shot sometime during my Christmas vacation. I'll let you guys know how it works out.

I'll tell you know - get ready for some disappoint. Haha. I use IZ3D on my ASUS 27" 3D Monitor but my HD 5870 can't produce 120 FPS fast enough on max settings. I have to drop to medium/high with no AA/AF for most games to get decent frame rates. Your HD 6870 is about 20% slower than mine. Try 720p before you try 1080p.

The IZ3D software is a little annoying in the UI department, but you can customize your own profiles for angles and make objects pop more or less. Get ready to hate shadows haha.

jmwatkins

12-22-2011, 11:40 PM

I'll tell you know - get ready for some disappoint. Haha. I use IZ3D on my ASUS 27" 3D Monitor but my HD 5870 can't produce 120 FPS fast enough on max settings. I have to drop to medium/high with no AA/AF for most games to get decent frame rates. Your HD 6870 is about 20% slower than mine. Try 720p before you try 1080p.

The IZ3D software is a little annoying in the UI department, but you can customize your own profiles for angles and make objects pop more or less. Get ready to hate shadows haha.

I don't think I'll be disappointed as anything will be better than PS3 3D. I tried a lot of games in 3D on PS3 and the best I played were SuperStardust HD and the Motorstorm PR demo. :P

I'll drop to 720p and start from there.

railven

12-23-2011, 06:57 AM

I don't think I'll be disappointed as anything will be better than PS3 3D. I tried a lot of games in 3D on PS3 and the best I played were SuperStardust HD and the Motorstorm PR demo. :P

I'll drop to 720p and start from there.

I never bothered hook up the PS3 to my monitor. Maybe I should try it. Maybe it's just me. 3D still feels gimmicky. It is a huge performance hit, but then again trying to maintain 120FPS with any current single card is a lot to ask for haha.

On that note, even 120 FPS to me feels gimmicky. The few games I can get there on high don't feel any smoother, and personally the games I have to trade IQ for to get from 60 to 120 FPS the IQ lost isn't worth the frames gained.

Meh, maybe I'm just being bitter since my card can't handle it and once I do get 120FPS I'll pull a Kammie and sing it praise from the mountain tops.

awol

12-23-2011, 08:03 AM

I'll tell you know - get ready for some disappoint. Haha. I use IZ3D on my ASUS 27" 3D Monitor but my HD 5870 can't produce 120 FPS fast enough on max settings. I have to drop to medium/high with no AA/AF for most games to get decent frame rates. Your HD 6870 is about 20% slower than mine. Try 720p before you try 1080p.

The IZ3D software is a little annoying in the UI department, but you can customize your own profiles for angles and make objects pop more or less. Get ready to hate shadows haha.

How is it possible that your 5870 is faster than a newer generation card? :confused:

Mase

12-23-2011, 09:00 AM

How is it possible that your 5870 is faster than a newer generation card? :confused:

The 5870 was the elite AMD card during its generation, the 68xx series was never the elite series, the 69xx series is, so a 6970 (top tier card) is faster than his 5870 (top tier card last gen). The 68xx series is what I would consider their mainstream line, it brings a lot of value to the table but it was never meant to compete with other high end GPUs.

jmwatkins

12-23-2011, 09:12 AM

How is it possible that your 5870 is faster than a newer generation card? :confused:

I didn't understand it myself, but then I read this article the other day.
http://www.tomshardware.com/reviews/radeon-hd-6870-radeon-hd-6850-barts,2776.html

jmwatkins

12-23-2011, 09:14 AM

I never bothered hook up the PS3 to my monitor. Maybe I should try it. Maybe it's just me. 3D still feels gimmicky. It is a huge performance hit, but then again trying to maintain 120FPS with any current single card is a lot to ask for haha.

On that note, even 120 FPS to me feels gimmicky. The few games I can get there on high don't feel any smoother, and personally the games I have to trade IQ for to get from 60 to 120 FPS the IQ lost isn't worth the frames gained.

Meh, maybe I'm just being bitter since my card can't handle it and once I do get 120FPS I'll pull a Kammie and sing it praise from the mountain tops.
Don't bother with PS3 3D. I also think 3D is gimmicky, but I think it's awesome when it is done right. The problem is that about 95% of the 3D content I've viewed is crap.

awol

12-23-2011, 09:17 AM

I didn't understand it myself, but then I read this article the other day.
http://www.tomshardware.com/reviews/radeon-hd-6870-radeon-hd-6850-barts,2776.html

So if I'm understanding correctly, the 5870 had more "stuff" (shaders, etc)... but the 6870 was clocked higher making lesser "stuff" run faster albeit still not performing as well as the 5870 in real world application.

Mase

12-23-2011, 09:46 AM

So if I'm understanding correctly, the 5870 had more "stuff" (shaders, etc)... but the 6870 was clocked higher making lesser "stuff" run faster albeit still not performing as well as the 5870 in real world application.

The 6870 was never meant to compete with the 5870 however the 5870 has something like 1600 streaming process units where the 6870 is in the 1100s and while the 6870 has a higher core clock it has a slower mem clock. Only reason it has a higher core clock is that it is a more efficient card but other than that it will lose to the 5870 is practically every bench you can think of.

railven

12-23-2011, 11:26 AM

I didn't understand it myself, but then I read this article the other day.
http://www.tomshardware.com/reviews/radeon-hd-6870-radeon-hd-6850-barts,2776.html

Don't bother with PS3 3D. I also think 3D is gimmicky, but I think it's awesome when it is done right. The problem is that about 95% of the 3D content I've viewed is crap.

So if I'm understanding correctly, the 5870 had more "stuff" (shaders, etc)... but the 6870 was clocked higher making lesser "stuff" run faster albeit still not performing as well as the 5870 in real world application.

The 6870 was never meant to compete with the 5870 however the 5870 has something like 1600 streaming process units where the 6870 is in the 1100s and while the 6870 has a higher core clock it has a slower mem clock. Only reason it has a higher core clock is that it is a more efficient card but other than that it will lose to the 5870 is practically every bench you can think of.

Basically as Mase said. They tweaked the architecture so each SP counted as 4, versus during the HD 5 series each SP counted as 5 (VLIW4 vs VLIW5.) The claim was that in the VLIW5, one cluster never much load so it was wasting space, power, and created needless heat so they removed it.

Add to that AMD painted themselves into a corner and they had two options to get out of it: 1) shift the names or 2) eat a loss. They chose to shift the names. If everything was normal, the current HD 6850/6870 would have been the HD 6750/6770 and sold for respectively $120 and $150 dollars. That would have been a huge gain for the consumers, huge loss for AMD. So the shifted the name over to 6850/6870 and pumped up the higher cards to 6950/6970.

A total dick move, but I can understand why they did it. Just wish they'd have planned ahead a little better.

EDIT: Not that I'm AMD or anything, but if I were them, I'd have recycled Cypress and Juniper for the lower tier and gave Barts a heavier OC. This could have created a more congested line up, but would have offered better performance at better price points.

IE:
Juniper (HD 5750/5770) to Juniper (HD 6650/6670)
Cypress (HD 5850/5870) to (down the clocks or remove a ROP cluster) Cypress (HD 6750/6770)
Barts would be introduced as the HD 6830/6850 and Cayman would be the 6870.
Antilles would be the 6970.
Presto.

Personally, I think they lost a lot of money with the Cayman 6950 that could unlock to the Cayman 6970.