If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

How far from this are we?

Not sure how accurate this chart is but it says a 6970 gives 2.7 Teraflops. I know that is probably a theoretic number not a real world one but that tells me a pair of 6870's or the equivalent Nvidia card should be able to easily handle the Unreal 4 engine. Or am I missing something?

Hmmm, well flops are correct when we are talking about graphics (floating point operations per second.) I'd venture to guess it is a little more complicated than one number depending on the core, vertex shader, pixel pipeline, raster, or if the card has stream processors which can handle more than one task.

You'd have to also assume they are only generating graphical data and there is not any interaction with the environment or gameplay. When you add gameplay/physics then you need to simultaneously run "gametime" which controls input, physics engine refresh, database pulls, and such. Then there is the database that is constantly being updated with player information. AI needs to run, etc etc.

For ONLY the graphics portion, perhaps 2.7 teraflops is good, but when we are talking about a full game we need much more than just that.

It's really difficult to say whether or not something can "run" based on a single metric. IMO that Tom's article is incredibly poorly written.

I know it has to do with multiple factors but using the figure of 2.5teraflops as a ball park then current PC graphics cards should have absolutely no problem running the engine. 2x6970 gives ~5.4 teraflops and 2x7970 gives 7.4 teraflops. Sure that will come at a premium but that is 2 and 3x the stated required performance. Dialing down to a "low" 1680x1050 resolution would give you even more headroom. Xbox360 is stuck at 250 gigaflops. This is an example of the PC getting dragged down by console gaming. We are still stuck in the Xbox360 era which is 5 or 6 years old now? I think I was using a 4800SE 6 years ago.

I know it has to do with multiple factors but using the figure of 2.5teraflops as a ball park then current PC graphics cards should have absolutely no problem running the engine. 2x6970 gives ~5.4 teraflops and 2x7970 gives 7.4 teraflops. Sure that will come at a premium but that is 2 and 3x the stated required performance. Dialing down to a "low" 1680x1050 resolution would give you even more headroom. Xbox360 is stuck at 250 gigaflops. This is an example of the PC getting dragged down by console gaming. We are still stuck in the Xbox360 era which is 5 or 6 years old now? I think I was using a 4800SE 6 years ago.

From a purely math perspective, sure. From a software perspective, not quite. So much of what makes a game look good today is art direction and textures. At the end of the day the graphics are only as good as the artists. The 360 is getting pretty old, but new games still look great.

From a purely math perspective, sure. From a software perspective, not quite. So much of what makes a game look good today is art direction and textures. At the end of the day the graphics are only as good as the artists. The 360 is getting pretty old, but new games still look great.

That's my point. Sure DX9c games still look good. I'm playing the S.T.A.L.K.E.R series with the complete mods installed. Nothing I've seen on the Xbox360 compares to the graphics. And I am sure they would look even better with DX10. Since the PC gaming market is not nearly as big as the Xbox360/PS3 market we will have to wait around until the next generation consoles before someone is going to spend the money to develop a game that will use the new engine. The console industry has no interest in the next generation of hardware because current graphics are good enough to maintain sales.

That's my point. Sure DX9c games still look good. I'm playing the S.T.A.L.K.E.R series with the complete mods installed. Nothing I've seen on the Xbox360 compares to the graphics. And I am sure they would look even better with DX10. Since the PC gaming market is not nearly as big as the Xbox360/PS3 market we will have to wait around until the next generation consoles before someone is going to spend the money to develop a game that will use the new engine. The console industry has no interest in the next generation of hardware because current graphics are good enough to maintain sales.

I don't think consumers really have an interset in next gen graphics either. I know I don't. Gears of War and Skyrim are some of the best looking games out there. At the average TV viewing distance, they are sharp as hell.

Crysis looks good and all, but somehow it doesn't look as good as Half-Life 2. That is, the graphics are hyper realistic, but the art design is so bland and uninspiring it's wasted.

Hmmm, well flops are correct when we are talking about graphics (floating point operations per second.) I'd venture to guess it is a little more complicated than one number depending on the core, vertex shader, pixel pipeline, raster, or if the card has stream processors which can handle more than one task.

You'd have to also assume they are only generating graphical data and there is not any interaction with the environment or gameplay. When you add gameplay/physics then you need to simultaneously run "gametime" which controls input, physics engine refresh, database pulls, and such. Then there is the database that is constantly being updated with player information. AI needs to run, etc etc.

For ONLY the graphics portion, perhaps 2.7 teraflops is good, but when we are talking about a full game we need much more than just that.

AGREE

I honestly dont think any modern gaming PC is being taxxed at all since everything these days is a simple console port. my relatively low-mid end i3 cpu and mid range 6870 plays EVERY game out right now at 1080p with everything cranked to near max. the same game on a ps3/360 is being run at 500-720p with the equivalent of low-medium settings. most console ports are not even optimized all that well and should run even better on a low end PC with better code

i believe that article is talking mainly about consoles and next gen consoles... i believe that the next gen consoles ps4/xbox720 will not be much more powerful than a standard gaming PC we have today (im thinking the equivalent of a quad core i5 and a geforce560ti)... even the preliminary wii u specs are saying that it will have the equivalent of a mid range radeon 6770 or so.

with that in mind, i believe that a high end gaming rig that you can build today will run the samaritan demo fine.. something like a quad core i7 and a single 7980 (SLI 7850)

Originally Posted by ImaNihilist

I don't think consumers really have an interset in next gen graphics either. I know I don't. Gears of War and Skyrim are some of the best looking games out there. At the average TV viewing distance, they are sharp as hell.

Crysis looks good and all, but somehow it doesn't look as good as Half-Life 2. That is, the graphics are hyper realistic, but the art design is so bland and uninspiring it's wasted.

yeah i do believe we are reaching a drastic point of diminishing returns on graphic fidelity until we get some new graphic design techniques.. some PS2 games (FFX, Silent Hill 3) look practically the same as what you can find on a modern console when run on an emulator at high resolution and anisotropic filtering. look at xenoblade on the dolphin emulator... all next gen really needs is a GPU bump to maintain 60fps and FSAA at 1080p...

I've said it before and I'll say it one more time, the next frontier is not higher graphical quality, but greater emphasis on simulation engines and material properties. We can only go so far with how a game environment looks before we want to start interacting with it more realistically. Think complex weather patterns, real material properties, dynamic fires based on those properties, materials changing organically when exposed to breaking/freezing/melting points, biological modeling of all organs/bones/tissue of each soldier during shooters, etc etc.

There is SO much more on the horizon, but people get stuck only looking at the graphical bits when the true game changers are how we can interact with the environment.

Everyone loves to destroy stuff when there aren't any consequences. Think about how fun it is to smash an old kitchen with a sledgehammer when remodeling and then not having to clean it up.

It's also just becoming too expensive to create graphics that actually take advantage of advanced hardware.

If you wanted to re-do a game like World of Warcraft with Crysis graphics the budget would surpass the most expensive movies ever made and the game would require a terabyte of space just to store textures.

Unless we figure out a new way of actually creating art assets I'm not sure the hardware is really going to matter. If resolutions suddenly jump to 4K, now you've got to create 4K assets. Games like Crysis get to cheat because they can just look at a photograph and say "match that", but that just leads to really boring, unimaginative games that look pretty.

I've said it before and I'll say it one more time, the next frontier is not higher graphical quality, but greater emphasis on simulation engines and material properties. We can only go so far with how a game environment looks before we want to start interacting with it more realistically. Think complex weather patterns, real material properties, dynamic fires based on those properties, materials changing organically when exposed to breaking/freezing/melting points, biological modeling of all organs/bones/tissue of each soldier during shooters, etc etc.

There is SO much more on the horizon, but people get stuck only looking at the graphical bits when the true game changers are how we can interact with the environment.

Everyone loves to destroy stuff when there aren't any consequences. Think about how fun it is to smash an old kitchen with a sledgehammer when remodeling and then not having to clean it up.

Yeah. It would be cool to play a game in a world that is actually deformable, not the "shoot this thing and the polygon disappears" garbage we have today. It would be awesome if you could shoot a building with a tank shell and it always gets destroyed differently based on real physics.

Yeah. It would be cool to play a game in a world that is actually deformable, not the "shoot this thing and the polygon disappears" garbage we have today. It would be awesome if you could shoot a building with a tank shell and it always gets destroyed differently based on real physics.

I agree. Of course, that gets us into more than just pure graphics performance. I think there is a lot of untapped computing power in modern GPU's and certainly it would be better used for other things than texture quality. Current reviews are now benching at 2500, I still usually use a 1680x1050 resolution and don't really have any interest in anything over 1900x1080. Eyefinity? Maybe when I have the money and space for 2 60" plasmas which is not anytime soon. So I would be way happier if they used all that extra performance for physics affects etc.