I dunno. These new Gigahertz GPUs pack quite the wallop though so while I don't think the VRAM is impossible, I don't find it very plausible. (I think GPU manufacturers need to move beyond the GDDR standard.)

How far along will we start wishing for 120Hz? Is can there be a limit?
I've seen many test on random gamers that failed to noticed between 60 vs 120. Now 120 vs 240 will be even more difficult to notice if impossible entirely

Near the end of next gen when devs need to try really hard to improve graphics, 60fps will be the first thing they drop

I wonder when a nuclear warhead goes off, does the frame rate of real life drop?

No, "60fps" will never be the standard unless Sony\MS explicitly mandate that this be the case in order for a game to appear on their console- which will never happen.

Developers will continue making trade offs they deem important in order to push the envelope ever further. Sure, Killzone 2 could have been a 60fps game- but it wouldn't have looked half as good at 720p now would it?

No, "60fps" will never be the standard unless Sony\MS explicitly mandate that this be the case in order for a game to appear on their console- which will never happen.

Developers will continue making trade offs they deem important in order to push the envelope ever further. Sure, Killzone 2 could have been a 60fps game- but it wouldn't have looked half as good at 720p now would it?

definitely. Devs that are currently working on next gen titles say that there is pressure both internally and from MS and Sony to have native 1080p and some are aiming for 1080p60 aswell. However 60fps isn't being stressed as much

I wonder when a nuclear warhead goes off, does the frame rate of real life drop?

Consoles sticking at 30FPS is not progress is it.
Didnt video game used to be 60FPS when they were 2D? You know like mario and all those?

Just to put it into perspective BF3 uses nearly all my VRAM on my 1GB card at 1024x768 resolution if I use ultra settings.
Would it use more if I had a 2GB card?
But if I use LOW it only uses just over half.

I won't use more if you had 2gig, because when a game runs out of video memory the fps tanks, and I'm talking like from 60fps to 6 fps and since your screen shots tell that it runs at 66fps that means thats all the game requires at that resolution/settings. If you increase the resolution that it may need more Vram

I wonder when a nuclear warhead goes off, does the frame rate of real life drop?

Can someone explain GHz to me? Or rather, the significance of what one GPU or CPU clocks at that makes it important for console specs?

I'm not as tech-savvy as you guys. I base a lot of my assumptions on impressive tech by the number that follows the GPU series (X600 or whatever) and the amount of RAM; 4GB of DDR3 is said to be future-proof.

Can someone explain GHz to me? Or rather, the significance of what one GPU or CPU clocks at that makes it important for console specs?

I'm not as tech-savvy as you guys. I base a lot of my assumptions on impressive tech by the number that follows the GPU series (X600 or whatever) and the amount of RAM; 4GB of DDR3 is said to be future-proof.

A GHz is merely a unit of measurement for CPUs and now, GPUs, in which it is the number of cycles per second in which said devices operate. A hertz is one cycle , a kilohertz is 1,000 cycles, a megahertz is 1,000,000 cycles and a gigahertz is 1,000,000,000 cycles. The higher the hertz, (also dependent on actual chip architectures) the faster the these chips operate and thus, the more computations they're able to do per second. I'd say, far more important than CPU and GPU clocks is the speed and architecture of the ram when used in conjunction with said chips. It's fine to have 1GHz CPU or GPU and it's fine to have decent ram but if the architectures of each don't compliment each other and the speeds don't match so that these components work well together, then you're left with very inefficient hardware setups for which programmers and other software developers have to work with.

Personally, I think we need less ram but at higher clock rates to match chips on the frequency rates in which the chips operate and we need different ram and/or chip architectures so that the flow of information between ram and chips is more efficient. Of course once this is done, software developers can then work on being efficient with said hardware to provide for better experiences for the end user.

Can someone explain GHz to me? Or rather, the significance of what one GPU or CPU clocks at that makes it important for console specs?

I'm not as tech-savvy as you guys. I base a lot of my assumptions on impressive tech by the number that follows the GPU series (X600 or whatever) and the amount of RAM; 4GB of DDR4 is said to be future-proof.

Fixed
That will be future proof if they want another 10 year lifecycle. But that will be EXTREMELY expensive for a $400 console so they are probably going with 6-8gb DDR3

I wonder when a nuclear warhead goes off, does the frame rate of real life drop?

I think the reason it requires so little RAM to run that demo is because it is nothing but a controlled cutscene. Meaning that once anything becomes offscreen, it can be removed from the GPU until it is needed again. A good example inside from the Angi's demo is when they stick the hairy beasts in that cage with some serum. When the camera is up close on the beasts during the serum injection, the detailed hair is rendered, but immediately following that onesmall scene, the beautifully rendered hair becomes missing from then on. It's slide of hand in a sense, and it is one of the reason cutscenes always look better than the actual gameplay.

I think the reason it requires so little RAM to run that demo is because it is nothing but a controlled cutscene. Meaning that once anything becomes offscreen, it can be removed from the GPU until it is needed again. A good example inside from the Angi's demo is when they stick the hairy beasts in that cage with some serum. When the camera is up close on the beasts during the serum injection, the detailed hair is rendered, but immediately following that onesmall scene, the beautifully rendered hair becomes missing from then on. It's slide of hand in a sense, and it is one of the reason cutscenes always look better than the actual gameplay.

ITs a realtime tech demo.

And anyway thats how game engines work, they often do not render what cannot be seen by the camera/player, no point rendering the whole world if the parts of the world cannot be seen, its wasted memory and processing time. There will be tools to tell the engine what to render and in what circumstances.

Posting Permissions

PlayStation Universe

Copyright 2006-2014 7578768 Canada Inc. All Right Reserved.

Reproduction in whole or in part in any form or medium without express written
permission of Abstract Holdings International Ltd. prohibited.Use of this site is governed
by our Terms of Use and Privacy Policy.