Man... makes me hungry for a PS3. I just hope they can keep the speed up on SotC. That game was seriously not meant for PS2 hardware. As for framerate... technically, the human eye has a persistence of vision cap at around 20fps. Movies are all 24fps, and they look great. The reason behind 60fps (fields per second), is because it can use AC power as a regulator, which cycles at 60 times per second. Early CRTs couldn't keep a pixel lit for a whole cycle without looking flickery, which is why they invented interlacing. So 60fps is kind of unnecessary, just one of those standards like QWERTY keyboards, that is there for no other reason than issues with outdated hardware.

The new GT5 trailers are just... it's unreal, there are no words, the amount of content is hard to believe but the graphics are just insane, I'll only believe this is real once I see it on my TV, these guys must know something other developers don't. Most games are sub-HD and struggle to maintain a consistent 30 fps. Then this game comes along with an unparalleled level of detail, weather and lighting effects not to mention physics based damage and runs at 1080p-60fps like it's nothing. I've lost track of all the features that are in the game, I don't even care anymore, it will take months to get a grasp of it all anyway.

Yamauchi says that the things they weren't able to include will be in GT6. Like what?! What else could they possibly put into this game? Every request I've ever seen after reading thousands of posts discussing the game throughout all these years has been fulfilled and in a way that completely exceeds expectations.

There's dynamic weather and there's even a weather forecast that may not be entirely accurate like in real life (?!) and it will influence weather temp, pressure and humidity which will influence the surface and grip! At one point in the trailer you could see Kaz racing at night, with rain. You could see the wipers functioning and the incredible light effects from the headlights and the insane level of detail inside the cockpit and then came the snow. Oh my God the snow!

Everything that was missing in the demo and the previous trailers is here. Skidmarks, realistic damage, course generator, dynamic weather, nascar, it's unreal.

Man... makes me hungry for a PS3. I just hope they can keep the speed up on SotC. That game was seriously not meant for PS2 hardware. As for framerate... technically, the human eye has a persistence of vision cap at around 20fps. Movies are all 24fps, and they look great. The reason behind 60fps (fields per second), is because it can use AC power as a regulator, which cycles at 60 times per second. Early CRTs couldn't keep a pixel lit for a whole cycle without looking flickery, which is why they invented interlacing. So 60fps is kind of unnecessary, just one of those standards like QWERTY keyboards, that is there for no other reason than issues with outdated hardware.

Both games will be solid 30fps. Ueda said that the games were designed for 30fps and 60 just didn't work, but we can expect rock solid 30 fps.

Man... makes me hungry for a PS3. I just hope they can keep the speed up on SotC. That game was seriously not meant for PS2 hardware. As for framerate... technically, the human eye has a persistence of vision cap at around 20fps. Movies are all 24fps, and they look great. The reason behind 60fps (fields per second), is because it can use AC power as a regulator, which cycles at 60 times per second. Early CRTs couldn't keep a pixel lit for a whole cycle without looking flickery, which is why they invented interlacing. So 60fps is kind of unnecessary, just one of those standards like QWERTY keyboards, that is there for no other reason than issues with outdated hardware.

Just 20? Are you sure it's that low? In MMOs with the frame rate on, it's pretty easy to tell the difference between 20 and 30+ fps (which happens a lot in Lagaran).

As for framerate... technically, the human eye has a persistence of vision cap at around 20fps. Movies are all 24fps, and they look great. The reason behind 60fps (fields per second), is because it can use AC power as a regulator, which cycles at 60 times per second. Early CRTs couldn't keep a pixel lit for a whole cycle without looking flickery, which is why they invented interlacing. So 60fps is kind of unnecessary, just one of those standards like QWERTY keyboards, that is there for no other reason than issues with outdated hardware.

I was doing my best to just ignore this comment because I knew responding to it could not possibly be brief, but then somebody actually went and took it seriously. There is so much wrong in these few sentences that I don't even know where to begin. Heck, sometimes there are multiple factual errors in one sentence!

But I'll give it a shot.

"technically, the human eye has a persistence of vision cap at around 20fps."

Utter nonsense. The human eye does not have a 'persistence of vision cap' at all. It doesn't even see in term of frames per second (not 'fields per second'...whatever the heck that is) in the first place. However there does come a point where the brain is not able to distinguish between framerates. That point is nowhere near 20fps, it's roughly 60fps but for some people it's even higher than that.

"Movies are all 24fps, and they look great."

Movies are 24fps, I'll give you that. This is the result of a standard developed in the 1920s, and frankly there is room for improvement. The only reason movies look good at 24fps is because of motion blur, which is to say that each image was not taken from a still scene and then spliced together, but rather when filming something in motion you get a series of blurry frames and your brain fills in the detail to understand what it's looking at. This happens naturally when using film because of the exposure time.

However, motion blur doesn't come for free. You are literally looking at blurry images, and there is an associated loss of detail. For instance there are times when it is impossible to read text on things that are moving in a movie when from a corresponding position in real life you would have been able to. But these kinds of effects are subtle and most people don't notice. There have been actual studies showing motion-heavy movies to people recorded at higher framerates and yes they absolutely can tell the difference in a side-by-side comparison. And interesting point, though, is that people have become so used to looking at 24fps movies that when shown something at a higher framerate they will often comment that it looks 'fake' or 'cheap' despite the fact that they're seeing more detail. 24fps is how people think movies are supposed to look.

BTW, it is possible to do motion blur in video games but the easiest way to do it is to interpolate between frames, which is generally more work and doesn't look as good as just running at a higher framerate in the first place. That's why nobody bothers.

"The reason behind 60fps (fields per second), is because it can use AC power as a regulator, which cycles at 60 times per second."

Complete and utter nonsense. AC power has nothing to do with framerates. Most electronics converts to DC power anyway, but even without taking that into account framerate != refresh rate.

"Early CRTs couldn't keep a pixel lit for a whole cycle without looking flickery, which is why they invented interlacing."

First of all, interlacing is a total red herring here. Second of all, you are again confusing framerate and refresh rate. A CRT with a refresh rate of 60Hz absolutely does look flickery. How much it bothers you depends on the person, but I can't even look at a 60Hz screen. It hurts my eyes. That's why most (if not all...) CRTS run at more than 60Hz. The refresh rate refers only to the scan time for the electron beam. A CRT running at 120Hz displaying 60fps would refresh twice for each frame. That kind of thing is completely normal for a CRT.

"So 60fps is kind of unnecessary, just one of those standards like like QWERTY keyboards, that is there for no other reason than issues with outdated hardware."

Given that everything you said to support this statement is nonsense, I guess it shouldn't be a surprise that your conclusion is nonsense.

I love Final Fantasy VI more than my dog, and it's a 3-frame walking animation and unanimated enemies.....nevermind, I still appreciate a goodlooking game from time to time. More importantly, it'll be nice to play a smooth (as opposed to laggy, it's true, SotC was not meant for PS2) game.

A consistent framerate is more important than simply going for a high one, Two Worlds is choppy as fuck and the demo gave me a headache, while I wouldn't get that from N64 games that typically ran at 15 FPS. But no, my point there is that it's only useless to go to 60 if you can not see a difference, and that is outright wrong. You can more capably argue that going for above 60 is useless beyond insurance for a stable framerate, you're not going to see a difference and it only leads to abuse (somehow) in games like Quake III Arena.

As for framerate... technically, the human eye has a persistence of vision cap at around 20fps. Movies are all 24fps, and they look great. The reason behind 60fps (fields per second), is because it can use AC power as a regulator, which cycles at 60 times per second. Early CRTs couldn't keep a pixel lit for a whole cycle without looking flickery, which is why they invented interlacing. So 60fps is kind of unnecessary, just one of those standards like QWERTY keyboards, that is there for no other reason than issues with outdated hardware.

I was doing my best to just ignore this comment because I knew responding to it could not possibly be brief, but then somebody actually went and took it seriously. There is so much wrong in these few sentences that I don't even know where to begin. Heck, sometimes there are multiple factual errors in one sentence!

A consistent framerate is more important than simply going for a high one, Two Worlds is choppy as fuck and the demo gave me a headache, while I wouldn't get that from N64 games that typically ran at 15 FPS. But no, my point there is that it's only useless to go to 60 if you can not see a difference, and that is outright wrong. You can more capably argue that going for above 60 is useless beyond insurance for a stable framerate, you're not going to see a difference and it only leads to abuse (somehow) in games like Quake III Arena.

Not only can you see the difference but you can also feel it because of control response. Since most games are 30 fps, when I play something that is 60, like wipeout HD or flower, I can tell the difference right away. Those games are so damn smooth and responsive, I really can't imagine them without that framerate, they just need it to be the games that they are. Latency is reduced and the game looks smoother, in an ideal world every game would run at that framerate but it's understandable, 30fps works well enough for most games and twice the time to render each frame means better graphics. Recently insomniac said that they were going to stop making games at 60 fps because framerate doesn't sell a game, graphics do.That's why it blows my mind that GT5 manages to run at 60 fps and in full HD no less, they managed to create the best looking console game without sacrificing neither framerate or resolution, it's insane.

Unlike most of you, I have zero interest in updated Ico/SoTC. I loved those games, but have already moved on. There are way too many games that I have yet to play this gen(Bayonetta, Uncharted 2, Darksiders, etc etc etc). No way I'm playing something a second time around.Plus we have next month with Lord of Shadows and Vanquish coming out. Ugh.I never got the Gran Turismo love. I have no interest in realistic racing games. I'm all about the Wipeout and F-Zero type racers.