The flash video is not 60fps, you have to download the AVI file in order to see the differences. Otherwise if you judge it of the flash video, then you are not getting an accurate representation of 60fps vs 24fps. As a matter of fact, i would argue that even the 60fps AVI video is not truly a good representation. As there is a huge difference between 24fps and 60 fps in a game in real life. 60fps looks silky smooth and usually fast, whereas 24fps looks jerky and slower. Now i'll admit, that i dont know about with motion blur, and do tend to like the effect. But the fact remains that 60 fps is better than 24fps. Please guys, this is not a 360/PS3 thing. This a reality thing, And fact is fact.

...I dont care what the frame rate is as long as there are no frame rate issues and it looks smooth. If that means 30, 40 or 60 frames per second then I dont care. To be honest, they should stop listing frame rates as a sales point unless its a sustained frame rate at that level for the entire game...not just when things aren't to busy on the screen.

There's even an debate going on between devs whether 30fps is more cinemative and actually suited to cinematic games, since films are filmed in 24fps. The new Sony tv's will have a refresh rate of 24fps also so it's more cinematic.

I feel that 60fps has its place in racing games etc, but if you're making a cinematic game...I actually think that 30fps may be preferable.

Has probably got atleast 10 comments in their post history saying the same thing or close to it in regaurds to What #10 mentioned. It's nice to see people understanding this subject is used as a fanboy tool.

I got up excited about the Elite waiting for the game shop to open and so i was going to burn some time playing some vids and my 360's drive stoped working lol. So i went and bought the Elite and then called up MS bi*ching and said look you sold me this POS after i was deffending your console on forums/game sites; i told them you better fix this sh*t and i better not have any more problems i requested an new one altogether and dude said i'll do what i can. I guess i wont be playing games if my Elite breaks and my new/fixed Prem when i get it back cus i disslike Sony.

Nah! the 360 line-up and future is too bright Lakers lost and my 360 boke and i thought today would be a good day. No rings of death though!. lol

The ejection/play bar says Unplayable disc with every game i own and CDs and i dont hear the hum of the drive spinning cus its not moving.

When I see this, I see lots of variables like different kinds of TVs and monitors or what kind of hardware there using such as graphics cards, memory, processors or even consoles. Any number of things can effect framrate and PCs are the hardest of them all to tweak. Also take in account what kind of tools are available to devs or how those tools allow shortcuts and if those devs rely on those shortcuts to much. Personally I don’t care for motion blur it gives me a headache, I find the higher the framerate the more crisp and clear the image is (especially in games that have large outdoor maps)

The best part is that the whole point of this video is to try and show you that 24fps looks like crap "WITH NO MOTION BLUR"...but if you watch the first part of the vid, 24fps with motion blur absolutely shames 60fps without. Well done, guys. Your video's conclusion missed its most important point.

With motion blur, most people can barely tell the difference for anything above 20 or so FPS (even down to 12 or so, depending on the person, if I remember my research correctly). Without motion blur, the human eye can easily see past 120fps and even up to over 200fps.

Look at the first part of the video, pretend it's saying "30fps" and "60fps" instead of 24 and 60, and you have the whole console FPS argument in a nutshell. 30fps done right, by conserving the system's power for better motion effects and etc blows 60fps out of the water (if going for 60fps means sacrificing those effects).

OK, first of all, how many frames per second is this videoclip running at? Because if it's running at 24fps (what I would expect), then you couldn't possibly see a difference between 60 fps and 24 fps, because 60 fps will transform into 24 fps. That's why the only difference to be noticed is when motion blur is added.

you have to see it with fast refresh TV, like an traditional SDTV, OLED... even the fastest refresh rate TV is not good enough compare to SDTV, thats the LCD's weakness... Plasma is better in freshrate

Of course higher frame rates are better and it has nothing to do with 24fps for a movie,thats misleading.One is an interactive experience that is changing constanly and these changes need to be updated,the faster the better and SMOOTHLY. A movie is just a series of frames run one after the other with no interactive content, so 24fps is adequate.You all must have played a game at 60fps and another at 20 or so, was there a difference?Finally, real life doesn't have any frames per second, may seem a silly point to make, but it's similar to the idea of digital sound being superior.There is no such thing as digital sound, all sound is anologue in nature.That's my take on it.