Yukon Cornelius:Why am I suddenly reminded of complaints about HDTV from several years ago, and how the pixels are too small for your eyes to even differentiate when you're sitting at a reasonable distance from the TV?

I don't know, is it because no one ever said that?

HD was a natural progression. The first time you see an HD signal, it's simply amazing. You can't believe you ever got along without it. This HFR isn't like that at all. It just makes film look like video. That's it. No big magic behind it.

...so why is there even a debate? Wouldn't it be a non-issue if people couldn't tell the difference? The debate wouldn't be "wait everyone, no one can tell the difference", it would be "Peter Jackson tried to use this 48fps thing, but it looks the same".

Medic Zero:Lydia_C: Fish in a Barrel: I saw it this morning. It was awesome. In 24FPS it would have been a blurry mess.

I understand that some people like the stately feel of 24FPS. You'll get over it.

I went to a midnight showing on Thursday of 24FPS. It was not at all a "blurry mess." What it was was too damn long for the amount of storyline covered.

The main reason I chose not to see 48FPS is that fact that it is only showing as 3D, and I'd heard enough reports of people feeling motion sick to know that hyperrealism + 3D was liable to make me sick too. I'd be willing to at least try 48FPS if they got rid of the damned 3D.

/ last movie I saw in 3D was "Alan Quartermaine and the Lost City of Gold"// you'll have a hard time convincing me that any movie in 3D is worth seeing

gwowen:But the only people who's brains process FPS input as if it were reality are psychopaths. All FPS look fundamentally unrealistic, so the brain doesn't give a flying freak about the disjunction between the smoothness of the motion, the artificiality of the perspective, and the fact that a everything is simultaneously in focus regardless (because there's no genuine depth, so no depth-of-field). So yes, FPS are different - they already look fake, so high frame rates don't make them look fake.

It's not just that -- videogames are fundamentally different from movies here. In a videogame -- especially a fast-paced, competitive videogame like an FPS -- the player wants the game world to respond to controller inputs as quickly as possible. But the vast majority of videogame engines work by saying "Check controller inputs, move things around, draw a frame, check controller inputs, move things around, draw a frame, etc." That means that the player's control over the action is constrained by the framerate; if a game is only rendering at 20fps, then it's going to take a 20th of a second for your controller inputs to take effect. So someone who's playing a game at 60fps has a significantly different gameplay experience -- and a competitive advantage -- over someone playing the same game at 20fps.

Movies are very different; you're just watching an experience, not trying to line up a headshot.

BumpInTheNight:So is this why the consoles tend to only render at sub 30 FPS but PCs consistently display at 60 or higher?

No, it's because your console is a mid-range computer from 2005, and your PC is a [low-end/mid-range/high-end] computer from [year >2005]. Even the Black Friday refugee computers still have more processing power than the average console these days.

The whole "backlash" is a bunch of bullshiat. I saw it at 48fps and wouldn't have known there was anything unique about it if it hadn't been for all the HFR hype. It certainly didn't detract from the moviegoing experience. I never had the feeling that it was "too real". How could that even be a genuine complaint? Isn't that the whole point of a movie? To lose yourself in the story? "I liked it but it would have been better if the image jumped and jittered more."

Glitchwerks:All that said, I'd be curious if the people who enjoy 48 FPS are gamers or otherwise tech-savvy and if the people who find it bad are otherwise.

48 FPS would be fine for a lot of stuff. The problem for PC is that most computer monitors want to display at 30 or 60, and tend to tear and stutter outside of those numbers. Even 55 is noticeably jerky compared to 60.

"Dr. Hameroff's theory has to do with the synchrony of the gamma waves in the brain - it's called gamma synchrony - the brain wave cycle of 40 hertz. There's a very strong theory that that is why we perceive 40 moments per second, but regardless of the reason, most researchers agree we perceive 40 conscious moments per second. In other words: our eyes see more than that but we're only aware of 40. So if a frame rate hits or exceeds 40 fps, it looks to us like reality. Whereas if it's significantly below that, like 24 fps or even 30 fps, there's a separation, there's a difference - and we know immediately that what we're watching is not real."

This has nothing to do with the uncanny valley, and is not the only factor in motion blur. The shutter angle used when shooting most of Skyfall corresponds to an exposure time of 1/50 s, but nobody complained about reduced motion blur or looking "cheap".

If nobody had known about the change in frame rate most people would have noticed anything, whereas a rapid pan across a picket fence is always jarring at 24fps.

Lydia_C:The main reason I chose not to see 48FPS is that fact that it is only showing as 3D, and I'd heard enough reports of people feeling motion sick to know that hyperrealism + 3D was liable to make me sick too. I'd be willing to at least try 48FPS if they got rid of the damned 3D.

I don't know who these people are, but they should probably see a doctor.

If anything the HDR is easier to watch, and 3D is how you see life every day. Saw the Hobbit last night and if nobody had told me it was HDR I just would have been surprised how nice it looked and how I could finally see what's going on when the scene is dark in a 3D movie.

TDBoedy:Medic Zero: Lydia_C: Fish in a Barrel: I saw it this morning. It was awesome. In 24FPS it would have been a blurry mess.

I understand that some people like the stately feel of 24FPS. You'll get over it.

I went to a midnight showing on Thursday of 24FPS. It was not at all a "blurry mess." What it was was too damn long for the amount of storyline covered.

The main reason I chose not to see 48FPS is that fact that it is only showing as 3D, and I'd heard enough reports of people feeling motion sick to know that hyperrealism + 3D was liable to make me sick too. I'd be willing to at least try 48FPS if they got rid of the damned 3D.

/ last movie I saw in 3D was "Alan Quartermaine and the Lost City of Gold"// you'll have a hard time convincing me that any movie in 3D is worth seeing

I enjoyed Prometheus in 3D, but it was IMAX too.

Dredd 3D was actually glorious in 3-D.

True! For some reason I forgot about that. Quite good all around actually.

2/3rds of the critics and 4/5ths of the audience on rottentomatoes.com liked it. I'm seeing it tomorrow, so I can't speak first-hand, but I don't think a minority of people not liking the movie says anything one way or another about Peter Jackson.

If anybody you know has a problem differentiating movies, video games or TV shows from reality you should keep an eye on that person; they're possibly dangerous to themselves or others.

If anything over 40fps is thrown out by your nervous system then anything over 40fps is just thrown out. There may well be other problems, such as odd depth of field, or the false 3D effect that are causing viewers problems.

Came here to say this. When I stream video from my Xbox to my HDTV it is at 60 Hz progressive scan, and shows with great cinematography like Game of Thrones, Breaking Bad, Boardwalk Empire, and Mad Men look just as good as a theatrical feature film. They I no way look like a soap opera, despite Mad Men basically being a soap for middle aged men. I don't know if they film at 24p and then convert it to 60p or 60i for broadcast, and that's why it looks like it does, but I never get that unreal feeling from content that likely has been shot for 60p. Hell, even season 5 onwards of Doctor Who is starting to look like a feature film in terms of the cinematography with their new HD cameras.

I don't care about FPS. I am not interested in a real or a fake experience when I got to the movies. SURPRISE I ALREADY KNOW IT IS FAKE.

What they need to do is stop making stuff in 3D. It is irrelevant for 99% of the movie, and the 1% is forced stuff because hey it is 3D, where suddenly you are looking at it from a 1st person point of view when the rest of the movie is 3rd person. Here have one or two cool effects of something being thrown at the camera that completely disrupts the flow of the movie.

At this point 3D is too much work to do well through a whole movie so it ends up detracting from the movie. The best movie I have seen 3D in? Harry Potter. I hate those farking movies. But Harry Potter did it right, they decided since they couldn't do 3D well through the whole movie, they would just make the final fight scene 3D, but do it well. As it turns out 2-5 minutes of well done 3D is vastly superior to 2+ hours of crappy 3D.

It takes too much time and costs too much money to do 3D in a manner that it looks good through an entire movie though. I don't care if a movie feels fake or feels real, I would rather have a good story, good acting, good general shooting, and no farking 3D unless they take the time to do it. 24fps, 48fps, who cares you know how many of the really good movies I have seen that I thought "Man if only this was shot at a higher frame rate?" None.

StoPPeRmobile:BumpInTheNight: So is this why the consoles tend to only render at sub 30 FPS but PCs consistently display at 60 or higher?

A solid 60 is like glass to me. 30 sucks and 45ish works.

/Will pwn you.

The difference is that film is a constant frame rate, PC game frame rates are average rendered frame rate, so if you're seeing 30fps in an FPS, likely you're getting sub 24fps under heavy rendering, which is noticable. At 60fps, ypu're enough above average that you won't be seeing any choppyness

So much stupid in here it burns. It's like listening to a bunch of 3rd grade boys explain the intricacies of how women parts work. You folks have so much technical knowledge it's like someone combined an Apple Store Geniustm with a BestBuy sales associate.

There is so much science to perception, but have no fear the neckbeards are here to help!!! See, they can turn their vsync on and off on their video card, they study frame rates when playing WoW and also have a high def lcd TV so... they obviously know wtf is up.

Great article, I've wondered what specifically it was that bothered me about it but I just couldn't put my finger on it. It definitely breaks that suspension of disbelief, the comfort zone.

Looks great for the Olympics, sports, news, anything that is supposed to be real. But just looks oddly asinine for something that is supposed to be fantastic and abstract.

kingoomieiii:The problem for PC is that most computer monitors want to display at 30 or 60

That's not really true anymore. The actual LCD pixel refresh rate might be as low as maybe 30-some Hz on the low end and rarely is higher than 60 Hz on computer-oriented systems. Faster LCD displays exist but mostly are reserved for video-oriented systems (i.e. TVs) where they're trying to do frame rate matching with recorded content instead of with generated content.

/ Don't confuse the pixel refresh rate on the panel with the backlight flash rate; they are uncorrelated// Also don't confuse the image refresh rate on the computer->monitor video link with the actual panel refresh rate

Came here to say this. When I stream video from my Xbox to my HDTV it is at 60 Hz progressive scan, and shows with great cinematography like Game of Thrones, Breaking Bad, Boardwalk Empire, and Mad Men look just as good as a theatrical feature film. They I no way look like a soap opera, despite Mad Men basically being a soap for middle aged men. I don't know if they film at 24p and then convert it to 60p or 60i for broadcast, and that's why it looks like it does, but I never get that unreal feeling from content that likely has been shot for 60p. Hell, even season 5 onwards of Doctor Who is starting to look like a feature film in terms of the cinematography with their new HD cameras.

Actually, doing a little more digging, it looks like most of the shows were shot with 24p digital cameras and then converted to 60p or 60i for broadcast. Recent seasons of Doctor Who, for example, have been using a Sony F35 (which does go up to 50 fps, but I'm betting they saved that mode for slo-mo scenes), so I'm betting they were probably shooting 24/25p.

Does anyone know for sure what they do for high-end TV productions? Now that I think about it, my Mad Men Blu-Rays are in 24p, so I'm betting it is probably pretty common to use that format natively and then convert.

MurphyMurphy:Great article, I've wondered what specifically it was that bothered me about it but I just couldn't put my finger on it. It definitely breaks that suspension of disbelief, the comfort zone.

Which is a perfectly valid point, but that's about training not the actual limitations of perception/etc. Once you get used to the higher frame rate you can ignore the non-reality of it just like you did before.

Johnson:Douglas Trumbull developed SHOWSCAN (65mm film projected at 60fps) after doing lots of testing.I managed to see a demo of it at his company headquarters by accident. I was in town to visit someone in summer 1987 and looked it up on a lark and simply showed up to see about "the demo".Turns out there was a PRIVATE demonstration that day for a dozen people and they thought I was one of them.

It was really amazing to watch, but it was less than 10 minutes long and has really only been used for interactive rides. Trumbull envisioned it being used for feature length films as well, perhaps we can tolerate it for a short time, but over 2 hours it gets annoying?

still waiting on that address there Jon Lovitz..

wasn't able to Google it were you?

here's how i know you are makin this up:

"looked it up on a lark and simply showed up to see about "the demo".

and the dooooozie

"and has really only been used for interactive rides."

did you use the yellow pages?

also, the showscan technique was in development well before 87.

-so where did you "look it up"?-where and when did you hear about the private and very proprietary Showscan system? In Starlog? years after it was already in use?-what else was Showscan used for before it was used on rides?

With out the "blur" of movement, you end up with a rapid series of stop-frame images, where the action will look very non-fluid. Yes, when thrown, a football travels at over 60 feet per second, but if you image it perfectly at 60 fps, you would freeze it at each point in its travel. When you then show the series in real time, the football will appear to jump from one position of perfect focus, to the next position of perfect focus, and it would look unnatural, and very artificial.

Like it or not, your brain WANTS the image to imitate your normal continuous stream of data input, and that includes the inability of your visual system to focus clearly on BOTH the moving football and the stationary receiver. One of them will be blurred.

Um, whether you see a sequence of frozen images or motion blur has nothing to do with framerate and everything to do with shutter speed. If the shutter is on 1/1000, anything less than 1000fps will give you a sequence of discontinuous images instead of proper blur.

/Bring on the high framerates//Only two movies should be 3D: Tron and Tron Legacy.

Let me guess, another person who thinks that because our neurological system can only recognize ~60 individual images per second, that means we can't "see" or process higher frame rates, while completely ignoring that our brain is really, really good at filling in smoothing visual movement thus making the number of individual images we can process per second a meaningless number in regard to artificially created moving images?