Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

An anonymous reader writes "Last year, when we discussed news that The Hobbit would be filmed at 48 frames per second, instead of the standard 24, many were skeptical that format would take hold. Now that the film has been released, an article at Slate concedes that it's a bit awkward and takes a while to get used to, but ends up being to the benefit of the film and the entire industry as well. 'The 48 fps version of The Hobbit is weird, that's true. It's distracting as hell, yes yes yes. Yet it's also something that you've never seen before, and is, in its way, amazing. Taken all together, and without the prejudice of film-buffery, Jackson's experiment is not a flop. It's a strange, unsettling success. ... It does not mark the imposition from on high of a newer, better standard — one frame rate to rule them all (and in the darkness bind them). It's more like a shift away from standards altogether. With the digital projection systems now in place, filmmakers can choose the frame rate that makes most sense for them, from one project to the next.'"

I was just wondering if anyone else would mention ShowScan, amid all the claims of "first time such a high frame rate film has been produced... blah blah blah..." when the claim really should be "finally, something almost as good as what was available 40+ years ago."

24fps has always bothered me whenever an object or person moves across the screen quickly. Even the small increase to 30fps is a significant improvement to my eyes. 72fps seems like a good goal, though I probably won't complain about 48.

I think those in the "24fps is magic" camp have a lot in common with the "vinyl is better" and "tubes are better" bunch. They either like their content distorted by their medium of choice or just like the idea of using archaic technology. There's certainly nothing wrong with either of those things, but the old ways are not "better" for everyone else.

24 fps from a high-speed shutter camera (usually digital these days), can be disturbing. 24 fps with low-speed shutter (older analog cameras), where there is motion blur is ok; motion blur approaches what we see with a naked eye.

24 fps from a video game, which is a sequence of stills, typically without motion blur as it requires more CPU time, is awful.

Assuming Jackson used digital cameras, 48 fps should be an improvement.

24 fps from a high-speed shutter camera (usually digital these days), can be disturbing.

What camera man in his right mind would shoot faster than 1/24s shutter speed and then display it at 24 FPS? This only happens when taking a live broadcast at a high frame rate and insta-converting to 24 FPS like they do (or did) with some awards shows. It would never happen in a movie.

Have you seen the movie yet? Reserve judgment until you do. My wife went into the viewing not really comprehending what "HFR" meant. About 30 seconds into the movie she leaned over and whispered, "Is the entire movie going to be like this???" and later, "It looks like a video game."

I was pretty well mentally prepared for the frame rate difference, so I was able to enjoy it as a spectacle if nothing else. But it added nothing of value to the movie itself, and speaking honestly the movie did lose something in the transition. It ceased to feel like a movie. It felt like an extremely high definition live broadcast.

I would like someone to explain to me, where is the inherent benefit with 48 FPS? Sure it's nice for directors who would love to shoot faster-moving pans, but how exactly does it make things nicer for the viewer? Do people complain of headaches when they watch movies? Seriously, where is the improvement?

I'm not aware of broadcasts in 50 FPS. AFAIK, they're being evaluated, but basically material is broadcast at 25 or 30 fps, depending on the standard used. These conform to the old PAL/NTSC/SECAM framerates. Interlaced formats, however, can be 50 or 60, but that's because each frame is essentially split into two frames of alternating horizontal lines, "fields".

I guess that depends on how you define it. On interlaced broadcast (like all old TV), you get a half-frame every 50 seconds, where half-frame means either the even or the odd lines, alternating. However in true interlaced broadcast (i.e. where the material was already recorded in that format, not transformed into it as when putting a movie to TV) it's not that you get the even and odd lines of the same image, but each half-frame is recoded on its own time. So say you've got 50 half-frames per second, then you'll get e.g. the even lines of the image at 0ms, then the odd lines of the image at 20ms, then the even lines of the image at 40ms, then the odd lines of the image at 60ms, and so on. Only with converted stuff, the even and odd lines will be from the same image.

You can see that quite nicely when capturing a true interlaced-recorded TV program on tjhe computer, where two half-frames are combined into a frame. If there's fast movement in the scene, you'll get striped frames because your "frame" is actually the combination of two images at different times, with the even and odd lines image separated by 20 ms (50Hz) or 16.7 ms (60Hz). Given that those images are recorded at different times, I'd say it makes sense to consider them different frames which are recoded at half the vertical resolution with a displacement of one line every second frame.

Interlacing was a wonderful thing in the analog days. TV would have looked (literally) half as good without it. But those days are passed: It is time to let interlacing die. It just gets in the way now and complicates things needlessly.

But the reason for using fields is so that the effective frame rate perceived is 60 frames per second, just at half the vertical resolution. Each field contains a different image than the field before it. So we're used to relatively fast frame rates. If people find 48 FPS distracting, the reason is probably either a psychological "uncanny valley" thing—because it feels almost like TV but not quite—or perhaps it just happens to be a magic speed that makes people uncomfortable for some reason.

I'm not aware of broadcasts in 50 FPS. AFAIK, they're being evaluated, but basically material is broadcast at 25 or 30 fps, depending on the standard used. These conform to the old PAL/NTSC/SECAM framerates. Interlaced formats, however, can be 50 or 60, but that's because each frame is essentially split into two frames of alternating horizontal lines, "fields".

720p60 is a common broadcast format and a few European broadcasters do 720p50 (presumably to ease upscaling 25 FPS SD content). It seems 1080i50 is more popular over there though (annoyingly, I despise interlacing and would much rather have seen 1080p30 and 1080p25 become the broadcast standards rather than their crappy interlaced counterparts).

I designed and produced many corporate events in my life. Most of then involving video animations on screens as large as 60 foot wide. As soon as technology allowed me, I produced and projected video animations in 60fps to make pans more fluid. Would I be producing a movie today, I would try to shoot in 60fps for the same reason, much more fluid motion on big screens.

As soon as technology allowed me, I produced and projected video animations in 60fps to make pans more fluid. Would I be producing a movie today, I would try to shoot in 60fps for the same reason, much more fluid motion on big screens.

If Jackson had chosen 60fps for The Hobbit, it would have been a much better choice, at least as far as home video is concerned.

With a choice of 48fps as the source, we are going to get stuck with a much lower quality home video release, because there is no current format that allows at least 48fps and 1920x1080 resolution. So, to convert to 24fps, either the original footage will have to be filmed at 24fps, or else some sort of digital interpolation will have to be done. Neither will give the same quality that we have come to expect from current media, as instead of 24 frames per second where scenes with little motion have very sharp frames, pretty much every frame will show some sort of motion.

The framerate (1/24 of a second) determines how much judder you get. The shutter speed determines the motion blur.

The Hobbit does not use interpolation. It is shot at 48fps. Interpolation is used by 120Hz TV's to guess the intermediate frames and is a worthless technology in my mind. I only want to see the original framerate, whatever that may be, and of course motion blur is removed on the interpolated TV's, against the director's wishes.

Oh yes, like wagon wheels going backwards. I also pine for the days of scratches, dust spots and pubic hairs on the big screen. And nothing but nothing beats the exhilaration of watching the celluloid melt because the projector stalled.

Even if the human eye can't distinguish > 60fps (it definitely can), the human retina is not v-synced with the television/screen. So you still need more temporal resolution than the eye can handle for it to appear smooth.

Personally, I'd like to see the end of the concept of framerate altogether - that changes to the display image are bundled into timestamped packets which can be of any infintessimal interval. It'd impose a slight bandwidth overhead but be pretty insigifnicant if more than a few pixels were contained in each bundle.

You'd be detaching the uptake mechanism from the playback mechanism. Each device does the best it can with the hardware it has on-hand. The simpler the task it's presented with, likewise, the s

It's no more a problem with digital than film. They can both accept the same shutter speeds and aperture settings and even the same lenses. How can digital exhibit anything different?

Stuttering happens on pans that are too fast. It's because there aren't enough frames. But sometimes we want sweeping panoramic shots at a decent speed. In order to deliver on that, you MUST have a higher frame rate.

> Most people can't see a difference in rates above 30fps and pretty much nobody can distinguish fps over 60 fps.

Bullshit. You most certainly CAN see a difference, particularly when there's high-resolution, high-contrast detail with fast movement across the screen. In fact, high-framerate video has its own "uncanny valley" problem (above a certain framerate, generally in the neighborhood of ~300fps, hyperfluid 2-dimensional video becomes disorienting and vertigo-inducing, because your brain can't reconcile the seemingly-lifelike motion with its lack of depth).

Maybe if you're a 20 year old with perfect eyesight, but how many of those will be willing to buy $1000 TVs? The whole reason Blu Ray has been a flop is the average viewer can't tell the difference between an upscaled DVD and a BD enough to make it worth the money, and the last figures i saw had 3D TV also ending up in the "People don't buy this shit" column, so really who cares?

We ALL know why they are doing this, its the same reason they try to push 3D on us every so many years since the 1950s, its because it lets them charge more per butt in the seat, and i have no problem with that, I really don't. But when you look at what people are actually buying for their homes you see a shitload of 720P and 1080P bottom of the line sets, the 120Hz sets ain't selling for shit because frankly most people don't care and their DVDs aren't HD anyway so why should they spend the money?

Until they are selling 120Hz 60FPS sets for $199 at the Best Buy AND all the programming is also in that format? Give it up chuck, its another teeny tiny niche that won't sell for squat. I mean look at how many gave up their HD sats and cable for the compressed all to hell netflix, at the end of the day its "good enough" and that is all the masses give a shit about.

I don't know if you'd describe it as a flop, exactly, but it certainly lacked the massive uptake of DVDs/CDs over VHS/Cassettes. In my experience, most people didn't rush out and buy a Bluray player; they got one the next time they were going to upgrade anyway - with their console, or built into their TV, or occasionally replacing their standalone DVD player. I still know many people (including myself) who just use DVDs.

The high-res transition was very much an iterative update. People had too much invested

The reason why DVDs took over from VHS tapes (which hardly was an overnight thing either) had more to do with creature comforts of not having to rewind the tape, deal with tape stretching/media fall out, and other physical problems of the VHS medium. That the DVD discs took up much less space was also a huge bonus and none of those advantages applied to Bluray as a format. For the most part Bluray is simply DVD on steroids and seen as just that.

Most people can't see a difference in rates above 30fps and pretty much nobody can distinguish fps over 60 fps. There are plenty of people (especially gamers) who think they can but they are imagining it.

You really don't have a clue, but you've bought into some techno-babble explanation and have convinced yourself that you do. It's sad, really.

There's a point at which a flickering light source stops being perceived as flickering, and starts being perceived as continuously lit. That threshold is somewhere south of 60 fps for the vast majority of people, true, but that isn't the same as not being able to perceive more than 60 fps (much less 30!).

The reason film (@24 fps) and TV (@30 fps) look smooth, where video games (@30 fps, or even 60 fps+) don't is because the human eye is fooled by (or possibly trained to be fooled by) motion blur. When a camera takes a picture, it doesn't actually capture a moment in time, it captures a span of time. The more something moves during that span, the more motion blur exists. This is due to the shutter system which is required to keep the film from being exposed when it is out of position within the camera. (Note: Motion blur is an effect that can be seen with the naked eye, even with no camera in the mix, but the speeds involved for that are *much* faster than required to see them on film.) Video games (short of the ultra-high end games coupled with extremely powerful graphics cards) don't produce motion blur. Instead, they work to produce more than 30 fps, which is the *minimum* required to feel 'smooth' in the absence of motion blur, frame rates faster than 30 fps (up to at least 75 fps for most people) have been shown to be distinguishable as noticeably smoother in experiments.

Like because it is cheaper and easier to make movies that way? If something looks fake when there is not enough blur, it is because movie makers have not bothered figuring out how to make their scenes look more realistic at a more realistic frame rate. It has nothing to do with the video technology. It is like complaining that color TV looks too realistic and they should stay with black and white. These days people only watch black and white TV whenever they are feeling nostalgic. I applaud Jackson for trail blazing the path to higher frame rates in movies.

I am not sure that I buy your argument- I think that the brain perceives more detail from more frames much like you can composite several images in PS to create a higher resolution image (very common in astro photography)
However if the extra frames reveal issues with props, make up or digital animation than they are self defeating.

In the first case, the camera is not panning, but just filming the scenario as it is, and projector playing it back at the filmed rate. Thus viewing the projection is the same experience as looking at the scene in real life, to within the fidelity of the playback. Notably, there is no depth (or a poor simulation of depth with forced focus), but apart from that higher fidelity should be more realistic. The viewer's eyes will be jumping around the big screen and blinking just like normal so there is absolutely no reason to try to "simulate" that; you have the real-life effect already occurring. Same with motion blur; the eye will supply the same amount of blur that it does in real life, so there is no reason to simulate it, beyond compensating for too *low* of a frame rate, which requires a longer integration time to avoid appearing choppy.

And yet it is exactly this sort of scene that was causing people to deride 48fps as being "soap opera like". They talked about how watching the Hobbits slowly walk down the hill towards them looked epic in 24fps, and looked like a documentary in 48fps. It destroyed the suspension of disbelief for them, and made them think they were looking at actors not Hobbits. That has nothing to faking limitations of human vision. It is completely psychological; whether that psychological effect is inherent in the medium or the result of prior conditioning is debatable, though.

The second scenario is where the camera is panning, and thus forcing visual motion on the user even though they didn't initiate it. This is identical to being smoothly flown around a scene, and how "realistic" it is will depend on whether that would actually happen in real life. In situations where it is realistic my argument above would apply; the eye will be looking around the moving scene just like it would be when looking out a train window.

On the other hand, in situations where panning is being used to simulate human motion, I would argue that 48fps could allow the filmmaker to have more realistic view changes if they want them. Low rate 24fps forces the director to have slow gradual pans less they create a choppy or blurry mess as a result in the limitations of the rate. However, as you pointed out, the eye doesn't work that way. It jumps around, taking time to settle and focus each time. If you tried to do that at 24fps the viewer would get lost, unable to follow the transitions. In large part this is because in real life they are controlling the transitions so they know in advance where the view is changing to, but to a lesser extend this is due to the limitations of the frame rate. Faster frame rates will allow for more abrupt translations that are still possible to follow.

Whilst I do generally like higher frame rates, the above is actually the trouble... It looks _too_ realistic. For a high fantasy movie like The Hobbit sometimes putting a little 24Hz vaseline on the lens helps let your brain fill in the gaps with fantasy. That 48Hz the project fills in the gaps with reality.

It sounds reasonable, but it isn't. There are less frames, but the filling in your brain does is more or less equal to what just presenting it with more frames does. It doesn't synthesize detail like it does with low spatial resolution imagery.Consider the following: suppose you were given a low resolution drawing, you would not be able to draw the equivalent of the high resolution version of it. If you were given two subsequent (drawn) frames, you _would_ be able to fairly accurately draw the frame in betw

The reason you think 720p looks better is because of frame rate. That's why ESPN and Fox Sports both use 720p for broadcast. In the US-ATSC system, 1080i is interlaced at 59.94 fields per second, or 29.97 frames per second. 720p is progressive scan at 59.94 FRAMES per second.

There is also a lesser quality version of 720p at 29.97, but broadcast 720p is 59.94 FRAMES per second. That's why it is better for fast-action sports, and looks much better than 1080i.

720p-60 (as it is called) uses the same amount of broadcast bandwidth as 1080i-30.

Why didn't the industry standardize around 48FPS? Well, for most broadcasters, bandwidth is money, and the higher frame rate would have massively increased the cost of distribution. Secondly, at the birth of digital HD, encoding and decoding at this rate was much more expensive, and LCD TVs have a very poor response rate, making them (at the time) a poor match for higher rates.

The reasons are simpler than that. At the time, VHS was still king. LCD TVs were expensive, small, and fickle novelties. CRT tele

If you have a modern medium to high end HDTV, turn on frame interpolation [wikipedia.org] processing (by whatever silly trademarked name your TV has for it) and watch an HD movie (especially one with sweeping pans and action, etc). It's hard to quantify exactly why it's distracting, but is sometimes described as a "soap opera" effect.

It bugs me too, but it really is hard to objectively say why. I'd like to think it's about a subconscious feeling of "expansiveness" and uncertainty (since your brain has to interpolate instead of the TV, and maybe your brain interpolating engages you with the content differently, etc) that you want with a more "epic" movie experience.

But there is also a strong argument that it's mostly your brain adjusting to something it has not experienced in this setting, and you will get used to it if exposed enough. Sort of like getting a new pair of glasses with a different shape/refractive index...

maybe Jackson should just try actually shooting the whole story this time. Hey Merry - where'd you get that cool magic blade that killed the Witch King? "Errr.... well err ummm. See there were these barrows, but we had to cut that from the story, but - hey, Liv Tyler is hot, right??"

At the same time I love the movies (and just got the extended BR edition), I'm sad that they weren't as 'faithful' as they could have been to the books, and worse is that it was such a huge endeavor that it's not likely to be tried again for at least a very long time, if ever. I more than understand cutting out a part like Tom Bombadil (and changing and cutting various other things) for the sake of brevity, but that's not what they did - they cut it for the sake of stuff that wasn't in the books at all.

Hopefully, pirates will get their hands on a digital copy of the original video, resulting in a high quality h.264 file that can be played on virtually any media player or HTPC, instead of those silly obsolete Blu-ray disks.

Actually, now that I think of it - I wonder if this will be offered on things like Netflix or AppleTV in the full resolution/refresh rate? Or if it will be crippled to avoid making the Blu-ray version look bad?

And like many people that have commented it seems, I found that the Tom Bombadil thing was horrendous in the book, and cheered a little inside when it was skipped in the movie.

I honestly can't even slightly understand why some people have such a hardon for that part of the book. It was terrible. TERRIBLE!

One of the reasons people like the Tom Bombadil section is because of the character development.

Remember, the book was about little, ordinary people that can do great things, even while big, great people are doing great things all around them. The book was not about little people outshining big people, nor was it about great people overshadowing the efforts of little people. On complaint about the movie was that it was more about Aaragon and Legolas with Gimli being the comic relief than it was about the Hobbits.

As for the character development, the Tom Bombadil was one of the first things that said, "This is not a simple trip across the forest. This is a dangerous journey and you better be ready." In the book the RingWraith drove them into the dark forest, and they almost got killed because they did not take the journey serious enough. When they got to Bree, they tried to fall back into the easy ways of the shire, only to be almost killed again by RingWraiths because they weren't paying attention. Only this time, they "found" a guide to help them in their character development. By the time they dealt with WeatherTop and finally made it to Rivendell, they were ready to start the journey to Mordor.

The Scouring of the Shire, another section left out by the movie, was the final step that the Hobbits had to take to realize that they were no longer children or ordinary people, but had become great people with large responsibilities. They no longer needed to rely on their guides or other races to take care of their own troubles. Their accomplishments did not belittle the other races, but finally became equals with them. And as equals, they were expected to take care of their own troubles. With great power comes great responibility. (The words are from Spider Man, but the theme is ancient.)

From the reviews I've seen so far, no one seems to enjoy the 48fps. Even mainstream reviewers have referred to it as 1970s video smooth, "old Dr. Who at best." (Paraphrasing from the CNN review this morning). Maybe The Hobbit is the sacrificial movie which needed to be made and receive this kind of backlash, in order to never have such an awful-looking "feature" used in film again.

You say that, games are actually going the other way, well, console games anyway. As the hardware ages and game developers want stuff to look better, they keep lowering the framerate to allow the consoles to cope. A lot of games are now 30fps on consoles.

I really don't get why people are so attached to 24fps. Can you imagine this with computer games?

Because 24fps in a movie has no relation whatsoever to 24fps to computer games. In a movie, 24fps is shot with cameras and you get motion blur (just as you would if you take photos at a film speed of 1/24th of a second). Your brain is an amazing thing, and happily interpolates the motion blur to give a concept of smoothness. What I'd like to know is whether 48fps looks "soap opera" simply because we've conditioned ourselves to equating high fps video with the crap shows that always used it on TV, or whether there really is something magical about 24fps. I can't really see any inherrent reason why 48fps should look bad per se, even if it probably doesn't add anything much.

I do know, however, that there is no way I want to go anywhere near The Hobbit. Forget the whole 24 vs 48fps thing -- Jackson sold out big time in making three stodgy films out of one tiny, light-hearted children's book, presumably for no other reason than to rake in the extra cash. He ought to be ashamed of himself.

It is difficult to describe without seeing it. You become much more conscious that you are looking at actors wearing costumes standing on a set. I've had a similar sense with some movies in blu-ray, although this is different.
Now, 48fps is really cool for scenes with perspective motion, as you feel tricked into thinking you are part of the scene.

When playing a game, I can easily tell if it's running at 30 fps or 60 fps, and I *much* prefer the higher framerate, for obvious reasons. It'll definitely take a bit of getting used to when it comes to moves, but it is no doubt a good thing.

Agreed. The kvetching over the transition from 24 fps to 48 fps reminds me of the transition from incandescent bulbs to compact fluorescent or the transition from records to CDs. It strikes me as nostalgia for a (mostly) inferior product.

Everyone tells me I'm crazy when I say that windowed games and videos play really sluggish in Windows 7 compared to XP. The reason is because the new window manager uses 3D hardware to do compositing, and for some reason, updates seems to be locked in at 30 FPS (my guess is 50% of the monitor refresh rate). XP updated at full blast and provided much, much smoother games and video. I've noticed a rather huge loss in overall performance on my new Win7 machine, despite it being massively more powerful than

I'm all for video and motion being at 48fps, and maybe even 100fps+ for super smoothness which will also help cure motion blur (without the use of black flickery interspersed sub-frames). Heck why stop there, 240 or 300fps will help for compatibility, and allow us even smoother motion.

HOWEVER..., critics argue that the Hobbit feels less 'dream-like' and 'too real'. Even though I disagree with them to an extent, I recently played a game called Nitronic Rush [nitronic-rush.com] (fast free Wipeout clone, with tron-esque graphics, great fun btw). I set it to 60fps, but the graphics are 'enhanced' by motion blur, which 60fps normally doesn't 'need'. We're talking at least a couple of frames worth, and maybe up to 5 frames worth of artificial motion blur. However, I find this actually gets the best of both worlds. You get the smoother motion so that your eyes don't ache, and any fast panning looks convincing. But you also get the cinematic 'blurry' look that 24fps films provide (24fps film techniques employ motion blur naturally, or at least something similar to motion blur).

I think 60fps with this kind of motion blur may have a big future for it.

You got the science wrong. This has very little to do with FPS and all to do with the stroboscopic effect film camera shutters introduce.
Bear with me here for moment while I explain.

It only matters very little whether you capture rapid motion with 30fps or 1000fps - the motion still occurs at its natural speed, it's the amount of motion blur per frame that changes. The eye sends a continuous stream of signals to the brain and the brain "sees". Most people have difficulty registering details about an object that moves faster than 36 degrees per second. So for roughly 180 degrees field of view, anything that crosses your sight in under 5 seconds is blurred. That's not much frame rate, right there. I only give this example to demonstrate that high frame-rate is not that important for action.

When you shoot film @ 24 fps, the photographic shutter does not stay open the whole 1/24 sec. time, because that will be too much motion blur and also too much exposure at, say, F2.8 for the film. Normally film is shot at shutter speeds about 1/50 sec. This means that half of the 1/24 sec. motion is NOT CAPTURED AT ALL. Film creates a stroboscopic effect, and when played back through a projector that displays 1/50 sec. worth of action for 1/24 sec., it looks eerie, artsy.

For the soap opera look, cheap TV shows are shot with cheap video cameras which do not have light shutters. Shutter is open for the duration of the frame - 1/60 for interlaced NTSC TV. The whole action is captured with motion blur similar to film (film at 1/60 sec. shutter). The playback is absolutely realistic, cheaply realistic.

So, there you have it: TV @ PAL/50i and film with "normal" shutter speed 1/48 sec. @ 24fps have similar amount of motion blur. Film has a stroboscopic effect, TV does not.

How nonsensical is it, and how resistant change do you have to be, to worship these artifacts. They are no more beneficial than ticks/pops were on Vinyl. There is certain nostalgia value to listening to something with ticks/pops sometimes, but it isn't something we put everywhere because we can't do without it.

So these resistant to change, Luddites in love with quite irritating artifacts have taken to calling superior motion video with less blur, less judder and less jerking: "The Soap Opera Effect".

Do a freeze frame on a soap opera and good movie. You can still tell which is which when frozen. Soaps look like crap, because they have crap production values. Poor sets, poor lighting, poor cameras, shot without any flair.

Shoot 48fps (or 60 fps or 120 fps for that matter) with great sets, great lighting, great cameras and great flair and it will be amazing and have nothing in common with soap operas.

To be fair there is more to it than just "24fps has unwanted artefacts". Most people will probably remember when they first saw Saving Private Ryan because the film stock and shutter speed used gave it a very realistic, un-blurry and gritty look. One of the biggest reasons it has taken so long for digital cinema cameras to become popular is that the early ones were unable to replicate the effect of using particular well known film stocks and camera techniques.

Star Wars is another good example to look at. Part of the charm of the original movies is that they were a bit rough around the edges. The film had a fair bit of grain that made the Star Wars universe look a bit grubby and used, rather than sleek and clean like Star Trek. The later trilogy was crisp and clean, and ended up looking more like generic sci-fi than the Star Wars we loved.

48fps is still in its infancy and it will take some time for cinematographers and directors to figure out how to get the effect they want from it. In the end the result will be better than 24fps, but that doesn't automatically mean that the early examples will be particularly good. 3D was the same; everything looked terrible until Avatar finally figured out how to use it and still look like a movie and not give you a brain aneurysm.

Hi there. Technical director here. Just need to step in a clarify the relationship between frame rate and motion blur. I'm seeing a lot of posts that are calling for higher frame rates with more motion blur, as if they are two completely independent things. They're actually closely linked. Let me explain:

Motion blur is the effect of a moving object in the frame while the shutter is open. In photography, the time the shutter is open is called the shutter speed, and is used along with iso and aperture to control the overall exposure. If you know anything about photography, this is pretty basic stuff.

In the film world, the equivalent of shutter speed is what's known as shutter angle. This is because the shutter for film camera is a spinning disk, of which a portion lets light through and a portion blocks it as it spins. The portion, measured in degrees, that lets the light in is the shutter angle. Typically, the shutter angle used in film is 180 degrees, meaning during half that 1/24 of a second frame rate, the film is being exposed. In photographic shutter speed terms, that would be the same as 1/48. Again, not too complicated.

Here's the catch though: because your film stock is rolling by at 24 frames per second, each frame can only be exposed for 1/24 of a second or less. If you use a smaller shutter angle, or faster frame rate, you get less motion blur. What this means is there's no practical (the film industry definition of practical) way of getting more motion blur than your frame rate and shutter angle allows. The faster you go, the crisper the action will be.

So at this point you're probably wondering who cares about the amount of motion blur in a movie? The answer is: the audience. The industry has shot film at 24fps with a 180 degree shutter angle for so long that's what everyone is used to. The last thing you want is to distract your audience away from enjoying the movie because there's know there's something different about the picture quality but they can't figure out what.

Finally, I'd like to point out that this choice of frame rate, like many other subjective decisions that are made during a movie production, are made at the director's discretion. Peter Jackson is going out on a limb by shooting a movie at this frame rate, and doubtless he has his reasons for doing so (mostly due to it being shot in 3d as I recall) but it's still his call. The industry talk I hear views it as an experiment, and everyone's curious as to how it will work (or won't). If audiences do get used to it and like it, expect to see more movies shot like this, and in enough time it will be the new standard.

That's irrelevant. The CMOS sensor on the RED works the same way as a frame of film - it is activated and exposed to light in exactly the same way, for a fraction of a second depending on the desired shutter speed. The only difference is that the shutter is electronic (turning the sensor on and off) vs mechanical, and even new digital cameras like the Sony F65 have mechanical shutters exactly like film cameras.

> The 48 fps version of The Hobbit is weird, that's true. It's distracting as hell, yes yes yes.

...because Lord knows, that's what I always look for in a movie. A presentation that is weird and distracting as hell.

That said, like 3D, you do get a choice, so no harm, no foul. We will be seeing the film in 2D, 24 FPS, because 3D gives my wife migraines and because of reports in New Zealand of motion sickness - like symptoms amongst viewers there.

Parenthetically, I predict that of the people who love 48 FPS will contain a high percentage of people who can play first person shooters for hours without motion sickness, and conversely, the people who don't like it will be those who can't. Although I don't think anyone is collecting this metric, sadly.

I will see the film at the faster frame rate (and in 3D because I believe that's your only choice at 48 FPS) but I want to see it "normal" first.

I don't feel qualified to judge the technology not having viewed it yet, but the most interesting criticisms that have come out of advanced showings so far is that the sets look more like sets, which disrupts one's ability to suspend disbelief, and that the depth of field tends to be very deep, with everything in focus, which makes things look weird (because the human eye doesn't see that way). Speed Racer did the same thing, intentionally. (Speaking of which, it appears that I'm the only one who liked Speed Racer.)

...which brings me to my point. This doesn't make the film any less artistic. In actual fact, Jackson's use of the technology is an artistic choice. It may not be a choice that everyone likes, and it may disrupt what we've come to believe are common artistic choices (directing audience attention through depth of field, motion blur to indicate movement, softer focus for effect) but that doesn't make it any less artistic. Now, whether it's a *commercial* success, that remains to be seen. I strongly suspect that the 48 FPS showings will be crowded because it's a new thing. Whether people will flock to the next film in that forum remains to be seen.

I've been doing computer animation for 35 years, as long as it has existed. Back in the early 80's, I worked on some early 60 field-per-second animation; and I was a convert to high-frame-rate footage since then. (The opening to the PBS show NOVA was perhaps the first 60fps animation ever done.) When we started doing broadcast graphics (show openings, things like that) for TV, we naturally did them at 60fps, and that looked right as it worked with the rest of video. Finally, though, we moved into advertising, and TV advertising was (and still is) typically 24fps. And it bothered me!

But then, something changed my mind completely. We were doing an ad for Snacky, a Japanese snack food company. There was the required silly animated spokespuppet, and we modeled it and made it perform. Part of doing animation is doing the lip-sync, and the company gave us the dialogue in English to animate to. We did this, although it didn't seem right -- expecting them to give us the Japanese soundtrack eventually.

But no, it got to a couple of days before delivery, and the character was still speaking English, and we asked the customer when he came to review the work. "This is only going to be shown in Japan, right?" "Oh, yes, yes!", "And you're going to dub it into Japanese, right?" "Of course! Yes!" "But the lip sync is to an English sound track, the lips are not going to match the dialogue!" "YES! JUST LIKE ALL GOOD ANIMATION!"

Because in that day, lip-sync that was correct in Japanese meant it was low-quality domestic animation; where if the lip-sync didn't match it was high-quality American animation. Nobody can tell me that wrong lip-sync is in any way superior -- except that there were 150 million people in Japan who would see it that way instinctively and immediately.

So, I became a happy convert to 24 fps animation. I applaud Peter Jackson for his incredibly audacious experiment, and I hope he succeeds, but he has to fight the near-instinctive reaction from a lot of people who see 48 fps as video.

I think that part of the problem with The Hobbit at 48fps is that the screens are so terribly dark that you just can't appreciate the high frame rate. Your eye integrates dark scenes over a long period of time, and at 48 fps with the very very dark 3D screens, I believe that your eye smears the frames together. On Transformers III, I removed all the motion blur from the very dark scenes, because even at 24 fps they got smeary.

1) If you get the least bit motion sick, don't go see it at the high frame frate in 3D. Normally I don't, even when seeing IMAX/OMNIMAX, but this film I did.

2) The 48 frames per second and 3d makes certain parts of the film like watching a live stage production. The problem then becomes with the post production. There were a lot of scenes when you could tell the background was composited and with so much CGI some of it was like going back and watching CGI from 15 years ago.

That was one of things I liked about the LOTR movies and especially by the third movie, the CGI had gotten so good that it was largely seamless. You didn't notice it, it was just part of the story. In this one I noticed it and often found myself cringing.

At that time film was very expensive. Producers then preferred a minimal frame rate to save cost. Some Nickeldoleans were 10 fps. The guy in Hugo used 16 fps. Since Edison was one of the inventors of motion pictures, he may have wanted to sell more film stock.

NTSC television to the bitter end was 29.97. Many hidef television broadcasts are actually at 23.976 fps, and most feature films shot on digital equipment are at this rate as well, because it converts to 29.97 with fewer artifacts.w

Sometimes we do, but that's definitely a visible artifact. You can't just delete every other frame, you have to add blur and interpolation to get the same level of motion blur the 2x picture had -- the cameras can't shoot overlapping frames.

Another factor with shooting "fast" is that it halves your available light, so if you have an ISO 800-equivalent gain factor at 24 fps, it becomes ISO 400 at 48; so then either your f/stop (and thus depth of field) has to give, your shutter angle (and thus motion blur)

Actually, I was thinking more along the lines of the Hogfather (specifically from the movie).

While I enjoyed this first Hobbit movie, I found the Radagast scenes awkward (like an old family photo with too-large glasses and sisters with poofy bangs). Radagast and his bunny sled seemed too much like something right out of Discworld, which would be delightful except that combining Discworld and Middle Earth yields a very large impedance mismatch.

60 fps would have been a good thing. 48 is just as dumb as 24. You'll still have to do a pulldown on most consumer displays with 48 fps. If you're reading this Hollywood, update the DCI spec. to support 60 fps!

48 is either dumber or smarter than 24, but not just as dumb, depending on whose idea it was. If it was the display makers' idea, then it's goddamned genius because all the videophiles are going to buy new displays all over again. My TV's panel has a native film mode so it doesn't have to do anything wacky to display a film-mode signal. But it doesn't have a 48hz mode... Not that I'll be buying another TV. Problem is, even if a firmware update would let my TV display that content, it's not going to be an al

We really need to move beyond 24fps though. Take any single frame of that scene and just try to make out what is in the house. Is that a lamp? Or a table? Or wait, maybe it's a vase with a funny flower coming out of it. You can't tell. It's a blurry mess. All you can tell is that is was a sweep of the inside of a house. No detail. [...]

That of course assumes that viewing all the detail is important. In many cases "viewing all the detail" is not what you want. It can be distracting from the message that the writer and director is trying to convey. At times the blur in the background can help support the in focus stuff in the foreground and the elements that are actually important to the story.

Yes. Most people can see the difference between 240 fps and speeds below that. Some people can perceive differences up to 360 fps. Note that these values are way above the frequencies that we can detect flicker -- around 75 fps.