So on Christmas Day, I went to my parents and while we were watching a Blu Ray, I noticed that on their new LCD TV, it looked awful. Too smooth, too clean and like a soap opera.

A quick google search later and I discovered lots of new TVs have a refresh rate of 120hz or higher in order to provide "frame interpolation".

However I think it looks horrible, so I ask you - what is the point in this technology that makes films look so... fake? And how do you turn it off?

_____________________________

"I put no stock in religion. By the word 'religion', I have seen the lunacy of fanatics of every denomination be called 'The Will of God'. Holiness is in right action and courage on behalf of those who cannot defend themselves."

It's to do with the elimination of things such as so-called 'motion-blurring', something quite evident during sports events like football and racing. It's bollocks. While it may artificially improve the quality of Formula One I have no idea why anyone would utilise it for normal TV programming like soaps. As you say it looks crap, too smooth.

Some people even use it for watching blu-rays because they are over-sensitive to the natural 'judder' effects of 24/fps. These people are fools.

Not sure how you turn it off. I think some TVs don't even have the option of doing so.

EDIT: I'm sure others can give a more precise explanation as to how it works but it's purpose is definitely to improve or eliminate motion-blurring and high-def judder. But it just looks wrong.

The reason frame interpolation looks so jarring is because it makes motion look more realistic, not less.

Film, since the advent of synchronised sound, has had a standard frame rate of 24fps (some European films and almost all European TV use 25fps for film based productions, but as that is only a 4% difference it is all but irrelevant in this discussion). This rate was settled on as it provided a good compromise between the illusion of motion (which can be achieved at lower frame rates; many silent films used 18fps) and the need to move the film fast enough to allow for reasonable sound fidelity.

However 24fps is still too slow to remove perceivable flicker when projected. To avoid this each frame is shown twice in a film projector (or even thrice in some). This doubles the effective frame rate and reduces flicker to a less obtrusive level.

24fps is also too slow for eliminating motion artefacts if the individual frames are too sharp. It introduces strobing, which presents itself as a slightly stilted, stuttering motion which looks unnatural. Most stop motion animation has this strobing, as do the battle scenes in Saving Private Ryan due to a non standard shutter speed.

To mitigate this movie cameras shoot with a standard shutter speed of 1/48 of a second. This creates significant blur on fast moving objects which helps the frames blend together to create a cohesive whole with fluid motion.

This is why Phill Tippett and co. created Go-Motion in the early 1980s. It added motion blur to each frame which gave an illusion of more realistic movement to the shot. if you compare the Walkers in Empire Strikes back to the Walkers in Jedi you'll see how well it works.

However fluid motion is not the same thing as realistic motion.

TV works at a refresh rate of 50Hz for PAL regions or 60Hz in NTSC land. Some shows shot on film at either 24 or 25fps, especially bigger budgeted dramas or sitcoms. Smaller budget shows, or those with a rapid turn around (such as soap operas) or live events (like sports) used TV cameras which shot 50 or 60 frames per second. This means, due to the smaller exposure time (and also due to how TV cameras work), their is far less motion blur, but because the frame rate is much higher you don't get stuttering motion.

Ironically the motion in these shows is more natural than film based fare. However as we have become so used to seeing film as a high quality medium, and TV, especially soap operas, as low quality fare we have become indoctrinated into associating the look of film into being superior to the "soap opera look". I am as guilty of this as anyone. I hate the soap opera look.

What frame interpolation does is creates new frames to go in-between the 24 real frames. It then displays them at whatever the native frame rate of the screen is.

I think it looks awful, but there are others who seem to like it. But then again there are those who like the over-saturated, overly bright, over sharpened factory preset mode on their TVs too.

The frame interpolation can be turned off, it is usually given a name like Motionflow, Trumotion or Motion Plus and are found in the settings menu.

My Samsung TV has the same thing. It's called motion plus. To turn it off I have to go to picture options in the menu, find the motion plus and just turn it off. There is about 4 or 5 settings I think: standard, clear,smooth and 1 or 2 others. Hope this helps

_____________________________

Exactly six miles north of Skagg Mountain in the Valley of Pain, there lives an evil devilmonster. His name is Bingo Gas Station Motel Cheeseburger With A Side Of Aircraft Noise And You'll Be Gary Indiana.

Is this similar deinterlacing? I've been reading up on this (although I don't understand half of it) as my vlc player has it enabled, set to Blend. When I play video files it looks fine (my pc is hooked up to my tv via HDMI) but when I play dvds I do get some motion blur. I'm not sure if the deinterlacing has something to do with it or not as the vlc has some pretty advanced settings that I don't want to mess around with. As for my tv most of the picture effects settings are off except the Warm setting.

_____________________________

And I heard a voice in the midst of the four beasts And I looked and behold, a pale horse And his name that sat on him was Death And Hell followed with him.

Is this similar deinterlacing? I've been reading up on this (although I don't understand half of it) as my vlc player has it enabled, set to Blend. When I play video files it looks fine (my pc is hooked up to my tv via HDMI) but when I play dvds I do get some motion blur. I'm not sure if the deinterlacing has something to do with it or not as the vlc has some pretty advanced settings that I don't want to mess around with. As for my tv most of the picture effects settings are off except the Warm setting.

Interlaced video is a throwback to the days when your average TV was as deep as it was wide.

CRT TVs work by using an electron gun to excite phosphors which makes them glow. The electron gun lights the screen, line by line, from top left to bottom right to create a picture. There are two ways to do this: progressively, which means drawing every line on the screen from top to bottom before refreshing; or by interlacing, whereby the screen is refreshed in two passes (the first pass draws the odd lines and the second pass draws the even lines).

Interlaced was chosen for a number of reasons. Phosphor decay meant that it was difficult, at least in the early days, to light the whole screen before the first lines started to fade. This would lead to a rolling bar on the screen. Interlacing also reduced flicker as it doubled the refresh rate of the screen. It also allows twice the vertical resolution to be crammed into the same bandwidth (albeit at half the frame rate).

When I wrote in my earlier post that TV cameras shot at 50 or 60 frames a second I wasn't being entirely accurate. They shoot 50 or 60 half frames per second (called fields) which are interleaved to create the final frame. However the motion characteristics mentioned are still apparent. In fact interlacing creates additional issues.

When you shoot a fast moving object using interlaced video cameras it can move appreciably between fields. This leads to an artefact called combing, whereby the vertical edges of the object look serrated like a comb. The faster the object the more exaggerated the effect.

On a CRT TV, which is interlaced in nature, and produces a much softer picture this effect is easy to miss. On modern progressive displays, such as computer monitors or HDTVs, which are far sharper it will stick out like a sore thumb.

To counteract this de-interlacing techniques have been developed. There are three main methods: weave, bob, and averaging.

Weave weaves the odd and even lines back together and displays them progressively. It is probably the best choice for PAL DVDs from a progressive source (such as film). This is because a PAL film to video is sped up by 4% to 25fps rather than having a complex telecine operation performed. Video stored like this at 50i is functionally identical to video at 25p. On VLC this option is confusingly called "disable". If used on video from an interlaced source it will display the combing.

For film to NTSC transfers things get a little more complex. To get 24 frames to display over 30 frames (or 60 fields) a system called a telecine was developed. Basically every 4 frames of film is stored on 5 frames (more precisely over 10 fields) of video using an irregular cadence. On DVDs taken from a film source the NTSC video was typically stored as an interlaced 24fps stream to save space, and the cadence was added by the player.

If the source is native interlaced PAL or NTSC one of the other methods will mitigate the combing.

Bob takes the odd and even fields and displays them discretely one after the other without any interlacing. It doubles each line to maintain image height. It will get rid of the combing but effectively halves vertical resolution.

Averaging calculates what the frame should look like by comparing the odd and even lines. This can create ghosting on fast moving images; better than combing but not by much. On VLC this is called blend. For some reason this seems to be the default setting.

The other settings are different versions of these methods. A full description can be found at:

Thanks for the advice Dpp. I tried the various options and found little difference between Discard and Blend. As Discard is an Interpolator I thought best to stick with Blend. Tbh there's very little "ghosting" on PAL video I found, only the ones transferred from NTSC I found to have more blur to them and of that mostly stuff shot on video. I haven't watched any Japanese or Telecined examples yet so not sure how that will work out.

_____________________________

And I heard a voice in the midst of the four beasts And I looked and behold, a pale horse And his name that sat on him was Death And Hell followed with him.

I was going to post a question for dpp (or anyone else knowledgeable), but I guess I may have found the answer with this thread. Basically, I was going to ask:

When I got my TV with it's factory-set settings, before I calibrated it (with a DVE DVD disc and some internet scouring for good settings), everything had a "strangely real" feel to it. When you watched a film, it was hard not to notice that you were watching real people in studio, if that makes sense. In a way, I kind of liked it, cos it felt more realistic, but something about the image quality detracted from the artifice of the movie and made you think of actors stood there reciting lines. The words "soap opera effect" drew me to this thread, because the few TV shows and films I saw during that brief period kinda looked like they were filmed on the Neighbours set but broadcast upscaled or in HD!

Recently, I connected a hard drive to my TV via USB. Connecting in this manner uses Samsung's built-in Media Play software. Lo and behold, the image had a similarly "weirdly real" feel to it that the TV had by default. It wasn't quite as distracting as it had been when I first bought my TV, but it was noticable. People stood around in studios saying rehearsed lines.

I put the Battlestar Galactica mini-series on and found that some special effects also looked like odd/real objects, too - like, a grounded spacecraft looked like a plasticky fibreglass construct, it didn't look like a metallic ship with the requisite weight to stand space flight! I've watched most of the full series of BSG broadcast on Sky Atlantic (in SD), and I guess you make a leap of faith that such effects are real objects and you stop considering it, so when watching the mini-series via the TV's software, it was perceptibly different. It more obviously looked like an effect. When the ship was in flight, the CG took over, and it all looked a bit more "normal", more similar to the series' broadcast on tv.

I also noted that although the show appeared to be playing at normal speed (people's voices sounded fine and were in sync), when they moved, they seemed to be in a slightly unnatural rush! People walking looked ever so slightly as though they'd been sped up, maybe slightly jerky.

So is the likely cause of these effects that it has frame interpolation turned on as default? If so, I wonder if I can switch it off for Media Play, or if I need to do so in the TVs settings menu. I'd assumed that the calibration settings you entered into the TV would be standard across all devices connected to it (as is the case with DVD), the only exception being my PS3, which has the same calibration but uses Game Mode, which obviously modifies some settings. But it seems Media Play has it's own PQ settings, unless it's just in some weird Mode by default...

You have to calibrate all inputs separately as each one should have its own memory. This is because no two pieces of equipment will output video in an exactly identical manner. Every time you change your player or buy a new device you should calibrate or re-calibrate its input.

The "Game" mode switches off all post processing to reduce lag. For a DVD or Blu-ray player, "Movie" "Film" or "Cinema" (depending on brand) modes are usually best as they tend to be set closer to spec and have more tuning options. Some higher spec TVs have a Pro mode where everything can be fine tuned to as close to perfect as possible.

While it sounds like you've switched off interpolation for some of your devices it may still be active for others. You should be able to switch it off in the TV's menu.

ORIGINAL: Dpp1978 You have to calibrate all inputs separately as each one should have its own memory. This is because no two pieces of equipment will output video in an exactly identical manner. Every time you change your player or buy a new device you should calibrate or re-calibrate its input.

Ok, now I'm unsure as to what I've actually done. I don't recall re-calibrating the Sky box, the DVD player and the PS3 (for Blu ray) seperately. If I remember correctly, they all looked "wrong" to start with, and I re-calibrated using a DVE DVD disc & lenses, and now none of them look "wrong". Could my single calibration have globally affected all inputs?! Maybe I just copied the values I'd used for DVD to the Sky and PS3 output, and it's just that I've forgotten that I did it... Dunno. I'll check 'em tonight.

Assuming that, one way or another, I've just duplicated the settings for all inputs (excluding the hard drive connected via USB), I know I have the DVD settings pretty spot on. What source would you use to calibrate the Sky+ box? Is there a test channel? For the PS3, I guess I could get hold of a blu ray calibration disc, though I'd rather not stump up the cash again, if I can help it. The pic quality for Blu ray is very good, but maybe I'd find it was slightly off...

When initially trying to get the TV settings right, I remember there was some judder when watching SD tv broadcasts, and I set an option in the menu to reduce this, which I think was part of Auto Motion Plus. But that sounds like I was turning the frame interpolation on...

quote:

ORIGINAL: Dpp1978 The "Game" mode switches off all post processing to reduce lag. For a DVD or Blu-ray player, "Movie" "Film" or "Cinema" (depending on brand) modes are usually best as they tend to be set closer to spec and have more tuning options. Some higher spec TVs have a Pro mode where everything can be fine tuned to as close to perfect as possible.

I was definietly wrong about this - Game Mode cannot be switched on for my PS3. I know I did implement it originally (before calibration), but my PS3 is used for Blu Rays and Iplayer etc far more than it is for gaming. So I must have turned Game Mode back off, either with the intention of turning it on every time I played a game, or found that I didn't need it on after calibrating (?). I don't notice any lag when playing games now, anyway...

quote:

ORIGINAL: Dpp1978 While it sounds like you've switched off interpolation for some of your devices it may still be active for others. You should be able to switch it off in the TV's menu.

I'll have to have a look. I know Media Play has a very limited set of Tools, which include the TV Mode (Game, Film etc), but it definitely doesn't have all the pq settings, and I don't recall Auto Motion Plus being listed in there. I'll play about with the TV menu proper whilst in Media Play and see what happens...

I had a look last night, and it seems I have applied the same settings to the DVD player and PS3. The Sky box calib is the same, except for the Motion Plus setting. The former 2 have Blur Reduction set at 8 (out of 10) and Judder Reduction at 0, whereas the Sky box has the Judder at 2. Works for me, and Sky looks fine and doesn't judder anymore. But I will check out that link in case the PS3 blu ray calibration needs adjusting.

I also looked at the settings for Media Play (for USB devices). I was right to say that not all the pq settings are available - the TV's Menu button doesn't work whilst you're in this, and there's no option for changing individual colours etc, just basics like brighness and contrast. However, I did find Motion Plus and entered the same values as above, and it did the trick in reducing the soap opera effect. Perhaps if I copy the BD files from your link to the hard drive I can play it there and adjust whatever settings I can.

"I put no stock in religion. By the word 'religion', I have seen the lunacy of fanatics of every denomination be called 'The Will of God'. Holiness is in right action and courage on behalf of those who cannot defend themselves."

Like said in the other thread it depends on the context of what footage was shown, which in this case was ten minutes of unfinished shots. As most audiences are completely unfamiliar to the new frame rate it will obviously look jarring to begin with but I'd rather make up my own mind when I see the film in full which will give my eyes time to get used to it.

This could go either way, it could either be a very expensive folly or this could revolutionise the way films are shot in this brave new digital age. Frankly Jackson aught to be commended for having the stones to do something like this, whether this works or not he's pushing the boundaries.

_____________________________

And I heard a voice in the midst of the four beasts And I looked and behold, a pale horse And his name that sat on him was Death And Hell followed with him.

Like said in the other thread it depends on the context of what footage was shown, which in this case was ten minutes of unfinished shots. As most audiences are completely unfamiliar to the new frame rate it will obviously look jarring to begin with but I'd rather make up my own mind when I see the film in full which will give my eyes time to get used to it.

This could go either way, it could either be a very expensive folly or this could revolutionise the way films are shot in this brave new digital age. Frankly Jackson aught to be commended for having the stones to do something like this, whether this works or not he's pushing the boundaries.

I, like you, will wait to make up my own mind; that is assuming I get to a 48fps screening.

They have been clever in that they shot it in such a way they could also release it at 24fps with minimal motion artefacts. The trailer at 24fps certainly doesn't look like a daytime TV show.

The worst case scenario as far as I'm concerned is foregoing the 48fps version and seeing the 24fps version instead.

What I want to know is how will you know which version you'll be seeing? I've seen reference to a 4k 2D version (not sure at which frame rate or if it's both); 2k 3D at 48fps and 24fps; 2k 2D presumably at both frame rates and a 35mm film version at 24fps. No doubt there will also be an IMAX release.

Does anyone else look back at the days when all you had to decide was which film you wanted to see with fondness? Pretty soon buying a cinema ticket will be as bad as buying coffee from Starbucks.

I'm assuming it's 24 FPS - I wouldn't have thought there are any projectors currently in cinemas that could even do 48 FPS at 4k.

I'm pretty sure I read that Sony's 4k projectors will be able to with a firmware upgrade, which makes sense as the way they do 2k 3D is to have 2 2k images stacked within a 4k frame and a dual lens to put one over the other on the screen. If it can do 48fps 2k 3D it should be able to do 4k 2D I would imagine.

I'm assuming it's 24 FPS - I wouldn't have thought there are any projectors currently in cinemas that could even do 48 FPS at 4k.

I'm pretty sure I read that Sony's 4k projectors will be able to with a firmware upgrade, which makes sense as the way they do 2k 3D is to have 2 2k images stacked within a 4k frame and a dual lens to put one over the other on the screen. If it can do 48fps 2k 3D it should be able to do 4k 2D I would imagine.

Some HFR-related announcements are expected at CinemaCon, as digital cinema equipment manufacturers are working to be able to support whatever the demand might be from exhibitors and studios.

Series 2 projectors from Barco, Christie and NEC -- 40,000 to 50,000 are installed worldwide, according to Christie -- would be able to show The Hobbit at a HFR with a currently available software upgrade and a piece of hardware called an integrated media block (IMB) equipped to play 48 fps, vendors explained.

Between Cinemark and Rave, there are nearly 4,000 screens in North America that have Barco Series 2 projectors with the required software and a Doremi IMB with beta software to make it capable of playing HFR, Barco vp digital cinema entertainment Patrick Lee told The Hollywood Reporter.

Sony expects the majority of its 13,000 installed 4K digital cinema projectors to support high frame rates by the time The Hobbit is released Dec. 14, though the film also will be available in 24 fps.

< Message edited by Spaldron -- 25/4/2012 10:30:01 PM >

_____________________________

And I heard a voice in the midst of the four beasts And I looked and behold, a pale horse And his name that sat on him was Death And Hell followed with him.

Digitally shot high framerate high resolution pictures LOOK better it's a question of getting used to the future.

High framerate images look different to 24fps images; whether they look better will be a matter of personal taste. It is irrelevant whether they were shot digitally or on film.

High frame rate photography is not new. Doug Trumbull's Showscan system shot 60fps on 65mm film. In the '70s. That is higher resolution (both spatially and temporally) than The Hobbit is being shot at.

quote:

Digital technology is FAR superior to chemical film.

Massive generalisation there.

There are some things digital is really good at: it tends to do better in low light conditions, and storage media is reusable. The ability to instantly review your shots is a boon.

Film still has an advantage in bright conditions and in overall dynamic range (at least for now). It is also, potentially, higher resolution than any digital sensor, and you can upgrade your camera easily, and relatively cheaply by changing the film stock you are shooting. Buy a digital camera and the sensor it came with is, in the vast majority of cases, the one you are stuck with.

As far as image quality is concerned both are capable of greatness so why worry?

quote:

They actually NEED better lenses than chemical films because the artifacts show up.

That is because the sensors' imaging areas tended to be smaller than that of a 35mm film camera; especially in the early generations. If you cram more pixels into a smaller space you'll need a sharper lens to get the most out of it. It is down to an optical principle called the circle of confusion.

Use the same sharp lens on suitably fine grained film and it will perform just as well.

quote:

It's the inherent conservatism in the industry that shows idiots that complain of the Hobbit footage in 5K! 48 fps looks bad.

It could be they just didn't like it. It may be something that they will get used to but was jarring at first. I wouldn't call them idiots for that.

I reserve judgment on how it looks until I've seen actual footage; I might love it, I might hate it. I've seen some 48fps footage with a similar shutter speed to that used on The Hobbit and it certainly looks different. I wouldn't say it looks bad, but it might take some getting used to.

In any case even if the Red cameras are shooting at 5k (which is a debatable figure) no cinema will show it at that resolution. All 3D showings will be limited to 2k and the highest resolution 2D standard is 4k. Resolution is only one factor to good image quality.

quote:

Of course old movies shot with the old technology should look as they are, but profiles on Digital TVs can actually make them look natural most tv have settings to turn off the up hz'ing and so forth.

The frame interpolation on these TVs looks horrible, at least to my eyes. I'd rather watch a film at its native frame rate wherever possible; be it 12fps, 18fps, 24fps or greater. Narrative film-making is not about creating natural images it is about creating the images that best tell the story. The images may be naturalistic, but usually aren't.

quote:

We don't have this stupid discussion in videogames where more framerates is actually better even so high we physically can't see it about 60 fps for most people, but some can feel more.

No, you get plenty of other stupid discussions though. Games and movies are different art forms. What is good for one is not necessarily good for the other.

quote:

For instance BBC Nature series LOOKS better than almost any movie today shot entirely digital and native tv framerates.

Which for the BBC is 25fps.

quote:

National Geographic photos all digital.

I'm pretty sure it is up to the photographer what medium they use.

In any case no digital sensor will compete with a good large format film camera for sheer image detail. A 5"x4" plate camera can produce the equivalent of a 20k image. That is about 330 megapixels.

All of which is meaningless if the photographer doesn't know what he is doing.

I'd rather just sit back and look at the pretty pictures than worry about the technical niceties. It is only if something looks bad or is jarring or out of place that I'll worry about it. And even then not for too long.

I have something called 'Real Cinema' on my Lg 32"LCDTV. I can't really see this soap opera effect either way with blurays or DVDs although I think it's only 50/60Hz. Is it these 100Hz+ TVs that have the soap opera thing?