Is there any program capable of creating true 59.94 FRAME per second progressive video from a 59.94 FIELD per second interlaced MPEG-2 source? As far as I can tell, all the common tools out there (TMPGenc, etc) assume that the user is either dealing with film-source (which is 23.98/24fps, end of story, and nothing short of actual motion-vector temporal rate conversion is going to change that) or is trying to put 59.94 field/second video onto a video CD at 29.97fps.

What I *really* want to do is to be able to take a normal MPEG-2 file ripped from a DVD (704x480, 29.97fps/59.94hz) that came from a true interlaced source to begin with (so each field literally captures a sequential moment in time 1/59.94th of a second after the previous) and use it to extrapolate a nice, clean 59.94 frame/second progressive video.

Yeah, I know that the only way I can actually watch it is by using Powerstrip to force the laptop's refresh to 59.94hz and connecting it to an inherently progressive display (like, say, an Infocus X1 DLP projector) since most CRT-based HDTVs insist on mangling 480p to 960i anyway.

I can. So I want to. It's absolutely killing me that I finally have a display capable of rendering wall-sized 60fps video, but nothing suitable to feed it besides raytraced eye candy. The X1's Faroudja chips are nice, but I suspect the results they're forced to produce in realtime are nothing compared to what might be achieved if a PC were given a few days to chew on the file and do it right -- trying multiple strategies per screen region and scene and playing Bayesian games with them until it determines the best strategy for every last moment of the video... something that's simply not an option when realtime output is required (if only because such a strategy simply can't realize after the fact that it made a bad strategy decision and go back to fix it... it's already been displayed by then).

Well, actually I was expecting the deinterlacing algorithm to extrapolate the missing scanlines to fill in the half that are missing from each field (using fields that come before and after any particular field to discern the missing info). That's one other reason why a non-realtime algorithm would ultimately be better... it could hold off on filling in the blanks on, say, field #28,703 until it's had time to study the next 408 fields and compare them against earlier ones.

For example... suppose there's a 12 second scene. At some point during the scene, the camera might be relatively motionless for two consecutive fields, so the full vertical detail in the scene can be captured. Suppose 3 seconds into the scene, there's rapid vertical motion (the worst-case scenario). Looking forward and back, the algorithm might recognize that certain regions are stationary background, and that it can safely rip the missing scanline data from the same region of some other fields,, then handle the remaining volatile regions (surrounding moving objects, or morphing/changing objects that simply can't be predicted) by bobbing.

The main problem with deinterlacing to 29.97 and showing each resulting frame twice in a row is that the temporal information present in the original interlaced video is irreparably mangled. It's a form of digital noise (like aliasing and quantization errors in cd audio) not entirely unlike 3:2 pulldown artifacts that most people don't really notice until they learn to see it (though they might vaguely perceive that something is wrong), but once you're unfortunate enough to learn how to recognize it, you can't help seeing it everywhere it exists and being driven crazy by it.

Apps like TMPGenc have the right general idea, but their algorithms are weighted more towards eliminating combing at the possible cost of phantom images (say, two footballs created from the widely-separated scanlines of a single fast-moving ball). Since they're combining two frames into one, the algorithm needs to realize that the two footballs are actually the same moving object and decide which one is the "real" one. By keeping the temporal rate at 60fps and treating each field as the final authority on positioning onscreen images (using adjacent fields to fill in missing info, but NEVER in such a way that would contradict the current field) the phantom problem can be eliminated.

If its really film , the entire frame is from the same instant.
They shoot 24 per second and that is the limit. unless you
want to interpolate (not extrapolate) between two frames
to get 48 fps. In that case you are inventing data.

In the case of television cameras , they get 60 fields / sec
and they are all from different points in time ,and putting
2 fields together makes a bad frame.

Can I ask the Rhetorical Question: What good is a 59.94 FPS progressive display when there is absolutely no 59.94 sources out there? Nor will there be in the forseeable future?

You can indead convert fields to frames, but that always looks terrible (1/2 the data is missing, look at water or trees....ugh). I think your hosed until you can find progressive sources (not progressive players).

The only one I know of is Terminator 2 Special edition, which has HDTV WMV files on it. Try downloading the LOTR3 or MATRIX3 samples in HDTV WMV and project those, let us know how they look.

Quite an interesting question. Although it seems like a nice idea, and keeping in mind what you want to do, I wonder if perhaps the only solution is using AVISynth to double up the frame outputs, use a filter that separates, and/or matches the fields, and perhaps reweaves them back again. Unfortunately, this might require reencoding your MPG2 stream. Since this requires lots of horsepower, I'm hard pressed to find any software that will be able to do what you want successfully.

If what you're asking for can be achieved, then I will totally reinvest my time and equipment to do so, since my footage is essentially pure interlaced. I can't believe that it hasn't warranted more questioning in the past.

But the closest advice (sorry, that's all I've got), is to check out some AVISynths functions. By just opening up the .AVS file, you can go to fast moving scenes on a frame by frame basis and analyze and tweak some filter functions. At this point in time, I can only think of a Decomb (although this wasn't it's original intent), and live with a bit of the Fielddeinterlace that comes about (I believe it's motion adaptive)...
I think there's some braniacs in this forum, or in the Doom9 forum, that have successfully doubled up and reweaved fields to create true progressive frames...In the end, it's gonna still require some deinterlacing.

Good luck, and if you find out how outside this forum, could u please reply back here? The MiniDV crowd, would appreciate the advice.

Radical Deinterlacing™ (and its companion, hardcore temporal rate correction) have been video fetishes of mine for years

Every 6 months or so I try to remind people that it's a viable concept... especially now that a thousand bucks will buy an Infocus X1 to throw a progressive 800x600 image (or letterboxed 16:9 800 x 450 one) on the wall, complete with VGA connector to make it easy to leverage the output of the most sophisticated and versatile piece of video gear ever created: their PC.

It never ceases to amaze me how utterly fixated the TV industry is on things like interlacing and least common denominator standards. In a lot of ways, it reminds me of what the American (IBM) PC video game market was like back in high school before the Amiga's death. American video game companies like Sierra were so obsessed with not alienating a single one of the millions of ancient PCs with sub-VGA/pre-386 hardware that they produced game after game that just sucked on high-end hardware. It wasn't until the Amiga finally withered away, and Amiga-centric developers (who used to regard the sale of 50,000 copies as direct evidence that God existed, personally owned a copy of their game, and enjoyed it immensely) came to realize (around early 1992) that even if you wrote off every last PC in existence with less than a 486SX/33, 4 megs, and ET4000 based video card, there were still twice as many potential customers as the entire Amiga market ever was. So they made games like Comanche that demanded top-notch hardware, but blew everyone away, and started the arms race that now leads AMD and Intel to target their fastest and 64 bit CPUs at gamers rather than server users.

Sigh. Anyway, every now and then I just wish people would try a little harder to push the boundaries a bit and try things just because they can. I still hope that soon, some renegade DVD manufacturer will quietly (or loudly) extend the unofficial capabilities of their progressive-scan DVD players just a wee bit to enable the burning of 4.7-gig "xSVCDs" with a few new profiles (all progressive), like 16:9 720x480@60fps, 4:3 640x480@60fps, and 4:3 480x480 @ 60fps.

A better MPEG-2 decoder chip, a few weeks of firmware authoring by a few employees, and the first company to pull it off will have the crowd at CES worshipping them and a DVD player that maybe costs $3 more to build, but commands a $100 price premium over otherwise comparable proscan models and enjoys cult status like the Apex 600a did (and still does, on eBay). Their competitive advantage would likely last only maybe a year before everyone else added the same capabilities, but considering that DVD players have basically become Wal-Mart commodities, it amazes me that nobody has actually done it yet.

Even now, it saddens me to think that the most powerful individuals in the TV industry honestly can't even imagine why consumers would want the ability to display 480p72 (3:1 film), just to give an example. It's something that would cost practically nothing to add to even a mid-end HDTV, but isn't likely to happen anytime soon. To their mindset, it's pointless because it's not one of the official ATSC standards. The fact that ATSC means nothing unless you actually want to broadcast the video over public airwaves (vs distribute optically, or via satellite, or privately-licensed radio spectrum) is just beyond their grasp.

And people wonder why I watch certain movies in front of my 21" monitor and not on my 25" TV. The Matrix3 Trailer in 1024x768 (albeit 29.97) WMV made me cry. The fact that it was telecined made me cry more :P

What's the point of killer graphics only to resort to a framerate a PII user wouldn't tolerate for playing Unreal Tournement. I can tell a 23.97 and a 29.97 video at a glance. I see interlacing artifacts all the time because I check for them in my encoding. Feh. Even HDTV on my digital cable is nothing more than a higher res version of the main feed. Interlaced to boot. There must be a lot of old fat white systems engineers at the broadcast networks that are afraid of anything new........

No Rush...less than 3 years to go before all Analog broadcasts in the US cease to exist. Hard to bump it when the Frequencies are already resold

Digital TV and HDTV aren't the same thing. Few people know that. But, I'd be willing to pay extra for real HDTV feeds (once you go HDTV you never go back :P ). In fact I can now, but the quality isn't there.

So after all that . the whole problem is that you notice flicker
in 24 fps video ?

Judder. Or worse, the lurching 3:2 cadence most people call "the film look," and a few crazy people actually want to EMULATE (eek). Flicker is nicely taken care of by DLP or 72hz refresh

Originally Posted by FOO

We don't trust people who use words like "leverage"
here. Marketing people and Jack Valenti use words like that

Confession: I keep hoping that one of these postings might someday fall into the lap of the Right Person who's in a position to influence the feature set of future players. If "leverage" gets their attention, it did its job

Personally, I suspect that when/if the feature finally DOES arrive, it'll happen when some engineer notices that a future chipset supports it, thinks it would be a neat feature to have available but undocumented, and quietly leaves it enabled (but unsupported, and actively concealed from management lest it be ordered disabled as a kneejerk reaction) for some future pioneer to discover and take advantage of (like UEI's All for One Remote controls that can be reprogrammed with a hacked cable and PC via jp5, or some GM car radios that let you assign stations to the "cracks" between the buttons).

A little late on this one, I know, heh, but I just came across it and found the idea interesting.

I think the concept, in of itself, is fairly simple. Although I don't personally have any experience doing this sort of thing, I can't imagine that some software out there can't do this, or at the very least, would be trivial to implement. (gagh. After I write this, I ofcourse remember the "bob deinterlace". Which is essentially what you want. Check out http://www.lukesvideo.com/interlacing.html and scroll down until they explain the Bob deinterlace. I'm pretty sure there is a Bob filter (possibly several) for virtual dub.)

Essentially you want to take 30fps interlaced (yes i'm rounding :P) and turn it into 60fps progressive.

This shouldn't be too hard. Infact, it seems quite similiar to an age old de-interlacing technique (I forget the name, and am too lazy to look it up, heh). Essentially it's the "drop one field, and interpolate" method. Where you basically ignore one field per frame, and take the remaining field, and interpolate it back up to the original frames size. A fairly down and dirty method (ie. straight forward) that is less than ideal since it's a wasteful loss of quality.

So all one would need to do is tweak the above method, to NOT drop the 2nd field per frame, and merely interpolate both fields up to full frame size, and pack them both into the stream, doubling the frame rate.

The reason I am posting of course, is because I've often toyed with another idea. I came about it (concieved) when trying to figure out the "ideal" solution for deinterlacing.

The above idea (dual field interpolating) is less than ideal. Interpolating, means you lose quality. And it's kinda silly to lose this quality, when you have the ability to preserve ALL info from the stream.

60fps, in theory, should mean that deinterlacing can be done flawlessly. The previous idea sacrifices image quality for gaining speed (smoothness). Most other forms of 30fps deinterlacing, sacrifice smoothness for image quality.

So my idea was something along the lines of "Why not simply mimic the tv's display of interlaced video?". 60fps, where each new frame contains only 1/2 frame of new info. Or conversely, each new frame contains 1/2 old info.

So assume we have a frame in this new stream. A 60fps stream. We are constructing it from a 30fps interlaced stream (60 FIELDS per second).
So to create the new frame, we take the old frame, the next field (assume it's the EVEN field) and simply copy the field's lines onto all the EVEN lines of the frame. So our new frame has all the ODD lines of our previous frame, and all the EVEN lines are from the new field.

Todo the next frame, we take the previous frame, the next field (which would be the ODD field now) and simply copy the field's lines onto all the ODD lines of the frame. So our new frame has all the EVEN lines from our previous frame, and all the ODD lines from the new field.

So there we have the simple construction of a 60fps progressive stream, from a 30fps interlaced stream. The net effect is pretty much exactly the result you would see on the tv. Essentially we use the 60fps stream to "fake" the results of a 60hz interlaced display. You retain smooth motion, and also image quality.

Whats also interesting to note is that, we really don't need to create a seperate stream (60fps) to make this result. It could easily be done by a "smart" player. All the info required todo this, is stored in a 30fps interlaced stream. And it very well could less intensive to playback the 30fps stream in 60fps form, than a true 60fps stream. Using fancy directx directdraw techniques, I'd imagine that this would be quite trivial.

I'm not sure if any software dvd players (which provide realtime deinterlacing) use the above technique. It's not too practical as a conversion technique (true 60fps playback requires a beefy machine), but as a playback "effect", I can't see why not.

Anything that requires me to replace all my video hardware and viewing devices (tv's etc) is garbage in my book.

I'm not changing anything. Take all your new-fangled crap and go away.

I don't see anything wrong with modern television. My eyes are somewhat fuzzy anyway, even with contacts.

I still can't believe people stare and look for errors. Whining about 24fps and all. I think it's more in their head than reality.

I like pushing boundaries on technology. But only when it has value.

I'd venture to say 99% of the world doesn't care about higher definition tv. Hell, most countries still don't have tv. Even places in Canada and the USA don't have tv or terrestial stations (and those people cannot afford satellite, no cable available either).

I think the worse thing about a tv is the commercials that increase 10 times louder than the show. That pisses me off and everybody that I know too.

"If you build it, they will come" only works in "A Field of Dreams". There's plenty of crap built in this world that now sits in the bottom of a landfill.

No tv in 3 years? Yeah, right. I can see the NAACP and ACLU all over that one. They'll have records and statistics showing minorities cannot afford new stuff, and that this is merely a way to exclude blacks from viewing entertainment. You just watch!

Actually, that's the way all deinterlacers for rendering on a PC work.
The bad ones only use Bob (simple scaling), but any decent graphics card, such as a ATI Radeon does have adaptive deinterlacing and will display 59.94Hz full vertical resolution progressive from an interlaced NTSC source.

Most DVDs are progressive (24Hz FILM), so it's irrelevant in this case, as you want to display 24 frames per second.

I get really annoyed at Imax films now, as they have the ability to shoot in 48 fps but they all shoot at 24fps. As a result, the best visual experience available includes such flicker that I can see individual frames

100fps.com is a good guide for DivX, but we want HD MPEG2 at 50/60 fps!