I have been shooting with a Sony HDR-FX1 1080i underwater video setup. I see some cameras like the Sony HDR FX1000 which will shoot at 1080p, as well as the PMW-EX3 which will also shoot 1080p.

My question is, whether the 1080i footage is shown on a computer screen, an HD projector on a flat screen in a seminar hall, or on an HD 1080p TV. or the same venues with the video shot natively at 1080p, is there really going to be any visual difference between 1080i that is deinterlaced in post compared to 1080p shot natively?

One thing to consider is that there's a lot of variation in quality among monitors, projectors and HDTV's. The poorer ones will mask differences that the more expensive ones might reveal. My $0.02 is that with smart (motion-adaptive) deinterlacing, and low-mid range display equipment, there will be negligable difference.

Deinterlaced 1080i60 is not the same as native 1080p30 and will not look the same. The FX1000 can shoot 30p but records it as 60i. The Z5 does record natively in 30p. Neither shoots 1080/60p, as in fact no HDV cams do.

My understanding of the process is that in the smartest deinterlacing, the two fields are blended to smooth out the difference between them and appear somewhat smoother than tape shot at only 30 (albeit complete progressive) pictures per second. My further understanding is that even the most brilliant deinterlacers can't "create" 60p from 60i. I think that at best deinterlacers are just "guessing" at the missing data that would be there if any of these cams could actually shoot 1080p/60.

But I could certainly be wrong. My actual point is that when people start talking about "1080p" we need to be careful about specifying frame rates (as you mentioned in your post) because many people hear this and are seduced by it without realizing that "1080p" by itself means nothing. I think the TV manufacturers have done such a good job of marketing the phrase that people think it always means 1080/60p or 1080p/60, as it seems the OP might have.

Adam, so do you believe 1080P shot at 60 i will not look as good as 1080P shot at 60P?

Sorry, you lost me. There is no 1080p shot at 60p. There's 1080/60i or 1080/30p. From what Graham says you may be able to get nice pseudo-60p out of 60i with the right deinterlacer, but it's something you'll probably have to try yourself to see how you like it.

True 1080p/60p is the holy grail, but so far it doesn't exist as an acquisition format in HDV tape. Not that I know of, at least. And even if it did, to my knowledge no NLEs could handle it.

I'm afraid we're overcomplicating the matter: what Allen was originally asking is if there is visual difference between the two acquisition formats when played on different displays.

My subjective opinion, based on what I see on the screen: Yes, there is a difference. Video shot originally interlaced will look smoother on the screen (regardless of the screen type and size) even after deinterlacing; progressive video is significantly sharper. The difference is less visible when there is no or little movement (camera or subject).

It also depends on how your eye is trained, and what you prefer to watch. Some people prefer smooth, I like it crisp. Bottom line: interlaced does not necessarily look worse than progressive.

What I am trying to figure out is my next videocam acquisition, hence the questioning.

I currently shoot with a Sony HDR FX1 1080i videocam and am eyeballing an upgrade. Sony has their HDR-FX1000 coming out shortly. This shoots at 1080p rather than i. I am trying to figure out if in the display venues I would use, such as an HD projector on a screen in a seminar hall, a flat panel TV or a computer, whether there would be an improvement in footage appearance quality. I would suspect on a computer, probably not.

Sony also has their PMW-EX3 which will be supported by a particular underwater housing manufacturer. I don't really need all the features of that camera, such as cine modes, 24p, etc.

But for that price, I can almost get a Scarlet 5K, which I suspect would be a QUANTUM leap in footage quality.

The problem I have with my FX1 1080i videocam is even when I deinterlace and have camera image stabilization turned off, the footage freaks out when either the camera is moving or the subject is. There is only so much I can do to stop the forces of nature, lol.

Anyway, that all is the impetus in determining whether I would get a sufficient benefit out of an upgraded videocam system.

Sony has their HDR-FX1000 coming out shortly. This shoots at 1080p rather than i.

No, it doesn't. That's the point I am trying to make. They *both* shoot _1080/60i_. The FX1000 also has an *additional* mode that shoots 30p. There is no 60p mode.

Quote:

Originally Posted by Ervin Farkas

I'm afraid we're overcomplicating the matter: what Allen was originally asking is if there is visual difference between the two acquisition formats when played on different displays.

Right, which is what I said in my first reply: I agree with you; there will be a difference. But the original question and two responses make it sound like Dave thinks there is such a thing as 1080/60p -- especially because he says "1080p shot at 60p" -- and that there is a camera that will shoot that, and I just wanted to be clear so he doesn't buy a camera that doesn't do what he thinks it does.

But you're right -- 30p doesn't look like 60i, whether the 60i is deinterlaced or not.

No, it doesn't. That's the point I am trying to make. They *both* shoot _1080/60i_. The FX1000 also has an *additional* mode that shoots 30p. There is no 60p mode.

Very well stated, thanks...

Any HDV camcorder recording 1080p is going to give you 1080 progressive either at 30 frames per second or 24 frames per second, but not 60. As Adam correctly points out above, the 1080p modes for an HDV camcorder will be 1080p30 or 1080p24.

I blame the HDTV set makers for this for slapping "1080p" stickers all over everything and making people think there is a 1080p/60 acquisition format. Hordes of people stumbling, chanting zombie-like "1080p.... 1080p.... " without knowing that by itself, without specifying a frame rate, this is meaningless.

I blame the HDTV set makers for this for slapping "1080p" stickers all over everything and making people think there is a 1080p/60 acquisition format. Hordes of people stumbling, chanting zombie-like "1080p.... 1080p.... " without knowing that by itself, without specifying a frame rate, this is meaningless.

1080P is different than 1080i, so the 1080P sticker is well placed.

60 frames per second (played at 60 frames per second) looks really smooth, but a little overkill if you ask me.

There is a difference between 1080P and 1080i, frame rate is a different story.
1080P is a whole frame image, while 1080i is half of the total lines at a time.
even though the difference can be hard to tell for some people, it is real