The other question to ask is why does the current 1080p video frame look so infinitely crappier than the equivalent 2mp still? If they can give me broadcast quality or better 1080p, people would not be waiting for 4k as much as they are.....

Have you taken a 2MP still on your camera to compare it with a 1080p video grab? (and I mean taken the image at 2MP, not resized in PS)

That's the only fair way to compare because of pixel binning etc. Oh and at JPEG as well. And at 1/50th or 1/60th shutter too.

And for complete parity can you also enlarge the 2MP print of your still to the size of your TV screen? Remember to keep everything sRGB for equivalence.

As I alluded to earlier, we put up with 400k resolution tv pictures for decades because the illusion of motion and the motion blur caused by the relatively slow shutter (not to mention the interlacing) was all too much for our lowly brain power to handle and so it looked all crisp and sharp and detailed and that. We only see each image for 1/25th or 1/30th of a second, so our brain is filling in a lot of the gaps at quite a rate. A bit like temporal compression in reverse.

Why isn't my tractor as fast as my coupe on the motorway? Why can't I plough a field with my bike? All similar questions.

And if it's not broadcast quality how come I've been getting stuff on telly shot on my 7D, 550D and 600D for the last few years?

If the 1Dx had an option for 1920x1280 I'm sure it would look substantially better (even at the highest JPEG compression level) than a single frame of a static scene on a good, heavy tripod. The reason I say that - 4:2:0 encoding.

I would venture to say a Canon D30 (2160 x 1440) or Nikon D1/D1H (2000 x 1312) using the same tripod placement and lens in Large-JPEG-Normal (or even Small-JPEG-Normal at 1440 x 960 for the D30) would probably meet or exceed the resolution of any modern DSLR shooting 1080p of a static subject in good light (obviously the modern cameras would win hands-down in high ISO situations). The reason I say that is softening and loss of detail caused by 4:2:0 encoding and H.264 compression.

I think if someone has a modern camera and an old D30 laying they should compare them and see if 1080p can stand up to a 13 year-old DSLR. I would be very interested in the results.

We don't need higher resolution, we just need something other than the Youtube codec. Magic Lantern discovered yesterday that the Canon cameras store in the buffer RAW 14 bit 4:2:2 DNG files until it is converted to H.264, you can see the difference.

Ideally it would be nice to have a compressed RAW DNG file like the one that's found on both the new Black Magic Cinema and Pocket camera, or have the compressed RAW as an option and the H.264 or AVCHD as a second option.

We don't need higher resolution, we just need something other than the Youtube codec.

The 4K-spec tell something about that. Which means that we can get 4K-sized video, but finding a camera that also can meet the 4:4:4 requirement is...troublesome. If you stick w. bayer-pattern sensors you'd need 39.3MP (Wouldn't that be a nice resolution for a 1Dxs? )

We don't need higher resolution, we just need something other than the Youtube codec. Magic Lantern discovered yesterday that the Canon cameras store in the buffer RAW 14 bit 4:2:2 DNG files until it is converted to H.264,

Ideally it would be nice to have a compressed RAW DNG file like the one that's found on both the new Black Magic Cinema and Pocket camera, or have the compressed RAW as an option and the H.264 or AVCHD as a second option.

- They want to keep that as a high-end feature, to keep selling the 1DC at $12000.- Most people don't need it yet, and very few really want it too badly (keeping in mind most DSLR owners don't even care if their camera has a video mode at all)- It requires more expensive hardware = lower profit margins or higher cost for the consumer.- They are primarily still cameras - they are not built from the ground up with 4K video in mind.- The negative feedback would be a PR disaster - look at GoPro's situation as an example. They've delivered quasi-4K video, 2.7K video, and 120fps video in a consumer device, but have copped all sorts of complaints from people who do not own a fast enough micro-SD cards, or whose computer is too slow to even play back the footage. The same would happen if Canon started delivering 4K video to customers who do not yet understand the demands of 4K video. - Very few CF cards can handle the demands of 4K video (see previous point).

And, most importantly:

- They are selling more cameras that anybody else is right now. Why change what's already working? When sales drop, they lose sales to a competitor who is offering 4K video in DSLR's, then perhaps they will too.

Personally, I would rather see the video improved so that they offer proper 1920x1080 video (or perhaps even 2K - for that little bit of extra res for slight framing adjustments). The C100 footage is a whole lot sharper, as is the GH2 - especially hacked - and I wish Canon would at least attempt to get their DSLR footage up to this level. It is just frustrating that, aside from the moire-free (albeit softer) video of the mkIII, Canon has done absolutley nothing to actually improve the processing and image quality of their DSLR video performance since the mkII came out all those years ago.

Personally I would love to see proper, clean 1080p at 50/6060 & proper, clean720p at 100/120fps in h.264 format (with increased bitrates to accommodate the extra frames), and an option for 24/25p 2K recorded to a better codec like Cineform RAW or Cinema DNG. That would offer a significant increase in IQ, while still keeping it well within realistic confines of the average persons recording/editing/playback workflow.

...The "consumers don't have 4K TVs" argument is a little dodgier. On the one hand, yes, it's true, if no one has the equipment to watch 4K content, then it's silly for Canon users (excluding certain professionals) to clamor for the feature. On the other hand, 4K TVs have become semi-affordable...

It's not just the TVs, it's the lack of widely-adopted distribution format. Networks can't broadcast 4k, you can't get 4k via cable or satellite, you can't burn a 4k Blu-Ray disc, etc. For enthusiast-shot 4k material, how would you distribute it to your audience?

If the viewer has a 4K TV connected to a home media PC, you could hand them a portable hard drive with 4K material and maybe they could play it after updating all their codecs and software. That would be required for *each* viewer of your 4K material.

For professional use there's a better argument for producing in 4k so the material will have longer shelf life, much like filming color TV shows in the mid-1960s when few people had color TV.

paul13walnut5

[quote author=joema link=topic=14478.msg264664#msg264664 dateFor professional use there's a better argument for producing in 4k so the material will have longer shelf life, much like filming color TV shows in the mid-1960s when few people had color TV.[/quote]

Which is also actually a good arguement for shooting on film! Get a 4k scan today, then a 16k scan in five years when folks are having the same old arguement about why their rebel sl8 can't record at 16k at 120fps.

Hmm, I don't think many people participating in this thread to much cinematography. For everyone who has talked about 4K not being broadcast, or 4K TV's not being mainstream, etc. as resons why we don't have it in our everyday and even high end DSLRs...I think you are generally missing the point of high resolution video capture. It really isn't about the way you stream the video to your customers. It's about capturing as much detail as possible initially, for a number of reasons.

For one, 4k video, even 8k video, and 16k video if/when it ever arrives...is usually DOWNSCALED in post processing. Just like taking a high resolution still photo, and scaling it down 2x or 4x, you mitigate problems with the original video. You reduce noise, you improve sharpness, you eliminate artifacts (hot pixels, frame tearing, etc.)

Second, having more pixels to work with gives you more "room" to work with, provided you frame adequately. With 4k video, or even better in the future 8k video...you can frame out a bit, adding a buffer for a variety of post-process corrections. This might be smoothing hand-held panning, stabilizing jittery hand-held video, just plain old simple cropping to cut out something that ended up in the corner or edge of a scene that shouldn't have been there, etc.

In the end, the ultimate goal is still to produce a 1080p final video product...regardless of whether you have 2k, 4k, or 8k RAW video source. In addition to that goal, though, is to have crisper, clearer, less noisy, stabilized, extremely smoothly panning video of unparalleled detail and sharpness...AT 1080p.

To be honest, I am rather certain that little in the way of mainstream broadcast 4k TV content when 4k becomes commonplace mainstream will actually be shot at 4k, even if the camera bins 8, 16, or 32 megapixels to produce it. I suspect that quality 4k programming will ultimately be shot with high end 8k cinematography equipment, for the same reasons we all want 4k video in our DSLRs now.

I think there are two fundamental reasons why we don't have 4k video in our DSLRs: For one, it is kind of a high end, prestegious thing, and it makes sense for companies competing in that arena to protect it. If we are really complaining about a $7000 camera not having 4k, then it isn't too much of a stretch to think someone could pick up a CinEOS that does 4k for $15k...one has to figure if your spending seven grand in the first place, you aren't just fooling around unless you are independently wealthy...so...$7k, $15k...whats the diff?

Second, it DOES take fairly high speed equipment to process 4K video frames at 30fps, let alone at any higher speed. A pair of DIGIC5+ could handle the input, but you would have to REQUIRE high speed writeout as well. That complicates the issue...creates a tech support nightmare for those who don't read manuals and don't understand nor care that the camera wasn't designed to support 4K video with a cheap 200x CF card from five years ago.

One also has to figure continuous high speed processing is going to produce high heat. That has a whole host of implications...the need for better passive cooling or even active cooling of most electronic components. The potential for additional noise to creep in over time at all ISO settings unless the sensor is actively cooled. Conforming to the various regulations around the world regarding battery design, power consumption, even limitations on allowed features in products of certain classes that lead to additional import or export taxes when those limitations are ignored, etc. etc.

I would put "The ability of TV broadcast stations to deliver 4K content" DEAD LAST on my list of reasons why we haven't seen 4k 30fps video in our DSLRs yet!

LOL have you actually seen the 4K from the GoPro? Or even the 2.7K? While it looks good for a GoPro, neither looks better than even the cheapest DSLR. Data rates, sensor size, there are so many variables. I mean an Arri Alexa is 4K also, but that doesn't mean the GoPro is just as good. The 15FPS 4K on the GoPro is pretty much worthless, they only added it so they could put "4K" on the box and hope that people see "oh it's 4K, it must be the greatest camera ever" and buy it.

Why are you talking about 4k? Let's talk about HD. Canon DSLRs do not even shoot true 1080p while their competition does. In short, the answer the original poster's question is that Canon managers are greedy old men who do not care about the customer at all. Prove me wrong.

This thread is hilarious, watching everyone make mountains out of anthills (not even big enough to be a molehill).

Consider this. In South Australia, we just switched off our analogue TV signals for good. No more. Digital only. New TV or a Set-top box only. So I went with my mum to a shop to buy a new TV for the kitchen, that she listens to while she cooks. The old one was so old it didn't even have a Composite Video input, Aerial only, so it was set-top box plus VCR or something else to modulate the video to rf, or a new tv, and a new tv was cheaper.Anyway, we get to the shop. We start looking at the cheapest in a decent size. We see a nice Teac, 32" half price for only $300. So I read the specs. "Full HD 1080i" it claims. I ask the salesman how it can be both "Full HD" and "1080i" at the same time. He explains that's how people market it, "full hd" just means 1080 lines, p or i.Anyway, further down the spec list I read "1344 x 768 Pixel Screen". Again, I ask the salesman, how it can be "Full HD 1080, i or p" and only have "1344x768 pixels". He did look a bit sheepish for a minute, but came back with "well, the digital receiver can tune in to 1080i signals, but downscales it to 768 to put onto the screen. If you wanted to you could use an HDMI out to another screen for true 1080i display".You know what? We bought it anyway. It was cheaper than anything else, beat her old tv by miles, and she wouldn't notice the difference anyway.

So who cares if Canon's $15k camera can do 4k video, but their $500 one can't, or even their $3k one? Can you play it anyway? If you could, do you have the editing power to edit it into something watchable? And then, can you distribute it on anything other than huge USB sticks or portable HDDs? And I'm not sure what's meant by "canon dslrs cannot even shoot true 1080p", is that because they use 442242 compression instead of 442444 or 444224? People can hardly tell the difference between 768 and 1080i and 1080p. If you ask them, they'll say that 1080p is better than 1080i, the ads have conditioned them to know that. Ask them to explain why or what it means, even pick between the two side-by-side, they won't know. I couldn't pick the 1344x768 screen from a 'real' 1080p screen next to it.Here's a tip: Joe Public can't tell the difference either. Joe Public doesn't care. Joe Public just wants some pretty pictures to flash on a shiny box to distract him while he shovels nachos into his face. And the company that can deliver that to him easiest is the company that wins. Canon is that company, and Canon is winning, 10 years in a row it has been winning. If you're already winning a race, why stop and change your shoes?

This thread is hilarious, watching everyone make mountains out of anthills (not even big enough to be a molehill). And I'm not sure what's meant by "canon dslrs cannot even shoot true 1080p", is that because they use 442242 compression instead of 442444 or 444224?

No, it is about the bad downscaling. The 1080p video files have a true sharpness that is a lot lower - closer to 700 lines of resolution or roughly what proper 720p is (and less than that HDTV's downscaled Full HD can display). Improved processing could no doubt get these cameras to deliver some extremely sharp 1080p video, and I think that is far more of a priority than 4K

Look at 1080p from a 5dmkIII side by side with 1080p from a Canon C100 and you will see what is meant by "canon dslrs cannot even shoot true 1080p."

You say that it all doesn't really matter because consumers just want to see flashy images - while I agree that audiences are often easily impressed, and that they have incredibly short attention spans - they are not the only people you have to impress. Clients, marketing managers, producers, broadcasters - all sorts of people along the production pipeline scrutinise your image quality to the highest degree, and if it doesn't pass their test, then your easily impressed audience will never get to see it anyway, which equates to lost income.

What If I said to you that most people are only viewing photos on the web at about 1200x800 pixels, therefore your DSLR's only need to shoot 1mp photos? Would you agree with that?

We can always use more resolution, for stills and video, and it is just a matter of finding the balance point between what is possible and what is necessary. At this stage, for most working professionals, proper 1080p is necessary, and for many consumers, who have bought the best TV they can afford, proper 1080p in these cameras will deliver a noticeable IQ difference at the ideal viewing distance. However, in the past 5 years Canon have not made any improvements to the soft video in their DSLR's.

This thread is hilarious, watching everyone make mountains out of anthills (not even big enough to be a molehill).

Consider this. In South Australia, we just switched off our analogue TV signals for good. No more. Digital only. New TV or a Set-top box only. So I went with my mum to a shop to buy a new TV for the kitchen, that she listens to while she cooks. The old one was so old it didn't even have a Composite Video input, Aerial only, so it was set-top box plus VCR or something else to modulate the video to rf, or a new tv, and a new tv was cheaper.Anyway, we get to the shop. We start looking at the cheapest in a decent size. We see a nice Teac, 32" half price for only $300. So I read the specs. "Full HD 1080i" it claims. I ask the salesman how it can be both "Full HD" and "1080i" at the same time. He explains that's how people market it, "full hd" just means 1080 lines, p or i.Anyway, further down the spec list I read "1344 x 768 Pixel Screen". Again, I ask the salesman, how it can be "Full HD 1080, i or p" and only have "1344x768 pixels". He did look a bit sheepish for a minute, but came back with "well, the digital receiver can tune in to 1080i signals, but downscales it to 768 to put onto the screen. If you wanted to you could use an HDMI out to another screen for true 1080i display".You know what? We bought it anyway. It was cheaper than anything else, beat her old tv by miles, and she wouldn't notice the difference anyway.

So who cares if Canon's $15k camera can do 4k video, but their $500 one can't, or even their $3k one? Can you play it anyway? If you could, do you have the editing power to edit it into something watchable? And then, can you distribute it on anything other than huge USB sticks or portable HDDs? And I'm not sure what's meant by "canon dslrs cannot even shoot true 1080p", is that because they use 442242 compression instead of 442444 or 444224? People can hardly tell the difference between 768 and 1080i and 1080p. If you ask them, they'll say that 1080p is better than 1080i, the ads have conditioned them to know that. Ask them to explain why or what it means, even pick between the two side-by-side, they won't know. I couldn't pick the 1344x768 screen from a 'real' 1080p screen next to it.Here's a tip: Joe Public can't tell the difference either. Joe Public doesn't care. Joe Public just wants some pretty pictures to flash on a shiny box to distract him while he shovels nachos into his face. And the company that can deliver that to him easiest is the company that wins. Canon is that company, and Canon is winning, 10 years in a row it has been winning. If you're already winning a race, why stop and change your shoes?

This post isn't any more intelligent or knowledgeable than the others.

Your notion that the public cannot tell the difference between 720p and 1080p, or between interleaved and progressive, is just flat out wrong. People can tell the difference. The average TV show is 720p, with a few channels broadcast in 1080i. The difference between 1080i and 720p is quite visible. Flip between both versions for the same sports channel (usually sent on different sub channel blocks), and the improvement with 1080i will be clear. Progressive scan is even better, and that is usually only realized with BluRay these days (although some in lucky areas might be able to get 1080p TV, not sure).

The quality of picture that you get out of a BluRay is unparalleled. That is also the primary reason why millions and millions of people every year spend big bucks to buy top-end BluRay players and high end Full HD (1920x1080 with progressive scan capability) TVs to the tune of thousands of dollars. People aren't just chasing a big TV...they are chasing crystal clear, razor sharp PICTURE. People know this, they talk about it on forums dedicated to it, and they constantly spend money upgrading TVs or other equipment year after year to maximize that quality. It isn't every one of the 140 million homes in the US doing this every year, but tens of millions of people do.

It's a JOKE to think people don't care about getting the kind of quality expect out of the expensive gear they pay for. A 4k capable video camera, paired with some 4k capable video editing software, goes a long way towards making better videos. The "average" person who just wants to shoot home videos will pick up a camcorder. The guy who wants to make awesome, professional quality sports videos of his buddies doing awesome tricks with their snowboards would LOVE to have 4k video for an affordable price!

Last, I've already said this in my last post, but I'll say it again. The point of having 4k video is not so you can BROADCAST 4k TV!! The point is just the same as the reason you want a high resolution 18-36mp camera to downscale your photos to .5mp Web Size: Image Quality. Downscaling normalizes noise, sharpens detail, eliminates small artifacts, hides cinematography "tricks" or chop...it enhances quality. It also gives you additional editing latitide, and the ability to use more advanced tools like Adobe Premier to perform post-process image stabilization, panning smoothing, etc. You don't buy 4k to broadcast it at 4k. You buy 4k for downscaling. You buy 4k to maximize video IQ, and improve your editing capabilities if you have the post-processing tools.

Might not want to shoot your mouth off until you really know what your talking about.