Baselworld is only a few weeks away. Getting the latest news is easy, Click Here for info on how to join the Watchuseek.com newsletter list. Follow our team for updates featuring event coverage, new product unveilings, watch industry news & more!

Higher frame rates are possible now. Under current HDMI spec 4K(4096x2160) is limited to 24fps or 30fps with quad HD (3840x2160). 1080/120 would fit within the current HDMI bandwidth. If we are looking at still pictures the added resolution would be preferred. But with moving images the increased frame rate will give the appearance of a more detailed picture. It is more lifelike as stated. 4k or 1080P/120 would both take about the same bandwidth and for movies/ video I would choose the latter. Now 4K @240fps I'm all for it:D We'll just need another HDMI standard:rolleyes:

According to the specs Sony posted, this 84" set is an edge-lit set (which is probably why it got a "900" and not a "950" designation). Especially for such a large screen size, can anyone in the know comment on how you can possibly have even back-lighting on an edge-lit set of that size?

Yes. In the demo loop, there was a 1-2 second black segment. I looked as hard and quickly as i could and uniformity in both sets on display was perfect. Again this was for a brief moment and and in a lit exhibit space. So crossing fingers , this may not be an issue here. I also looked for other defects such as dead pixels, seams, etc. but could not find any. Usually I can find them such as in the OLED displays at CES. But not here.

Sony had done a stellar job of picking good content. As OP post, the set looked stunning especially since unlike a projector, you could get your nose 2 inches from it and examine individual pixels.

Quote:

As for the price, it's an early adopter price.

I would not be surprised if these sets go through a post manufacturing repair to fix dead pixels. This was the process when the first 1080p sets were introduced until the yields were improved. The expected quantities on these and the number of dealers who sell them will be small.

Skyfall is shot in 2K with Arri Alexa, so that trailer must have been up-converted too, whatever the Sony guys at Cedia said.
High Efficiency Video Coding (HEVC)/H.265 is 50% more effective than H.264. 4K raw material can also be compressed much more than 2K with less loss of quality.
The aim is to have 4K running at 20-25Mbs for HT material. A 4K movie fits fine on a 50GB BD disc with the new color space standard for UHD. But you won't see 4K BD the first couple of years at least.
RED has their own RedRay codec that run at same bitrates for 4K at 10bit colorspace.
BBC shoot a lot with RED cameras at 4K and 5K, particularly Nature docs. How much they edit and render out in 4K is unknown.

However, there are still compression artifacts and other unwanted anomalies showing up at lower bitrates with H.264 and 1080p at only 8 bit... especially if they have to filter the master file to within an inch of its life to compress the data at that extreme level. The better Blu-ray transfers usually have higher average bitrates and far less pre-filtering. 20-25 Mb/s sure seems extremely low for 4k material. That can sometimes be too low for 1080p currently depending on the complexity of the image.

I want the 4k format to look outstanding on a BIG screen, not a small telecine monitor. That will necessitate higher than Blu-ray bitrates and 10 bit, 4:2:2 or higher video even with H.265.

If UHD doesn't look noticeably better than 1080p @ 8 bits with a packaged consumer medium it will never sell.

However, there are still compression artifacts and other unwanted anomalies showing up at lower bitrates with H.264 and 1080p at only 8 bit... especially if they have to filter the master file to within an inch of its life to compress the data at that extreme level. The better Blu-ray transfers usually have higher average bitrates and far less pre-filtering.

The mistake here is that people in general think that the artifacts are solely a result of too hard compression and low bitrate. The fact is that most of those artifacts are already present in the raw material coming out of the camera or the scanner.

The reason is that most films shot digital are shot on cameras with 2K sensors or from film scans scanned at 2K for DI. These camera sensors and scanners gives barely a 1.5 megapixel of real pixel resolution, outputted as 2K with the baked in artifacts, which are again amplified with further compression.
These images are in addition rather soft and needs electronic sharpening, therefore you see all the ugly edge enhancement on 1080p material. This would be avoided if the source for 1080p was original in 4K.
We generally think that post production techniques are on a high professional level and knowledge updated to the latest processing techniques. But that is very far from the truth, it much more the opposite.

Much of what is shot with 4K cameras has often been directly transcoded to 2K to Apple ProRes, discarding the 4K RAW, and edited, graded and rendered from the 2K ProRes, because of the Apple/FCP centric "standard" in post houses and/or pure stupidity. This has happened to most movies shot on Red 4K cameras, and even if films now are scanned at 4K,6K and even 8K, they are processed with bad methods.

The first digital 2K camera that gives a full 2K file after debayering is the Arri Alexa, because it has a 3K sensor that is sub-sampled in camera to 2K. The negative is that the camera has an extra strong Optical Low Pass Filter that gives a very soft picture.
So only movies that are shot with Red 4K & 5K cameras (or the new Sony F65), Alexa or films scanned at higher than 2K can give a artifacts free 2K source, if the files are treated right.

Only the Red and Sony cameras can provide a 4K "future proof" source, which is why one have the "head scratching) wonder why so many high profile films and TV series are shot on the Alexa, Like "Game of Thrones", "Downton Abbey", "Avengers", "James Bond; Skyfall" !
Which means, all these productions will only exist in a future "4K world" as up-converted versions.

Why doesn't the studios do some "forward thinking" and "future proof" their material by demanding their productions are made for a 4K delivery, even if that is somewhat into the future?

The post processing situation is slowly improving at those post houses that care to update themselves, but the choice of the right camera for the job is almost more in decline.

Quote:

20-25 Mb/s sure seems extremely low for 4k material. That can sometimes be too low for 1080p currently depending on the complexity of the image.
I want the 4k format to look outstanding on a BIG screen, not a small telecine monitor. That will necessitate higher than Blu-ray bitrates and 10 bit, 4:2:2 or higher video even with H.265.

The claim is that the quality of 4K projected at a 20 foot wide screen at those low bitrates is almost indistinguishable from a DCI version. Provided that the material is shot with a camera with an higher resolution than 4K, is artifacts free and is treated properly in post.

Quote:

If UHD doesn't look noticeably better than 1080p @ 8 bits with a packaged consumer medium it will never sell.

It is possible that the difference between 1080p and 4K is not high enough resolution to sell to the general public, who are not very quality concious, and have too small screens. That is also why NHK decided to skip 4K for the future broadcast system and settle for 8K.

For us enthusiasts with large screens, I think we will appreciate the quality of a 4K upgrade. But it will also depend on a combination of several factors that will contribute to how impressed we will be, like; the before mentioned post production treatment, high contrast (and color saturated) images impress more than low contrast (and desaturated) images, laser projectors will give a wider color gamut than lamp based projectors etc. etc.

Higher frame rates are possible now. Under current HDMI spec 4K(4096x2160) is limited to 24fps or 30fps with quad HD (3840x2160). 1080/120 would fit within the current HDMI bandwidth. If we are looking at still pictures the added resolution would be preferred. But with moving images the increased frame rate will give the appearance of a more detailed picture. It is more lifelike as stated. 4k or 1080P/120 would both take about the same bandwidth and for movies/ video I would choose the latter. Now 4K @240fps I'm all for it:D We'll just need another HDMI standard:rolleyes:

Or give up on cables entirely.

Cogito ergo sum makes a fundamental mistake because it ignores the implied existence of the narrator. Descartes might as well have said "A rose is red, therefore I am".

Higher frame rates are possible now. Under current HDMI spec 4K(4096x2160) is limited to 24fps or 30fps with quad HD (3840x2160). 1080/120 would fit within the current HDMI bandwidth. If we are looking at still pictures the added resolution would be preferred. But with moving images the increased frame rate will give the appearance of a more detailed picture. It is more lifelike as stated. 4k or 1080P/120 would both take about the same bandwidth and for movies/ video I would choose the latter. Now 4K @240fps I'm all for it We'll just need another HDMI standard

It is possible that the difference between 1080p and 4K is not high enough resolution to sell to the general public, who are not very quality conscious, and have too small screens.

Even with a small screen and|or 1080 display, or a 4k2k|8k4k display placed too far away for full resolution to be visible, there are advantages to having the highest possible source resolution because 'reframing with zoom' can make full use of any 'otherwise invisible' signal content . . . as I suspect both the broadcast sports and Adult Film industries already understand.

Much of what is shot with 4K cameras has often been directly transcoded to 2K to Apple ProRes, discarding the 4K RAW, and edited, graded and rendered from the 2K ProRes, because of the Apple/FCP centric "standard" in post houses and/or pure stupidity. This has happened to most movies shot on Red 4K cameras, and even if films now are scanned at 4K,6K and even 8K, they are processed with bad methods.

That is not entirely accurate. In just about all current 4K captures, the OCN is archived to LTO tape for preservation. In addition all the metadata generated in post production is also archived. So to re-release it's a largly automated process to build a 4K version from what was initially a 2K release version.

The studios are not stupid. However they are a business. Until the market for 4K distribution is stronger it makes no sense to release in 4K at roughly 3x the cost of 2K. Yet they still archive the 4K OCN and metadata for future re-versioning.

That's hardly "stupid" thinking.

And Apple ProRes is mostly used as a proxy on A titles. Independent lower budget films are different. Many money saving shortcuts are taken there. The master is cut and color corrected in uncompressed 2K. The only time it hits a compression stage is DCP or home/broadcast versions.

OCN = Original Camera Negative. Yes it's still largely referred to as a "negative" even though it's no longer film.

I fully agree, but the framerate needs to match the spatial resolution. Gary Demos showed years ago that with SD you need close to 300 frames per second to be able to use clean shots while still producing smooth motion portrayal, BBC Research showed a few years ago that 400 frames are required to have motion and limited blurring in the individual frames, for XGA.

I have been complaining about Super Hi-Vision being limited to a mere 60 FPS, you need 600, I used to say. NHK is now showing 60 vs 120 FPS, a remarkable difference. I couldn't resist asking when they would start work on 1200 Frames per Second;-).

I fully agree, but the framerate needs to match the spatial resolution. Gary Demos showed years ago that with SD you need close to 300 frames per second to be able to use clean shots while still producing smooth motion portrayal, BBC Research showed a few years ago that 400 frames are required to have motion and limited blurring in the individual frames, for XGA.
I have been complaining about Super Hi-Vision being limited to a mere 60 FPS, you need 600, I used to say. NHK is now showing 60 vs 120 FPS, a remarkable difference. I couldn't resist asking when they would start work on 1200 Frames per Second;-).

Do you have a link to this study and conclusion? It seems to be at least part of a "need for motion blur" argument that also circles the computer animation world from time to time. I'd like to see what he's using for testing and metrics.

You get a built in anti-aliasing effect as a real world smooth object moves from what ends up being pixel to pixel. Regardless of the resolution involved, the more native frames that exist (from the source recording instrument) the smoother/real the effect.

Cogito ergo sum makes a fundamental mistake because it ignores the implied existence of the narrator. Descartes might as well have said "A rose is red, therefore I am".

No link at hand, did find a brief report at one of the BBC Research websites a few years ago, a modified version of a paper presented at IBC at the time, it was also demonstrated at Tech Retreat. As for the Demos research I never saw a paper on it, but Mark Schubin kept bringing it up in discussions back in the day. He also programmes the Tech Retreat, and brought in the BBC Research demo, so he might be the man to ask. He probably is still at IBC or travelling though, so might not get back to you immediately.

Man I would just LOVE to be able to see one of these 4K sets! Wish I could have gone to CEDIA this year. I'm curious to see high frame rate material too. As long as it doesn't look like the soap opera effect cause that makes me wanna puke.

Oh I don't doubt that there would be a particular fps where things got smooth.

What I'm curious and half dubious of is that there need be a particular matching of FPS to resolution. Even at course resolutions, an increase in fps would be just as relatively impressive: as a sharply delineated object moves into a pixel, the pixel becomes more and more that color until completely "filled" with it. The number of steps in that filling aids in what is smooth.

Even though persistence of vision is far longer than a 1/300th of a second, your eyes are still affected by smoother color changes.

And, of course, the pixel/second speed of the object is paramount.

Cogito ergo sum makes a fundamental mistake because it ignores the implied existence of the narrator. Descartes might as well have said "A rose is red, therefore I am".

Coolscan and Glimmie have some valid points. As always on AVSForum, the discussions get very lengthy :-)

Any weak link reduces "quality", and there are many, many links in the chain. Arri Alexa's OLP filter, Apple Color's incompatibility with 5K media which requires transcoding, cable companies' crappy low-bitrate recompression, LCD pixel decay time which is slower than the refresh rate..... the fact that the best technology exists doesn't mean that the world can uniformly and quickly adopt it. It's a bit like us having the technological ability to have a moonbase right now - and yet we don't.

Furthermore, making TV and films is a business. (Financially it's riskier than ever, which means that the smartest people stay away from it ;-) The prevailing consideration for workflow decisions (which determine quality) is neither what's the best nor what's future-proof - it's what's "good enough" and acceptable at the time. Movies were shot on 35mm film for decades because that was the prevailing and commonly available system - 65mm never really caught on. In a way, I'm a bit surprised that Seinfield was shot on 35mm and not U-Matic SP... would it really have had lower ratings because of the quality? Look at reality shows to see what happened to production values and picture quality over the past decade - increasingly greedy producers figured out there was no point to finesse drama on 35mm when you could stash a dozen (non-SAG) people in a room and shoot them with an HVX200.

Most content by volume is cable episodials, soap operas, reality TV crap, movies that barely make it to DVD, corporate videos... those workflows really aren't optimised for the highest possible quality. It doesn't help that engineers from Sony Japan develop slick 4K TV's when my producer sends me a 500x400-pixel photo because it was $15 cheaper at GettyImages and he doesn't know better.

For the few "A titles" where there's enough budget to care about the artistic and visual quality, you seem to be forgetting something that's much dearer to the content-makers: the story. Movies are not supposed to be a window, but a fantasy; an alternate, stylized, somewhat fantastic version of reality. Directors and DP's who choose the Alexa don't care much for its OLP filter - they love its appealing, naturalistic color rendition as well as its ease of handling and familiar operating interface. Likewise, 24fps sets movies apart from reality (and sports, news, etc.) because of its degraded, distinctive look. It tells you it's an interpretation of reality, "something else", sometimes likened to a dreamlike sense. Today's "smooth-motion" algorithms turn 24p into 60p, and IMHO they destroy the content, making everything look like a soap opera.

So while no one wants solarization and banding, there's a ton of other factors in play. I'd much rather watch an amazing story on SD DVD than watch Transformers 3 in any resolution...

It is destined that 4k is the next step in the resolution that we will get our entertainment, though I don't think that it will be as widely adapted as 1080p is. At this point, the delivery method is not in place for the common consumer to get 4k media. The groups behind HEVC will start to deploy that format soon, but it is still years away from wide adaption and the data compression to work with our infrastructure.

I am excited that 4k devices are being made and my future plans for a home theater will include a 4k projector, but I doubt I am in the mainstream.

1. Anybody knows what were the parameters of the demo material shown at CEDIA? Manufacturers often fool people by showing e.g. uncompressed or little compressed clips. Part of the wow impression is thus coming from the source material and not from the display.

2. While there is some case for 4K when having huge displays it is valid only when displays and source material are of impeccable quality. Otherwise the 4K becomes ideal tool for detecing imperfections and artefacts. There are huge doubts if the extreme quality requirements can be satisfied with edge-lit LCD's (e.g. having clouding @4K sounds ridiculous) and highly compressed sources like the H.265. Thus, the 4K would make real sense with displays of the 4K OLED type and sources which are virtually lossless @high framerate and 10-bit color meaning ~200 Mb/s rates. Until then we will see the 4K prothese.

3. Most content watched will be 2K upconverted. While sophisticated algorithms can make 4K upconverted undistinguishable or perhaps subjectively even better-looking than 2K, the sense of doing it is big question. One can apply equally sophisticated algorithms for 2K PQ improvement.

4. Where the 4K is needed now and with good reason and economy are 4K computer monitors. There is some push for them but it is very limited comparing to the 4K TV. That is looking very strange since there is market for such monitors if only the price is realistic and the price can be made realistic as there are already 2K laptop displays with the same pixel density. Hopefully manufacturers will be sobered by economic reality and make the real start of 4K with computer monitors as it should be.

I must say that there is DEFINATELY a difference in humans ability to see the varying qualities of video. I have been running a 206" projector for 3 years now and have witnessed exactly what happens to NORMAL people when exposed to variying resolutions and qualities (hundreds of different people's opinions). I cannot tell you how many times I have been in a room full of people watching a variety of content and even though "I" might notice a "less than perfect quality" NO ONE ELSE DOES......................EVER! I believe that once you reach screen sizes this big it makes less of a difference (to most people) and not more of one. This is just my personal opinion of what I see regularly in my own place. Heck even the terrible looking WII video game system is still a ton of fun for everyone at this screen size.

When we have Arri and "Arri fanboy" Roger Deakins working so hard at convincing us that "there is no need for 4K cameras at this point because 2K Alexa Footage Up-Resed to 4K and even IMAX Looks 'Superb" it might be hard for the industry to be convinced that Native 4K material has any merit.

These guy's work as "break pedals" for Technological and quality development in the film industry. This is of course excuses because of their failure to develop 4K cameras and inability to see pass the quality issues of "1080p is good enough".

Hopefully some powerful people in the industry will counter these "bullsh*t arguments."

It is the relative merits of 2K vs. 4K resolutions in both acquisition and digital cinema distribution and projection that gives this debate its context here at IBC.
On the opposite side of the ring sit manufacturers like Sony, with its F65 camera and CineAlta 4K digital cinema projector, and RED, with its boundary-pushing 4K and 5K cameras and developing 4K laser projector.
Those companies maintain, as many others developing the hardware and software to support 4K workflows in general here at IBC also believe, that images acquired at the highest resolutions possible translate into beautiful, pristine images on the big, and even bigger screen.
Add to that equation visionaries like Christopher Nolan, who shoot booth digitally and on film while aiming for the largest and richest resolutions possible to bring his vision, and DP Wally Pfister's beautiful images, to clear, richly saturated life.

Quote:

ARRI engineers and product managers have heard this all before but see nothing wrong with up-resing Alexa's 2K images onto traditional and 4K and/or IMAX screens.
When asked during a Q+A session when there will be a 4K version of the camera, however, Shipman hardly blinked before delivering his answer to those of us in attendance. "There is just not an urgent need for it," he said. "When you have Roger Deakins, adored for his film work, shooting Skyfall and he is comfortable up-resing to 4K and showing on IMAX, I think it speaks for itself."

Quote:

It is hard, however, to draw comparisons between up-resed 4K and true 4K resolutions when caught up in the expertly shot, edited, graded and finished clips, further enhanced by a Dolby 7.1 surround sound track.

Many practiced eyes here at IBC will tell you they can spot the difference between 2K and 4K immediately, and Deakins himself admitted he feared up-resing ARRI Raw to IMAX would not look good enough. But once he saw the results, he put his fears to rest. "The images I have seen in the IMAX theater are simply superb," he told the audience.

To put some buckets of cold water on hot heads:
1. Anybody knows what were the parameters of the demo material shown at CEDIA? Manufacturers often fool people by showing e.g. uncompressed or little compressed clips. Part of the wow impression is thus coming from the source material and not from the display.

There was no context under which to get such answers. It is a show floor from a big company and the person answering only knew top level answers. On that front, he did say it was using a video server. I don't know if that was a DCI server or editing workstation system. Either way, I looked close and at least in the quickly changing demo content, I could not find much compression artifacts. So whatever compression was used, was quite mild.

There is no question that proper material was chosen. This was the key to Sony's success here unlike their projector demos. LCDs shine with very bright content especially on show floor and they and produced exactly that (beach scenes, etc.). They had done everything properly to show this unit in the best light possible. As they should have .

Quote:

2. While there is some case for 4K when having huge displays it is valid only when displays and source material are of impeccable quality. Otherwise the 4K becomes ideal tool for detecing imperfections and artefacts. There are huge doubts if the extreme quality requirements can be satisfied with edge-lit LCD's (e.g. having clouding @4K sounds ridiculous) and highly compressed sources like the H.265. Thus, the 4K would make real sense with displays of the 4K OLED type and sources which are virtually lossless @high framerate and 10-bit color meaning ~200 Mb/s rates. Until then we will see the 4K prothese.

OLED has its problems getting to market with 1080p. 4K should not be on the wish list yet . There are a lot of problems to be sorted out there (such as poor blue reproduction, maintaining gama as picture brightness level changes, uniformity, aging, etc). For now, as LCDs go though, this demo worked. As I noted in my earlier response, I did not see clouding and such. If it were there, it certainly did not interfere with the demo.

BTW, the viewing distance for everyone there varied form 1-2 inches (me ), to just 3-5 feet. This is why I think this demo was so successful. The material was produced for demo purposes and per above was very carefully produced to show well. Grains of sand in a beach scene for example around a girl's foot was impressive to look at at just a few feet. These were not movie scans and such. Think of what you can get if you took a 4K camera and shot bright and colorful scenes edited them and then played them as is in post. That is what was there.

While many talk about 4K being useful for projection, seeing this demo shows that as a marketing tool it is much more successful in flat panels. I gave the reason above. You can demo the picture at such close viewing distance -- something you don't do with projector. The brilliance of a flat panel also helps showcase increased resolution with bright images.

Quote:

3. Most content watched will be 2K upconverted. While sophisticated algorithms can make 4K upconverted undistinguishable or perhaps subjectively even better-looking than 2K, the sense of doing it is big question. One can apply equally sophisticated algorithms for 2K PQ improvement.

I don't think anything about this demo should rekindle the wishful thinking that real 4K content is coming anytime soon. It just isn't the case. I am personally thinking of using the display for demonstration of art, and such to take advantage of its extra resolution. Maybe we get some better than 1080p content for early release window but that hope is very faint right now.

Quote:

4. Where the 4K is needed now and with good reason and economy are 4K computer monitors. There is some push for them but it is very limited comparing to the 4K TV. That is looking very strange since there is market for such monitors if only the price is realistic and the price can be made realistic as there are already 2K laptop displays with the same pixel density. Hopefully manufacturers will be sobered by economic reality and make the real start of 4K with computer monitors as it should be.

There is 4K monitors coming in smaller sizes. Panasonic was showing a prototype 20 or so inch one at CES for example. I hear others working on it.

My guess is that the first 4K PC monitor in the market will be the Sharp 4K 31.5" IGZO monitor they showed at IFA, and will first be released by Apple as the new Apple 4K monitor. Mac Pro towers just got a Quadro K500 GPU.

There is simply no reason to believe that a mainstream-priced LCD will magically solve the problems of edge-lit uniformity simply because some $25,000 demo piece may have done so.

While I find that observation somewhat helpful, I wouldn't extrapolate anything about it.

There is no difference in HDMI cables. If you can see the picture without visible dropouts or sparklies, the cable is working at 100%. No other cable will display a better version of that picture. You're simply wrong if you think there is a better digital cable than one that is already working. (Oh, and plasma didn't die because of logistics problems, nor does OLED ship in big boxes because it comes from Korea.)

There was no context under which to get such answers. It is a show floor from a big company and the person answering only knew top level answers. On that front, he did say it was using a video server. I don't know if that was a DCI server or editing workstation system. Either way, I looked close and at least in the quickly changing demo content, I could not find much compression artifacts. So whatever compression was used, was quite mild.
There is no question that proper material was chosen. This was the key to Sony's success here unlike their projector demos. LCDs shine with very bright content especially on show floor and they and produced exactly that (beach scenes, etc.). They had done everything properly to show this unit in the best light possible. As they should have .

Yeah, but this far from normal reality of highly compressed material and darker content. Thus better to keep heads cool about PQ.

Quote:

Originally Posted by amirm

OLED has its problems getting to market with 1080p. 4K should not be on the wish list yet . There are a lot of problems to be sorted out there (such as poor blue reproduction, maintaining gama as picture brightness level changes, uniformity, aging, etc). For now, as LCDs go though, this demo worked. As I noted in my earlier response, I did not see clouding and such. If it were there, it certainly did not interfere with the demo.

Yeah, but without displays having no inherent artefacts the 4K looks bleak. It is hard to believe the 4K edge-lit display has no typical artefacts. When one imagines people discussing e.g nonuniformities or limited black levels of 4K display this is absurdity.

Quote:

Originally Posted by amirm

BTW, the viewing distance for everyone there varied form 1-2 inches (me ), to just 3-5 feet. This is why I think this demo was so successful. The material was produced for demo purposes and per above was very carefully produced to show well. Grains of sand in a beach scene for example around a girl's foot was impressive to look at at just a few feet. These were not movie scans and such. Think of what you can get if you took a 4K camera and shot bright and colorful scenes edited them and then played them as is in post. That is what was there.

OK, but these distances have nothing to do with TV viewing scenarios.

Quote:

Originally Posted by coolscan

My guess is that the first 4K PC monitor in the market will be the Sharp 4K 31.5" IGZO monitor they showed at IFA, and will first be released by Apple as the new Apple 4K monitor. Mac Pro towers just got a Quadro K500 GPU.

Hopefully such a monitor is coming soon and in reasonable price range. There is ready and large market waiting for it.

When we have Arri and "Arri fanboy" Roger Deakins working so hard at convincing us that "there is no need for 4K cameras at this point because 2K Alexa Footage Up-Resed to 4K and even IMAX Looks 'Superb" it might be hard for the industry to be convinced that Native 4K material has any merit. These guy's work as "break pedals" for Technological and quality development in the film industry. This is of course excuses because of their failure to develop 4K cameras and inability to see pass the quality issues of "1080p is good enough". Hopefully some powerful people in the industry will counter these "bullsh*t arguments." http://www.studiodaily.com/2012/09/roger-deakins-digital-odyssey-not-in-4k/

Thanks for the link to the article and then to the interview with Roger Deakins. I think Roger Deakins has a pretty good handle on what's important for image quality.

Im very interested in 21:9 and 4K. it strikes me as the ultimate in "suspension of disbelief" when watching a movie.
The increased depth is probably the main reason. 21X9 is a different topic, but give you that feeling that you are there due to the large field of view
Whoever says we cant tell the difference at 8 or 9' is just flat wrong.

Washington, D.C. (September 4, 2012) -- Some tech pundits are proclaiming the 4K TV as the next big thing in home video. But don't tell that to HBO Chief Technology Officer Robert Zitter.

4K TVs, which are expected to be available for sale late this year, purport to offer a resolution four times greater than current HDTVs. But asked about the new technology at an industry trade show in Berlin, PC World writes that Zitter noted that HBO and other programmers would have to build a new production infrastructure to accommodate a switch to 4k broadcasts.

"That makes us look at 4K somewhat skeptically," Zitter said, according to PC World. "From my perspective, I have looked at 4K and we are prepare to, if it really comes to pass, maybe offer it on an on-demand basis."

Sony last week announced that it would begin selling an 84-inch, 4K TV late this year and other TV makers are expected to follow suit. While prices are unknown at this point, they could exceed $20,000 at launch, which would create yet another immediate obstacle in 4K's path.

Zitter also echoed CNET's claim that a large screen is required to see any difference between a 4K TV and a 1080p HDTV.

"They need a screen size that is greater than 60 or 70 inches," he said. "You can't see the difference on a screen that is smaller than 60 inches. But how many people in the United States or anywhere are going to have TV sets that are bigger than 60 or 70 inches? 20 or 25 percent?"

And we've been talking about this before. It seems that those with really great vision can't differentiate between images with angular resolution of 200 pixels per degree and and images with higher angular resolution. For those with average vision, that limit seems to be somewhere ~110-120 pixels per degree.

I did see it. And it was nothing short of amazing. I did a lot of Q&A, and while some have already answered these questions/comments, I like to condense them.

Yes, it works today with HDMI 1.4

The purpose of an 84" and a projector was to show that a bigger screen could be used in smaller environments. Which is why it was stunning at 10 inches. But it was equally stunning at 12 feet, especially when you did a side by side comparison of the competitors and looked at minute detail.

Many movies in America that are shown in digital cinema are 4k. Some 2k, but many 4k. And many theaters have upgraded to the Sony professional 4k projector as their weapon of choice. But because of money, many smaller theater owners and chains are forced to use 2k. The alternative is to go out of business since Hollywood is forcing digital on the theater owners.

As far as the human eye, Sony started a special training discussing the "arcminute", and how it affects in real world scenarios. It was interesting, and made some very valid points.

I can not believe I liked the up scaling as much as I did. And yes, it was truly apparent. So, in the old days, one would buy a $25, 000.00 projector and add a $25,000.00 processor to eek out a few extra lines of resolution. Out of the box, Sony's up scaling makes a far greater difference, no additional devices required.

So many people on this site and others state "while I have not seen these products, I have seen on a piece of paper factual evidence that supports my opinion." Maybe it's because we are in election year, but it is not about the paper, its about the truth. And the truth is in this case, visible. If only it were as apparent in politics... Want more? Get two engineers in a room with their own white papers on the same subject, and each will prove their contradicting paper is accurate, while the other one is total nonsense.

It does a better job of anamorphic display without a lens. Do the resolution math, and it will be clear as to why.

Build quality? I have seen far too many Samsung's with major issues. And far fewer Sony's (and LG's for that matter). I can not tell you how many "reviews" I have seen comparing the flagship Samsung with a mid level Sony. You would certainly hope it would out perform. What we notice is what ever TV you or a family/friend has had issues with, is "junk", each and everyone of them with no exceptions; the entire product line. As a dealer, I can only tell you who gets the most service calls. As I deliver Sony 5 to 1 over Samsung, I should NOT see more Samsung's in service, which unfortunately I do. Far too many. And in the last two years, LG has been very good in build quality, as well as performance. It was the Japanese though to be fair who suffered most from the tsunami last summer, leading to very difficult shortages. And this made a lot of dealers push Samsung (and LG) on their clientele.

So to everyone who is looking at a piece of paper and saying "it shouldn't, couldn't, won't and can't', when a local dealer gets them in, treat yourself to a viewing. And if they are willing to compare to what is out there, side by side, jump at the chance. If you have doubts about how the comparison is set up or skewed, that's not the dealer for you. If anything, grab the remote and do your own tweaking (with in reason, please); just remember to tweak ALL that you are looking at, not just the one you want to win

I believe Sony has a home run on their hands with both the projector and the 84" panel, even though not everyone can afford them.