We have two Seiki 50 inch ones at the office and I have a 65 inch Seiki at home. The 65 is more glossy than the 50, but all told they're relatively solid. It's a nice relatively affordable way to see 4k in action. The real issue is there aren't many options to feed 4k to these TV's to see it. iMacs don't output 4k so you're really looking at either a rMBP or a nMP.

In terms of near 4k monitors, we also have 2 of the 34in LG UHD monitors with our two nMP's and those are really really great. $999 at Frys.

I think Disk and traditional broadcast are going to be the late adopters to 4K- online distribution will get there first, where a company like Netflix can serve you at whatever resolution is sniffs out that you need.

I know Netflix has some 4K programming right now, but playback is limited to a couple of TVs with the right internals.

I was really hopping that we'd have seen wider adoption of h.265 so far this year, which is going to be critical for more widespread 4K distribution.

All the major streamers (except the iTunes Store) have either started limited 4K streaming or have announced it is in the works. Comcast and Time Warner Cable have announced 4K streaming is in the works. Sony has launched it's own 4K streaming movie service and all the major TV makers (Samsung, LG, etc.,) have accounted 4K streaming partnerships.

Like Marcus said, I think traditional TV broadcasting will remain HD for the foreseeable future because of all the technical hurdles of switching from HD to 4K. Streaming services, on the other hand, can much few limitations. YouTube, for example, went from 240 to 4K in about 7 years, IIRC, and it was pretty invisible to the end user (unlike like the switch from analog to digital broadcasting).

[Walter Soyka]"Replacing every single piece of HD video gear in the pipeline with a piece of 4K video gear, from storage and playout to transmission.
"

Basically this ^.

It will be the analog to digital move all over again (including updated digital b'cast signals not being compatible with current HDTVs thus requiring all new TV purchases or purchasing of new receiver box to plug into your existing HDTV).

EDIT:
To expand slightly on this, below is a link that talks about the ATSC 3.0 standard that is in the very early phases of development. It will include 4K and other cool stuff, but it will also be incompatible with ATSC 1.0 (the current HD b'cast standard) and ATSC 2.0 (a revised version of the current ATSC 1.0 b'cast standard that has yet to be implemented). Basically, new hardware for everyone!

[Lance Bachelder]"So then by "technical" we mean lack of or unwillingness to spend the cash?"

No, I meant technology related. The ATSC 3.0 spec is still in it's infancy so the tech required to implement it is either in a lab or on a drawing board too. By contrast, the hurdles of streaming services going from SD to 720 to 1080 to 4K to whatever are basically just buying bigger, better, faster versions of their existing hardware.

Money, of course, plays a part in everything and, IMO, by the time ATSC 3.0 is ratified and the hardware can be created it will be cost prohibitive as IPTV (even of OTA b'casts) will be even more popular than it is now. Why spend years and another billion dollars (if not more) upgrading the OTA delivery systems to 4K when most Americans will be streaming a live simulcast of NBC instead of 'tuning in' to NBC's b'cast?

If b'casters could upgrade as easily as the streamers do then I think we'd see announcements for impending OTA broadcasts but it's a much more difficult problem to solve.

[Marcus Moore]"Netflix can serve you at whatever resolution it sniffs out that you need."

sure netflix might be sniffing 720P for around a decade.

Larry Jordan has been pretty full on on this though right? the computational decode needs for h265 are pure savage?
and it's a triple time encode relative to h264?

and is it true no one can see 4K over 2K (whatever that is) unless they're sitting bizarrely close to a 40" screen in their home?
4K sony images on room slabs are enjoyable, but I'd rather full pipeline high dynamic range and crazy brightness nits myself I think.

You'd think the panel manufacturers are pushing themselves down a culdesac where they are engaging in PC destructive commoditisation on pixels, when they could perform apple level quality up-sales on certified film grade acetate plate 12 bit panels measuring forty five inches with 12000 nits faithfully reproducing the 35 mill of apocalypse now.

you'd think the panel makers could do worse than create an internal value driven competition for true home celluloid?

yes, h.265 is more processor intensive to both encode and decode, but I'd wager not as bad as when we first started encoding h.264 files way back when. And it's not just resolution, a broader colour spectrum is part of the h.265 spec, thankfully.

As to the 4K vs 1080 "no visible difference" argument, that could be made about any resolution, depending on the size of screen and the distance you're sitting from it. Many people today aren't benefiting from 720p vs 1080p, cause they're watching those 65" screens they bought from a couch 15' across their basement rec-rooms.

As resolutions get higher and higher, yes, absolutely fewer and fewer people benefit. For instance I don't think I'd benefit from 8K on any of my screens. Which is why digital distribution is the perfect for this- because they can have 20 versions of the same file that they can feed to you depending on whether you're watching on your iPhone on the train, or your 110" home theatre. Or, depending on the quality of your internet connection.

Those who want 4K are going to be the same self-selecting group that bought into higher end video formats before. They'll have the larger screens, be sitting closer for a more "theatrical" experience, and have beefy internet connections to get the highest bandwidth version of files possible.

Sure, it's going to sell to people who probably don't benefit- but people get oversold all the time, so that's nothing new.

it still feels mad - aren't we having a stupid argument that has nothing to do with the attributes of the film held in US salt mines?
If cinemas are being marginalised to spectacle:
exactly how stupid is the person pushing false pixel density with crap LCD screens?
why are the panel suppliers at such a destructively reductive mega-pixel argument?

how is it that the 12 bit vista-vision classic film celluloid 13000 nitsblow your mind household panels are only available in our imaginations?

Ultimately I think that's exactly what it is- people want Vistavision in their homes- one aspect of that is certainly clarity. Depth and quality of colour is a much harder metric to quantify.

And I think "digital" has a lot to do with it. I think analogue scaling is much more forgiving. If you take an 8mm film and project it in a theatre, the image will be soft, but it will still look "natural". While digital images blown up past their original resolution degrade in a very unnatural way. I think that's why resolution IS an important part of the picture.

If you went back 20 years, when all those shows were shot on videotape at 480i- if those people knew that their shows would be technically sidelined in just over a decade, do you think they would have thought harder about shooting on film (since no HD alternative existed)?

[Marcus Moore]"If you went back 20 years, when all those shows were shot on videotape at 480i- if those people knew that their shows would be technically sidelined in just over a decade, do you think they would have thought harder about shooting on film (since no HD alternative existed)?"

I believe most comedies and dramas were still being shot on film 20 years ago... prime time shows, at least. Shows with decent budgets like ER were never shot in SD... instead, going from film to HD (I think they shot on Red cameras when they went digital).

I think I'm mostly referring to sit-coms, like FAMILY TIES or THE COSBY SHOW, which were shot video. In the 90s it seems like they went back to film with shows like CHEERS, SIENFELD and FRIENDS. I'm not sure when these kids of shows moved to digital.

I mean, it's hard to say what the market for that 80s material is in the long run. But it is always amazing to think that shows from the 50's and 60's like LEAVE IT TO BEAVER or GILLIGAN'S ISLAND could conceivably be given a 4K treatment, while shows made decades later will forever be trapped in 480i hell.

[Marcus Moore]"I think I'm mostly referring to sit-coms, like FAMILY TIES or THE COSBY SHOW, which were shot video. In the 90s it seems like they went back to film with shows like CHEERS, SIENFELD and FRIENDS. I'm not sure when these kids of shows moved to digital.

I mean, it's hard to say what the market for that 80s material is in the long run. But it is always amazing to think that shows from the 50's and 60's like LEAVE IT TO BEAVER or GILLIGAN'S ISLAND could conceivably be given a 4K treatment, while shows made decades later will forever be trapped in 480i hell."

Ah, I see your point. Though, I think it's always been a question of budget, even for popular shows like All in the Family or Barney Miller... video was just cheaper to shoot. Then again, who could have predicted (then) that there would be a market for high(ish) quality on demand video in the future?

Considering the lack of flexibility in the FCP X UI on a single screen, I'm still leaning towards a second cinema display. Especially working in Motion on two screens, with one for JUST the timeline (and option FCP X needs) is just great.

There's been lots of scuttlebutt about a 4K ACD coming in the fall, but I'm honestly thinking I may just get another current TB Display. The way I figure it, a 4K ACD is definitely going to come at a price premium- maybe $2,000-$2500? For that money, I think I'd rather just get another current gen, which still looks great, and then spend my 4K money on a larger TV display for the studio instead, where the resolution will be more noticeable.

The only way I'll be cheesed off is if Apple releases an 4K display at the same or only marginally more money, but I just can't see that happening.

[Marcus Moore]"Considering the lack of flexibility in the FCP X UI on a single screen, I'm still leaning towards a second cinema display. Especially working in Motion on two screens, with one for JUST the timeline (and option FCP X needs) is just great."

I would never do that now. Especially because the current Cinema Displays are glossier than the current, thinner iMacs. They don't even have USB3.

I don't perceive this as being a limitation of a single screen. Literally everything can be open. The 34 isn't just screen size, it's resolution.

It looks very nice, and I've heard great reviews. The only disadvantage I see here is the lack of available vertical space for the Project timeline- which is always at a premium. I'd love to be able to throw just the timeline out to the second display, giving me oodles of vertical hight for complex project with no need for scrolling up and down.

It's basically the same height as the current 27", it's just so damn wide it makes it look short.

I drooled over this thing since Bob Zelin posted about it at NAB. Odds are I'm getting one for my office when the next rMBP refresh comes - unless of course Apple releases a variant of this with FaceTime camera. In our experience no camera is the only negative.

I worked on it for a few days straight the other weekend and it's just good fun. The Mac Pro was also obviously was more fun to work with over an iMac as well.

I guess I don't understand your point, then. You're saying that you loose vertical resolution for timelines, but it has every bit as much vertical resolution as WQHD monitors, which now represents most higher resolution 27" monitors. True, there are still a few 30" with 1600 instead of 440; is that what you are referring to?

Moving to this display would give me more horizontal space (not resolution) for the Project timeline, but vertical space is what I need more of in complex Projects.

That's why I say I'd probably go with two displays over this extra-wide one. Since being able to put the whole timeline to the second display (which we can't even do at the moment- feature enhancement sent) would give me OODLES of vertical height for the timeline (and a whole 27" screen's width too!).

But this ultra-wide screen would definitely be a better single-display option than the current ACD. Same height, extra width for timeline and Event Browser.

Gotcha. I've always been a two screen (three including broadcast monitor) guy, but I'm really enjoying this. If I get a Mac Pro down the line, I might go for two of these on it. It's funny that I've moved to "if." I find the new Macbook Pros so versatile that, given my need for mobility, I'm more inclined to go with them. I'm still not a big fan of the 15" screen, but I can set up multiple sites and move CPUs between them, which is kind of delightful. TBolt really was a game change for me.

But, it looks like it sold out like crazy. This monitor was released originally in January in South Korea, and then released in June in the USA. Maybe the rest of the world wanted it too. When I ordered it early on Amazon, I got a notice that shipping would be delayed. I got lucky and went to Frys the day it was released and they had two in stock.

If I were a betting man, these could end up being the guts for a new Thunderbolt Cinema Display.

John, have you played back 'Scope format films (roughly 2.39:1 aspect ratio) on that 3440x1440 screen? Must be a pleasure to see? Don't know why this is marketed as 21:9 since that would have a 2.333:1 aspect ratio and your screen is 2.389:1. Very nice I imagine.

Does anyone know if these ultra wide monitors can be used as a virtual dual screen? I am concerned about how I'd use Final Cut Pro X on one of these since it doesn't have windows, but yet you can put events browsers, or viewers on a second display. I am really concerned if I'd be losing any capabilities with FCP X having a single ultra wide monitor to replace a dual screen setup.

While were at it - anyone using a curved monitors or monitors with editorial and how's that going?