Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

New submitter tvf_trp writes "Fox Sports VP Jerry Steinbers has just announced that the broadcaster is not looking to implement 4K broadcasting (which offers four times the resolution of today's HD), stating that 4K Ultra HD is a 'monumental task with not a lot of return.' Digital and broadcasting specialists have raised concerns about the future of 4K technology, drawing parallels with the 3D's trajectory, which despite its initial hype has failed to establish a significant market share due to high price and lack of 3D content. While offering some advantages over 3D (no need for specs, considerable improvement in video quality, etc), 4K's prospects will remain precarious until it can get broadcasters and movie makers on board."

I bought their original 50inch model in May of this year to use as a monitor. I paid $1099 at the time, with Amazon Prime shipping.

There were a few little annoyances immediately that I had to work out, and the Seiki support people were great. Got new firmware to fix a few things.

The only functional issue I have left is it won't autowake up from the hdmi on my video card (it is actually the video card not the monitor) so I have to hit the button.

Overall I'm happy with it, here are a couple of my quick commentsThe screen is a little glossy for my taste but not horrible.(personal preference)The colors are a little over saturated, I should probably to a color calibration on it.The monitor is a little too big, I actually have to turn my head and pick up my mouse more than I'd like for stuff on the far edge. I've been telling people a 42" would be about perfect so the 39" looks nice, especially for the price.On a couple of games I've thought I've seen a little ghosting but nothing horrible. At 4k the HDMI is only 30Hz but the actual screen refresh is still normal.

I originally said I would try it for 60days and worst case scenario it would become just another TV. That time expired in July and I'm still using it.

Maybe I'm just a simpleton, but I recently went out to get a new monitor.

I ended up getting a 1080p 23 inch LED TV instead and just plug in my PC via HDMI.

Now, like I said, I'm a simpleton, and I'm sure other people can make use of much higher resolutions or other characteristic that my simple eyes and brain cannot process.

But for me, I sat there staring at the monitor and then the TVs. Then I looked at the price; they're about the same and it just made sense to get the TV. It comes with built in sound, a remote control (good for sound control too).

Most people who replied to you didn't answer you and most of those people gave you the wrong answer. A number of people said that the Seiki will only run at 1080p with a computer attached, which is just flat wrong.

The 4k Seiki will run in full resolution with both the 39-inch and 50-inch models. The limiting factor on the Seiki's are the connector, which is standard HDMI. A standard HDMI cable cannot push more than 30 hz, which is a very flow refresh rate for monitors these days. Indeed, the Seiki itself supports 120hz, but because it only comes with a cable jack that allows 30hz, you need to use 30hz.

In the next year hopefully other companies or Seiki itself will come out with displays with HDMI2 or Thunderbolt ports at similar price points. This will allow higher refresh rates to be used, prevent screen tearing in 3d work and gaming and improve fast-motion scenes.

the Macbook Air is a specialist laptop specifically designed to be smaller, thinner and lighter. Apple has lots of laptops with 2560x1600 resolution, you just chose one designed for a different purpose.

Why do so many tablets have a higher resolution (and probably higher quality) display than the Air then? Even the iPad Mini has a higher resolution.

Higher resolution beyond a certain point no longer becomes about displaying more data, but displaying it better. The font remains the same physical size, but more pixels are devoted to it, leading to much crisper, clearer text, without reverting to tricks like anti-aliased and sub-pixel rendered fonts.

It will be like HD and 3D. In a few years it will become standard on mid range and even cheap TVs.

The key difference with 3D is not the cost of the TVs, it's the cost of the broadcast equipment and cameras. 3D was actually quite a cheap upgrade from HD, and most of the same equipment and software could be used with a few modifications. 4K is another ball game though.

Even worse there is 8K on the horizon as well which will require yet more brand new equipment. NHK, the Japanese national broadcaster that invented 8K, has stated that they will not support 4K at all and are instead going to look at going directly to 8K around 2020 (in time for the Olympics). I have a feeling they may not be alone in wanting to wait, but of course TV manufacturers all want to push 4K as a reason for the consumer to upgrade or pay a premium.

4K is horizontal resolution. That's not a marketing trick, they're using digital theater projection lingo. Makes more sense with theater, since all movies are the same width, but not all are the same height due to aspect ratio differences.

But for some material out there 480p is as good as it will ever get (old 80s tv shows).

Which was ironically shot on 35mm film and would just need to be re-edited to be released in 4K. Just look at Star Trek or Seinfeld in HD. On the other hand, shows from the 90's and 2K's are shot on digital at a much lower resolution.

The only reason to move to 4K in the home is larger screens. That and for computer screens. As in larger than 60". HD TV's came around before the content too.

I'll start to be interested in 4K when there are cheap devices, displays and content worthy of driving a 4K display.

Until home consoles are rendering 4K@60 frames per second comfortably across all games, or super-mega-ultra-duper-bluray is becomes mainstream, I doubt your average joe will really care. Current generation consoles can't even do 1080p at decent framerates across all games. Though blu-ray is pretty nice.

Well that's kind of in-line with my point. Samsung (or insert panel manufacturer here) can have a production run of 1920x1200 panels destined just for monitors. OR they can have a larger run of 1920x1080 destined for both TV's and monitors. Guess which will be cheaper?

1080p TV drove the adoption of 1920x1080 as the standard for PC monitors more than marketing.

Economies of scale don't have much to do with it, at least not in the TV > PC realm. The panels which go into TVs are very different than those that go into monitors. Combine that with the incredible size difference between the standard TV and the standard monitor and there's not much they share in common.

Yes, and not only for size. Different technologies too. Most cheap monitors are TN film panels. Many cheap TVs are IPS. The pixel pitch on a TV panel is typically larger (i.e. physical pixels are smaller) because even with a small TV it's likely to be viewed from a couch rather than a chair and desk and because they are easier to manufacture that way.

What may amaze you is the sheer number of different panels in a single manufacturer's product line, even within the same model. As far as I can tell for instan

HDTV is shot at 16:9 because that's what the TVs are. But movies are usually wider, at 1.85:1 or 2.40:1.

16:9 was chosen because it was more or less a compromise between the common widescreen film ratios and the narrower 4:3 SDTV and 1.375:1 Academy ratios.

There's a good youtube video about this sort of thing here [youtube.com], and the wiki article on the 16:9 ratio [wikipedia.org] is also handy.

Now it may well come to pass that movies will be shot in a native 16:9 ratio, but so far the trend is simply to make sure that all the action fits into that area when they crop the image for transfer to home video.

Of course, the moderate popularity of IMAX weirds things a bit. I remember seeing the Bluray release of The Dark Knight, parts of which were filmed for IMAX, which has a 1.44:1 ratio. The rest of the movie was in a more typical 2.40:1 ratio. Their solution was to present the conventionally filmed parts of the movie letterboxed, but to show the IMAX sections in 16:9, filling the frame of the TV, but still cropping the top and bottom of the original image.

My answer is:C is demonstrably false, as I'm about two feet away from the screen I'm using at this very moment.D is demonstrably false, as many sane people buy larger screens.

I suggest you rethink your position replacing distance and size by field of vision. Your previous statement would turn into "an field of vision over n degrees is useless". To which I'd answer "Anything less than my entire FoV is not enough."

My answer is:C is demonstrably false, as I'm about two feet away from the screen I'm using at this very moment.D is demonstrably false, as many sane people buy larger screens.

I suggest you rethink your position replacing distance and size by field of vision. Your previous statement would turn into "an field of vision over n degrees is useless". To which I'd answer "Anything less than my entire FoV is not enough."

I've never understood these people who never get close to their monitor to see more detail.For me it is the most natural thing to want to do instead of "zooming."Just cause I can zoom doesn't mean that sometimes I won't want to actually get closer and look.

There is such a thing as the "Resting Point of Vergence" which is the shortest distance at which people's eyes can focus effortlessly and indefinitely. The average is 45" looking straight ahead and 35" looking on a 30 degrees down-angle. Sitting closer to your TV/monitor than your RPV will cause eye fatigue over time. In my case, that distance is around 30" looking straight ahead. For some people, it can be as short as 15". But the average is 45".

Somehow, I do not think pause-to-inspect-minute-detail is common enough to justify the billions of dollars it would cost the industry to make 4k the new standard any time soon when it barely just got done upgrading to HD.

Are you stupid? Does you neck hurt yet? Are you tired of having to lean over to get a good head on look at the 1/3rd of the screen on either side of the middle or do you just ignore 2/3rds of your screen.

Sitting 2 feet away from a 42" display makes you a moron unqualified to continue this conversation.

I take it you've never played a first-person game on a 40" screen. Granted I'm usually a little closer to 3' away (arms length is the recommended distance to sit from a monitor). Filling a larger portion of your FOV is a great way to boost immersiveness. And yes, I do have to move my eyes a lot to see the full detail in the corner of the screen, but I have to do that out in the real world to.

Works great for office work as well (though that 4K resolution would be a huge bonus), in which case I'm generally only looking at a portion of the screen at a time, but can switch between tasks/monitor different things simply by moving my eyes, almost like working on a physical desk. And it's a big boost over multiple monitors in that you can size windows to whatever size and aspect ratio makes sense for the tasks at hand.

I don't think that's what GP was implying at all.In fact quite the opposite.

A. 1080p on 42" at 10 feet away is more than most people can discern.B. More than 1080p on 42" at 10 feet away has no value. 1080p may have value if on a screen far bigger than 42" at 10 feet, or with 42" far closer than 10 feet.

So he seems to imply;C: 42" is less than 'insanely huge'.

Also, your assertion of D may or may not be correct, since D is undefined.

4k may not make much sense on a 42" TV, but on 55" the difference is clearly visible. And screens are getting bigger all the time, with sizes around 65" being common and even a few screens of over 100" hitting the market.

Also the comparison to 3D is flawed. 3D requires 3D content, but viewing stuff on a 4k screen carries a benefit even for content not in that resolution. Compare an ordinary blu-ray on a HD screen and a 4K one (both 55" or over); you'll see a marked difference in quality thanks to the upscaler. The same way DVDs look way better on my upscaling HD screen than they do on a lower res one of the same size.

The ability to see individual pixels is not the limit of perceptible improvement though. Even on 'retina' displays there is visible aliasing on diagonal lines. Think about it like this, a 12nm chip fab produces individual elements at 12nm, but places them with much, much better than 12nm accuracy.

This, like the "you can only see so many colos" argument is misleading.

You absolutely can tell the difference between 4k and 1080p at average viewing sizes and distances - but not because you can pick out the individual pixels.Lower pixel density creates visual artifacts. Aliasing, uneven gradients, pixel pop (Where small elements or points of light like stars get lost between large pixels), etc.If you see a 4k and a 1080p display side by side the difference is shocking.

I've seen 4K on a not-yet-released 20-inch Panasonic tablet [engadget.com] - it's jaw-dropping. You might not be making "full use", but...oh, my it's beautiful. This from a guy who doesn't care much for TV or video.

OK, you're asking "why a 20" tablet? WTF?" - one vertical market for this is radiologists, who definitely need all the resolution they can get, high dynamic range, and a big screen. Saw it at a medical convention.

Cable companies have a hard enough time providing enough bandwidth for more than a couple HD channels, where are they going to find the bandwidth for 4K Ultra HD? Does Blue Ray even have the ability to take advantage of this technology? How about gaming platforms? What, exactly, would let someone be able to justify their investment?

Yes but 4K content can be rendered in a video game given the right hardware/software. 4K video requires that the content be recorded and sent at that resolution. Content providers like cable channels are now only producing all of their content in 1080p much less 4K.

Exactly - if content providers aren't even willing to send enough bitrate through the pipe to deliver a satisfactory experience by today's HD standards, who on Earth would imagine they'd do justice to 16x the bandwidth requirement just a few years from now? Some broadcasts are still MPEG-2; some others are MPEG-2 but get passed through a last-leg AVC transcoder to save bandwidth; and while AVC's enjoying healthy adoption, there's no way to expect most companies will pay the hefty fees to adopt HEVC equipme

Not to mention... If you want to stream a 4K Show over Hulu or Netflix, you'd hit your AT&T or cable provider's MONTHLY bandwidth cap in ~5 MINUTES. (variable, depends on compression, cap, buffering, throughput, etc.)..

Reminds me of LTE, Verizon Wireless is quick to point out you can download from them at over 50Mb/sec.. they wont tell you that after 60 seconds your phone bill is now $600

Cable companies have a hard enough time providing enough bandwidth for more than a couple HD channels, where are they going to find the bandwidth for 4K Ultra HD?

They can start by charging for analog channels commensurately for the bandwidth they use, rather than giving away the analog stuff they modulate in-house for "free" while charging "extra" for digital content they've nothing to but encrypt.

Why the heck would I want UHD when most HD content is so compressed that the artifacts are easily discernible from across the room. At least that is my experience with every HD medium I have seen OTA, cable, satellite, and to a much lesser degree in Blu-ray.

I came here to post this. I'm in the minority, but to my eye it is more pleasant to watch the old grainy picture than it is to watch compressed high resolution video. In particular, my eye gets drawn to grass. Every time I watch a game played on grass (baseball, football, the other football, etc), the digital compression just hijacks my eyes. I can learn to ignore it over time, like watching a movie with subtitles, but it still is not my preference.

Best way to turn almost all compression artifacts into regular noise. Your brain is great at perceiving that as being higher quality imagery.Using post resize noise or post resize sharpening (MPC-HC or MPC-BE sharpen complex 2) also works great to turn 720p content into '1080p'.

I agree. Compression is the primary issue here. Make the resolution 10k and it'll still look like crap because of the heavy compression. But if you're claiming to see compression artifacts on a blu-ray disc I think you need your eyes checked. Those usually don't use anywhere near the compression of cable TV.

Why the heck would I want UHD when most HD content is so compressed that the artifacts are easily discernible from across the room. At least that is my experience with every HD medium I have seen OTA, cable, satellite, and to a much lesser degree in Blu-ray.

You have a point, but you lost credibility when you included OTA in that list. OTA is uncompressed 18.2mbit MPEG. There is no point in compressing an OTA broadcast because the bandwidth is functionally unlimited, and I don't even think that the ATSC standard supports compression beyond normal MPEG2. When you see artifacts on an OTA broadcast it is most emphatically *not* from compression, it's usually from interference or a badly tuned/aligned antenna.

MPEG-2 is compressed by definition; an uncompressed HD picture is something like 1 Gbps. Confetti, for example, looks awful no matter what the source, because it's hard to compress.

The only reason MPEG-4 isn't supported in ATSC is because it didn't exist when the standard was written! MPEG-4 is actually now in ATSC, but is not a required part, so no receivers support it and no broadcasters use it except in rare corner cases.

And it's only 18.2 Mbps if there are no other services on the OTA channel; some stations in smaller markets now cram 3 HD services into the 19.393 Mbps channel, which is an average of about 6 Mbps per video channel when you take into account audio and overhead. Most other stations run at least one SD channel in addition to the HD channel, many run more than one. Others are doing Mobile DTV which eats into the bandwidth available. The bitrate of a single HD feed averaged across all OTA stations in the US and Canada is something in the neighborhood of 13 Mbps in MPEG-2.

Obligatory disclaimer: I used to work for a broadcast TV company heading up our broadcast TV engineering projects. I now work for the FCC on over-the-air digital TV matters. In my spare time, I run digital TV website RabbitEars.Info.

OTA broadcast is, as you say, MPEG. Or, more precisely, MPEG-2. To say it is uncompressed is completely false. It may not be over-compressed, but you still see artifacts in scenes that the compression can't handle well, particularly scenes with rain or fire--anything chaotic where there are massive changes between frames.

There are so many facts wrong in your post that I sincerely hope you don't work in the technical field of broadcasting. OTA uses MPEG-2 (same codec as on DVD's), which is a lossy compression technique. ABC NBC and CBS stations all take an MPEG-4 feed from their network and re encode it to MPEG-2; FOX stations get MPEG-2 video that they then "splice" is their network bug and local commercials and promos.
Getting a lousy picture on digital TV from a poor or unaligned antenna is a lie that salesmen use. If th

What consumers want is a stable technology, not be be on a constant upgrade treadmill

There are different kinds of consumers. What you say is probably true of the consumers buying their sets at Walmart and Target, and it's probably true of me as well (at least to a degree). But I know plenty of people who are always on the bleeding edge. This 4k stuff is blatantly targeted at those consumers, and it may or may not trickle down to the rest of us... sometimes these high end things succeed (hi-fi VHS, HDTV) and sometimes they fail (videodisk, DVD audio), but the high-end, bleeding edge folks ge

As it is right now, the only true 1080p content is high bitrate blu-ray disks, and PC games. There is nothing else.

None of the currently released consoles can render 1920x1080 at 60 fps : they use a lower frame rate (30 fps) and a lower rendering resolution (not even 720p internally for most games). The next gen can maybe do it, but I suspect that some games will use lower frame rates or internal resolutions so that they can put more detail into other things.

Broadcast channels, satellite channels, and HD cable channels all generally are full of lower bit-rate tradeoffs. You need about 30-50 mbps to do 1080p without compromises or visible encoding errors.

Maybe in another 10 years, when the technology is actually fully utilizing the 1080p displays we already have, will an upgrade make sense.

Note that this is for video content. For your computer or tablet PC, higher resolutions are useful, and shipping tablets are already at higher resolutions.

There is way too must current content that is still not transmitted in 1080p. Buying a new (expensive) TV just to display most shows in standard resolution makes no sense at all. Yes, I know live broadcasts are usually in high def, but one can only watch so must sports on TV.
To be fair, I think it is actually a legacy problem. There is so much good legacy content recorded in standard definition that it is tough for new content to compete, at least from a percentage perspective. Best excuse for a good

Actual 1080p isn't even here yet for a lot of media. Most games and TV stations still only use 720p, and there are quite a few movies in that mode as well. It's no surprise that no major content provider is considering 4K at this point.

The timing for 4K is just too soon and wrong. Firstly, we're in a delicate financial situation around the world and the biggest consumer nation is on the edge of collapse. It seems like only a few days ago we went to digital TV. People are STILL getting rid of the CRT TVs. And the marketers are trying to sell us 4K TVs??! I'm sorry but no. Just no.

3d TV's failure was most certainly not a 'lack of content' and if it's perceived that way by the media mavens, then the same mistakes will be repeated.

3d failed because:- technologically not-ready-for-prime-time; wearing uncomfortable specs etc wasn't popular in theaters the FIRST go around with 3d.- people recognized it for what it was: a money-grab by hardware producers trying to re-milk the public that had already been forced to go out and buy all-new digital tvs.

Just give it up. Broadcast TV standards don't change overnight, and 4k is going to take huge effort, to provide a small improvement.

You're talking about making all those receivers people just went out and bough, completely useless. The government would have to PAY to replace them, just like they did with digital converter boxes a few years ago.

And don't tell me about satellite/cable companies! They lag BEHIND broadcasters, they do not take the LEAD... And internet service looks to be more bandwidth cons

When I visit the local Sony and see the 4K 9with true 4K content) side-byside with their best regular HDTVS, the improvement is quite stunning. The get pretty close to "appearing like a real window rather a just a TV" threshhold.

It is not to have 4 times as many things on the screen as a 1080p monitor.
It is to have a 2:1 pixel ratio (like all the apple retina displays) or somewhere in-between.
Web content, thanks partly to apple pushing high dpi displays, is now often tuned for this, showing you twice as much detail in the same space while keeping the dimensions it would have on a normal dpi display.

Read what anandtech had to say about testing a 4k monitor, and about how nice it is to look at fonts that arent just anti-aliased, but hardly have aliasing to begin with, thanks to the dpi.

I run a 1440p monitor, as it was the most pixels I could reasonably afford, (4K is just too much $) and I scale everything up so it's roughly 1080p sized. I love it for the clarity and sharpness, not for the number of things I can cram on the screen. (Although I do run my font just a little small in my text editor/ide)

There are of course downsides besides the price. Most of the 1440p monitors have poor input latency, meaning your mouse might feel a tiny bit laggy or put you at a slight disadvantage if you're a gamer, compared to lower latency 1080p monitors. That's totally ignoring whether your video card can render smoothly at that resolution. With 4K I'm not sure but I suspect it's the same or worse.

You just need to look at the higher resolution phones to realize what you're saying is bullshit (and those are ridiculously small 5" screens, although albeit you do look at it closer than a television). The so-called "retina" display by Apple is still far short of the maximum resolution we can see. Have you actually gone and looked at a 1080p display before deciding on your 720p monitor, or did you trust your flawed math and went with it? Here's the actual math [clarkvision.com] with references to the visual acuity numbers.

AVC encoding may optimally be able to cut the bit rate in half, so say 50 Mbps for AVC 4K 60p. I'm not sure live AVC encoders are actually at that point yet, but I believe they will make it in a year or two.

Then assume HEVC cuts the AVC bit rate in half. So 25 Mbps for HEVC 4K 60p. I know there are not