Posted
by
timothyon Friday January 03, 2014 @07:52PM
from the think-of-the-poor-pipes dept.

sfcrazy writes "YouTube will demonstrate 4K videos at the upcoming CES. That's not the best news, the best part of this story is that Google will do it using it's own open sourced VP9 technology. Google acquired the technology from O2 and open sourced it. Google started offering the codec on royalty free basis to vendors to boost adoption. Google has also learned the hardware partnership game and has already roped in hardware partners to use and showcase VP9 at CES. According to reports LG (the latest Nexus maker), Panasonic and Sony will be demonstrating 4K YouTube using VP9 at the event. Google today announced that all leading hardware vendors will start supporting the royalty-free VP9 codecs. These hardware vendors include major names like ARM, Broadcom, Intel, LG, Marvell, MediaTek, Nvidia, Panasonic, Philips, Qualcomm, RealTek, Samsung, Sigma, Sharp, Sony and Toshiba."

4K is the horizontal resolution, not the number of pixels. Actually, it is 3840 pixels × 2160 for most "4K" TVs, or about 8.3 MegaPixels. Some models are much less than $3K. Here is one [amazon.com] for $500.

Yes, but the problem with DisplayPort is that it's a royalty-free standard, so to implement it the manufacturer has to pay royalties to no one, making it expensive. In contrast, HDMI requires implementers to pay $10,000 per year plus a royalty rate of $0.15 per unit, reduced to $0.05 if the HDMI logo is used, and further reduced to $0.04 if HDCP is also implemented, making it cheaper. Or something.

Well if we're gonna get technical, you just calculated pixels per dollar whereas he was calculating dollars per pixel. Switch your numbers around:3000 dollars / 8294400 pixels = 0.00036 dollars per pixelSo technically he is still right. It is less than a dollar per pixel.

The Seiki SE50UY04 [cnet.com] shows up at less than a thousand pretty frequently.

The one major downside is that the cheapies almost certainly have neither Displayport nor HDMI 2.0 HDMI 1.4 will drive a 4k panel; but maxes out at something like 30Hz. Given that pre-canned 4k video is practically nonexistent (but would be the material that might have been shot at under 30FPS originally, and has plenty of detail in the original film if somebody feels like doing a good transfer), the only real use case is hooking it up to a computer, where the refresh rate will promptly unimpress you.

It won't flicker or anything, this isn't the CRT days; but 30FPS is Not Good.

Unless the DCI changes course, fast, the point will be largely moot. DCI 4K is higher resolution than 4k UHD; but 4k UHD hardware is available right now (with the cheap seats down to under $1000, and the 'walk into a Sony store and look rich and clueless' option still likely to set you back only ~$5k). DCI 4K gear is... niche. Not only is it almost entirely high end projectors designed for commercial movie theaters (and priced accordingly), the DCI writes their specs with all the paranoia, loathing, and pat

The DCI spec won't change because it's a cinema standard to protect cinema content, but there's no reason for consumer level gear to follow it. Whether you have a 16:9 (HDTV), 17:9 (DCI) or 21:9 (ultra wide) ratio screen it should play consumer movies just fine. The only question is if the delivery system will have a 17:9 version of the movie, it depends on how they've reframed the movie. If you could take the 3840x2160 stream and add a 256x2160 slice left/right you could have "dual format" discs that'd giv

I've actually got a Sony X9005A as a desktop display for my PC and no, the 29Hz refresh rate does not make it "unimpressive". If you're looking for getting impressed then the resolution will vastly overpower the refresh rate. When you have a window-like view to your games, photos etc. you just instinctively ignore the slow refresh.

The worst thing is probably the input lag introduced by the low refresh rate. The thing has one of the lowest input lag scores on the market, but the slow refresh still makes cursor input really laggy. It's not the kind of lag you see but the kind you feel. It's gone if you switch to 1080p, but you won't if you have a 4K panel, will you.

FWIW the Sony supports hdmi 2.0 and thus 4k@60fps, but good luck finding a GPU that outputs it. I'm stuck waiting for the eventual NV GTX 800 series which probably will. NVIDIA haven't even confirmed it.

On the topic of Youtube, I thought they'd supported 4K since 2010 [blogspot.fi]. In fact 4K vids on Youtube were one of the first materials I tested my panel on. They stream fine over 24mbps ADSL2 but the bitrate is not great (the vids are noisy).

Try $500 [amazon.com] plus shipping for a 39". Of course it's only capable of 30 Hz at UltraHD resolution because it only has HDMI, but it's available. I can't imagine why they didn't include a DisplayPort connector, since DisplayPort is royalty-free, but they didn't. I'm hoping they'll release another model this year that includes DisplayPort.

Reviews say Seiki customer service is nothing great, and there are more than their fair share of DOA units, but it's a start.

I'm actually genuinely curious what type of card you need to run non-gaming (maybe some blender) at 4K. I don't have much interest in high-quality gaming, but would LOVE to put websites, code, monitoring software, feeds, etc on a single display. I haven't bought a video card in years and really don't know what it would take.

Unfortunately the current crop of monitors is aimed mainly at the photography professional, since being able to see 8MP of a still instead of 2MP is immediately useful to anyone with a modern camera. The downside - for us that want affordable monitors at least - is that they then also put great emphasis on color accuracy and coverage, that $3k buys you a Dell UltraSharp 32 with 99% AdobeRGB coverage and factory calibration to a delta-e 2. In short, they promise you plug it in and the colors are pretty much

Doesn't matter about entering the consumer arena with display technology. I think this is very limited by bandwidth technology and realistic costs of deployment in the US. It's going to be amazeballs, but in South Korea, Japan, and several EU countries.

YouTube said it was demonstrating streaming of 4k. I would assume that to be true since YouTube is a website...

Doesn't that put the required bandwidth for streaming at 20-30Mpbs in ideal networking conditions with moderate compression? That's from a single

We're talking about youtube here, well known for mutilating 1080p content with a H.264 encoder until it fits in 4Mb/s.I'd be rather surprised if the 2160p stream is more than 10Mb/s average.

So basically, look horrible, look like it was streamed.

Kind of defeats the point of the 4k investment though. Only local content is going to be played. On demand will need one hell of a buffer for full quality. Piracy is going to be difficult (a plus for them) since it can only be larger than BluRay, and that can be upwards of 35 gigs. Most people can't be patient enough to max their connection for several hours to get a movie.

Now we have a whole other format for 4k that we can pay extra for to Netflix or B

I think the point is so Google gets VP9 support out there, mainstream.

See, we all know it'll be so highly compressed it'll look crap. I'm sure that's intentional partly to save bandwidth, but also to encourage you to buy stuff instead of just youtube streaming it all the time.

But the 4k over VP9 only, that's a message to producers that they must start to make their stuff available in VP9 format if they don't want to be "left behind" in the future video technology, and they will. And once that's the only for

4K resolution is a generic term for display devices or content having horizontal resolution on the order of 4,000 pixels. Several 4K resolutions exist in the fields of digital television and digital cinematography

4K resolution is a generic term for display devices or content having horizontal resolution on the order of 4,000 pixels. Several 4K resolutions exist in the fields of digital television and digital cinematography

And with that resolution you can see the layers of pancake makeup on your favourite actors and actresses, plus all that spitting during sports events in astounding clarity.

Well, I'd like the vertical resolution just for photo editing, it would be nice to see the full resolution without affect from scaling algorithms.

Yeah, but it will deliver 1080p and 720p video to you with lower bandwidth requirements. Less buffering and fewer artifacts (because of lowered data requirements and a corresponding lower rate of dropped packets).

And with that resolution you can see the layers of pancake makeup on your favourite actors and actresses, plus all that spitting during sports events in astounding clarity.

You're like an echo from 15 years ago when 1920x1080 was to replace NTSC 640x480, both HD porn and HD sports looks great despite the naysaying. Movies and TV too, if the costumes, props, backdrops or special effects no longer looked real they simply had to improve until they did. Why should UHD be any different? It might be that many people meet it with a yawn like Super Audio CD vs CD, for the vast majority a regular CD was more than good enough already but the "too much detail" should be thoroughly debunk

That's a complete waste of money. You'd need a gigantic screen and to be sitting so close that you can't see all of it in order to need 4k. 4k is for movie theaters.

I'm surprised that the idiots on slashdot aren't aware of the pixel density that the eye can perceive is the limiting factor here. Even with the current crop of HDTVs being 1920x1080, you need a rather massive TV and to be sitting quite close in order for the pixels to be a problem.

I've spent the last 7 years with a 37" 1080p screen as a primary monitor. I sit about 4 feet from it, can't see pixels, but almost (I see them at 3').Games looks absolutely awesome.Since it's showing signs of aging, if I can get a cheap 60" 4k screen in 2-3 years, I'll sit 5-6 feet from it, except when I have to display three datasheets side-by-side.Games will look awesome.

Here in the States, we solve that a different way. The NFL (professional handegg league) runs a channel called RedZone [wikipedia.org] that compiles a real-time highlight reel of all matches in progress on any Sunday afternoon. Whenever a team gets the ball within 20 yards of the goal, RedZone cuts to that match until the team scores or otherwise loses possession.

Is there any word on how this '4K' actually looks at bitrates Youtube can push enough ads to pay for, and your ISP will permit?

I have the greatest respect for people who actually handle the challenges of paying the computational costs of video compression and decompression (and scaling if necessary) as efficiently as possible; but once their work is done, a nominal resolution (even an actual X pixels by Y pixels value, not some marketing bullshit) is nearly meaningless unless you are in the (rare for video, probably less rare for audio) situation of having such a high bitrate that the quality of your reproduction is being constrained by your resolution.

Barring an increase in bitrate, will it even be possible to distinguish between Xmb/s nominally 1080 video scaled up to 4k and Xmb/s nominally 4k video?

That’s not the best news, the best part of this story is that Google will do it using it’s own open sourced VP9 technology. Google acquired the technology from O2 and open sourced it. Google started offering the codec on royalty free basis to vendors to boost adoption.

Google has also learned the hardware partnership game and has already roped in hardware partners to use and showcase VP9 at CES. According to reports LG (the latest Nexus maker), Panasonic and Sony will be demonstrating 4K YouTube using VP9 at the event.

VP9 is beneficial for everyone as it makes the codec available to vendors for free of cost – thus boosting its adoption compared to the non-free H.264/265. At the same time being Open Standard and Open Source it also ensures that users won’t require proprietary (and insecure) technologies like Flash to view content. The third benefit of VP9 is that it can deliver high-resolutions at low bit-rates thus using less bandwidth to watch content. It means that those on slower connections will not have to wait for buffering and be satisfied with low-resolution videos. It will benefit those on faster connections as they won’t have to waste their expensive bandwidth on videos.

And not a single Apple device will play VP9. Every Apple device will require transcoding, or using whatever format they find optimizes their [battery life|thermal envelope|PROFIT], which will nudge every well heeled, non-technical user to gravitate away from VP9.

And not a single Apple device will play VP9. Every Apple device will require transcoding, or using whatever format they find optimizes their [battery life|thermal envelope|PROFIT], which will nudge every well heeled, non-technical user to gravitate away from VP9.

Jobs is gone. Android marketshare is up. Apple may not be as wedded to h265 as they were to h264. Things change.

True. Jony has destroyed much of the UI Jobs toiled over; maybe he will sell out and join this open standard. My money is on the no side of this though. I see Apple devolving into all the bad things with none of the elegance now that Jobs is feeding the worms.

And your evidence to support this assertion? So what if Apple is not among the chipmakers supporting VP9 before the gate, since Apple isn't in the business of making chips, but buying them and putting them in consumer electronics.

That's utterly untrue. You might mean that Apple devices won't include VP9 support out-of-the-box (unlike Android), but that's quite different, and won't necessarily hamper adoption. You might as well say: "And not a single Apple device will have Google Maps."

All that's needed is for a popular iOS multimedia app to include VP9, or perhaps even for someone to simply implement a VP9 decoder in javascript:

Don't get your hopes up too high because I've heard the same all the way back to the VP3-based Theora and every version since. It's the escape hatch/emergency break to keep H.264/HEVC from abusing their near monopoly but in reality nobody seems to want to abandon them, not even Google. Kind of like how VC-1 is in the BluRay standard and in every player but 99% of recent discs use H.264. Even Firefox finally bit the bullet in October this year and said they'd use Cisco's H.264 blob on platforms that don't ha

The difference with VP9 is that there isn't already an entrenched standard that's better. When work on Theora started, MPEG-1 was still pretty common for web video, but by the time it was released everyone had moved to MPEG-4. Theora was definitely a big step up from MPEG-1 (and MPEG-2), but not as good as MPEG-4. When VP8 was open sourced, it was better than MPEG-4 (ASP), but most of the rest of the world had moved on to H.264. Now VP9 and H.265 are appearing at the same time. No one is considering sw

Frauhoffer say that VP9 has 8.4% worse bitrate (at same PSNR) than H.264/MPEG-AVC, and has encoding rates that are 100x slower. See page 3 here:http://iphome.hhi.de/marpe/download/Performance_HEVC_VP9_X264_PCS_2013_preprint.pdfI see no incentive to move in the direction of VP9. It's google very persuasively shoving their proprietory format on everyone, that's all. We criticised MicroSoft for doing that in the past, we shouldn't pretend that google is anything apart from an enormous multinational that wants

While YouTube's preference is VP9, [YouTube's Francisco ] Varela left open the possibility that the site might use HEVC in the future. ''We are not announcing that we will not support HEVC,'' said Varela, adding that YouTube supports 17 different codecs currently.

According to YouTube, the first partner TVs and other devices that incorporate VP9 will start hitting the market in 2015. In 2014, YouTube will start transcoding HD video into VP9.

I am not convinced that the transcode to YouTube will be enough to derail HEVC.

On May 9, 2013, NHK and Mitsubishi Electric announced that they had jointly developed the first HEVC encoder for 8K Ultra HD TV, which is also called Super Hi-Vision (SHV). The HEVC encoder supports the Main 10 profile at Level 6.1 allowing it to encode 10-bit video with a resolution of 7680x4320 at 60 fps.

On October 16, 2013, the OpenHEVC decoder was added to FFmpeg.

On October 29, 2013, Elemental Technologies announced support for real-time 4K HEVC video processing. Elemental provided live video streaming of the 2013 Osaka Marathon on October 27, 2013, in a workflow designed by K-Opticom, a telecommunications operator in Japan. Live coverage of the race in 4K HEVC was available to viewers at the International Exhibition Center in Osaka. This transmission of 4K HEVC video in real-time was an industry-first.

On November 14, 2013, DivX developers released information on HEVC decoding performance using an Intel i7 CPU at 3.5 GHz which had 4 cores and 8 threads. The DivX 10.1 Beta decoder was capable of 210.9 fps at 720p, 101.5 fps at 1080p, and 29.6 fps at 4K.

An inbuilt HEVC decoder is not entirely new of course, as LG's LA970 series of UHDTVs released last year also offered the same feature. However, the company's latest 4K Ultra HD TVs due to be unveiled at CES 2014 will use a ViXS XCode 6400 SoC (system on chip) that can decode HEVC-based content at 3840x2160 resolution with support for 60p frame rate and 10-bit colour depth, a world's first.

Google has also learned the hardware partnership game and has already roped in hardware partners to use and showcase VP9 at CES. According to reports LG (the latest Nexus maker), Panasonic and Sony will be demonstrating 4K YouTube using VP9 at the event.

I work in film post-production in Hollywood and I'm not sure we've had any consultations on VP9, MPEG always gets SMPTE and the ASC involved in screenings and quality shootouts. Of course Google might just be trying to buffalo filmmakers, which would be nothing new, I suppose. "Content providers," as a term, rarely describes the people working the camera or doing the color (let alone syncing the sound). If you're a professional the licensing of the codec is completely irrelevant, it's a poor economy if the quality is even remotely compromised.

Panasonic and Sony were demonstrating Google TV STBs a few years ago and I we all know how that turned out. It's basically no-cost for these shops to turn out this gear for whatever marketing event Google cares to throw. What you want to hear is Sony Consumer Electronics saying they wouldn't support the next MPEG standard, or Sony Pictures Entertainment announcing they'd standardize their delivery format on VP9. SPE is one of my employers and the codecs that, say, Crackle.com uses is decided by a group of people completely independent from the consumer electronics folks, and Crackle will support whatever codec is optimal on the target STB/mobile/desktop platform.

Why would a provider want to go single-track with a codec which is "Open" in the way Android is, which is to say, you can download the source code, but the reference implementation that's distributed to millions of clients is controlled by a single vendor?

And this is a problem how? It is called competition you know, a free market you know. And guess what I am mucho happier that Google is competing without clubbing everybody over the head using patents. You know like that Fruit company?

Which part of free licensing didn't you get? Vimeo, College Humour, ustream, etc are all free to implement it whenever they want. About the only thing stopping them at this point is bandwidth and its about fucking time something came out that would put pressure on the ISP's to build proper pipes.

You people are sick. Google spends hundreds of millions of dollars to give us a free and open universal video codec, to crush the MPEG-LA monopoly that actually sued ordinary people for posting their own cat videos over patent licensing, and improves it enough that it can do 4k video even over LTE, and actually gathers enough industry support to ensure universal hardware adoption. And you have to bitch about it like you have ever in your life achieved something so globally useful and helpful and difficult

nobody on the late nineties shot on hd cams, they all used film for cinema and SD for TV.

Many TV shows also shot on film: Star Trek TNG and DS9 were shot on 16mm film (Super 16 aperture), and several prime time sitcoms and dramatic shows shot on 35 mm -- Frasier, Law and Order, and this had been a standard practice for high-quality pre-recorded content since the 70s.

Phantom Menace (1999) shot one scene on a high-def camera -- the midichlorians scene, God help us all. This was the first theatrically-dist

I've read that Theora, based on On2's VP3, is roughly equivalent in rate/distortion to MPEG-4 ASP video (DivX, Xvid), which itself is primarily tweaks to H.263 (Sorenson video, used by early FLV). When Google added a free format to YouTube, it skipped Theora because Theora wasn't competitive in rate/distortion terms to what was already in use. H.264 and VP8 are a generation past that.

Netflix has tossed their hat in the 4K ring with the announcement of 4K streaming starting next month.

The jump from streaming 1920x1080 to 3840x2160 is not something that can be done by just flipping a switch. First of all, viewers need a 4K TV, which practically no one has yet. PCMag's Chloe Albanesius has informed us that Netflix's 4K content will require ''somewhere between 12 and 15 Mbps'' to stream properly. That;s a pretty serious connection which, again, not many.

By using H.265 HEVC (High Efficiency Video Coding) moving forward instead of the currently popular AVC H.264, Netflix thinks they will be able to stream the same quality they currently transmit at half the bitrate. Not only does this mean there's room for higher quality 4K streams, but the current HD content will be transmitted more efficiently.

It's unclear when we'll see 4K streaming available in standalone set-top boxes any time soon, or whether or not it will require new hardware in order to handle the increased resolution in the future, but for now it looks like the TV itself is the home for 4K streaming.

http://en.wikipedia.org/wiki/Dynamic_Adaptive_Streaming_over_HTTP [wikipedia.org]Youtube has split up all video/audio over 720p into separate streams, which makes downloading much more difficult.Some downloaders use ffmpeg to mux the streams together, but other than that, you're SOL for downloading anything better than 720p mp4.

Consoles prior to the Dreamcast output 240p, except for a few PS1 and Nintendo 64 games that output 480i. The Dreamcast, PS2, GameCube, original Xbox, and original Wii output 480i (or 480p if you were lucky). How would requiring 720p improve reviews of games for those platforms?

Google can't even serve Youtube using codecs currently supported by its own browser. There is a mode to switch to HTML5 video, but some of the videos require Flash anyway. Besides Google does not indemnify the potential users of VP9 against potential patent infringement, so it's only slightly better than just grabbing an h265 codec and linking against it without first obtaining a license. It's a patent mine field of epic proportions, and if you're going to pay for patents, you might as well get the real dea

When will I actually be able to watch a movie without stutter every 20 seconds because their buffering is shot to hell and back? And don't think that you can simply pause and let it load, because it simply friggin' doesn't! Instead, if you dare to pause the movie for more than a few minutes it will most likely not play at all anymore.

Quite frankly, the recent year has seen the worst development in quality for Youtube since. Now, I don't care too much about their "tie YouTube to your G+ account" spiel. I nev

I, too, have been having a terrible experience with YouTube, with it often times just freezing at the half or 2/3rd mark. It also seems that when you select a specific resolution, they take it as merely a suggestion. As you noticed, you can no longer let it buffer, they try to do some type of adaptive resolution switching which more often than not results in the stream dropping down to 240 just because I chose to skip ahead. Very annoying.

Yes, but can you *see* 1920x1200 on a 7" tablet. I happen to (still) have 20/12 vision, and I can't. It's like hearing the difference between 320kbps mp3s and uncompressed rips. Or, more accurate 48kHz vs 96kHz music. Unless you are in the very tippy top of the population, you can't do it in the best conditions, and if you take your laptop anywhere but the perfect room with perfect lighting you can't tell even if you have the rare physical senses to do so. I have 2880 res on my 15" laptop; I run at 150% sc

Well I have to agree with that but if you look at the presentation from Google I/O from last May [google.com] and you'll see that VP9 uses less bandwidth for a given quality based on the demonstrations. That's the main reason for switching to it, plus it does deliver some great video. So if VP9 does play out that means less bandwidth than competitive codecs. Unfortunately VP9 to me at least is half baked because I've been watching the project since first seeing the I/O presentation and I have to say that it encodes v