I think if you aren't ok with frequent upgrades, you need to watch the tech and pick your spots. I bought an 8G Kuro when I felt the tech was solid and 1080p was done right. I never cared about 3D or 4K, and OLED felt immature so I didn't jump back in until the 2016 OLED displays. It was the lure of even better blacks plus HDR that got me, and I'm ok with the current state of OLED and LG's processing for the $1800 I paid. In a perfect world perhaps I would have been better off with Sony's processing married to LG's panel, but I didn't have a crystal ball and my Kuro was failing. I will probably wind up with an off board video processor at some point.

I don't get in often either. Before the Kuro, the last display I bought for critical viewing was a Pioneer Elite PRO-720 HD. That's an old rear projection display that was amazing in the day (so good that I still use it). The whole thing is a reminder about how great we have it today. I think I paid like $7000+ (2002), $3500 (2008), and $1800 (2016) for these in order of purchase. The RP set was larger (64" vs 55"), but it shows how pricing has come down while the technology has gotten so much better.

Member

My first HD display was a Panasonic 42PX50U plasma that I bought in 2005 (still have it in my bedroom). Got seriously into home theater after that so I couldn't pass up buying the Pioneer 6020 in 2008 that I'm still using today. The LG OLEDs are the first displays that have seriously caught my eye since then. I was stunned the first time I laid my eyes on an E6.

Member

Upgrading from some random 40" to a 65" KS8000 that was $1,079.99. Unfortunately Samsung.com sucks, and despite a vast majority of people ordering it within the same 4-day period, some people have already received them, and others like me still have "preparing order for shipment", so that's neat.

As soon as it gets here, I've gotta swing by like Best Buy or something and pick up a TV stand as well. Living room's probably far too small for this TV too.

Member

I don't think it matters too much from a gaming perspective, as HDR10 will get scene by scene and per frame metadata with the next HDMI revision, which will push Dolby Vision out of the picture as an expensive option.

Banned

Why is the "warm2" option always the go to when calibrating the oleds? For some reason I cant stand watching it on that setting, it just looks like a really dim yellowish filter. Am I the only one who uses the "cool" setting? Looks much more vibrant to me.

People are used to default settings that are way too Cool. On most TVs, Warm or Warm 2 most closely matches the D6500 calibration standard for movies, which is what your own color temperature goal should be. Once you have watched a properly calibrated TV awhile, other TV's will look too blue and metallic in color, because they are.

Member

LG used a different technology with OLED, beginning with the 2016 sets. On older OLED, blue degrations from the subpixels would occur faster. Their 2015 used that type.

In 2016, they shifted to all white or white subpoxels that use colored filters, which makes color shift and degradation near impossible before 100,000 hours + they used new tools to make retention and burn in, things of the past.

My C6 is completely perfect, no Mura effect or vignetting, streaks or retention. When the screen is displaying black, I literally cannot tell tv is on - even in a pitch dark room.

I think you are wasting your time with this. People are still believing the limitations of OLED from ten years ago, still apply to the 2016 range, for whatever reason. These sets are not perfect, but some of the ignorance surrounding the current tech astounds me.

Member

The real world contains an even wider range of lighting conditions. Does that make reality "not a good thing?" The goal with HDR is to be able to reproduce everything from subtle shadow detail to uncomfortably bright content to give content producers the complete range to draw from.

I find the experience of allowing my eyes to adjust to different content ranges more compelling than the approximation used by games and movies wherein the camera, real or virtual, makes these adjustments for me.

Member

LG used a different technology with OLED, beginning with the 2016 sets. On older OLED, blue degrations from the subpixels would occur faster. Their 2015 used that type.

In 2016, they shifted to all white or white subpoxels that use colored filters, which makes color shift and degradation near impossible before 100,000 hours + they used new tools to make retention and burn in, things of the past.

All of their OLEDs have been white with color filters.
The difference is that it was previously an RGB stack while it is now a Blue+Yellow stack in the 2016 models.
People are still reporting image retention on 2016 models - particularly with HDR gaming, and most noticeable with bright yellow objects.
An example where this is a problem would be The Witness.

They still burn just like any other plasma.
The difference with Pioneer plasmas is that it is most obvious on black scenes instead of mid-tone grays/blues.
All plasma displays are susceptible to image retention/burn-in.
Not all plasma owners are bothered by or aware of it when it happens.
The main variable is the viewer, not the displays.

None of these models are true HDR displays.
They are really just "HDR compatible" displays.
They will accept and display an HDR signal but their peak brightness is too low.

LG UH6150: 183 nits
Sony X800D: 375 nits
Samsung KU6300: 425 nits

A true HDR display should be capable of at least 1000 nits peak brightness and a sustained brightness of at least 400 nits for a 100% window.

They should probably start rating HDR displays in "stops" to make this easier for consumers to understand, as nits do not translate well to real-world performance.
A "stop" is a doubling or halving of brightness.
So a 400 nit "HDR" display is only 2 stops brighter than an SDR display. (100 nits)
3 stops would be 800 nits. 4 stops would be 1600 nits and so on.

The current best displays from Sony and Samsung are roughly 4 stops brighter than SDR.
Most other displays are 3 stops or less.

People are used to default settings that are way too Cool. On most TVs, Warm or Warm 2 most closely matches the D6500 calibration standard for movies, which is what your own color temperature goal should be. Once you have watched a properly calibrated TV awhile, other TV's will look too blue and metallic in color, because they are.

The problem with TV presets is that Warm 2 is often 5500-6000K while Warm 1 is often 7000-7500K.
When that is the case, the recommendation should be Warm 1 not Warm 2.
Being slightly cooler than D65 is more acceptable to most people than being warmer than D65.

Of course some displays are accurately calibrated now, so that Warm 2 actually does measure D65. You really need to check reviews.

Member

The problem with TV presets is that Warm 2 is often 5500-6000K while Warm 1 is often 7000-7500K.
When that is the case, the recommendation should be Warm 1 not Warm 2.
Being slightly cooler than D65 is more acceptable to most people than being warmer than D65.

Of course some displays are accurately calibrated now, so that Warm 2 actually does measure D65. You really need to check reviews.

Member

I've always complained about the Warm 2 color setting, but ever since I got my KS8000 I'm only using that. It actually DOES make things have more accurate colors. Cold is ridiculously blue and even Standard is too blueish now... haha

Member

Why is the "warm2" option always the go to when calibrating the oleds? For some reason I cant stand watching it on that setting, it just looks like a really dim yellowish filter. Am I the only one who uses the "cool" setting? Looks much more vibrant to me.

Banned

I've always complained about the Warm 2 color setting, but ever since I got my KS8000 I'm only using that. It actually DOES make things have more accurate colors. Cold is ridiculously blue and even Standard is too blueish now... haha

Member

Why is the "warm2" option always the go to when calibrating the oleds? For some reason I cant stand watching it on that setting, it just looks like a really dim yellowish filter. Am I the only one who uses the "cool" setting? Looks much more vibrant to me.

Warm 2 sucks. Cool 4 life &#128526;, I don't care what the "pros" say, I refuse to use the piss filter on my TV. That isn't to say the power of calibrating isnt real, it is, but use what looks good to you, its personal.

Calibration is not overly difficult. You spend $1k on a TV, for ~10% additional price you can purchase the equipment. The software can be free and the knowledge has been recorded. You certainly do not end with a piss filter, but a balanced accurate image that is pleasing to the eye.

Member

Calibration is not overly difficult. You spend $1k on a TV, for ~10% additional price you can purchase the equipment. The EOTF ware can be free and the knowledge has been recorded. You certainly do not end with a piss filter, but a balanced accurate image that is pleasing to the eye.

Member

Below is all you need to calibrate your set at a very high level (color temp, luminance, black level, white level, greyscale, and colors brightnesses and color saturations) at the very lowest price -only the equipment. You will need to spend a bit of time in 'study mode' but you don't need a degree for this.

Do these devices still have a limited life-span? I bought the old Eye One for my plasma, used it once, then left it sat in a box for a couple of years, then went to use it again and it was wrecked. I didn't realise they deteriorated even if you don't use them!

Member

Do these devices still have a limited life-span? I bought the old Eye One for my plasma, used it once, then left it sat in a box for a couple of years, then went to use it again and it was wrecked. I didn't realise they deteriorated even if you don't use them!

From what I understand, no, as the filters are sealed to prevent this. If you are doing this professionally/semiprofessionaly, I would send to Calman for cert every few years, but for average user, nah.

From CurtPalme:

"The X-Rite Display 3 (the official name is 'i1 Display PRO III') is the newest meter from X-Rite, first introduced on June 21, 2011 where it surprised consumers and professional calibrators alike with its high performance at a low price. The Display 3 reads about three times faster and to lower light levels than the Chroma 5, and costs less! In fact, the read speed above 10 cd/m2 is similar to the approximately 10 times more expensive Hubble!

The main advantages of the Display 3 is that it reads much more accurately, consistently, and lower (closer to black) than the lower end meters. It even reads better than some of the other higher end, more expensive meters. For example, the more expensive Chroma 5 is rated to read down to 0.01 cd/m2 light output while the Display 3 is able to read down to 0.003 cd/m2 (0.001 fL), or about one-third as bright. This is due to the Display 3 lens system which is able to capture and focus more light onto the filter diodes, providing much better low-light sensitivity and much faster readings (similar to the Hubble).

Colour accuracy is also excellent: The stock Display 3 shows errors no higher than xy0.006 for both color and white relative to a $10,000 reference spectroradiometer. This is performance comparable to a stock Chroma 5.

The Display 3 does not require temperature compensation as it is a non-contact meter (like the Hubble) so the temperature of the display does not come into play. It does however come with a counterweight so that it can indeed be used in contact mode if desired or when ambient light cannot be controlled easily.

The non-contact design means that the readings will not be affected by the heat a display might emit (older plasmas are especially prone to this), nor is there any danger of damaging the delicate surface of a flat panel. There is no laser aiming device built-in to the unit however such as with the Hubble. We're sure some intrepid DIY'ers will manage to somehow attach a small laser pointer of some sort. Wink

Because the filters on the Display 3 are installed in a sealed environment, they are not subject to the same type of degradation in performance over time that is typical of less expensive contact meters. This means that the Display 3 needs to be recalibrated less often than a typical colorimeter.

The Display 3 is also one of the easiest devices to use because, like the Chroma 5, it requires no dark reading calibration. Some meters require that they be covered up and a reading taken either when they're first connected or every 10-20 minutes as you calibrate. It only takes a second but it annoys a lot of people! For comparison sake, the Display 2/LT, EyeOne Pro / EyeOne Pro 2, and Hubble requires a dark reading when the meter is initially connected. The EyeOne Pro / EyeOne Pro 2 and the Hubble then also require periodic dark readings throughout the calibration session.

A tripod mount is not required with the Display 3 as one is built in, which helps reduce upfront costs. "

MrArseFace

Why is the "warm2" option always the go to when calibrating the oleds? For some reason I cant stand watching it on that setting, it just looks like a really dim yellowish filter. Am I the only one who uses the "cool" setting? Looks much more vibrant to me.

It's usually a more accurate base to start tweaking from. You don't like it because you aren't used to it. Likewise if you calibrate properly the image will probably look flat and dim compared to what you like. Best to give it a couple of days to bed in. You'll adjust to it, and then when switching back you'll thing 'god this looks garish and horrible' (well hopefully)

I think on my Sony (55XD930) the most accurate baseline is expert 1/2. Haven't don't a calibration yet, just copied some basic settings from elsewhere and I'm adjusting the backlight to a comfortable level (seems on the set you can just leave contrast on max and it doesn't crush whites, and colours are pretty good out of the box so that makes it a bit easier)

Sailor Stevenson

Warm 2 sucks. Cool 4 life &#128526;, I don't care what the "pros" say, I refuse to use the piss filter on my TV. That isn't to say the power of calibrating isnt real, it is, but use what looks good to you, its personal.

Banned

LG used a different technology with OLED, beginning with the 2016 sets. On older OLED, blue degrations from the subpixels would occur faster. Their 2015 used that type.

In 2016, they shifted to all white or white subpoxels that use colored filters, which makes color shift and degradation near impossible before 100,000 hours + they used new tools to make retention and burn in, things of the past.

My C6 is completely perfect, no Mura effect or vignetting, streaks or retention. When the screen is displaying black, I literally cannot tell tv is on - even in a pitch dark room.

I'm aware that they switched to white OLEDs with color filters, but I haven't been paying attention to reports of pixel lifetime and wear since then. I do recall that this had an effect on the color gamut of the panels, so I'm not sure how the 2016 OLEDs stand in terms of gamut compared to DCI P3.

I'm wondering if Samsung ever plans to re-enter the market, they also pulled out because of the issue with the blue pixel degrading much faster than the red and green ones. They are still the world's only supplier of Super AMOLED panels for phones. The panels on phones of course get noticeable burn-in after only a year of typical use, but this hardly matters for phone displays.

Either way, I pissed away enough money on that Panasonic plasma. Once bitten, twice shy as they say. There are many well-understood limitations to LCD that everyone has beaten to death a million times but at least I can display my gaming PC and play all my games on it and look at game HUDs and Windows desktop for hundreds of hours and the LCD will not burn-in. I'll put up with all the other stuff as long as normal usage of my 4K LCD TV doesn't destroy it.

Member

Member

I'm aware that they switched to white OLEDs with color filters, but I haven't been paying attention to reports of pixel lifetime and wear since then. I do recall that this had an effect on the color gamut of the panels, so I'm not sure how the 2016 OLEDs stand in terms of gamut compared to DCI P3.

I'm wondering if Samsung ever plans to re-enter the market, they also pulled out because of the issue with the blue pixel degrading much faster than the red and green ones. They are still the world's only supplier of Super AMOLED panels for phones. The panels on phones of course get noticeable burn-in after only a year of typical use, but this hardly matters for phone displays.

Either way, I pissed away enough money on that Panasonic plasma. Once bitten, twice shy as they say. There are many well-understood limitations to LCD that everyone has beaten to death a million times but at least I can display my gaming PC and play all my games on it and look at game HUDs and Windows desktop for hundreds of hours and the LCD will not burn-in. I'll put up with all the other stuff as long as normal usage of my 4K LCD TV doesn't destroy it.

Member

I recommend anyone looking to get a 4K TV by Christmas to just hold out a little longer and see what CES has to offer in January. Newer and better TVs of all price ranges should be available in the next 3-6 months.

Warm 2 sucks. Cool 4 life &#128526;, I don't care what the "pros" say, I refuse to use the piss filter on my TV. That isn't to say the power of calibrating isnt real, it is, but use what looks good to you, its personal.