“If we just start with What’s in a dwarf’s head?, there’s 50-something personality facets we pulled from a psychology textbook, 30 intellectual values, a bunch of specific needs they have, like, if they like extravagant clothing,” Tarn says, “And then, based on all of that, there’s 120 different emotions.” It’s a remarkable amount of depth for a dwarf, a creature that, in the game, is represented by nothing more than a little smiley face.

And blogging about it. Check it out at CRPGAddict.blogspot.com. He’s basing this off wikipedia’s list of RPGs, his Google spreadsheet can be found here. Also, he seems committed, looks like he’s currently around 1990, tackling things like Dungeon of Nadrod and Magic Candle (huh?). This is truly a life’s work if ever there was such a thing.

That said, I just received my Freewrite the other day, and while some people are panning it as a 500 dollar hipster keyboard, hipsters are waxing effusively over the product’s anachronistic simplicity. I think it falls somewhere in between those poles, and hope to write a review soon. Whatever your opinion, while the Freewrite was delayed, it did come to life as a fairly accurate representation of its initial vision, so I’d say that’s a definite success.

So 4K is alltherage, especially with the new PS4K coming (which, horn-toot, I totally called). But 4K apparently isn’t enough, we also need to ensure our new sets have HDR. This sounds like total platitudinal synergistic marketing-speak. But is it?

To my un-educated brain, the way we present the HDR videos on LCD screens sounds like a combination of software emulation and brighter lighting (to allow for greater contrast), rather than a true new display technology. Of course, emulation can be done a ton of different ways, which naturally leads to standards issues, but wikipedia tells me that on August 27th of 2015 a standard was adopted–but that’s not quote right, there are actually quite a few standards including “Dolby Vision” (which is proprietary) and “HDR10” which is an open standard. They’ll probably, eventually, both be adopted by TV makers.

OLED screens may help here, since they can rely on true blacks to obtain high contrast, but perhaps they can’t get as bright as an LED backed LCD screen? I haven’t found an answer to this, but I do know there are OLED HDR screens on the wayalready here. And I just found the answer in the Wired article, I’ll just quote it here:

For an LCD, a qualifying TV must have a peak brightness level higher than 1,000 nits and a black level less than 0.05 nits. For an OLED to qualify, it must have a peak brightness of at least 540 nits (remember, OLEDs cannot get super bright) and a black level less than 0.0005 nits (remember, OLEDs can get super dark).

One more aspect of HDR we should call out: the color space is increasing from the old standard of 8-bit, to a new 10 or 12 bit color space. This increases the amount of colors we can see. I’m guessing this will make a true technological difference.

So what’s the verdict? Is HDR a marketing gimmick or is it actual new technology? Per for former, of course it’s marketing, but per the latter, I’d say it’s somewhere between “sort of” and “yes”. New, better technology allows us to capture higher luminosity videos, and new technology increases our display of color space dramatically, but we’re still emulating the luminosity on the TV. I’m guessing HDR video will look much better, but also, I do not think HDR means what TV marketers think it means.