DRM is good, blocking hardware bad

There’s no denying that DRM is good and needed. But how did it become Microsoft’s and Intel’s cashing playground?

Digital rights management (DRM) is a generic term for technologies that control one’s access to content. After years of hard fight, it looks like the pirate side has won. They have the content and they have the ways to share it. Something needs to be done so that the content creators get their bread to the table and it looks like Microsoft and Intel (AMD) have it figured out.

They have decided that there will be a hardware and software wall between us, the viewers and the content. For example Netflix 4K. To watch 4K videos from Netflix, you need Windows 10, Edge browser and a Kaby Lake 7th generation Intel CPU. Absolutely ridiculous.

They claim that it’s because of the H.265 10-bit decoder built into the Kaby Lake processors. Skylake only has 8-bit decoder which hasn’t got the performance. That’s horse s**t. Why did the Skylake have it then, if it can’t be used?

My PC runs games at 4K 60fps. I have a 4K display and a GTX 1080 that has the H.265 10-decoder, why isn’t that compatible? I’ll tell you why.

Money.

The monopoly companies like Intel and Microsoft are just cashing in without thinking about the future. Peeing into the customer’s morning cereal is never a good thing.

Intel even changed the chipset from Skylake to Kaby Lake to force customers to buy the new Z270 motherboards and that way, a new Windows 10 licence from Microsoft. Luckily, most of the Z170 boards can be updated to support Kaby Lake.

That still leaves the cost of the CPU. The Kaby Lake chips have better thermal solution than Skylake and because of that, they can clock higher. That’s the difference. Why would anyone with an Ivy Bridge or newer upgrade? The performance gains are minimal and the retail value of the old one has collapsed.

Why didn’t Intel just draw the line between Sandy Bridge and Ivy Bridge? They say it’s the hardware decoder. Why is there the software wall with Windows 10 and Edge then? Why don’t just ask the Windows OS to check if you have an Ivy Bridge or newer? Why can’t you just check with the Windows that the user is not running any recording software and just freaking show the content?

The Microsoft’s and Intel’s answer is hardware. It’s a hardware decoder that can decode the encoded stream so that the content can’t be captured. And how does this prevent the pirates from capturing the content from the video cable? And why is the Edge browser needed then?

Money is the real answer.

Taking a commonly known issue and turning it into a short term profit, that’s what Microsoft and Intel are doing here.

Don’t forget that the same restrictions apply to the UHD Blu-ray. The disks that the new version of XBOX One can run with an ancient AMD CPU. It’s not a hardware problem, it a problem with selfishness.