The standards of tomorrow: What are HDR and HDR+?

When we buy a TV set or a monitor today, we are thrown around the wildest terms and the latest technologies, which the average buyer has only heard about on the sidelines at best. HLG, HDR, HDR+, Dolby Vision, that’s what they’re called, the standards of today and tomorrow that present us with the most realistic videos we’ve ever seen. They all define so-called contrast or dynamic ranges in which the images or videos are displayed. So it’s not a matter of resolution, but of the range of the colour space. The higher this is, the more contrasty and appealing the image will be.

From SDR to HDR – more contrast, more depth

Up to now, classic TVs were usually displayed with SDR (standard dynamic range), which provides a contrast range between 6 and 8 bits, i.e. a maximum of 256 brightness levels per color channel. For some years now, more precisely since CES 2015, HDR has been on everyone’s lips. HDR video stands for high dynamic range and uses a color depth of 10 bits, which is why it is also called HDR10.

As an open and license-free standard, HDR has already established itself in many areas, especially in streaming and video portals such as Amazon Prime Video, Netflix or YouTube. When buying a television, however, you should pay attention to whether the device is actually marketed with the HDR10 standard or is just “HDR-compatible”. The latter is often used as a sales trick with older or cheaper devices that have only 8 bits instead of a 10-bit panel. The quality of a real HDR device is not achieved this way.

However, the technology has one big disadvantage: It only works with static metadata. For explanation: Each video sends metadata to the corresponding TV set or monitor on which it is played, in order to give it the optimal display values. If they are static, they apply equally to all scenes in the video – so no distinction is made between darker or lighter scenes. HDR+ was created to overcome this disadvantage.

HDR+ belongs to the high-end standards of the future

HDR+ or also HDR10+ relies on dynamic metadata and thus even more quality in the images. In this way, the colour space can be optimally adjusted to the brightness of scenes and even from frame to frame. You can see an example between the two technologies in the following picture:

Difference between HDR and HDR+ / Source: Panasonic

HDR+ has absolute potential for the future, but at the moment it is hardly to be found. The technology developed by Samsung was first announced in 2017 and can only be used for some Blu-Rays and Amazon video content so far. However, since it is also available royalty-free and is already supported by film studios such as Warner Bros or 20th Century Fox, it is likely to play a fairly significant role in the near future. This could possibly put it one step ahead of its biggest competitor, Dolby Vision.

Dolby Vision is also based on a “frame-by-frame” technology and can even score with a stately colour depth of 12 bits (that’s 4096 contrast levels!) – in terms of performance, this makes it the best technology available today. So why hasn’t Dolby been able to assert itself so far? Many see two main reasons for this: Firstly, the licence costs for using Dolby Vision are extremely high, and secondly, there are currently virtually no end devices with 12-bit panels. It remains exciting to see how the two high-end standards will develop over the next few years. Until then, HDR will probably occupy us enough for now.

What do you think of the topic? Click here for the comments!

Simon Lüthje

I am co-founder of this blog and am very interested in everything that has to do with technology, but I also like to play games. I was born in Hamburg, but now I live in Berlin.

Related Articles

Back to top button