Discover the difference between Dolby Vision, HDR10, HDR10 +, HLG, and HDR Advanced
HDR is present in the market as a differentiator in the purchase of technological devices, mainly in TVs. But have you noticed the amount of HDR variations? What is the difference between them? Let’s find out!
When you are looking for a TV to buy in the market, perhaps one of the things you hear or read the most is the acronym HDR, and this is presented as a differentiator compared to technologies of other generations. In addition, there are models of televisions that boast other variations of HDR, which are known as Dolby Vision, HDR10, HDR10 +, HLG, and HDR Advanced. But what is the difference between them?
In this guide, we will understand what each HDR format makes it so special, and what the advantages and disadvantages of each are. But first, to find out the difference between them you need to understand what HDR is.
What is HDR?
HDR is an acronym in English for High Dynamic Range, which can be translated as High Dynamic Range, it is a feature that appeared on the market in 2016, and that basically allows to leave the television image with a high quality of colors and excellent results in brightness and contrast.
A typical TV has a technology known as SDR (Standard Dynamic Range, translated as Standard Dynamic Range), and many image details cannot be perceived on this type of screen. HDR TVs have a much higher nits * value than ordinary TVs, allowing you to identify the illuminated spots more accurately.
Major television manufacturers have adopted this technology, as is the case with Sony, Samsung, and LG, for example. Nowadays, 4K TVs are generally the devices that offer compatibility with the feature, but in general, HDR can also be found in next-generation monitors and consoles, such as the Xbox Series X / S and PlayStation 5 For these models, HDR allows the display of images with more vivid colors, leaving light tones with more brilliance and black tones with more contrast, presenting the user with an innovative experience that ensures that no detail goes unnoticed.
HDR is not always available only on 4K TVs, there are simpler devices on the market with full HD image resolution but they are also compatible with high image quality, and this ensures that this type of television stands out much more. To use the feature, it is clear that the video displayed on the screen needs to support this image mode, which is not usually the case with broadcast broadcasts on open TV. But most films and series on Netflix, Amazon Prime Video, Hulu, and Disney + platforms have this technology.
Now that we understand what HDR really is, we need to understand the difference between Dolby Vision, HDR10, HDR10 +, HLG, Advanced HDR, which are basically variations of the same feature. Come on?
As its name suggests, Dolby Vision is the HDR standard that belongs to Dolby Laboratories and was one of the first feature formats introduced on televisions. Its differential is based on its theoretical image pattern, which counts with 10,000 nits * and brings a color depth of up to 12 bits. Due to these details, Dolby Vision is considered today as the highest quality HDR available on the market, and as a result, it is present in most platforms that display video content today.
Dolby Vision is present in more than 200 movie theaters around the world, in addition to LG smartphones, Apple iPhones, video games, and streaming platforms, such as Netflix for example, which show Captain Marvel, Avengers, and Star Wars with the Vision format.
The big disadvantage for manufacturers is that Dolby usually charges a certain percentage of each unit sold with HDR Vision, which includes all the material that we have already mentioned such as TVs, media players, cell phones, tablets, consoles, and also films, series, videos, and games.
The HDR10 was developed in partnership between Sony and Samsung, two major reference companies in the television market, which tried to go head-to-head with the Dolby Vision format. This HDR format has a brightness of 1,000 nits and a color depth of 10 bits, well below the Dolby format.
Another disadvantage is that the HDR10 is also not dynamic, that is, it uses fixed metadata to produce and provide brightness information, which makes the image result less accurate. In darker scenes, for example, the colors are saturated, which represents details significantly inferior to the Dolby Vision standard, however, they are details that are perceived by more demanding users and who have a certain knowledge about the subject.
Unlike the Dolby format, the HDR10 is open source and free, which can be an advantage for manufacturers to use in their products.
Sony uses HDR10 mode in most of its products, as is the case with 4K Blu-ray discs, and the company’s last two generations of consoles, PS4 Pro and PS5. This format is also present in the Microsoft consoles such as the Xbox and Xbox One S One X, and the newcomer S Series Xbox and Xbox Series X.
With the frustration of the HDR10, Sony and Samsung tried again to face the Dolby format, and for that, they created the HDR10 +. Basically, little differs from the previous version.
The values remained the same as the previous generation, with 1,000 nits * and color depth in 10 bits, but with the correction of the lack of dynamic mapping of darker tones, which in a way demonstrates an advance for being able to correct the contrast of a scene to another automatically.
However, the advantage of the previous version was also maintained, and the HDR10 + can be used by manufacturers free of charge, just like the HDR10.
HLG is an acronym for Hybrid Log-Gamma, which can be translated as Hybrid Log of Gamma, and was created by the well-known BBC and NHK broadcasters. The use of this image format is limited only to live streams, which differs from the formats we have already mentioned and which are generally used in streaming platforms, cell phones, video games, and so on.
Since the focus of this format is only for real-time viewing, HLG ignores metadata for calculating brightness, which allows for a different displayed result for each type of television. This in a way is an advantage, because depending on the TV on which you watch the image quality improves, and even ordinary TVs with the SDR format, can display results very close to the HDR.
Of course, an old TV will display “more normal” images, but on newer models, the HDR will be shown live. Because of this limitation, it is logical to say that the quality of the HLG is better than the SDR, but well below the HDR10 + and Dolby Vision standards.
Developed by Technicolor, a company that has been working in photographic coloring techniques since the time of roll films, HDR Advanced encompasses three HDR sub-standards, each of which has its specific purpose;
SL-HDR1: in a simple way, it is a ‘basic’ HDR and allows any SDR TV to have better image quality, even though it is not as superior as the other formats.
SL-HDR2: this is an image standard very similar to the HDR10 + and Dolby Vision formats, as this has the dynamic metadata calculation setting.
SL-HDR3: this is an image standard that is still in the testing phase, but basically tries to unite the two things previously presented. Technicolor is committed to this format, and if it works, it will be the first HDR image-standard with dynamic metadata to be backward compatible with SDR TVs. This is not to say that every SDR TV will have the HDR10 + image quality, but the goal is to avoid the distortions we mentioned earlier.
What’s up?! Got it right? Did you already know all these HDR formats? Which one is the best? Leave your comment below and share your opinion with us!
More in NUpgrade
- Best off-road GPS apps for android and ios IOS in 2021
- Unforgettable Laptop Models That Left Their Mark in History
- XPG Levante 240 Review
- OPPO patents removable camera module technology; see details
- Hammerhead True Wireless Pro launches – Meet the new TWS from Razer
- Rockstar releases an update that brings a new area in GTA V; watch the trailer
- What exactly is Airplane Mode; What Does It Do? Here’s What You Need to Know …