It’s no secret that 4K TVs are becoming increasingly popular, and with that increase, the demand for high dynamic range (HDR) content is skyrocketing. But what is HDR, and what are the different types? This article will break down the popular HDR formats: HDR10, HDR10+, and Dolby Vision.
HDR, or high dynamic range, is a term used to describe a video signal with a wider range of luminance than what’s considered standard. This means that HDR video can display a wider range of colours and brightness levels, resulting in a more realistic and lifelike image.
- HDR10 is the most popular format supported by all major 4K HDR TVs, and it uses static metadata, meaning the same HDR information is applied to the entire video.
- HDR10+ is an improved version of HDR10 that uses dynamic metadata, which allows the HDR information to be applied on a scene-by-scene or even a frame-by-frame basis.
- Dolby Vision is the premium HDR format that uses dynamic metadata and is supported by select 4K HDR TVs.
While all three types of HDR offer increased contrast and colour range, Dolby Vision is the clear winner regarding image quality. However, HDR10+ is a close second, and its dynamic metadata allows for more accurate image reproduction.
Is HDR10 better than HDR10+ or Dolby Vision?
Regarding HDR or high dynamic range, there are three major HDR formats: HDR10, HDR10+, and Dolby Vision. Each has its advantages and disadvantages, so it’s important to understand their differences before deciding.
- HDR10 is good for bit depth, peak brightness minimum, maximum brightness maximum, tone mapping, static for metadata, and amazing for TV support.
- HDR10+ is great for bit depth, peak brightness minimum, maximum brightness maximum, better tone mapping, dynamic for metadata and TV support.
- Dolby Vision is great for bit depth, peak brightness minimum, maximum brightness maximum, best for tone mapping, dynamic for metadata and TV support.
What is HDR10?
HDR10, or High-Dynamic Range 10, is a new HDR standard introduced in 2015. HDR10 is an open format which any manufacturer can use. This format is backwards compatible with non-HDR TVs.
While HDR10+ is an improved version of HDR10 introduced in 2017, this format is a proprietary format only compatible with TVs certified by the HDR10+ Consortium.
HDR10 and HDR10+ are both designed to improve the picture quality of HDR content. HDR10+ is the more advanced standard but is not compatible with all TVs.
What is HDR10+?
HDR10+ is a high-dynamic range (HDR) video format that supports HDR10 and extends it with dynamic metadata. This allows for more accurate image reproduction on display capable of HDR10+.
While HDR10 is a static format, meaning that the HDR information is a part of the video signal from the start of playback and does not change, HDR10+ is a dynamic format. This allows the display to adjust the HDR settings on a scene-by-scene or even frame-by-frame basis. HDR10+ is, therefore, able to offer an improved HDR experience over HDR10.
A display needs to be HDR10+ certified to benefit from the technology. This certification ensures that the display can reproduce the wide range of colours and brightness levels required for HDR10+.
HDR10+ is an important new standard for HDR video. It offers improved image reproduction over HDR10 and is increasingly available on HDR10+-certified displays.
What is Dolby Vision?
Dolby Vision, a cutting-edge imaging technology, results in a more realistic, lifelike viewing experience. It offers a higher contrast, colour, and detail standard than standard HDR (High Dynamic Range) technology.
Dolby Vision goes beyond standard HDR by increasing the contrast ratio and expanding the colour gamut. This results in a more realistic image closer to what the human eye can see. Dolby Vision also uses a higher bit depth, allowing more detail and smoother transitions between colours.
While regular HDR technology is becoming more common, Dolby Vision is still somewhat rare. Only a handful of TVs and projectors support the technology. But as more manufacturers adopt Dolby Vision, it will become the new standard for home entertainment.
The bit depth is the first thing to consider when comparing HDR10 vs HDR10+ vs Dolby Vision. Bit depth is the number of bits used to encode the colour of a given pixel—the more bits, the more colours that can be represented. HDR10 uses 10 bits per pixel, HDR10+ uses 12 bits per pixel, and Dolby Vision uses 12 bits per pixel.
- HDR10 uses 10 bits per pixel to encode the colour of a given pixel. HDR10 is an open standard that is supported by most HDR TVs.
- HDR10+ is an improved HDR format that uses 12 bits per pixel to encode the colour of a given pixel. HDR10+ is a proprietary format that is supported by some HDR TVs.
- Dolby Vision is a proprietary HDR format that uses 12 bits per pixel to encode the colour of a given pixel. Some HDR TVs support Dolby Vision.
- HDR10: 1,000 nits
- HDR10+: 4,000 nits
- Dolby Vision: 10,000 nits
- HDR10 vs HDR10+ vs Dolby Vision
Regarding high dynamic range (HDR), there are three main competing formats: HDR10, HDR10+, and Dolby Vision. All three formats offer increased brightness and contrast compared to the standard dynamic range (SDR), but some important differences remain.
Peak brightness is one of the most important specs when comparing HDR formats. HDR10 has a peak brightness of 1,000 nits, HDR10+ has a peak brightness of 4,000 nits, and Dolby Vision has a peak brightness of 10,000 nits. This means Dolby Vision can offer more detail in highlights and shadows than either HDR10 or HDR10+.
In addition to increased peak brightness, Dolby Vision supports a wider colour gamut than either HDR10 or HDR10+. This means that Dolby Vision-enabled TVs can display a wider range of colours, resulting in more accurate and realistic images.
Which HDR format is right for you? Dolby Vision is a clear choice if you want the best picture quality. However, if you’re looking for a more affordable option, either HDR10 or HDR10+ will likely be a good fit.
One key difference between HDR10, HDR10+, and Dolby Vision is the use of metadata. Metadata is information embedded in the video signal that provides instructions to the TV on how to display the image. HDR10 uses static metadata, meaning it is the same for the entire video. HDR10+ and Dolby Vision both use dynamic metadata, which can be changed on a scene-by-scene or even a frame-by-frame basis. This allows for a more accurate image display, as the TV can adjust the settings to match the content.
HDR10 vs HDR10+
HDR10+ is an open standard that Samsung and Amazon developed. It uses dynamic metadata, like Dolby Vision, and is supported by major brands such as Samsung, Panasonic, and LG. HDR10, on the other hand, is a static metadata format developed by the Blu-ray Disc Association.
Dolby Vision is a proprietary format that uses dynamic metadata. It was developed by Dolby Laboratories and is used by major brands such as LG, Sony, and Vizio.
To understand the differences between HDR10, HDR10+, and Dolby Vision, it’s important first to understand tone mapping. Tone mapping is mapping the values in a high-dynamic-range (HDR) image to the lower dynamic range of a display device. This is necessary because displays can output a limited range of values. HDR image formats use a “luminance” scale that goes from 0 (black) to 10,000 (white), while standard dynamic range (SDR) displays use a “light” scale that goes from 0 (black) to 100 (white).
Tone mapping is a lossy process, which means some information is lost when an HDR image is mapped to an SDR display. The goal of tone mapping is to preserve as much of the original image data as possible while creating an image that looks natural on an SDR display.
There are three main types of tone mapping: global, local, and dynamic. Global tone mapping scales the entire image to fit the display’s dynamic range. Local tone mapping preserves more of the original image data by applying different scales to different parts of the image. Dynamic tone mapping adjusts the tone mapping over time in response to changes in the image.
All three HDR formats are based on different technologies, which means that they are not compatible with each other. HDR10+ is based on static metadata, while Dolby Vision and HDR10 use dynamic metadata. This means that Dolby Vision TVs can accept and process HDR10 and HDR10+ signals, but HDR10 and HDR10+ TVs cannot process Dolby Vision signals.
What does this mean for backward compatibility?
If you have a Dolby Vision TV and want to watch HDR10 or HDR10+ content, you’ll need a compatible receiver or player that can downconvert the signal. The same is true if you want to watch Dolby Vision content on an HDR10 or HDR10+ TV. You’ll need a receiver or player that can upconvert the signal.
Regarding which format is more backward compatible, it depends on what you want to watch and what kind of TV you have. If you have a Dolby Vision TV and want to watch HDR10 or HDR10+ content, you’ll need to get a receiver or player that can downconvert the signal. However, if you have an HDR10 or HDR10+ TV and want to watch Dolby Vision content, you’ll need a receiver or player to upconvert the signal.
Only some TVs currently support every high dynamic range (HDR) format. In particular, HDR10+ support is still fairly uncommon, even though the format has been around since 2017. If you’re looking for a new TV and have your heart set on HDR10+, you might have to do additional research to find a model that supports it.
The same is true of Dolby Vision. While it’s become more common in the last couple of years, many TVs still don’t support it. If you’re interested in Dolby Vision, you’ll need to ensure that the TV you’re considering is compatible.
A lot of gamers are wondering which HDR format is better for gaming. The short answer is that it depends on your console. If you have a Playstation 4 Pro, you want to use HDR10+. If you have an Xbox One S or X, you will want to use Dolby Vision.
The reason for this is that each console has different capabilities. The Playstation 4 Pro can output a higher HDR10+ bitrate, which means it can display more colours and finer details in HDR games. On the other hand, the Xbox One S and X can output at a higher Dolby Vision bitrate, which means it can display more colours and finer details in HDR games.
So, if you’re wondering which HDR format is better for gaming, the answer is that it depends on your console. If you have a Playstation 4 Pro, you want to use HDR10+. If you have an Xbox One S or X, you will want to use Dolby Vision.
- PS4/PS4 Pro
- Xbox One
- Xbox Series X/S
- Nintendo Switch
While all current consoles support HDR10, only the PS4 Pro, PS5, Xbox One, and Xbox Series X/S support HDR10+. Meanwhile, only the Xbox One and Xbox Series X/S support Dolby Vision. As for the Nintendo Switch, well, it doesn’t support any HDR format currently. All three console manufacturers have said they’re committed to supporting HDR in the future, so we can expect all three consoles to support all three HDR formats eventually.
Conversely, all three HDR formats are supported on PC, with both HDR10 and HDR10+ being included on all three major operating systems (Windows, macOS, and Linux). In contrast, Dolby Vision is supported on Windows and macOS.
It’s also worth noting that while all three console manufacturers have said they’re committed to supporting HDR in the future, only Microsoft has said they’re committed to supporting all three HDR formats. So, Sony and Nintendo will eventually drop support for HDR10+ instead of only supporting Dolby Vision.
So, what’s the verdict? Which of these HDR formats is the best?
There is no clear answer, as each has its advantages and disadvantages. HDR10+ may offer the best image quality, but its lack of support from major streaming platforms means many people will need help.
On the other hand, Dolby Vision is supported by most major streaming platforms, but its higher price tag may not be the best option for everyone. Ultimately, the best HDR format for you will depend on your individual needs and preferences.