High Dynamic Range (HDR) has become one of the most significant advances in television picture quality over the past decade. Yet many consumers remain confused about what HDR actually means and which format they should prioritise when buying a new TV. With terms like Dolby Vision, HDR10, HDR10+, and HLG appearing in specifications, it's easy to feel overwhelmed. This guide will demystify HDR technology and help you understand what matters most for your viewing experience.

What Is HDR and Why Does It Matter?

High Dynamic Range refers to a set of technologies that expand the range of both contrast and colour that a TV can display. In practical terms, HDR content shows brighter highlights, deeper blacks, and a wider spectrum of colours compared to standard dynamic range (SDR) content. The result is an image that looks more lifelike, with greater depth and detail in both shadows and highlights.

Consider a scene with bright sunlight streaming through a window into a dimly lit room. With SDR, either the window would appear blown out (pure white with no detail) or the room would look too dark. HDR allows both the bright window and the shadowy interior to be displayed with full detail simultaneously, much closer to how our eyes perceive the real world.

📊 HDR by the Numbers

SDR content typically uses 8-bit colour depth (16.7 million colours) and limited brightness range. HDR content uses 10-bit or higher colour depth (over 1 billion colours) with significantly expanded brightness capabilities, measured in nits.

HDR10: The Universal Standard

HDR10 is the most widely supported HDR format and is considered the baseline standard. It's an open format with no licensing fees, which is why virtually every HDR-capable TV supports it. HDR10 uses static metadata, meaning the brightness and colour information is set once for the entire movie or show and doesn't change scene by scene.

HDR10 content is mastered at up to 10,000 nits of brightness, though no consumer TV can actually achieve that level. The TV interprets the metadata and maps it to its own capabilities. This works well in many cases, but can result in some scenes appearing too dark or too bright if the TV's tone mapping isn't optimal.

The advantage of HDR10 is universal compatibility. Every streaming service, Blu-ray disc, and gaming console that supports HDR uses HDR10 at minimum. You'll never have compatibility issues with HDR10 content.

Dolby Vision: The Premium Experience

Dolby Vision represents the current gold standard in HDR technology. Unlike HDR10's static metadata, Dolby Vision uses dynamic metadata that can adjust brightness and colour information on a scene-by-scene or even frame-by-frame basis. This allows for more precise optimisation of the image regardless of the scene's content.

Dolby Vision also supports 12-bit colour depth (though current TVs use 10-bit displays, the additional headroom future-proofs content) and brightness levels up to 10,000 nits. The format includes sophisticated tone mapping that adapts content to your specific TV's capabilities more effectively than static HDR10.

đź’ˇ Key Takeaway

Dolby Vision's dynamic metadata means a dark, atmospheric scene followed by a bright outdoor shot will each be optimised individually, rather than using a single compromise setting for both. This results in consistently excellent image quality throughout.

The trade-off is that Dolby Vision is a proprietary format requiring licensing. Content must be specifically mastered in Dolby Vision, and TVs must be certified. Netflix, Disney+, Stan, and Apple TV+ all offer extensive Dolby Vision libraries. LG, Sony, TCL, and Hisense TVs widely support Dolby Vision, while Samsung TVs notably do not (Samsung uses HDR10+ instead).

HDR10+: Samsung's Dynamic Alternative

HDR10+ was developed by Samsung and Amazon as a royalty-free alternative to Dolby Vision. Like Dolby Vision, it uses dynamic metadata for scene-by-scene optimisation. However, HDR10+ typically maxes out at 10-bit colour and 4,000 nits (compared to Dolby Vision's 12-bit and 10,000 nits theoretical limits).

In practical terms, on comparable TVs, the difference between HDR10+ and Dolby Vision is subtle. Both deliver excellent dynamic HDR performance that surpasses static HDR10. The main consideration is content availability—while Amazon Prime Video and some Blu-rays support HDR10+, Netflix and many other major streaming services focus exclusively on Dolby Vision for their premium HDR content.

Samsung TVs support HDR10+ but not Dolby Vision. If you're considering a Samsung TV, you'll still get excellent HDR from HDR10 and HDR10+, but you won't have access to Dolby Vision content as mastered—it will play in HDR10 instead.

HLG: Broadcast HDR

Hybrid Log-Gamma (HLG) is an HDR format developed by the BBC and NHK specifically for broadcast television. Its unique characteristic is backwards compatibility—HLG content can be displayed on both HDR and non-HDR TVs without requiring separate streams. For broadcasters, this simplifies HDR adoption significantly.

In Australia, HLG is relevant for future free-to-air HDR broadcasts and some YouTube HDR content. Most modern TVs support HLG, and it's generally considered a "set it and forget it" format—you don't need to configure anything, and compatible content will display in HDR automatically.

Which HDR Format Matters Most?

For most Australian viewers, the priority should be:

  • HDR10: Essential and universally supported. Any TV marketed as "HDR" will support this.
  • Dolby Vision: Highly desirable if you use Netflix, Disney+, Stan, or Apple TV+. Provides the best streaming HDR experience.
  • HDR10+: Nice to have, primarily relevant for Amazon Prime Video users and Samsung TV owners.
  • HLG: Useful for broadcast and YouTube HDR content. Most TVs support it automatically.
âś… Practical Advice

If you stream a lot of content from Netflix, Disney+, or Stan, prioritise a TV with Dolby Vision support. If you're committed to Samsung TVs, you'll still get excellent HDR through HDR10 and HDR10+, just without access to Dolby Vision specifically.

Your TV's HDR Capability Matters More Than Formats

Here's the truth that often gets overlooked: your TV's actual HDR performance matters far more than which formats it supports. A cheap TV with Dolby Vision support but low peak brightness and poor contrast will look worse than a quality HDR10-only TV with excellent brightness and colour.

For truly impactful HDR, look for TVs with at least 600 nits of peak brightness (1,000+ nits is ideal), wide colour gamut coverage (90%+ of DCI-P3), and good contrast ratios (OLED excels here, followed by Mini LED/QLED). A budget TV labeled "HDR" with 300 nits of brightness won't deliver the HDR experience you're expecting, regardless of format support.

Checking and Enabling HDR

To verify HDR is working correctly, check your streaming apps—Netflix and Disney+ show "Dolby Vision" or "HDR" badges on supported content. On PS5 or Xbox, enable HDR in system settings. Ensure your TV's HDMI Enhanced/Input Signal Plus setting is enabled for the port you're using—this is often disabled by default and prevents HDR signals from being received.

When HDR content is playing, most TVs display an indicator in the info panel (accessible via your remote's info button) showing the active HDR format. This confirms your content is playing in HDR rather than falling back to SDR.

👩‍🔬

Sarah Mitchell

Technical Editor

Sarah is an electrical engineer specialising in display technology. She helps readers understand complex specifications and make informed decisions about their TV purchases.