What is HDR?

High Dynamic Range (HDR) enables a more-realistic visualization of images that more-closely represent real-life scenes by taking advantage of the greater range of luminance (i.e., the range from bright white to dark black) that we as humans can see. Before HDR, only Standard Dynamic Range (SDR) was available. The problem with SDR displays is that photographs displayed on the screen do not look like the original scene. A large range of colors, bright highlights, and super dark shadows that were visible in the original scene are not fully represented on an SDR display. By way of comparison, HDR is a technology that can maintain the same average brightness, yet significantly increases the range of colors and improves contrast by a factor of more than 100x.

What version of the Windows operating system supports HDR?

For HDR output to external displays, Windows 10 Redstone 4 (version 1803) or later is recommended. For HDR output to an integrated display (e.g., a laptop or AIO PC), Windows 10 Redstone 6 (version 1903) or later is recommended. You can determine your current version of Windows by typing winver at the Start button or from within a command window. If you have an earlier version of Windows 10, you can easily update it through the Windows Update process to the newest version. Each new version of Windows has improved HDR capabilities, so updating to the current version is recommended.

Can I add HDR to my PC?

You can add HDR to your PC, but depending on the hardware that you currently have, HDR may require some new hardware and/or new software.

What hardware do I need to be able to run HDR on my Windows PC?

You need a suitable Graphics Processing Unit (GPU), an HDR display, and relevant cables to connect your PC to the display.

Which GPUs support HDR?

The integrated graphics of Intel’s 7th Gen Core processors and later with at least HD 620 graphics, NVIDIA GeForce GTX 10-Series/16-Series and RTX 20-Series graphics cards, and AMD Radeon Rx 5-Series and Radeon Rx VEGA graphics all support the HDR10 protocol.

Which external displays support HDR?

Typically there are two things for which to look: the minimum requirement for PC HDR support is a display with HDR10 input – most 4K TVs now support HDR10, and many PC monitors support HDR10. The second thing to look for is the DisplayHDR logo, which indicates the HDR display’s certified performance tier. Displays that do not include the DisplayHDR logo may support HDR; however, the performance may not be noticeably different vs. SDR. See the Certified Products page for a list of DisplayHDR certified displays.

Which cable interfaces can be used for HDR?

DisplayPort v1.2 and later, USB-C with DisplayPort Alt Mode, Thunderbolt-3, and HDMI v2.0a or v2.1. On a laptop or AIO PC, the integrated panel can be HDR-enabled, and in this scenario you don’t need to be concerned about interfaces and cables.

Can dongles be used to convert one interface on my PC to a different interface on my display?

Yes. There are dongles that convert Thunderbolt into either DisplayPort or HDMI. There are also dongles that convert USB-C with DisplayPort Alt Mode to regular full-size and Mini DisplayPort, and dongles that convert DisplayPort to HDMI. When shopping for a dongle, be sure to select one that also supports HDCP r2.2 if you want to watch protected movie content.

Beyond a Windows PC what other sources of HDR content is there that I can use with a DisplayHDR certified display?

Typically 4K BluRay players, recent game consoles, most 4K streaming boxes and dongles such as Roku and Amazon Fire, and a few TV Set top boxes support HDR10. Typically these devices use HDMI 2.0a for their output, so if you plan to use these you will need to ensure your HDR display includes HDMI 2.0a (or later) input.

What software supports HDR?

A: Multiple web browsers support HDR, and of course within the browsers multiple video streaming sources, such as Netflix, YouTube, and VUDU, support HDR. The Windows default video player Movies & TV app supports HDR playback, as does VLC media player. Video-editing applications such as DaVinci Resolve Studio and Sony Catalyst Production Suite support content creation of HDR videos. Video transcoding for HDR is available by tools such as FFmpeg. Krita, an open-source free-to-download raster graphics editor, supports HDR drawing and HDR animated video. The Windows Photos app can display HDR images. Additionally the “HDR+WCG Image Viewer” is a free to download app from the Windows store. The Affinity Photo app supports HDR photographic editing of still images. More than 100 of the top-tier games support HDR, most new games are launched with HDR support. For software not listed here, consult the software’s specifications to determine whether it supports HDR.

I have an HDR PC. How do I check that it’s operating in HDR mode?

As a very simple check, by right-mouse clicking on the desktop you can launch the “Display Settings” page. There should be an option called “Play HDR games and apps” this should be set to “on.” You can then go into the sub-menu called “Windows HD Color Settings” if you scroll down to the bottom of the page you can check for a visible difference between the SDR and HDR video images.

See our detailed HDR setup guide for more detailed troubleshooting steps.

What is the difference between HDR10 and DisplayHDR?

HDR10 is a protocol that is a fundamental requirement of HDR communication between the GPU sources from Intel, NVIDIA, and AMD, and HDR displays. HDR10 is the predominant means used for HDR information transfer in the PC industry, the TV industry, and most other sources of HDR content. DisplayHDR builds on the foundational requirement that the hardware uses the HDR10 protocol and then further specifies the display’s performance, such as luminance, black level, and color. Full details can be found in our technical specs.

In summary, HDR10 governs digital data sent from the device to the display, and DisplayHDR governs how digital data is converted to visible imagery inside the display.

Is HDR only available with 4K?

No, HDR and resolution are independent of one another. Within the list of DisplayHDR certified devices, you will find many FHD, QHD, and UHD displays and laptops. You will also find displays that go beyond the traditional 16:9 aspect ratio, using 21:9 and 32:9 ratios. The GPUs that support HDR support all of these resolutions, and will dynamically scale content from any resolution to the display’s resolution.

How much more color does HDR have than SDR?

Typical PC monitors and other SDR displays are only able to present about 33% of the color range that humans can see, which is why a photograph on a display never quite looks real. By increasing beyond the standard PC sRGB color gamut to DCI-P3-D65, we approach 50% of the color range. Ultimately, the HDR10 standard supports the full BT.2020 color gamut, which reaches just over 70% of the range of colors that humans can see.

What is the difference between CTS r1.0 and CTS r1.1?

The CTS is our Compliance Test Specification for DisplayHDR which can be downloaded free from VESA. We launched CTS r1.0 in December of 2017 and it remains valid for new device certification through the end of May 2020. CTS r1.1 was launched September 2019 and currently has no expiry date. CTS r1.1 entails a more-rigorous testing procedure and thus helps promote further technology innovation for new products. It should be noted, however, that many devices that were originally certified with CTS r1.0 may also meet the CTS r1.1 specs; however, due to the cost of retesting, recertification is optional. Starting June 2020, new device certification will only be permitted using CTS r1.1. On our Certified Devices page, we indicate which CTS was used for the certification of each device.

What’s the purpose of the new CTS 1.1 Delta-ITP test?

The white point accuracy of a display has always been important in accurately displaying photos and videos, and of course in enabling content creators to create content that is accurately rendered on their display. With HDR the added element of luminance now using EOTF-2084 with absolute luminance values instead of SDR’s relative luminance values now also becomes critically important. Delta-ITP is the tolerance measurement system VESA has adopted for the DisplayHDR CTS 1.1 certification program. It measures both luminance and color accuracy of the test patches, and is ideal for HDR. In CTS 1.1 we test grey patches from 5 cd/m2 up through 50% of the logo level, so for a DisplayHDR-1000 display we test the luminance and white point accuracy through the range of 5-500 cd/m2.

Why was Delta-ITP chosen instead of the more commonly used Delta-E?

Delta-E is a useful and familiar measurement system for testing at standard SDR luminance levels of 100 cd/m2, however as a measurement system it does not work well for the extreme range of luminance testing that is necessary for HDR systems. We wanted to use a testing methodology that would enable us to test white point and luminance accuracy for the extended range of HDR, and thus have chosen to use Delta-ITP. Delta-ITP is effectively “The Delta-E for HDR” however, it should be noted that the Delta-ITP values are calculated quite differently from Delta-E values and thus cannot be compared.

Why does DisplayHDR use 10% patches for the test cases?

We chose 10% because we wanted to find a single test case that best represented various highlight sizes, from the smallest pin point “star field” kind of HDR (which is a tiny fraction of 1%) to the near-full-screen explosion that may briefly encompass 70% or more of the screen. We also wanted to develop a test that was fair across both LCD and OLED as well as any future display technology.

What we have found is that OLED displays can become brighter as the test patch size gets smaller, and achieve exceptional brightness for test patches smaller than 1%. However, there are few movie scenes where the highlight is less than 1%. LCD displays with local dimming do not achieve peak luminance with just one of the local dimming zones illuminated, but rather with several adjacent zones illuminated such that the scattering of light from multiple zones boosts the central luminance. However, at some point LCD does become power limited (often with test patches beyond 50%), so peak luminance for LCD is generally in the 20-40% range. Therefore, with LCD and OLED having different peak luminance levels and with movie highlights ranging from twinkling stars to full screen explosions, we chose 10% as our official singular test patch size to be the best compromise of all of these considerations. Office applications and web browsers, which often have a white background, are never run at full HDR luminance for the white background and were thus irrelevant to this HDR test consideration.

I’m a manufacturer, if I’ve already certified to CTS 1.0 should I recertify with CTS 1.1? Also, for products currently in development which certification should I use? Is there any benefit in certifying for both 1.0 and 1.1?

CTS 1.1 is a superset of the certification used in CTS 1.0, so there is no benefit in performing both certifications, if your hardware can pass CTS 1.1 that is the only certification you should perform. If you have already certified with CTS 1.0 you are welcome to re-certify if you would like, however there is no program requirement that you recertify, once a device is certified it remains certified as the certification does not expire. For products currently in development either certification program can be used through the end of May 2020, however starting June 2020 only CTS 1.1 will be available for new product certification.

Is it valid to use the DisplayHDR Logo without the Tier Level number in the logo?

Not on a product, no. Utilization of the logo on a product should always include the corresponding numerical tier level performance indicator as part of the logo. Utilization of the logo on a product without the tier level indicator is a violation of the logo agreement. The only place where the logo-without-number is permitted is in a generalized manner on a website or other marketing materials indicating support for DisplayHDR across a range of products.

What do the terms global-dimming, local-dimming, full-array dimming, 1D-dimming, 1.5D-dimming, 2D-dimming, and active-dimming mean? Also, LCD displays typically only have a contrast ratio of 1000:1 to 3000:1. How can an LCD achieve a 100,000:1 or greater contrast ratio?

LCDs don’t emit light on their own and must instead have a backlight that shines through the LCD material to display an image. Today’s LCD-based displays use a number of LEDs for the backlighting. To achieve a greater contrast ratio than a standard dynamic range display, the LEDs in the backlight change their brightness level, allowing the display to dim the backlight for darker blacks and brighten it for brighter whites, which creates a wider contrast ratio. There are several different dimming designs that can be used in the backlight to accomplish this.

Global Dimming: The backlight, which consists of a string of LEDs on one edge of the LCD panel, is treated as a single “zone” and is dimmed for dark scenes and brightened for bright scenes. This is the least expensive type of dimming and can be accomplished with a standard LCD panel. This approach works well for scenes with a limited dynamic range. This type of dimming is typically found on notebooks as it has the lowest power consumption of any dimming technique and generates the least amount of heat. The disadvantage of this design is that the simultaneous contrast ratio is never greater than the contrast ratio of the LCD panel, usually around 1000:1.

Local Dimming: This represents a wide variety of different sub-designs, each detailed below. What differentiates all of the local-dimming designs from global-dimming is that global dimming has a single backlight zone, the entire screen’s backlight is adjusted as one control, in local dimming the screen’s backlight is split into segments which can be independently adjusted.

1D Local Dimming: This design also uses an “edge-lit” string of LEDs, but in this case groups of LEDs on the string can be independently controlled. For most displays, the string of LEDs is located at the bottom of the panel, resulting in a number of vertical zones, equally spaced across the horizontal edge of the display. An edge-lit LED string typically contains between eight and sixteen LED groups, resulting in eight to sixteen dimming zones. This design allows for simultaneous contrast ratios of 6,000:1 to 100,000:1. 1D local dimming is currently the most common design found in HDR televisions and displays.

1.5D Local Dimming: Similar to the 1D local dimming, using edge lighting. However in this design an LED lighting string exists on two sides of the panel, typically top and bottom although left and right designs also exists. The advantage of this design is that it typically has 2×16 zones, so twice as many zones as 1D, but more importantly the top and the bottom of the screen are independently controlled, versus the 1D design where each zone is typically the full vertical height of the screen.

2D or Full array local dimming (FALD): In this design the backlight LEDs are moved from the edge of the panel to the rear of the panel and are arranged in a two-dimensional matrix of LEDs. Each LED is independently controlled and adjusts the brightness of just one “square of a checkerboard” on the display, although typically they are rectangles rather than perfect squares. Today’s HDR displays and televisions typically have between 384 and 1152 zones. These designs are the most expensive, due to the complexity of the circuitry and the processing demands required. The design can also generate a large amount of heat, and often requires cooling fans and/or heat sinks to be placed behind the LCD panel to draw heat away from the display electronics. Full array local dimming produces the best image quality of all of these designs and can achieve simultaneous contrast ratios of 20,000:1 to 500,000:1. Due to the high cost of this design, these displays command the highest prices and typically cost thousands of dollars.

Active-dimming: is the term VESA adopted for one of the new tests in our Certification Test Spec v1.1 (CTS v1.1) where we added a new kind of validation procedure to ensure that displays were actually dimming the backlight based on real-time analysis of the video content, rather than merely only dimming when metadata changes occurred in the video stream. It would be typical that during a movie or game that the metadata for HDR10 would not change, however each frame may have a different peak luminance than the prior frame, and thus could adjust the backlight accordingly. This yields better power saving, and better HDR blacks. The new tests in CTS v1.1 ensure that we test, without changing the metadata of the signal, a dramatic reduction in peak luminance from a full-white checkerboard, to a checkerboard where the white boxes are only 5 cd/m2, this provides ample opportunity for the dimming algorithm to reduce the backlight power. When reducing the backlight power the black level of the black segments of the checkerboard will also reduce, and this is what is measured and used in our calculation of active-dimming stops, (for the more technical, “stops” originally used in photography are a power-of-2 logarithmic function).

What about OLED, does it use a similar backlight technology?

OLED (Organic Light Emitting Diode) displays do not require a backlight because each OLED pixel generates its own light and can be considered its own zone. Because the illumination of each pixel is independently controlled – and there are over eight million pixels, or zones, in a UHD display – OLED displays typically have contrast ratios of 1,000,000:1 and can provide an excellent HDR experience. OLED displays, though, cannot currently get as bright as LCD displays. (A typical OLED cannot get much brighter than 500 cd/m², while current LCD HDR displays can achieve 1000 cd/m² and greater peak brightness.)

I’ve heard that 1000cd/m2 would be too bright and uncomfortable to use.

This is a common misunderstanding of HDR. Even with a display of 1000cd/m2 or brighter, most content, including typical SDR applications such as Email, Word, and Excel will run at the same average brightness levels that appear on an SDR display so that it is comfortable for normal use. This standard luminance level is adjustable by the user to enable optimization to ambient lighting and preference as before. The greater luminance capability of HDR displays are not normally used for full-screen steady-state luminance, but rather more typically for more-realistic representations of small highlights (lights or reflections common in real life scenes), or brief flashes of light such as an explosion within a game or movie.

As a content creator, is there anything particularly special about the DisplayHDR-1000 performance tier?

Yes, particularly for video content creation, the two most common Maximum Mastering Luminance Levels and Maximum Content Luminance Level values used for video content are 4000 and 1000. Currently, 4000cd/m2 displays are out of range for consumers, costing well in excess of $10K. This leaves 1000cd/m2 as the practical content maximum mastering level for consumers and many professionals. It is uncommon to master HDR video content for peak levels other than 4000 and 1000, and it would be impossible to visually grade and master content at the 1000 level on a display that can’t achieve 1000cd/m2.

As a content creator, what is special about the DisplayHDR-1400 performance tier, and how is it differentiated versus the DisplayHDR-1000 performance tier?

As noted above, for practical purposes the 1000 spec is the minimum spec required for grading of HDR video content.

However, at the 1000 level, the full-screen performance requirement under long duration testing is only 600cd/m2. Thus, if while editing the display pauses on a large flash of light that drives the display beyond the performance level that it can sustain for more than the 2-second flash requirement, the display may dim to a level below the video signal being sent to the display and thus no longer accurately represent the video signal. As a video content editor grading the content, adjustments to increase the luminance at this point will not be shown on the screen because a DisplayHDR-1000 display could be power-limited. However, at the DisplayHDR-1400 performance tier, the minimum requirement for long duration display is 900cd/m2 full screen. For practical purposes, 900cd/m2 full screen is sufficient to display any flash of light on the screen, even when paused indefinitely while grading. This ensures that any editing changes made by the content creator are always accurately represented on the screen, even in the most demanding of scenarios, such as when editing HDR video content to the MaxCLL 1000 level with a MaxFALL approaching 1000. Additionally, Display HDR-1400 contrast requirements, black-level requirements, and color gamut are all improved beyond DisplayHDR-1000.

For gaming, which performance levels work best?

Unlike for content creation, where 1000 is a critically important threshold performance level, the higher performance tiers are always better for gaming. Because games dynamically create content, unlike movies which are pre-encoded for specific luminance ranges, games can always take advantage of better displays with higher HDR ranges. However, one differentiation is worth noting: between the DisplayHDR-400 and DisplayHDR-500 performance tiers, there is a significant distinction in that the color gamut is greatly increased, and based on today’s technology the DisplayHDR-500 performance tier is the first tier that requires local dimming, which dramatically increases contrast and display performance. A few DisplayHDR-400 systems have also implemented local dimming; however, this is typically not the case and is not required by the spec at the 400 level. Furthermore, the True Black DisplayHDR-400 and DisplayHDR-500 performance tiers both implement the wider color gamut, further-increased contrast levels, detailed blacks, and stunningly fast rise time for improved gaming performance.