If you followed the news out of CES closely you probably heard the word HDR tossed around a lot. This coming year we’ll see TVs for under $500 with the feature, and fancy monitors for nearly $1000. But what does HDR even mean?
HDR stands for high dynamic range. Originally the term was applied exclusively to a style of still photography that greatly diminished shadows and highlights in photos. It made it useful for architects and real estate agents, people who would want to represent the insides of their buildings without all the nasty glare of sun and the darkness of shadowy corners. Yet HDR also found fans amongst really bad photographers with access to Photoshop, and consequently an unattractive HDR aesthetic emerged in still photography.
That aesthetic, thankfully, can’t translate to moving pictures. But HDR in movies and TVs is still about revealing details in areas of extreme brightness and darkness.
A display accomplishes the feat by having a truly exceptional contrast ratio. The UHD Alliance, a consortium of TV makers, content creators and distributors, actually defines the peak brightness and darkness a TV needs to produce to be HDR-compliant.
Specifically, the UHD Alliance says a TV has to be able to put out 1000 nits (twice as bright as Samsung Galaxy S7 phone in sunlight) and get as dark as .05 nits—something a lot of LED TVs can do. Or it has to get as bright as 540 nits and as dark as 0.0005— something only an OLED display is really capable of. It also must be able to display content at a minimum 4K resolution and produce wider color gamut than that the one used for the last 30 years, Rec. 709.
Read more at Gizmodo