IMPORTANT: HDR explained by a genius ( me )

HDR means “high dynamic range” - simple, right ?

WRONG

you really think if it was simple it would have taken a genius like me so long to understand it ?

the problem is “dynamic range” or even “HDR” means different things in different contexts …

let’s start with the basics. SDR or “standard dynamic range” ( not to be confused with SD or “standard definition” ) is 8 bits. HDR is 10 bits and Dolby Vision is 12 bits.

Professional Digital Cinema Cameras used to film your movies ( Such as Arri Alexa ) have about 15 F-stops of dynamic range.

F-stops are basically equivalent to bits but refer to light levels and apply equally to analog celluloid film and digital. one F-stop is one doubling of light level, as determined by lens aperture. one bit is also one doubling of level but as determined by binary math, rather than optics.

the human eye has a dynamic range of 20 F-Stops.

so to summarize.

human eye - 20 bit
arri alexa - 15 bit
dolby vision - 12 bit
HDR - 10 bit
SDR - 8 bit

also most LCD panels are natively 8 bit, even ones used in Dolby Vision capable TVs - but TVs have other ways of increasing precision such as by modulating backlight, or by using various patterns where some pixels are dimmed while others are brightened, or when some pixels are pulsed instead of being always on and so on. when implemented well these tricks can effectively get 12 bits ( or more ) of dynamic range from a 8 bit panel.

this is one reason why advanced backlight AND advanced processing are important - and Sony is known to be the best at both. With Bravia 9 Sony claims 22 bits of control using their new Mini-LED backlight master drive but of course this is marketing math - still, the point remains that LCD panel is only a small part of this equation - processor and backlight play a part too.

8K TVs actually have a small advantage here because when you have 4X as many pixels as you need that effectively gives you 2 extra bits of dynamic range via dither processing. more dimming zones in backlight also is better. more bits when it comes to controlling each dimming zone too. but above all having the right algorithm controlling it all is what makes a breaks a TV. Sony has and always had the best algorithm and processing while my Asus laptop has the worst i have ever seen on it’s Mini-LED screen. In fact it is probably fair to say it has none LOL.

anyway back to HDR …

you’re probably thinking - 8 bit ( SDR ) is rather poopy innit ?

indeed it is, VERY poopy.

the reason is that the kind of math used in SDR comes from the era of PAPER PRINTS and was very much adequate for printing out pictures of a person or pet to get framed.

but this math absolutely falls on its face when trying to display a view of setting sun for example, and for exactly the same reason why you will never see a beautiful printed image of a sunset - because the dynamic range of sunset is simply too great for either paper print or 8 bit math. it easily exceeds even the 20 bits of the human eye in fact - which is why you will go blind if you stare at the sun.

my Sony Z9J has the most realistic looking sunsets i have ever seen ( outside of real sunset ). the sun is rendered so bright when it moves across the horizon on the screen shadows move across the room just like if it was a real sun and like the real sun it is painful to look at and you can almost feel the heat, even though it’s just psychological.

to get this kind of effect takes powerful hardware … but it also takes HDR.

early monitors were dim and office lighting is bright so the graphics industry initially treated monitors as if they were paper by adopting a reference 100% brightness level similar to that of white paper in a well lit office or graphics design studio … i believe the standard was 100 nits ( white paper ) and monitors were capable of 250 nits max … ( the sun is over a billion Nits ).

to this day by default the background on web pages is white because boomers still use 20 year old monitors on which you can actually open a white page without going blind …

but when you use a screen like my 3,500 nits Sony Z9J and you open a page with a white background you fall out of your chair, because it looks like somebody firing a photo strobe in your face … and that problem is why HDR had to be invented …

essentially HDR says that we will no longer treat monitors as paper, because paper has a fixed brightness limit while monitors do not.

currently OLEDs can hit about 1,500 nits, Mini-LED about 3,000 nits and Micro-LED 10,000 nits.

furthermore, Micro-LED is only really limited to 10,000 nits because our highest end HDR standard ( Dolby Vision ) goes to 10,000 nits. in theory Micro-LED could go even higher, but considering the cheapest Micro-LED TV is still in the six-figure range it is too early to worry about that Dolby Vision becoming obsolete.

more relevantly even the dimmest tech we have, OLED, is an order of magnitude brighter than what the original 8 bit SDR color math was designed for, which means that SDR is deader than a doornail.

yes you can still buy a monitor with 250 nits brightness but if you do you’re a Boomer. such monitors are designed for Word Documents and Excel Spreadsheets. if you watch a movie like Dune on a monitor like that you should be shot in the nuts.

all videophiles understand that there is no substitute for sheer screen size. well there is also no substitute for brightness.

anyway back to what HDR actually is …

what HDR is - is a fundamentally different way to represent brightness.

instead of encoding brightness as a percentage value of “white” it is now recorded as an absolute value in “nits”

so in the Paper / SDR days you had white paper as 100% white, and 18% was supposed to be representative of Caucasian skin or average picture level and some studios had walls painted 18% neutral gray and in fact i almost painted my room that …

you also had 18% neutral gray cards to properly expose your camera and so on …

in modern HDR era paper is not a thing and thus neither are percentages of 100% white paper …

instead brightness is recorded as absolute value and not a percentage of anything. so that white paper will now be 100 nits, not 100% … and many other things will be much brighter … in fact the same white paper outdoor on a sunny day will be about 100,000 nits or more.

so if white paper outdoors is 100,000 nits and the sun is over a billion nits - how can those values be recorded in a system like Dolby Vision that is only limited to 10,000 nits ?

well obviously they can’t. it’s the same as how in the SDR days some things were brighter than white paper and those were obviously not reproduced properly. what is different with HDR is that we no longer put the artificial ceiling at white paper - but rather each movie is mastered at its own nit brightness UP TO a max of what each HDR standard allows, so 10,000 nits for Dolby Vision for example.

in practice movies are mastered at lower levels based on both what brightness modern TVs are capable of and also what brightness the studio master reference monitors are capable of. typically this is 1,000 nits. sometimes 4,000 nits. as i wrote elsewhere 4,000 nits will soon become the new norm like 1,000 nits was until now, and this is due to popularization of Mini-LED TVs capable of higher brightness than OLEDs.

but you’re probably like - i still don’t understand what is the practical significance of this ?

the significance is that with HDR daylight scenes can be recorded as bright while night scenes as dark. whereas with SDR they were all the same.

this is because when you’re working with percentages of 8 bit max you really want to use the whole scale or you’re going to run into quantization noise.

HDR introduces many mathematical tricks to stretch that 10 or 12 bits further by using various curves that can be defined per movie, per scene or even per frame. this allows some scenes to be very bright and others very dark without losing detail - just as it would be in real life.

HDR works like our eyes, not like paper. THAT is the big thing.

the exact details of how it is done are not important and they also vary by different HDR standards of which there are several. Dolby Vision is the gold standard of HDR though.

the point is that HDR is a set of mathematical tools that allow you to capture light as it is, in absolute values, then master / color grade it in studio mostly to the brightness level of the master reference monitor you have ( 4,000 nits for the latest Sony that came out in 2023 ), encode it using standards like Dolby Vision and use powerful processors in modern TVs to then scale those brightness levels to what the TV can actually reproduce using a process called “tone mapping” …

that is instead of treating the max TV brightness as white paper and simply displaying percentage values now there is much more complex math that tries to maintain absolute levels of brightness, as mastered, within the limitations of what the TV can actually reproduce and then remap the values that cannot be reproduced in a way that doesn’t look obviously clipped …

the result is simply a better looking picture. so much so that i can without any doubt say that on my 8K Dolby Vision capable TV i would rather watch a movie in 4K and HDR than in 8K and without HDR. the HDR makes a bigger difference than 8K, at least in properly mastered content.

and there is the rub - just like most content isn’t 8K also most content isn’t HDR …

so to get HDR in a setup like mine, namely

Windows 11
GeForce RTX 4080
Sony Z9J

you have to

  • enable HDR in windows
  • enable HDR in Sony HDMI input modes menu
  • make sure Nvidia drivers and Sony firmware play along ( not a given at all, took me many hours to get them to play nice )
  • use player that supports HDR ( my former favorite player “pot player” does NOT, but both Microsoft provided players and VLC do. i used pot player for foreign movies due to excellent subtitle support )
  • the video file itself must be HDR ( most 4K files are, but most 1080p are not )

now assume you did all that - how do you know you’re getting HDR ?

there is actually a bulletproof way to know. windows has the following setting ( pictured Windows 10 ):

the reason you need this setting is because remember SDR brightness values are relative but HDR are absolute. so you need to be able to control those two brightness levels separately. or rather when HDR is enabled HDR content brightness is managed automatically by the TV’s processor, but SDR brightness must be set manually - basically by adjusting how bright the 100% white paper is - like a dimmer switch on your office lights.

when you open your movie in a small window on one side of the screen and this windows menu on the other side of the screen and drag the slider if the movie is in HDR the brightness of the movie will remain constant while the rest of screen ( such as your desktop background ), being SDR, will dim or brighten as you drag the slider.

if the movie is SDR then it will also dim and brighten with the rest of the screen.

this is how you tell whether your movie is rendered as absolute values ( HDR ) or relative values ( SDR ). and believe me once you see the difference you will never want to watch SDR again.

most good new TVs can hit about double the brightness in HDR than SDR and yet most HDR content looks DARKER than SDR content.

this is not a contradiction at all. SDR basically always looks over-exposed but lacking punch. HDR has rich shadow detail and punchy highlights.

watching a modern TV in SDR mode is like driving a Porsche using spare donuts instead of proper tires. the TV cannot reach its potential in SDR mode.

because SDR is represented as percentages and not absolute values ultimately all colors end up looking the same - even more compressed than 8 bits would suggest. SDR simply lacks the tools to properly map both dark scenes and bright highlights.

SDR never looks good - if you turn down the brightness it looks dim - if you turn it up it looks washed out. in HDR you don’t need to adjust anything - it just always looks good.

SDR basically always looks like something printed on a piece of paper - you can watch that paper in a dim room or in bright sunlight - but it will always look like paper - it will never look real.

HDR is opposite - it always looks real.

word of caution - if you switch windows to HDR it will usually make SDR content look WORSE

but you don’t need to restart for this switch - you just flip the toggle, the screen blinks for about 2 seconds and the mode is switched.

before watching a movie i check ( using method previously described ) whether it is HDR or not and if its SDR i switch windows to SDR to make sure i don’t make SDR look even worse than it already is.

i typically keep windows in HDR because even though 95% of all content ( such as YouTube ) is still SDR i don’t really care about quality when it comes to something filmed with a phone and that i am watching with sun blasting through the windows.

but when it gets dark at night and i want to watch a movie shot with a $100,000 Arri camera and mastered on a $30,000 Sony Monitor that is when i begin to care about getting the best image possible, which means HDR if it is available.

and again, if HDR is not available in a given file and i still want to try to enjoy the movie i will flip windows to SDR for the duration of the film them put it back into HDR …

basically viewing SDR content when windows is in HDR Mode complicates the processing, namely the TV thinks it is receiving HDR signal but this HDR is actually created by windows from SDR that was arbitrarily mapped to some part of HDR range ( whereas real HDR has to be mastered in a studio ) and this is never perfect and ends up looking even more washed out than SDR already is.

so if you only have an SDR file and determined to try to enjoy it anyway - put the windows into SDR ( and the TV will automatically switch to SDR, even if you “enabled” HDR as SIGNAL FORMAT for HDMI ).

bottom line - as i said in the beginning - this is a little bit complicated. it’s something you need to actually use to understand.

the point of my write up here is merely to explain that it is WORTH THE EFFORT in getting it to work. you WILL SEE the difference.

it’s not like FLAC vs MP3 where you get a file 10 times bigger for barely any difference in sound. with HDR it’s a file that is maybe 20% bigger for a PRETTY OBVIOUS difference in picture quality.

the difference in image quality on the same TV between HDR and SDR is like the difference in picture quality between a High-End TV and a Mid-Range TV.

watching movies in SDR is like listening to music over the telephone call where your friend is calling you from a concert. it’s that bad once you try HDR. SDR tech is just woefully inadequate.

THE FUTURE IS NOW !

Videophiles with real experience with bright televisions and properly reproduced 2000+ nit HDR material will tell you that brightness is the most important part of the experience. You need a backlit LCD that can get extremely bright like your Sony to truly appreciate HDR.

Most people to whom I try to explain this don’t understand. They think OLED is better because it has better black levels and that will make for a better HDR experience. Hell, some people spend $30,000 on a projector that can only hit 100 nits on their screen and they think they are getting a good HDR experience.

If you want a mind-blowing cinematic HDR experience, get the biggest backlit LCD that you can get that will hit 4000+ nits.

The best TV that money can buy is probably still the 2019 98" Sony Z9G. Looks like you can get one for $15,000. It has their backlight master drive technology with 8K resolution and 4,000 real nits of brightness. I’ve seen it in person and it is spectacular.

1 Like