AVS Forum banner
  • Everything You Wanted to Know About HDMI Cables. Ep. 7 of the AVSForum Podcast is now live. Click here for details.

Netflix: High Dynamic Range is 'more important' than 4K

5.2K views 77 replies 31 participants last post by  hungro  
#1 ·
The following are selected quotes from "The Telegraph" read more HERE
"Forget 4K resolution, High Dynamic Range will make a bigger difference to your TV viewing experience, according to Netflix"

As anyone who is into photography will know, HDR stands for High Dynamic Range, which essentially means there is a strong contrast between the bright parts and the dark parts of the image. The higher the dynamic range of an image, the more realistic it looks.

When it comes to video, HDR is about making the bright pixels on the screen even brighter, to more accurately represent scenes from outdoors. According to Neil Hunt, chief product officer at Netflix, HDR will create a more noticeable difference to TV viewers than 4K alone.

"With 4K, there are enough pixels on the screen that your eyeball can’t really perceive any more detail, so now the quest for more realism turns into, can we put better pixels on the screen?" said Hunt. "I think that’s actually a more important quality improvement to get to the brightness and detail in the picture than the 4K is by itself."

Brightness (or luminance) is measured in 'nits'. Hunt explained that most TVs today have a peak brightness of around 100 nits, whereas the peak brightness of an HDR television is around 1,000 nits – representing a 10-fold increase in the brightness of the highlights on the screen.

A number of 4K TVs with HDR capabilities were on display at CES 2015, from the likes of Panasonic, Samsung, LG and Sony. All of these TVs were prototypes, but it is thought that the first HDR sets will go on sale later this year, and the technology will become mainstream in a couple of years.

Netflix is working with the newly-formed UHD Alliance to incorporate HDR into the 'Ultra HD' standard, along with 4K resolution and an expanded colour gamut, (which means redder reds, greener greens and bluer blues). It is also working with directors and producers to shoot and capture video in 4K HDR.

Netflix said it is working on about 60 'Original Series' this year, around 10 of which are big action features that justify doing the work to make them HDR. Marco Polo, an American television drama series about Marco Polo's early years in the court of Kublai Khan, was the first to be remastered for HDR content.
Using TV as HTPC, the 4k UHD is for me as much important as the HDR, so I hope my next TV/Monitor is going to have both preferably with FALD.
 
This post has been deleted
#9 ·
Lol... says the company that crops many (if not most) 2.35:1 aspect ratio movies to fill a 16x9 screen. And of course forget about 1.85:1 showing properly...

I don't know how anyone who cares about movies watches Netflix. I only use Netflix for watching TV material but not even that really (since the best TV is mostly HBO).
 
#21 ·
I think you meant to say in your opinion its BS.


That I ideal sitting position I assume for normal vision 20/20, who is to say you do not have 20/15 vision with glasses or you others think you talk BS. ;)


There is arrange if you look at one of the charts, the exact line is the THX certed one.


I would go but what they say not Joe 6 Pack on a forum.


Joe 6 Pack sitting 12feet away with his feet up is not seeing 1080p on a 50" screen even then LG said the most common size sold last year was 42" and 2013 was 32".


4k means bigger sets and sitting closet to the sets.
 
#30 ·
I think that HD (1080/720 res) was a pretty hard sell which had been available for several years before it became truly popular. I knew both men and women who saw HD and said, "meh". I recently shared a house with a woman whom I'd catch watching the SD version of a channel available in HD who would shoo me away when I pointed it out to her. She was watching a 40" TV from over 10' away with letterboxing around a 16x9 image inside of a pillarboxed 4:3 image; you'd think that she'd at least want the picture to fill up the screeen.

HDR is going to be much, much more difficult to sell. If you can show people the difference then they might be impressed, but explaining the difference well enough to make them come check it out will be problematic. With "4K" TV you've at least got that "bigger/more is better" mentality on your side.
 
#34 ·
I think that HD (1080/720 res) was a pretty hard sell which had been available for several years before it became truly popular. I knew both men and women who saw HD and said, "meh". I recently shared a house with a woman whom I'd catch watching the SD version of a channel available in HD who would shoo me away when I pointed it out to her. She was watching a 40" TV from over 10' away with letterboxing around a 16x9 image inside of a pillarboxed 4:3 image; you'd think that she'd at least want the picture to fill up the screeen.

HDR is going to be much, much more difficult to sell. If you can show people the difference then they might be impressed, but explaining the difference well enough to make them come check it out will be problematic. With "4K" TV you've at least got that "bigger/more is better" mentality on your side.
I'm not sure about this unless they are very dumb. They can just sell it like Dolby. By saying...gets the most brightness and contrast out of every Pixel! Or Helps get the most out of every single Pixel. Keep the message simple and crisp. And the masses will get it. Make it hard like...Darbee Vision. And the masses will yawn while the "Philes" marvel.
 
#61 ·
The vast majority of that 4k content would just be upscaled before it gets streamed, so it really makes 0 difference between streaming 1080P and having the set itself upscale it.

A lot of TV shows are now only being released on DVD, not even bluray.

That of course misses the point that the 4K sets that certain people here are so in love with have inferior pictures. Nothing like a 4K image with blacks rendered as dark grey.
 
#33 ·
In a lit room HDR will probably do no harm, but I think it will be obnoxious for dark room viewing.

TVs today get more than bright enough for viewing in dark environments. The brighter the image gets, the more the pupil constricts - the constant dilation and constriction of the pupil causes eyestrain with some people (which is why bias lighting is often recommended).

One of the reasons I love OLED is that the max. brightness can be turned down substantially without losing picture quality, for comfortable dark room viewing.
 
#39 ·
My experience with 4k so far is limited mostly to photography, slideshows and visuals with the pre-post production company I consult for in Stow Massachusetts. And I must say...it makes a huge difference in those areas/applications. For everyday 1080i broadcast, upscaled HDTV viewing? Some will notice it. Some won't. But straight 4k Blu Ray movie transfers to disk through a 4K Blu-Ray Player and similarly capable HDTV panel should yield superior results that anyone will notice. Regardless of the size of the panel. I see the clear differences in monitors of 27" or less. The detail and full depth imaging is just unmatched IMO. I do believe 4K technology and above that is coupled with Technology like OLED and QD are must have premiums for people who can stretch a bit to afford them. But I must admit...I am completely unexcited about viewing 4k technology that is shoe horned through high compression pipes via the internet via apps like Netflix. I would guess that most people wouldn't see much of a difference if any between that and uncompressed 1080p Blu-Ray. Or uncompressed 1080i broadcasts for that matter...on any size panel.
 
#50 ·
One of the things that most hobbyists that shoot in 4K notice, is the superior results obtained when that 4K footage is downscaled to 1920X1080. Obviously the 4K footage is far superior when viewed on a 4K monitor, but the down-sampled HD looks excellent.

Over the years, for those of us that have shot in both HD and 4K, we notice the downscaled 4K>HD look quite a bit superior to anything we ever shot in HD. A significant part of that is due to the fact that the HD cameras we've used in the past, simply weren't capable of the full resolution that HD was capable of. With 4K acquisition equipment, it's easy, when downscaled, to achieve full HD resolution. For most of us, that's the first time we've seen that full HD resolution.
 
#40 ·
4K TVs will sell because they are viewed at the store from 3' away and you can see a noticeable difference at that distance. I am a fan of UHD, but do not intend to buy one until they make an affordable (to me) 80" screen so that it will make a noticeable difference at my 14' viewing distance. I currently have a 70" 1080P TV that has more than acceptable detail at that distance, although I would like to have better blacks and details in dark scenes.
HDR makes quite an impact in photography in which multiple pictures are taken at different exposure levels and then combined. This results in a photograph that comes close to the limits of what the eye can perceive since both the darker and brighter parts of the image are more detailed at the same time, which is impossible to do using a single exposure level.
I do not know how well HDR works in video in comparison, but it would have a dramatic effect no matter what the resolution is if it is implemented well. Increased detail and contrast is easily perceived. Of course, the TV/monitor has to be capable enough to show these differences. It does not require 4K to do so and still be noticed on a 50" screen at 12 '-15'.
A television that can reach 1,000 nits does not have to produce that level of brightness over the entire picture. If it can also produce near perfect blacks, imagine scene in which a subject steps outside a home into a bright day in which the sun is 1000 nits, the sky is 500-700 and shadows are at 200 - all with the highest detail that the resolution would allow. Even at 1080P, it would be more impactful and realistic than 4K at a single exposure level. Because the entire image is not at a blindingly bright 1,000 nits, I do not believe that it would be any more fatiguing to watch than personally doing the same in real life.
 
  • Like
Reactions: barrelbelly
#43 · (Edited)
4K TVs will sell because they are viewed at the store from 3' away and you can see a noticeable difference at that distance. I am a fan of UHD, but do not intend to buy one until they make an affordable (to me) 80" screen so that it will make a noticeable difference at my 14' viewing distance. I currently have a 70" 1080P TV that has more than acceptable detail at that distance, although I would like to have better blacks and details in dark scenes.
HDR makes quite an impact in photography in which multiple pictures are taken at different exposure levels and then combined. This results in a photograph that comes close to the limits of what the eye can perceive since both the darker and brighter parts of the image are more detailed at the same time, which is impossible to do using a single exposure level.
I do not know how well HDR works in video in comparison, but it would have a dramatic effect no matter what the resolution is if it is implemented well. Increased detail and contrast is easily perceived.
Of course, the TV/monitor has to be capable enough to show these differences. It does not require 4K to do so and still be noticed on a 50" screen at 12 '-15'.

The most common misconception with HDR displays is that they have anything to do with HDR capture.

They don't.

The footage used, 35mm film, RED, Alexa etc has no more shadow detail or highlight detail than we've been used to for the last 50+ years.

The only difference is that the content will be mastered with higher peak luminance. The 'container' is larger and basically allows us higher contrast content with more graduations.

A television that can reach 1,000 nits does not have to produce that level of brightness over the entire picture. If it can also produce near perfect blacks, imagine scene in which a subject steps outside a home into a bright day in which the sun is 1000 nits, the sky is 500-700 and shadows are at 200 - all with the highest detail that the resolution would allow. Even at 1080P, it would be more impactful and realistic than 4K at a single exposure level. Because the entire image is not at a blindingly bright 1,000 nits, I do not believe that it would be any more fatiguing to watch than personally doing the same in real life.
A television that can do a 1000nits can't produce that level of brightness over the full screen. It achieves that brightness with power management. By dimming zone X it can brighten zone Y. I'm not sure what the minimum area is for the peak luminance spec but it could be 1 zone. The new Samsung I think has 300 zones. So that's not a very large area. I think the first 'HDR' displays will have a more useable range of 0-700nits. I don't know if we will ever see 4000nit displays that make the most of Dolby Vision, but they always find new ways to make LEDs more efficient so you never know.

In the same respect a currently calibrated 100nit display doesn't produce that level of brightness over the whole picture, unless displaying a pure white screen of course. The scene you describe is ridiculously bright. The shadow detail being double the brightness of what most people calibrate their peak brightness! More than double as bright as the cinema screen ever gets!

Basically anything over 90nit you are in highlight territory. Imagine a gradient from left to right on your screen. 0nit (black) on the left running all the way through to 100nit (white) on the right. Now imagine that continues all the way to 1000. That's the extra headroom that HDR promises.

As you can imagine thinking of the scale, our eyes aren't very good at noticing gradations in highlights. Black level is much more important for HDR. We can easily discern a step between 0.01nit - 1nit.

Good luck spotting the difference between 700-701nits. :)

The concern with fatigue is that if that 100-1000nit range is overused or is applied in rapidly cut scenes it could easily become uncomfortable.

But used properly the extra headroom could allow smoother gradations in highlights and even used as a creative tool to make certain scenes more immersive.

My feeling is that the first HDR masters will over use it. Then once it's a part of the production from the concept it could be used in more creative ways that really benefit the experience.

Think of it as LFE. You wouldn't want your sub rattling the windows constantly. It would be annoying, distracting and hurt your ears (maybe foundations looking at some home theatres!) after 90mins. But in the right place and moment it can be used as a tool to bring a scene to life.
 
#60 ·
I think Netflix will win the race to HDR later this year. Possibly having some content mastered in HDR as soon as the first HDR-able sets are available to buy. It sounds like Marco Polo has been mastered in HDR already, so it's waiting to go.

A couple of details I'm not sure about...I think they are going with the SMPTE variety as apposed to DV, and I'm not sure who they are hooked up with in terms of display manufacturers - Sony, Samsung, LG? And if there's any kind of exclusive period. It's still a little unclear which display supports what/who at the moment. Unless someone can enlighten me?

Then when UHD BD hits the market - I hear christmas 2015, it will be possible to deliver HDR on disk.

I'm sure the studios will somehow milk two remasters out of legacy movies. First a 4K remaster. And then the 4K+ HDR Ultra remaster. ;)
 
#44 ·
I've read the Alexa does 15 stops of exposure range, which is like 36,000:1 contrast. I'm sure that varies with ISO setting, etc., but still.
I believe that's greater than the contrast range in-scene offered by LCD TVs today. Certainly the trend towards edge-lit backlighting doesn't help. A JVC LCOS projector can do this.

Our eyes have an instantaneous dynamic range (analogous to not using a dynamic iris in a projector) or contrast ratio of either close to 1,000,000:1 for dark-adapted (not realistic in a movie) or 10,000-16,000:1 for brighter content.

The RED Dragon sensor is supposed to do 18 stops? That finally gets closer to what our eye can achieve. I suspect display presentation tech will lag behind that for a while.
 
#45 ·
Yeah, those are the manufacture claims! I think the f-stop number war started with RED.

When it comes to useable image (not just measurable noise thanks!) then we still work within a 9-10stop range. HDRx in RED can save you a little highlight headroom but real world use this maybe get us 11 stops to work with.

I thought most estimates for our eyes instantaneous dynamic range (where our pupil opening is unchanged) is between 10-14stops. I'm not sure how those numbers correlate to your contrast ratio mind!
 
#49 ·
That's the electronics age early adopter conundrum. Are you an early adopter? Do you pay a premium for latest tech?

One way that makes it more palatable is intra-household trickle-down.
Buy a best bang for buck TV now if you need it. When HDR (or whatever tech du jour) becomes affordable for you (and kinks worked out), you buy that, and bequeath the older tv to a less critical room. Formal living room, daughter, etc.
 
#51 ·
So, that's the 4k for production topic, but to be clear, has nothing to do with 4k display for consumers. Likewise 4k for monitors viewed from two feet away.

It's about Netflix saying that given finite resources, HDR will be enjoyed more by their customers, and can be implemented easier to give them ROI sooner.
 
#56 ·
When doing graphics design, there are a lot of times that the "objectively correct" color used elsewhere isn't looking right. I have to fight part of my brain to say, "all that matters is the interpretation. If it looks bad, it's bad, Photoshop color picker notwithstanding."
 
#58 ·
The irony here is that many of the arguments made for HDR here were arguments for plasma TV before it went the way of the dinosaur. My 1080p VT60 looks far more lifelike than any 4k LCD I've seen.


But I'm glad that Captain Obvious has found a steady position with Netflix just the same.