Jump to content

SDR V HDR on projectors


Recommended Posts

 

I don't mean to rain on anyones parade, but you can setup the projector to make SDR video look just like the "great HDR image", and setup older JVC's to look the same as the new models as well, for both SDR and HDR. 

The new models are not brighter, don't go as black and everything in between is adjustable via the projectors internal adjustments and or external equipment. So if you know what you are doing you can have whatever "look" you like with any projector and make any good projector look like any other.

The advantage for the the "new" models is more pleasing re mapping of HDR video for use with what is an SDR display device, a task previous models didn't do very well by default. They need a range of custom gamma maps that can be called up to best suite each individual movie title for best results and the new models still will. The new automated system will no doubt be an advance for the average consumer who doesn't want to play around with gamma maps or doesn't know how, but it will always be a compromise.

If you want to have your SDR movies look like your HDR movies you will still need to implement custom gamma maps for SDR. JVC could have automated this just like the auto HDR mapping but chose not too, probably because it would show people that HDR on a projector is nothing more than tweaked SDR and thats not good for marketing.

 

Link to comment
Share on other sites



On 21/02/2019 at 2:47 PM, Owen said:

 

I don't mean to rain on anyones parade, but you can setup the projector to make SDR video look just like the "great HDR image".

 

No you cannot. Unless you do it on a film by film basis manually, even then it wont work for every shot in the film..

 

Stop with the bullshit, it actually shows you dont know what the F the current state of HDR tone mapping is right now.

 

Show me an SDR Bluray dynamically adapting gamma algorithm that can do what MadVR does right now on a shot by shot basis with UHD HDR mastered films?

 

Ill wait..

 

MadVR tone mapping is currently updating its gamma maps with HDR sources THOUSANDS of times per film. literally frame by frame.

 

 

Edited by Javs
Link to comment
Share on other sites

Obviously there will be differences on some scenes, even if gamma is adjusted on a film by film basis, they are different masters after all. However, unless one does a side by side comparison the viewer will almost always be unaware of them. Even when I do a direct comparison and find individual scenes that do look "different" 9 out of 10 times I prefer the look of tweaked SDR over so called "HDR" on a projector. The director/producer can do a FAR better job of optimising the image for projector brightness (SDR) than any tone mapping algorithm in the display chain ever could.

 

HDR is mastered for VERY bright TV's and no matter what we do the display of HDR video on a projector is a compromise and gives inconstant results from scene to scene. If we introduce dynamic display gamma into the mix the situation gets WORSE as an algorithm is now deciding what we see not the director/producer, its remastering on the fly and is inherently inaccurate and inconstant.

 

SDR is mastered for the sort of brightness projectors actually deliver. SDR video is therefor consistent and EVERY scene in EVERY movie can be properly reproduced by a projector to convey the directors/producers mastering intent so a global approach to gamma adjustment to suit an individual display or viewer preference not only works but is ideal, the LAST thing we should want is any type dynamic behaviour. A dynamic iris is bad enough and a compromise in and of its self, add in dynamic gamma that adjusts the picture on a scene by scene or frame by frame basis and its a bloody mess and fundamentally inaccurate.

 

Would we want an audio system that adjusts its frequency response profile on the fly depending on what music is being played? 

  • Like 2
Link to comment
Share on other sites

6 hours ago, SandS said:

So what about Dolby Vision, will it be an improvement on this range of PJ's  No difference with my old x500 I take it.

Dolby vision is for tellys not home projectors as consumer Dolby vision is targeted at much higher nits tellys not to be confused with cinematic Dolby vision which is also aimed at 100 nits 30FL commercial theatres ie same as what we target for HDR ie 100nits or 30FL for HDR over what is SDR of 12-16FL in both commercial and home theatres. 

  • Like 2
Link to comment
Share on other sites



22 hours ago, Owen said:

Obviously there will be differences on some scenes, even if gamma is adjusted on a film by film basis, they are different masters after all. However, unless one does a side by side comparison the viewer will almost always be unaware of them. Even when I do a direct comparison and find individual scenes that do look "different" 9 out of 10 times I prefer the look of tweaked SDR over so called "HDR" on a projector. The director/producer can do a FAR better job of optimising the image for projector brightness (SDR) than any tone mapping algorithm in the display chain ever could.

 

HDR is mastered for VERY bright TV's and no matter what we do the display of HDR video on a projector is a compromise and gives inconstant results from scene to scene. If we introduce dynamic display gamma into the mix the situation gets WORSE as an algorithm is now deciding what we see not the director/producer, its remastering on the fly and is inherently inaccurate and inconstant.

 

SDR is mastered for the sort of brightness projectors actually deliver. SDR video is therefor consistent and EVERY scene in EVERY movie can be properly reproduced by a projector to convey the directors/producers mastering intent so a global approach to gamma adjustment to suit an individual display or viewer preference not only works but is ideal, the LAST thing we should want is any type dynamic behaviour. A dynamic iris is bad enough and a compromise in and of its self, add in dynamic gamma that adjusts the picture on a scene by scene or frame by frame basis and its a bloody mess and fundamentally inaccurate.

 

Would we want an audio system that adjusts its frequency response profile on the fly depending on what music is being played? 

I dont disagree with some of those points for sure.

 

SDR Bluray looks rubbish at 100 nits in a light treated room on a big screen, because the mid tones are searing bright, it looks FAR better at 50. When you have good tone mapped HDR, everything under 100 nits (encoded) will be at approx the same light levels as you get in SDR in your preferred setup - at least, thats how you should aim to do it, but you have an order of magnitude more headroom for highlights. You can get up to 150 nits without a terrible hassle with projectors these days, I personally use 85 in low lamp because I can matter of fact tell you, the image I get with tone mapped HDR now is significantly better than anything an SDR Encoded Bluray can do. The time for static tone mapping is over. Once you have played with dynamic tone mapping for months in depth like I have (I have been right there in the MadVR development threads testing every single beta and giving feed back) its not even funny how much better the image is dynamically. And I am talking shot to shot, scene to scene. Super dark shots such as the 6 nits (HDR encoded) scene from the Revenant right to super bright, it adapts to all of it.

 

In SDR you get a relatively flat (by comparison) image across the board and its consistently so, it looks pretty good at 50 nits peak, but it cant touch what can be done now if you directly compare them which I have done at incredible length over the past couple years now on every display I have in the house.

 

HDR unless on an OLED looks rubbish at anything approaching 1000 real nits too. I have four HDR capable displays in my house, (Samsung / Hisense / Panasonic OLED / JVC). The OLED is by far the best and I can run that full on, but its tiny, the JVC is VERY VERY close second, the Hisense does 700 nits, the Samsung 1400 peak, and they both look rubbish due to LCD black floor as expected, so on both I actually tone down the HDR output to probably 250/300 nits..

 

What the latest MadVR tone mapping is doing is remarkable, dim displays are no longer an issue whatsoever. You simply enter your real peak light output into the settings, and any time there is a scene with peak nits under your real peak nits, it will be tone mapped 1:1. Then anything over your peak nits will start to scale, and it uses pretty in depth analysis of the scene to do this, including not just how bright the pixels are, but what percentage of the pixels are at what brightness. A 6 nit scene encoded, which is spectacularly dim, will really be 6 nits on your display with this and yhou will be viewing it at exactly the same brightness as a display which can do 10,000 nits.

 

Here is the SDR Bluray of that shot from the Revanent.

 

Tp8Q735.png

 

Then the HDR Static Tone mapped version - 250 nits Target entered. This is probably something like a 3:1 ratio.

 

KCaIPfK.png

 

Then the Dynamic Tone Mapped version:

 

8cU4t9m.jpg

 

Lucy

 

SDR Bluray

 

qd0Ku0w.png

 

HDR Tone Mapped

 

e1UbxHv.png

 

In the past before this, we had to settle on a static tone mapping ratio of something like 4:1 with a curve on it, much lower than that and the whole image started to suffer, but that meant that 6 nits was projected at 1.5 nits, which is completely unacceptable. That was the cost of making the brightness scenes pop a bit more rather than compressing highlights to hell.

 

Every single display on the market is tone mapping. None of them are displaying 1:1 encoded pixels across the whole entire EOTF range. So you just need to understand this is what HDR is now. Its actually very powerful when done right, seems like its taken a couple years to get there, but there is finally huge strides.

 

I encourage you to get up to date with the latest builds and play with it, you will probably need to go to one of the more stables ones (build 46 is a good one).

  • Like 3
  • Thanks 1
Link to comment
Share on other sites

On 28/02/2019 at 1:00 PM, Javs said:

SDR Bluray looks rubbish at 100 nits in a light treated room on a big screen, because the mid tones are searing bright, it looks FAR better at 50.

If you use standard HDR gamma that's true, but thats NOT what I was talking about. Gamma for SDR must be changed to that shadows and mid tones are no brighter than with HDR, and since peak white level and black level are also the same as HDR displayed on the same projector the resulting SDR picture look just like the HDR picture,  I can get them so close that I can't tell them apart. 

No matter what "look" I dial in for HDR I can get the same "look" out of SDR. Is every scene the same, no it's not, but unless there is a side by side comparison no one would know, and as I said SDR is FAR more consistent and accurate, it needs no dynamic alteration of ANY kind.

 

On 28/02/2019 at 1:00 PM, Javs said:

The time for static tone mapping is over.

Not if you want picture accuracy. I would much rather the movie director/producer dictate what I see rather than an algorithm that is remastering the movie on the fly. To each their own.

 

On 28/02/2019 at 1:00 PM, Javs said:

In SDR you get a relatively flat (by comparison) image across the board and its consistently so, it looks pretty good at 50 nits peak, but it cant touch what can be done now if you directly compare them which I have done at incredible length over the past couple years now on every display I have in the house.

HDR video is VERY VERY flat and bland with undersaturated colour compared to SDR video when both are displayed with same display gamma. What gives HDR the apparent contrast and "pop" that people tend to like is  VERY exaggerated gamma applied by the display for HDR. SDR video needs comparatively very little gamma adjustment to get the same apparent contrast and "pop".

 

On 28/02/2019 at 1:00 PM, Javs said:

Here is the SDR Bluray of that shot from the Revanent.

Regarding the pictures of The Revenant, I actually prefer the look of the SDR version you posted, the dynamically adjusted HDR version looks harsh and artificial to me. It's a personal preference thing and I suspect that the exagerated high contrast look that you, and many others, find attractive is not my thing. It's more reminiscent of a TV image and I really don't like the way TV's display movies.

 

As for the Lucy images, it's dead easy to get the SDR version to look just like that so called "HDR" image and I have done so. I would prefer more shadow detail, which the SDR version is providing in your example, and that can be achieved without sacrificing the highlights which will be the same brightness with SDR as HDR when gamma  is adjusted appropriately and the projector is run at the same brightness.

 

On 28/02/2019 at 1:00 PM, Javs said:

In the past before this, we had to settle on a static tone mapping ratio of something like 4:1 with a curve on it, much lower than that and the whole image started to suffer, but that meant that 6 nits was projected at 1.5 nits, which is completely unacceptable.

SDR mastered content doesn't have that problem so there is no need for any dynamic correction. We see what the director/producer intended us to see which cant happen with HDR mastered content.

 

On 28/02/2019 at 1:00 PM, Javs said:

Every single display on the market is tone mapping. None of them are displaying 1:1 encoded pixels across the whole entire EOTF range.

True HDR displays don't use dynamically adjusted tone mapping for good reason, they don't need it.

Dynamic mapping is an imperfect solution to a problem that should not exist, and doesn't with SDR.

 

On 28/02/2019 at 1:00 PM, Javs said:

I encourage you to get up to date with the latest builds and play with it, you will probably need to go to one of the more stables ones (build 46 is a good one).

I'll download build 46 to test out of curiosity, but I have stopped bothing with HDR movies due to HDR's inherent unsiutability for projectors which are all currently SDR. Maybe when-if  hybrid log gamma gets worked out and implemented on Bluray the situation might change.

Link to comment
Share on other sites

12 hours ago, Owen said:

If you use standard HDR gamma that's true, but thats NOT what I was talking about. Gamma for SDR must be changed to that shadows and mid tones are no brighter than with HDR, and since peak white level and black level are also the same as HDR displayed on the same projector the resulting SDR picture look just like the HDR picture,  I can get them so close that I can't tell them apart. 

 

You are forgetting everything in the middle! They should look very similar yes, but if you spent any considerable time looking at a good HDR image rather than trying to spend every moment shooting it down, you might see what all the fuss is about.

 

Quote

HDR video is VERY VERY flat and bland with undersaturated colour compared to SDR video when both are displayed with same display gamma. What gives HDR the apparent contrast and "pop" that people tend to like is  VERY exaggerated gamma applied by the display for HDR. SDR video needs comparatively very little gamma adjustment to get the same apparent contrast and "pop".

 

You are not supposed to view HDR in its raw state, its essentially a LOG Gamma if you look at it like that, films are shot in such a way, that's the raw data as it sits, but in grading they apply look up tables, this is VERY similar. If you say its very flat, its because you are viewing it in the WRONG gamma. You should be using PQ and at least following some kind of EOTF curve. BT2390 preferably. I know you are knowledgeable, but you are talking about how HDR works all wrong. You also know that, I know you do, so, not sure why you are saying this.

 

Quote

True HDR displays don't use dynamically adjusted tone mapping for good reason, they don't need it. Dynamic mapping is an imperfect solution to a problem that should not exist, and doesn't with SDR.

 

Yes they do,  All of them still need to tone map above the display sets native peak brightness, its even been measured as to what they do on some displays, its absolutely tone mapping, even on 800 nit OLED's, minimal though, but still happening.

 

Also Dynamic tone mapping when its working properly should be imperceptible from Dolby Vision, which is the apex of HDR at this time, the metadata within tells the display exactly what to do, this is no different actually.

 

Quote

I'll download build 46 to test out of curiosity, but I have stopped bothing with HDR movies due to HDR's inherent unsiutability for projectors which are all currently SDR. Maybe when-if  hybrid log gamma gets worked out and implemented on Bluray the situation might change.

 

You may be surprised, you can pretty easily tweak that extra shadow detail in you like so much. Hybrid Log Gamma is not going to happen for films.

Link to comment
Share on other sites

17 hours ago, Javs said:

You are forgetting everything in the middle! They should look very similar yes, but if you spent any considerable time looking at a good HDR image rather than trying to spend every moment shooting it down, you might see what all the fuss is about.

Come on mate, I'm not forgetting anything. I can dial in whatever look I like for both SDR and HDR which is exactly why I don't see what the fuss is about. No matter what gamma setup is used for HDR video I can achieve the same visual results with SDR video.

 

 

17 hours ago, Javs said:

You are not supposed to view HDR in its raw state, its essentially a LOG Gamma if you look at it like that

Again, COME ON MAN, I never said it should. When we view SDR and HDR video with standard SDR 2.2 gamma for example, just for giggles, its easy to see that HDR video looks VERY, VERY flat and un colourful compared to SDR video and needs a wild gamma curve just to get an image with as much apparent contrast and colour saturation as SDR video. The gamma correction required to get SDR looking right at whatever peak output one desires on a projector is FAR, FAR less extreme than that required for HDR video AND SDR video is much more consistent because it was mastered for projector like peak output light levels, HDR video is NOT so often looks wrong.

Dynamic gamma derived from an algorithm at the display, as apposed to mastering intent applied in the studio, simply introduces another layers of inaccuracy.

 

17 hours ago, Javs said:

Yes they do,  All of them still need to tone map above the display sets native peak brightness, its even been measured as to what they do on some displays, its absolutely tone mapping, even on 800 nit OLED's, minimal though, but still happening.

If the display can achieve the peak output level the video was mastered for there is no need for ANY "tone mapping" and to use it would be inaccurate and not display the director/producers intent.

 

17 hours ago, Javs said:

Also Dynamic tone mapping when its working properly should be imperceptible from Dolby Vision, which is the apex of HDR at this time, the metadata within tells the display exactly what to do, this is no different actually.

How can an algorithm at the display end only know what the directors/producers mastering intent was, especially when the display is incapable of doing what is required, it CANT.

With Dolby Vision the display does what its told to do utilised informing the video production team embedded in the video. Obviously if the display can't do what the mastering requires the results will never be as intended, and projectors are not even remotely close.

 

SDR video suffers NONE of these issues because its mastered for the sort of light output levels projectors actually deliver. Set the peak white level you want, adjust gamma to suit and get the "look" you like, and you are good to go. No need to fiddle and absolutely ZERO need for any type of dynamic "enhancement".

 

17 hours ago, Javs said:

You may be surprised, you can pretty easily tweak that extra shadow detail in you like so much. 

Not surprised, its a given. I can get whatever "look" I like with ANY video. Its a real shame more projector owners don't understand what is possible and how to achieve it.

Projector manufactures could easily include a preset to make SDR video look just like HDR video at no cost to the consumer and one has to wonder why they don't. However, it would expose the inconvenient reality that "HDR" is not what its cracked up to be on a projector and thats not good for marketing.

Projectors are SDR display devices,  I've said this MANY times but its not sinking in. No matter what video you run on them you are looking at Standard Dynamic Range on screen, or close enough. The only difference between the default presentation of SDR and HDR on a projector is gamma and default peak light output, and we can have complete control over both if we want to have.

SDR video is mastered for projector like light output levels and it works just fine up to 300 nits or more with appropriately corrected gamma. No domestic projector is going to give you anywhere near that.

HDR is on a different planet, its designed for 1000 nits plus. When displayed on a projector it MUST be "tone mapped" back to an SDR like responce to get a usable picture, BUT since the mastering is for 1000 nits plus it will NEVER look as intended.

SDR is mastering is MUCH more appropriate for projectors and is going to be for the foreseeable future, its about time projector manufactures got on board as it would benefit 99.9% of content released to date. Only slight tweaking is required to get a "HDR" look out of SDR video displayed at the same peak light output as "HDR" on a projector AND the result will be more consistent and more accurate to the director intent. Using some any of display "dynamic" gamma with HDR video, just to get a result that is similar to but less accurate than SDR video, doesn't doesn't make any sense to me.

 

Edited by Owen
Link to comment
Share on other sites

Not surprised, its a given. I can get whatever "look" I like with ANY video. Its a real shame more projector owners don't understand what is possible and how to achieve it.
Projector manufactures could easily include a preset to make SDR video look just like HDR video at no cost to the consumer and one has to wonder why they don't. However, it would expose the inconvenient reality that "HDR" is not what its cracked up to be on a projector and thats not good for marketing.
Projectors are SDR display devices,  I've said this MANY times but its not sinking in. No matter what video you run on them you are looking at Standard Dynamic Range on screen, or close enough. The only difference between the default presentation of SDR and HDR on a projector is gamma and default peak light output, and we can have complete control over both if we want to have.
SDR video is mastered for projector like light output levels and it works just fine up to 300 nits or more with appropriately corrected gamma. No domestic projector is going to give you anywhere near that.
HDR is on a different planet, its designed for 1000 nits plus. When displayed on a projector it MUST be "tone mapped" back to an SDR like responce to get a usable picture, BUT since the mastering is for 1000 nits plus it will NEVER look as intended.
SDR is mastering is MUCH more appropriate for projectors and is going to be for the foreseeable future, its about time projector manufactures got on board as it would benefit 99.9% of content released to date. Only slight tweaking is required to get a "HDR" look out of SDR video displayed at the same peak light output as "HDR" on a projector AND the result will be more consistent and more accurate to the director intent. Using some any of display "dynamic" gamma with HDR video, just to get a result that is similar to but less accurate than SDR video, doesn't doesn't make any sense to me.
 


Interesting observations. Have you thought about posting the same on AVS?
Link to comment
Share on other sites



Well, at least I learnt about HDR and Dolby Vision and projectors.

But would like to know people's views on native 4K ver, eshift.

What is the minimum spec on a HTMI  lead 10m long for real 4K...

  • Like 1
Link to comment
Share on other sites

26 minutes ago, SandS said:

Well, at least I learnt about HDR and Dolby Vision and projectors.

But would like to know people's views on native 4K ver, eshift.

What is the minimum spec on a HTMI  lead 10m long for real 4K...

its really simple re cables.... there is no 10m premium certified cable that will guarantee to run 4k uhd full band width... so you either go for 30ft ie 9 something metres ... or  go 10m comsol... it what I have been using for a couple of years now NO ISSUES with full bandwidth 4k uhd. but it is dependant on your equipment combo where works or doesn't work and there are plenty ways to test with apple tv and xbox ones cable tests, also oppo main menu is a good test . thats the good thing it will either work or it wont. if doesnt you just return, cost is affordable at $65 from office works. otherwise you are looking at 10m fibre cable at 100s of dollars and it still wont be premium certified. but likely work

 

re 4k vs eshift like the cable thing, its covered in other threads in this sub forum, one even dedicated to the discussion. reality is eshift is capable of somewhere in extent of 3-3.5k and many will suggest thats about all typically going to get off uhd. also comes down to screen size and viewing distance where even going to resolve. anyways all this has been covered many a time in this very subform and in a few threads...many of us have moved on from this i guess and just enjoying benefits of uhd for last few years...

Link to comment
Share on other sites

oh bugger i thought those posts were the most interesting on that jvc thread. after all most posts were just reposting from other forums.. copyright? haha

 

seems very few buyers around anyway. oh well should be able to take the good with the bad on a forum thread.

 

come on owen..come on jav. its all spirited fun in the end.

 

all this sdr vs hdr stuff combined with all the 4k stuff just bamboozals mere mortals and plays into the hands of retailers/manufacturers. i mean the human eyes and ears done get upgrades every year.

Edited by hopefullguy
Link to comment
Share on other sites

2 minutes ago, hopefullguy said:

oh bugger i thought those posts were the most interesting on that jvc thread. after all most posts were just reposting from other forums.. copyright? haha

 

seems very few buyers around anyway. oh well should be able to take the good with the bad on a forum thread.

 

come on owen..come on jav. its all spirited fun in the end.

 

all this sdr vs hdr stuff combined with all the 4k stuff just bamboozals mere mortals and plays into the hands of retailers/manufacturers. i mean the human eyes and ears done get upgrades every year.

HDR is here to stay, which is why you wont see me go on about how stupid it is and will never work unless you have 1000 nit displays, that's completely glass half empty method of thinking and its very tiring to read here ad nauseam, which is a running theme with the person perpetrating those points of view!

 

I think dynamic tone mapped HDR looks considerably better than SDR as a whole. I have been watching it long enough on enough displays to come to that conclusion. I have tested the tone mapping on most UHD players, done Samsung, done Oppo, done Panasonic (not the 820), done custom curves (there are literally hundreds of people running my custom curves on JVCs), I've even got 4 UHD HDR displays in my house right now! They all treat HDR slightly differently, Dynamic Tone Mapped HDR has the potential to be a game changer.

 

Sorry, but HDR works even on displays which are brightness challenged to a certain degree, I run 85 nits right now in HDR and I could run 125, but I dont need to, I know if I had the choice, I would choose the HDR version of any given film so long as they did not muck up the mastering. As time goes on, these films are getting better and better. The Marvel stuff is exceptional. Infinity War especially a stand out.

 

The glass is most definitely half full over here!

 

:)

Link to comment
Share on other sites



Any thoughts on the "plug and play" dynamic tone mapping solutions that are emerging? ie. the latest Lumagen Radiance Pro beta firmware and the recently announced MadVR Envy. 

 

Although I get that dynamic tone mapping already exists (and is "free") when using MadVR on an appropriately specced HTPC, the thought of ripping 4K Blu-rays and setting up MadVR on a HTPC is off putting. Id rather just feed a UHD signal in via HDMI and output dynamically tone mapped signal to my projector.  

Link to comment
Share on other sites

48 minutes ago, Davo1972 said:

Although I get that dynamic tone mapping already exists (and is "free") when using MadVR on an appropriately specced HTPC, the thought of ripping 4K Blu-rays and setting up MadVR on a HTPC is off putting. Id rather just feed a UHD signal in via HDMI and output dynamically tone mapped signal to my projector.  

You don't have to rip them. Just have the appropriate UHD drive and decrypting software (e.g. AnyDVD HD) and then play them with a supporting player (e.g. JRiver Media Center).

Edited by Satanica
  • Like 1
Link to comment
Share on other sites

Would someone mind describing, or providing a direct link as to how to setup an SDR projector for HDR? The madvr doom9 thread is pretty long!

 

I have a sony hw45 and use a jriver with madvr. Is it feasible to actually output HDR using this setup?

Link to comment
Share on other sites

On 04/03/2019 at 6:56 AM, dandlj said:

Interesting observations. Have you thought about posting the same on AVS?

I no longer waste my time reading or posting on AVS, its been 10 years or more since I was active there. Its a sea of rubbish posts with little useful information to be found so I just don't bother, I've got better things to do with my time mate.

Link to comment
Share on other sites



On 04/03/2019 at 2:22 PM, Javs said:

HDR is here to stay

It is, and so is the fact that domestic projectors are SDR display devices and thats all they can display no matter what video you run into them.

This isn't a bad thing because on a big screen in dark room there is no need for the stupidly high peak output of true HDR, as you freely admit.

 

On 04/03/2019 at 2:22 PM, Javs said:

I think dynamic tone mapped HDR looks considerably better than SDR as a whole.

Dynamic tone mapping is a form of distortion, its not accurate to the source. You may like what it does most of the time but thats a personal preference thing. 

 

I draw your attention to the images you posted, a dark scene for The Revenant and a night scene from Lucy. We are viewing them in SDR on our PC screens and yet the "differences" can be easily seen, thats because the only real difference between them is gamma and we can adjust gamma to make the image look however we want it to look. There is nothing "HDR" about any of those images viewed on a PC monitor they are all SDR, just as they would be when viewed via a projector.

With projectors we can run SDR at exactly the same peak brightness as HDR so the Dynamic Range of the on screen image is the same for both if we want it to be, we can then tone map (adjust gamma) to get the "Look" we want, whatever that may be, and end up with virtually identical results. The scenes to scene differences will come down to mastering but all else being equal the SDR mastered video displayed at SDR like brightness (as with a projector) will be more accurate to the directors intent than a 1000-4000 nit HDR master displayed at the same SDR like brightness level. Add dynamic tone mapping into the mix for HDR video and accuracy goes out the window, an algorithm will dictate what you see not the director-producer.

 

On 04/03/2019 at 2:22 PM, Javs said:

Sorry, but HDR works even on displays which are brightness challenged to a certain degree

Thats because tone mapping has converted the image into something usable with an SDR display, it's is no longer High Dynamic Range its SDR on screen. The display dictates the "Dynamic Range" of the on screen image not the video.

Depending on the mastering and the tone mapping used at the display end HDR video may or may not look the same as the SDR version of the movie on a particular display but it is supposed to be very close, if its obviously different there is a problem.

Its odd to me that people will put a lot of effort into the tone mapping of HDR video to get the look they like but they wont do the same for SDR video.

 

 

  • Like 1
Link to comment
Share on other sites

15 hours ago, BradC said:

Is it feasible to actually output HDR using this setup?

Yes, HDR video can be displayed on any projector, it doesn't need to be 4K - HDR compatible or have HDMI 2.x ports either.

Its the display that sets the dynamic range of the picture NOT the video, its dictated by the displays contrast ratio and brightness.

It makes no difference what video you use you will only get a standard dynamic range image on screen no matter what domestic projector you have. Some have higher contrast than others and some are brighter, BUT they are ALL SDR ONLY.

  • Like 1
Link to comment
Share on other sites

The images were tone mapped for 120 nits at 2.2 gamma in order to post here for a reason. That's why you can view them.. when you do that in a light controlled room, the tone mapped images look better without a doubt. Also comical you say the revenant shot looked harsh.

I am otherwise not really interested in arguing with a brick wall mate. So my last comment on it is below.

Its a shame you are not on AVS. Funny you said it's a waste of time, is that because there are a LOT of genuine industry professionals on there whom you don't wish to engage with?? There are not many posters on here with the far reaching knowledge to challenge you, it's tiring for me to try, maybe you don't post on there because they would set your narrow views on the subject straight pretty quickly?, maybe not. You don't like a good disscussion? 10 years is an eternity, There are lots of people who would have the patience perhaps to construct a more long term conversation and people who could articulate other points, even calibration professionals (whom you would no doubt just claim don't know what they are talking about), but you keep repeating the same few points over and over for the past couple years now so I am pretty tired of it myself.

You clearly haven't paid much attention to the UHD format or done much in depth real world testing with it despite some fleeting occasions by the sounds of it, I am pretty confident in that, so I don't see how your limited testing invalidates the entire idea of HDR and projection just because you don't have a 1000 nit display. Especially when SDR is mastered for 100 nits but people have been watching it just fine at 40 or 50 nits for years. Its no different here really.

I would say perhaps read up more on how the EOTF works and calibration in relation to it. After 100 nits (virtual) on the EOTF all bets are off in terms of how you want to address the rolloff granted, But you are only going to come back saying the same things over and over.

Stop focusing on the dictionary definition of SDR and HDR for a moment and focus on the actual format we have and it's results, and then you will get somewhere, you have been arguing semantics this whole time.

And around the world we go.

Anyone else interested in HDR for projection I can assure you all the uptick in video experience is NOT subtle! It has gone from generally unwatchable on projectors 2 years ago to frankly stunning today and most definitely clearly superior (in REAL WORLD VIEWING) to the SDR versions.

Link to comment
Share on other sites

3 hours ago, Javs said:

The images were tone mapped for 120 nits at 2.2 gamma in order to post here for a reason. Thats why you can see them.

We can view those images at 50 nits or 150 nits and the relative differences will remain the same. The images will just be dimmer or brighter, and with gamma adjustment they can all be equalised

 

3 hours ago, Javs said:

when you do that in a light controlled room, the tone mapped images look better without a doubt.

Tone mapping (gamma mapping) can provide whatever "look" one desires for both SDR and HDR video source, but which map is accurate? With HDR its a guess.

When using as SDR display any look that is mastered into HDR video can be mastered into SDR video, the projector is the limitation not the video.

As for dynamic tone mapping applied at the display end, no matter how much you may like what it does to a particular scene it's not accurate to the source, no dynamic system is including a dynamic iris. As I said before its like playing with the tone controls of your stereo as the music plays, you might like what you hear but its not what your where intended to hear by the music producers. I dislike dynamic behaviour in a display system, but to each their own. I tolerate a dynamic iris for some movies because black levels without it are ordinary.

3 hours ago, Javs said:

Its a shame you are not on AVS. Funny you said it's a waste of time, is that because there are a LOT of genuine industry professionals on there whom you don't wish to engage with??

The knowledgable people don't post very often and I can't be bothered reading dozens if not hundreds of posts to find something of interest. I typically only go to AVS when I am searching for something specific or to check out something that is being discussed elsewhere.

 

I have over 2 decades of experiance manipulating image gamma so I'm fully aware of what can and cannot be done and why, I don't need to read about it.

 

When it comes to SDR displays, (projectors) any "look" that can be applied to the image in HDR mastering can also be applied to SDR mastering, the display is the limiting factor and it defines image dynamic range not the video.

 

3 hours ago, Javs said:

You clearly haven't paid much attention to the UHD format or done much in depth real world testing with it despite some fleeting occasions by the sounds of it

But I have. I can make HDR look like SDR and SDR look like any tone mapped HDR, or anything in between. Just set the same peak white level for both, which gives the same "dynamic range", and adjust gamma - colour to suit. After that any "differences" are academic.

I prefer the consistency of SDR video over HDR, it doesn't need any dynamic crap because it was mastered appropriately for a projector in the first place. Not perfect but close, unless the mastering was stuffed up, obviously we don't have any control over that.

I have alternative gamma maps I can call up to compensate for dodgy mastering and can modify maps on the fly very quickly to fine tune, but I don't need to do that very often as most SDR movies are pretty right with a fixed SDR to HDR "look" conversion map.

 

When comparing SDR video to HDR video on a projector its imperative that the same peak white level be used for both to get the same dynamic range AND the gamma of SDR  be adjusted to get the same "look" as whatever tone map you are using for HDR video, if you don't do what what are you actually comparing?  This is a task that MUST be done by eye and it can be very time consuming to get right, but once done you are good to go for most titles.

 

Dynamic tone mapping is cheating as its re mastering the video on the fly, its not the job of a replay system to do that. If the director-producer mastered in a very dark look with subtle dim highlights for a particular scene thats the way it should look on screen. A dynamic tone mapping system will make such a scene brighter and or boost the highlights when they should not be and in turn this will affect the relative contrast between scenes, it also affects how an dynamic iris system behaves which may be visually good or bad depending on the scene.To get an accurate picture the dynamic iris system and ALL dynamic systems must be disabled.

 

4 hours ago, Javs said:

Especially when SDR is mastered for 100 nits but people have been watching it just fine at 40 or 50 nits for years. Its no different here really.

40 to 50 nits is fine for SDR if you use a standard SDR gamma, BUT thats not what I am proposing. Displaying SDR dimmer than HDR is a choice not a necessity, the projector will happily display both at exactly the same brightness. Now when we do that standard SDR gamma is no longer appropriate as the average picture level will be too high so we adjust gamma to pull down the mid tones which makes the highlights look comparatively brighter and brings the average picture level down to where it was for SDR at 40 to 50 nits. This is what so called HDR is actually doing on a projector so we are achieving the same overall result.

50 to 100nits is only a one stop change in brightness, and thats not much. 100 to 1000nits is 3.25 stops, thats a LOT more.

 

4 hours ago, Javs said:

Stop focusing on the dictionary definition of SDR and HDR for a moment and focus on the actual format we have and it's results, and then you will get somewhere, you have been arguing semantics this whole time.

SDR can be displayed at the same brightness as HDR on a projector and deliver exactly the same "dynamic range", and once gamma is equalised the resulting picture looks near identical.

On a projector HDR is effectively SDR with tweaked gamma, not HDR at all. I'm fine with that and don't see how "true" HDR would ever be appropriate on a big screen in a dark room. 100 nits with appropriate gamma is plenty ands thats Standard Dynamic Range, like it or not.

4 hours ago, Javs said:

Anyone else interested in HDR for projection I can assure you all the uptick in video experience is NOT subtle! It has gone from generally unwatchable on projectors 2 years ago to frankly stunning today and most definitely clearly superior (in REAL WORLD VIEWING) to the SDR versions.

When both are viewed at the same brightness and with equalised gamma? How so when the picture looks the same?

  • Like 1
Link to comment
Share on other sites



  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...
To Top