Jump to content

JVC 4K E-shift & UHD?


Recommended Posts

On 24/04/2018 at 10:15 PM, Javs said:

We follow an EOTF for HDR and there are guidelines, you just use a multiplier for HDR gamma but you are not actually departing from it otherwise, its not a mere stab in the dark, far from it.

Which do you thing is more likely to be accurately, video mastered for 100 nits and displayed at around 100 nits with standard gamma, or video mastered for 1000 nits and  gamma remapped to get an acceptable image when displays at 100 nits or less?

 

On 24/04/2018 at 10:15 PM, Javs said:

If you add an s-curve to an sdr power gamma you are.

No s-curve is required for SDR video content. We would only apply an s-curve if we where trying to make SDR content resemble the deliberately and artificially spiced up look of HDR which CANNOT be accurate on a projector.

People may like the exaggerated contrast look but if thats way the studio thought the image should look when viewed at around 100 nits they would have mastered the 1080 Bluray to look like that in the first place. The fact that they didn't is VERY telling IMHO.

If HDR is remapped properly it should look just like SDR when displayed at around 100 nits. Even if we set out to adjust HDR gamma to match the SDR version of the movie we can't achieve an accurate result because the HDR "effect" can be applied differently from scene to scene making it impossible to accurately reverse.

 

On 24/04/2018 at 10:15 PM, Javs said:

Also the JVC only lose about 7% light due to filter.

I'll take you word on the 7% but thats very low according to the measurements of others, and as you already pointed out the Z1 looses about 40%.

The point is, wide gamut colours are VERY bright and require MUCH more light output, but enabling wide gamut mode on the projector reduced light output. Thats a fail mate.

 

On 24/04/2018 at 10:15 PM, Javs said:

You should really stop being such an HDR detractor.

I'm a realist, THERE IS NO SUCH THING AS HDR ON A PROJECTOR, and HDR encoded video is NOT at all ideal for projector use. We need video that is mastered for the sort of light output projectors typically provide NOT 10 to 20 times higher.

  • Like 1
Link to comment
Share on other sites



I think we have examples of HDR being created through remapping  of SDR  material in the form of  UHD discs that are simply upscaled 2K SDR versions.  There is no HDR to start with so it must be created from SDR.  

Edited by Tasso
Link to comment
Share on other sites

There is no such thing as "HDR" material. The dynamic range of movies has always been limited by the camera-film, there is no special "HDR" camera.

Cinema camera and film manufacturers have always strived to provide the widest dynamic range possible, same goes for still cameras. The full dynamic range of these cameras has always used to create both HDR and SDR versions of movies.

The only difference between HDR and SDR versions is the gamma applied to the video (gamma is the mapping of video levels between black and white) and the target brightness of the display device. SDR is mastered with a gamma suitable for a display running at about 100 nits and HDR for a display running 1000 nits plus. Obviously if we use video mastered for 1000 nits plus and display it at 100 nits or less we wont get the desired result. What we will get is a contrast exaggerated presentation that many people may like but its is not and can never be accurate.

Changing gamma has no effect on "dynamic range", dynamic range is totally dictated by the brightness and contrast ratio of the display. If we run a projector brighter the black level increases in proportion so dynamic range stays the same, we sacrifice black level for brighter whites. SDR video can be displayed at exactly the same brightness as HDR and will have the same black level on a projector and therefore identical dynamic range. What separates SDR from HDR presentation on a projector is simply display gamma, and I question how there can be two different representations of the same content that can be "accurate".

 

Since 1080 Bluray content is mastered for 100 nits, which is very close to what projector owners will be using, and 4K HDR Bluray is mastered for 1000 plus nits, way, way higher than any projector is capable of, its not rocket science to work out which should provide the more accurate presentation on a projector screen.

 

If HDR content was mastered properly and consistantely it would be possible to re map its gamma so that the on screen image was indistinguishable from the SDR version when viewed at 100 nits which IS SDR. Typically thats not the case because projector manufactures feel the need to give HDR content a deliberately different presentation and the studios want to make the HDR versions of movies look deliberately "different" to encourage consumes to buy the "new" HDR version. Its a scam because the exact same "look" could have been given to the 1080 SDR Bluray version if desired.

 

With current movies there is no excuse to give the SDR and HDR disk versions a different look, they where mastered at the same time. When viewed at the same brightness they should look identical, if they don't there is a problem, either with the mastering of the content or the displays presentation of it.

 

Edited by Owen
  • Like 1
Link to comment
Share on other sites

^ Like I said... an HDR Detractor. :)

 

A lot of what you are saying is not entirely the case. If you had spent much time with HDR (clearly you haven't) you may have a different tune.

 

Some films have completely different grades in HDR on purpose, The Revenant is one, both were done independently of each other with the director present.

 

Also you admit you yourself if I recall, that you dont watch SDR at 100 nits yourself? Then why don't you allow HDR grades the same treatment? You also need to realise that the HDR gamma is not linear in scale like SDR is, its more logarithmic, and the content creators know this, with most of the content existing at about 100 nits in any case.

 

Using the above example, with my JVC, if I watch a film in SDR at -12 on the iris which is what I do, then the dynamic iris system (the key point here) if called for will shut down the iris to the smallest possible point giving me a black level of lets say 0.0002nits measured for example sake. Now, if I watch the same film on the UHD HDR disc, I will probably be at -0 on the iris to hit 100 nits, and if the same scene calls for it, the iris will again close down to its maximum position to 0.0002nits again (I've tested this). So, yes, in fact, I have considerable increase the dynamic range possible. Rather than having 45 nits or so at -12 and a black level of 0.0002 and a contrast ratio possible ranging 225,000:1 I will now have double that at 500,000:1. Since I cant stand to watch SDR at 100 nits then I completely disagree with you.

 

Case in point. Look at a couple waveforms of HDR content:

 

Here is how to read the waveform.

 

MiW8LkI.jpg

 

Blade Runner 2049.

 

msQHw5S.png

 

7m30CId.png

 

KHbXKKE.png

 

kPPgy2N.png

 

That film barely goes over 100 nits. Now that same approximate level on the waveform and grade in the SDR version of that film if I were watching that version at -12 would likely sit at around 20 nits for a peak white of 45 nits. What we do with HDR curves is we set that same ref white level at 20 real nits, but allow 20-100 for the specular highlights, just as before we would have only allowed 20 - 45 nits for the exact same information. That is the part that is logarithmic. In fact with a proper HDR curve, the bottom 0-20 nits should be essentially identical to what you see in SDR for the most part (Lucy is a good film where this is true), but you now have this extended range on tap for a better rendition of highlights.

 

Blade Runner : The Final Cut is a film which looks spectacular in HDR, it literally has never, ever looked better, this is one which makes excellent use of these highlights, here the neon signs go up to near 1000 nits in LOG but using HDR curves on the JVC that information is from roughly 20 to 100 nits which gives the image a much more realistic appearance, they genuinely look very bright against the rest of the image:

 

PuEbRfy.png

 

 

Edited by Javs
spelling
  • Like 3
Link to comment
Share on other sites

4 hours ago, Javs said:

Using the above example, with my JVC, if I watch a film in SDR at -12 on the iris which is what I do, then the dynamic iris system (the key point here) if called for will shut down the iris to the smallest possible point giving me a black level of lets say 0.0002nits measured for example sake. Now, if I watch the same film on the UHD HDR disc, I will probably be at -0 on the iris to hit 100 nits, and if the same scene calls for it, the iris will again close down to its maximum position to 0.0002nits again (I've tested this). So, yes, in fact, I have considerable increase the dynamic range possible. 

 

Yes you will get the same blackest of black with the dynamic iris on 0 or -12 or whatever but this will only be if I understand correctly for scenes which are completely black or virtually completely black (i.e space scenes showing space and stars). For all other scenes and I'm talking the absolute vast majority the dynamic iris won't close down and your black levels will be severely compromised viewing them.

Edited by Satanica
Link to comment
Share on other sites



8 hours ago, Satanica said:

Yes you will get the same blackest of black with the dynamic iris on 0 or -12 or whatever but this will only be if I understand correctly for scenes which are completely black or virtually completely black (i.e space scenes showing space and stars). For all other scenes and I'm talking the absolute vast majority the dynamic iris won't close down and your black levels will be severely compromised viewing them.

If the iris doesn't begin to close down you are now talking about a range greater than 4-5% APL which at that point the difference in contrast ratios are much smaller than you realise. Your statement is correct absolutely when the DI is not involved, when it is, it changes everything. I also use Auto1 in HDR which is more aggressive and acts sooner.

 

The other interesting thing is that when you close down the iris you actually lose ANSI contrast at the same time (you go from about 300:1 down to 220:1 at 50% APL), in fact, the JVC have higher ANSI contrast with the iris fully open vs closed so past a certain point there is actually more contrast in the brighter scenes at fully open iris, the DI then takes care of the darker stuff.

Edited by Javs
Link to comment
Share on other sites

31 minutes ago, Javs said:

If the iris doesn't begin to close down you are now talking about a range greater than 4-5% APL which at that point the difference in contrast ratios are much smaller than you realise. Your statement is correct absolutely when the DI is not involved, when it is, it changes everything. I also use Auto1 in HDR which is more aggressive and acts sooner.

 

The other interesting thing is that when you close down the iris you actually lose ANSI contrast at the same time (you go from about 300:1 down to 220:1 at 50% APL), in fact, the JVC have higher ANSI contrast with the iris fully open vs closed so past a certain point there is actually more contrast in the brighter scenes at fully open iris, the DI then takes care of the darker stuff.

Auto1 if I remember reading correctly from the AVS Forum also changes the image in dark areas, yes it actually processes the image and highlights dark areas.

Auto 2 makes virtually no changes.

I can't imagine using my JVC with the iris at 0, it looks eye burning terrible.

ANSI contrast is far less important IMO than black levels.

Edited by Satanica
Link to comment
Share on other sites

1 minute ago, Satanica said:

Auto1 if I remember reading correctly from the AVS Forum also changes the image in dark areas, yes it actually processes the image and highlights dark areas.

Auto 2 makes virtually no changes.

I can't imagine using my JVC with the iris at 0, it looks eye burning terrible.

ANSI contrast is far less important IMO than black levels.

Auto 1 in HDR is different to how it acts in SDR because of the shape of the gamma. I am well aware of the threads on AVS I am hugely active over there. :)

 

Yes iris on 0 with an SDR grade and gamma is terrible because your diffuse white level (not to be confused with peak white level) will likely be up near 60-70 nits which is far too much. Thats precisely what I was saying in the above conversation which it seems you are not understanding. With an HDR grade and suitable EOTF, it will mostly look identical to the image you have in SDR at -10 since the diffuse white level will be matched, until there are highlights on the screen graded for HDR, then that extra punch comes into play, such as the blade runner shot above.

 

Which JVC have you got? Are you HDR capable? Are you running custom curves?

 

I have three HDR displays in my house at this very moment, a 1200nit UHD TV, a 600nit UHD TV and my ~120 nit JVC... guess which display looks far better than the others in HDR?

 

Quote

ANSI contrast is far less important IMO than black levels.

You might need to read my posts again.

Link to comment
Share on other sites

4 minutes ago, Javs said:

Auto 1 in HDR is different to how it acts in SDR because of the shape of the gamma. I am well aware of the threads on AVS I am hugely active over there. :)

 

Yes iris on 0 with an SDR grade and gamma is terrible because your diffuse white level (not to be confused with peak white level) will likely be up near 60-70 nits which is far too much. Thats precisely what I was saying in the above conversation which it seems you are not understanding. With an HDR grade and suitable EOTF, it will mostly look identical to the image you have in SDR at -10 since the diffuse white level will be matched, until there are highlights on the screen graded for HDR, then that extra punch comes into play, such as the blade runner shot above.

 

Which JVC have you got? Are you HDR capable? Are you running custom curves?

 

I have three HDR displays in my house at this very moment, a 1200nit UHD TV, a 600nit UHD TV and my ~120 nit JVC... guess which display looks far better than the others in HDR?

 

You might need to read my posts again.

-10 in SDR looks terrible too. ;)

The biggest increase in light output occurs between -14 and -13.

I have an X7000.

Edited by Satanica
Link to comment
Share on other sites

48 minutes ago, Satanica said:

-10 in SDR looks terrible too. ;)

The biggest increase in light output occurs between -14 and -13.

I have an X7000.

What size and gain screen are you running?

 

I have an X9500, which hits 100k:1 at -10. I am running -12 right now though for ~45nits.

 

An X7000, I had one of those, so have you played around with HDR curves? Or only with Gamma D?

Link to comment
Share on other sites



1 minute ago, Javs said:

What size and gain screen are you running?

 

I have an X9500, which hits 100k:1 at -10. I am running -12 right now though for ~45nits.

 

An X7000, I had one of those, so have you played around with HDR curves? Or only with Gamma D?

110" and 1.0.

 

No I haven't tried HDR. Please don't tell me I don't have a right to post about something I haven't experimented with.

 

Other than for virtually all black scenes (when the dynamic iris shuts right down) it appears you're getting inferior black levels with HDR compared to SDR. Correct or incorrect?

Link to comment
Share on other sites

57 minutes ago, Satanica said:

No I haven't tried HDR. Please don't tell me I don't have a right to post about something I haven't experimented with.

Well, its the truth, you have no experience with HDR for projectors (Gamma D is completely broken so that doesn't count either), seems neither does Owen, you seem to 'like' every one of his rants about it, as such you are both not qualified to make any trustworthy judgements on it then. Its as simple as that like it or not sorry.
 

Quote

Other than for virtually all black scenes (when the dynamic iris shuts right down) it appears you're getting inferior black levels with HDR compared to SDR. Correct or incorrect?

 

Its not as absolute as that, I dont/wouldnt watch HDR with the iris non functional since I value the black level performance of the machine. Before the HD Fury linker I used to just convert HDR to SDR using the Panasonic UB900. Now I have control of the iris, The iris does not go full closed/full open only there is a range of circumstances where it works in-between so if I must answer, you are incorrect there is no compromise to the black levels.

 

The JVC has a ~10x contrast multiplier with the DI when the iris is at -0, and about ~2x when the iris is at -10. So, there is more contrast range possible from the brightest whites to the darkest blacks available when viewing content at -0 and the DI on, do you agree? These are not observations, they are facts. What I am telling you is at 4 and 5% APL the contrast has already declined a lot about 120k:1 down to about 3,000:1 at -10 that any marginal difference is completely irrelevant at 5% APL vs Iris at -0, it might be something like 2500 vs 3000.

 

Here are some tests I did a long time ago using % Grey cards measured against 0% black with my X7000, of which my 9500 would beat handily now (I get 44k:1 at -0 now same throw). I dont have the numbers for 5% since I only just found that file today.

 

DI Off.

 

Iris -0

 

0% - 22,750:1
1% - 22,377:1
2% - 10,073:1
3% - 4,105:1
4% - 2,363:1

 

Iris -15

 

0% - 112,307:1
1% - 73,000:1
2% - 20,277:1
3% - 7,643:1
4% - 3,518:1

 

Of course the -15 wins here with no DI... what about when you are using the DI?

 

I just ran a test right now at 5% and 1% cards to see what the Auto 2 DI does when its on at -15 and -0.

 

Iris -0 DI Off 5%

 

ehEBjov.jpg

 

iris -15 DI Off 5%

 

fubGEav.jpg

 

Iris -0 DI On 5%

 

dGQFsP7.jpg

 

iris -15 DI On 5%

 

mLOpxsu.jpg

 

Iris -0 DI Off 1%

 

PBTgBdG.jpg

 

iris -15 DI Off 1%

 

iVyDGg5.jpg

 

Iris -0 DI On 1%

 

QPLV3xR.jpg

 

iris -15 DI On 1%

 

otLMvfz.jpg

 

at 5% and 1% the DI on both close down the exact same amount. Interesting huh?

Edited by Javs
Link to comment
Share on other sites

4 minutes ago, Javs said:

Well, its the truth, you have no experience with HDR for projectors (Gamma D is completely broken so that doesn't count either), seems neither does Owen, you seem to 'like' every one of his rants about it, as such you are both not qualified to make any trustworthy judgements on it then. Its as simple as that like it or not sorry.

If you think I have no right to make judgement then please stop quoting my posts and put me on your ignore list. I'm happy to do the same if you'd like?

Link to comment
Share on other sites

13 minutes ago, Javs said:

The JVC has a ~10x contrast multiplier with the DI when the iris is at -0, and about ~2x when the iris is at -10. So, there is more contrast range possible from the brightest whites to the darkest blacks available when viewing content at -0 and the DI on, do you agree? These are not observations, they are facts.

Yes I do agree. Do you agree that black levels have been raised/compromised though?

Edited by Satanica
Link to comment
Share on other sites

10 minutes ago, Javs said:

Here are some tests I did a long time ago using % Grey cards measured against 0% black with my X7000, of which my 9500 would beat handily now (I get 44k:1 at -0 now same throw). I dont have the numbers for 5% since I only just found that file today.

 

DI Off.

 

Iris -0

 

0% - 22,750:1
1% - 22,377:1
2% - 10,073:1
3% - 4,105:1
4% - 2,363:1

 

Iris -15

 

0% - 112,307:1
1% - 73,000:1
2% - 20,277:1
3% - 7,643:1
4% - 3,518:1

 

Of course the -15 wins here with no DI... what about when you are using the DI?

 

I just ran a test right now at 5% and 1% cards to see what the Auto 2 DI does when its on at -15 and -0.

 

Iris -0 DI Off 5%

 

ehEBjov.jpg

 

iris -15 DI Off 5%

 

fubGEav.jpg

 

Iris -0 DI On 5%

 

dGQFsP7.jpg

 

iris -15 DI On 5%

 

mLOpxsu.jpg

 

Iris -0 DI Off 1%

 

PBTgBdG.jpg

 

iris -15 DI Off 1%

 

iVyDGg5.jpg

 

Iris -0 DI On 1%

 

otLMvfz.jpg

 

iris -15 DI On 1%

 

QPLV3xR.jpg

 

Interesting huh?

It doesn't look like you had your camera in a consistent position, so inconclusive.

Take the 1% for example, the IRIS looks considerably more shut down at -15 then 0.

But I can't tell if it's because you took the photo closer at -15 than 0.

Link to comment
Share on other sites



15 minutes ago, Javs said:

...seems neither does Owen, you seem to 'like' every one of his rants about it, as such you are both not qualified to make any trustworthy judgements on it then. Its as simple as that like it or not sorry.

You're getting emotional and trying to insult me. But that's OK, it's good for a chuckle. ?

Link to comment
Share on other sites

1 hour ago, Satanica said:

Yes I do agree. Do you agree that black levels have been raised/compromised though?

Not with the DI in play. Without it, absolutely, yes.

 

1 hour ago, Satanica said:

It doesn't look like you had your camera in a consistent position, so inconclusive.

Take the 1% for example, the IRIS looks considerably more shut down at -15 then 0.

But I can't tell if it's because you took the photo closer at -15 than 0.

The iris is fully closed at 1% with the DI on in both photos.

 

Overlaid:

 

E66RJEt.png

 

1 hour ago, Satanica said:

You're getting emotional and trying to insult me. But that's OK, it's good for a chuckle. ?

Ha, no, not really.

Edited by Javs
Link to comment
Share on other sites

25 minutes ago, Javs said:

Not with the DI in play. Without it, absolutely, yes.

 

The iris is fully closed at 1% with the DI on in both photos.

I question whether grey cards at this low percentage are enough to trigger the iris to open up, well according to your experiment they are not.

From my observations it's more about how much white is in the image as opposed to how much grey as to what the dynamic iris will do.

I propose that real word content is more likely to trigger the dynamic iris to open up more the higher the aperture value and thus your black level being raised.

Edited by Satanica
Link to comment
Share on other sites

I think you agree that without the dynamic iris HDR content will have significantly higher black levels.

I'm not convinced that when in use (with real world content) it's anything better than hit and miss and more likely to miss.

Edited by Satanica
Link to comment
Share on other sites

1 hour ago, Satanica said:

I think you agree that without the dynamic iris HDR content will have significantly higher black levels.

Of course there would be higher black levels without the DI, I never once contended that, I contend though that the scenes where it would if at all be visible if the DI were active and it was not enough to make it close down, it would not matter one single bit because the delta at that APL is very very small and absolutely nowhere near the kind of contrast that these machines are known for. If that were important to you, you should get a a Sony or current Epson, they both beat the JVC at APL content above 4/5% true ADL quite considerably due to higher ANSI, hell get a DLP! But you don't hear people complaining about that regarding JVC's.

 

If you think my grey cards are not a great test (I agree to a point they are an odd data point) then you also have to admit the data I provided, which is measured against those cards shows how drastic contrast drops the second you come out of true black. Those grey cards are DARK. Once you hit 5%, you probably only have about 1500:1 contrast at best in any mode, the difference becomes smaller and smaller all the way to the final ANSI point.

 

Quote

I'm not convinced that when in use (with real world content) it's anything better than hit and miss and more likely to miss.

This is just another example of making a judgement based on assumption and not experience, and so should be taken with a huge grain of salt.

 

All of the above by the way, is one reason why the JVC Z1 with its phenomenal laser dimming system but only ~12k:1 native contrast with dimming off can go up against a 9500 and be more or less indistinguishable from each other. Just ask @wooferocau

Edited by Javs
Link to comment
Share on other sites



19 minutes ago, Javs said:

Of course there would be higher black levels without the DI, I never once contended that, I contend though that the scenes where it would if at all be visible if the DI were active and it was not enough to make it close down, it would not matter one single bit because the delta at that APL is very very small and absolutely nowhere near the kind of contrast that these machines are known for. If that were important to you, you should get a a Sony or current Epson, they both beat the JVC at APL content above 4/5% true ADL quite considerably due to higher ANSI, hell get a DLP! But you don't hear people complaining about that regarding JVC's.

Thanks for the advice, but I'll disregard it.

 

19 minutes ago, Javs said:

If you think my grey cards are not a great test (I agree to a point they are an odd data point) then you also have to admit the data I provided, which is measured against those cards shows how drastic contrast drops the second you come out of true black. Those grey cards are DARK. Once you hit 5%, you probably only have about 1500:1 contrast at best in any mode, the difference becomes smaller and smaller all the way to the final ANSI point.

I think your test is not good enough for conniving me. But yes I admit the drop in contrast out of true black is considerable.

 

19 minutes ago, Javs said:

This is just another example of making a judgement based on assumption and not experience, and so should be taken with a huge grain of salt.

I said " I'm not convinced that when in use (with real world content) it's anything better than hit and miss and more likely to miss.".

When I say "I'm not convinced.." that actually means I'm not sure.

Your information today (or lack thereof) has not convinced me to try anymore than when I got out of bed this morning, yawn.

 

19 minutes ago, Javs said:

All of the above by the way, is one reason why the JVC Z1 with its phenomenal laser dimming system but only ~12k:1 native contrast with dimming off can go up against a 9500 and be more or less indistinguishable from each other. Just ask @wooferocau 

Has that guy even had any of his numerous JVC projectors even somewhat calibrated? I've asked in the past and get no reply.

Edited by Satanica
Link to comment
Share on other sites

2 hours ago, oztheatre said:

Anyone seen any good movies lately?

I've kinda given up on movies of late. TV Series seems to be where it's at. Recently I've watched Dark, The Alienist, Fargo and Mindhunter. Westworld is back on too.

Link to comment
Share on other sites

Ok, at the risk of repeating myself a few fundamentals need to be sorted out and taken on board by the HDR brigade.

 

The vital point is that 100 nits is STANDARD DYNAMIC RANGE by the book so EVERYTHING we view on a projector will be SDR, or in most cases less than SDR, no matter what video we view.

A change in gamma DOES NOT constitute HDR, HDR requires a ten fold increase in brightness which projectors obviously cant achieve.

 

I'm sure Javs know this but for the benefit of other readers I think its time to explain how the standards of SDR came about. CRT displays have a natural transfer function (gamma) that is not flat and its due to the way a CRT tube works not signal processing. The rise in on screen light output mapped against signal input is non linear and has a curve that we call gamma and for CRT it was about minus 2.2.

To get a correct image on a CRT the video signal has to have an inverse gamma curve (plus 2.2) applied to it so that the on screen transfer function will be flat and linear.

To put this another way we must apply a distorted or non flat input to counteract what the CRT display will do.

 

With the advent of digital video it turned out that applying a 2.2 (approx) gamma pre emphasis was also useful for 8 bit digital video encoding because it provided more data points in the bright parts of the picture there they where needed to minimise colour banding and posterization. So the SDR gamma standard stuck and remains to this day.

 

CRT TV's could comfortably achieve about 100 nits, which is where the 100 nit standard came from, however viewing a TV at 100 nits is perceptually different to viewing a large projector screen that fills out field of view in a pitch black room at 100 nits.

The SDR standards where designed for TV's viewed in a non dark environment not projectors, so some people find the picture too bright at 100 nits (assuming they can achieve it) unless gamma is altered to suit. Having said that there are plenty of projector owners running high gain screens so that they can get a very bright picture because they like it.

Owners of modern TV's typically like SDR displayed at 200, 300 or more nits, either because of a bright viewing environment of often just because they like BRIGHT. To each their own.

 

There is no such thing as High Dynamic Range video, its just 10bit video that has been graded for a display running VERY bright. The high dynamic range is entirely created by the display, and if the display cant provide the high light output required there is no HDR on screen.

 

At this point I should point out the difference between grading, which is responsible for the look the movie and “gamma”. Remember SDR video has a “gamma” applied to it during video encoding which is intended to be reversed by the display device, its just an encoding and decoding process that if done accurately has no effect on the grading you see on screen. The “look” of the move can be whatever the studio wants it to be within the brightness limitations of the display. If they want comparitively bright highlights they can have them, but the mid tones will have to be pushed down to achieve that look, just as with HDR displayed on a projector.

 

 

4K video doesn't need “gamma” applied as part of the video encoding because its 10bit and there is no need. However, the video is mastered-graded with the expectation that the display will apply a very exaggerated non linear gamma that pushes up the white end drastically AND will be running at 1000 nits plus. When the display is limited to 100 nits or less re grading - tone mapping - gamma remapping, call it what you will, has to take place, either in the projector or externally. The bright parts of the image have to be truncated to “fit” the 100 nit or less limit of the display. This is simply re grading the video to SDR at the display end rather then at the studio end as happens with SDR video. The end result will typically look different because its not possible to accurately reverse the “HDR” grading.

However, and this is the important point here, those differences need not exist.

The studio could have graded the SDR version of the video to so that on screen it would look exactly like the HDR version after it has been re mapped by the projector for display at 100 nits or less. Remember we have the same dynamic range and peak brightness (SDR or less) available for both so the on screen grading need not look different.

Studios are not thinking about projectors when the grade 1080 or 4K Bluray, its all about how things look on a TV and they WANT to make the 4K version look different.

 

Now Javs says he can run HDR video at 100 nits and prefers to run SDR at about 50 nits because SDR looks too bright at100 nits, thats a choice not a requirement and people who love SDR at 100 nits or more will certainly disagree with that choice.

 

Now, I'm not a fan of bright images, in fact I strongly dislike bright. However there is no need to through the baby out with the bath water buy simply using low lamp power or a lower iris setting to get the average picture level of SDR down to where it would be with HDR at 100 nits. There is this amazing image control called gamma that allows us to display SDR content at the exact same peak white level as HDR while at the same time lowering the average picture level to whatever is desired.

This is basically re grading SDR video at the display end to emulate HDR grading, and while it wont look exactly the same on a title by title basis due to the poor mastering of some movies, the overall look is close enough IMHO.

 

The advantage in doing this for people who like the HDR “look” is that it can be applied to all movies which opens the door to thousands of titles that are not available on 4K disk and never will be.

If I limited myself to 4K titles my projector would only be used a couple of times a year. 98% of the movies I want to view are not available on 4K and likely never will be so optimising the replay of 1080 Bluray is paramount as far as I am concerned.

 

A word on the JVC's iris system, at least the ones I have used.

The 7 and 9 series JVC's have two irises, a lens iris and a lamp iris thats between that lamp and the optical block. When we adjust the iris setting down manually the first click reduces the lens iris and the second click the lamp iris and so on. If we set -12 we have the lens iris at -6 and the lamp iris at -6. Now, when we switch back to auto iris mode the lamp iris stays where it was manually set, only the lens iris has a dynamic function and it is prevented from opening up wider then what it was manually set to, it can only close down.
 

As Javs mentioned, the best ANSI contrast is achieved when the lens iris is full open, this is because closing the lens iris down increase the lenses non glass internal surface area to reflect light back at the imaging chips and scatter back at the screen. The lamp iris doesn't have this problem but we don't have independent control of it. We cant close the lamp iris without the lens iris closing as well, and thats an unfortunate oversight IMHO, although the visual improvement provided by the lamp iris is minimal to my eyes.

 

Best overall performance IMHO is achieved with the iris systems full open and auto iris engaged. In this situation the lamp iris stays full open and the lens iris can dynamically adjust between minimum and maximum as needed. We get the best possible ANSI contrast and good on/off contrast and blacks as the lens iris provides most of the contrast improvement.

 

Adjusting the manual iris setting to other than 0 (off) is a compromise that I find not worthwhile.

 

 

The notion that HDR video will provide a larger dynamic range then SDR on a projector is flawed. If you run the iris system wide open SDR and HDR have the same dynamic range, and if you close down the iris, which is not an option for HDR, SDR has greater contrast and hence dynamic range if on-off contrast is the measure.

 

 

So to round up, drastically remapping gamma at the display end to get HDR to work at SDR brightness levels on a projector is fine yet insisting that SDR content MUST stick to standard gamma and cannot be re mapped at the display end to run at the exact same peak white level and have the same overall on screen gamma as HDR is just silly and makes any comparison irrelevent.

 

Im not saying there are no advantages in 4K disk content, but when the on screen grading of SDR video is equalised to closely match the HDR version and the same peak white level is used I just don't see enough advantages to be bothered.

 

By the way, I have Lucy, Blade Runner 2049 in both 1080 and 4K, plus a couple of others, mainly to see what all the fuss was about.

Lucy looks good but is a not a great movie IMHO. BR 2049 is a good movie and I LOVE the audio but its nothing special visually and I certainly would not recommend it as a great example of “4K”.

 

I have the original Blade Runner in 4K only and its a shocker IMHO, never watched it through. I really thought I was watching the standard definition DVD version and I'm not a fan of the movie to begin with. It really looks like the old film title that it is and it has not aged well, despite remastering, IMHO.

Edited by Owen
  • Like 1
Link to comment
Share on other sites



  • Recently Browsing   0 members

    • No registered users viewing this page.




×
×
  • Create New...
To Top