Jump to content

mdm1979

JVC 4K E-shift & UHD?

Recommended Posts

16 hours ago, Tasso said:

So what differences are we seeing  with HDR vs SDR  as they do look different.

On a projector its the different mastering and predominantly a different on screen gamma curve due to the mastering and subsequently dramatic gamma re mapping in the projector. Gamma is the distribution of levels "between" black and white, its very adjustable and you can adjust in whatever "look" you like.

If HDR content is mastered and being displayed accurately the image should look the very much the same as an SDR mastered image except for the brightness of the highlights. Unfortunately the "look" of the HDR version can often be quite different in areas of the picture that are not highlights which is indicative of inaccuracy due to a deliberate mastering-grading change or display inaccuracy-lack of capability.

 

Quote Javs: "To get the same effect in SDR you would need to add an S-Curve to the 2.4 power gamma you would otherwise be using, that breaks calibration/specs and is completely not how the content was mastered or intended to be viewed."

 

In responce to Javs comment quoted above, how is re mapping the gamma of SDR video a problem when DRAMATICALLY re mapping the gamma of HDR video to make it suitable for a projector is not?

Re mapping HDR gamma totally breaks calibration/specs and is completely not how the content was mastered or intended to be viewed.

 

Projectors are standard dynamic range display devices and can be calibrated to accurately conform to the SRD standards, so when we view a 1080 Bluray movie on a calibrated projector we see what the producers of the disk wanted us to see.

In stark contrast, the HDR standards, especially for brightness and gamma are completely out of reach, plus the HDR "effect" can be applied differently from scene to scene. All we can have on a projector is an interpretation of HDR content that we like or we don't. We can alter the look to taste, just as we can with SDR content, but we can never see what the HDR disk producers wanted us to see.

 

As for the deceptive HDR promotional material entitled "Chasing the human vision system" posted by Al I bring your attention to a few discrepancies. 

Look at the black and white levels quoted, 0.05 black and 100cd/m2 white for SDR and 0.0005 and 10,000 cd/m2 white for HDR (1cd/m2 =1nit).

The black level of SDR is THE SAME as the black level of HDR NOT 10 times higher and is entirely dictated by the display not the video. If you run a projector brighter for "HDR" content the black level will be higher not lower, and thats not good as far as I am concerned.

The 10,000nit white level quoted for HDR is 10 times the current HDR standard so its a deceptive exaggeration. 

 

While the human eye does have a dynamic range of 20 stops or more from bright scene to dark scene, provided we allow adiquate time for our eyes to adjust, the dynamic range we can see in any single scene is limited to about 10 stops, far less then a good camera or a good projector running SDR levels.

What I like about movies on a projector is the subtlety of the presentation and what the low average picture levels allow my eyes to comfortably see in the image. High brightness is counter productive to my eyes, looks more like a TV.

 

Al also goes on about HDR and WCG working together, they do. Wide gamut colours are EXREAMLY bright and garish so to recreate them on screen we need a display that is VERY bright. Now to widen the gamut of the projector a special filter is required in the light path, problem is that filter drops light output by about 40% which is exactly the opposite of what we need. So if we try to expand the gamut most projection setups will not even achieve the 100nit SDR standard.

I'll bet that the vast majority of projector owners run without the wide gamut filter so no WCG for them, and no HDR either.

 

Lets go over what HDR actually is one more time. Its VERY high peak display brightness combined with a video gamma suitable for that high peak output, without BOTH there is no "HDR" and a projectors provide neither.

 

The video sent to cinemas is 10bit with 4:4:4 colour with very little compresion so is potentially much better then 4K Bluray. Its runs a gamma suitable for projection at 50 nits, half SDR. We can take that exact video and re master it with HDR gamma and display at 1000 nits or remaster gamma to the 100 nit SDR standard. In fact we can convert between any of these gammas as often as we like as long as we stay in an un compressed 10 bit digital domain without loss, its just a re mapping of video levels, no information need be removed.

 

So 4K HDR Bluray video should be thought of as just compressed 10bit 4:2:0 video with a gamma appropriate for 1000 nits display. If we display it at 100 nits its SDR so HDR gamma is inapropriate and must be re mapped. If we display at 50 nits we should really re map for 50 nits as the average picture level of 100nits mastered video displayed at 100 nits can be a bit much on a big screen in a dark room.

 

1080 Bluray is an 8bit format thats suitable for SDR 100 nit distribution,  its not suitable for dramatic gamma remapping as doing so can introduce colour banding. However with care the look of the image can be very significant altered without problems if we keep on screen brightness down under 100 nits, which is normally the case with projectors. 

8 bit video is just fine for 100 nits display, the standard it was developed for,  and plenty for 50 nit display. More bits are only really needed if we are using higher image brightness and at 1000 nits plus its definitely necessary.  We can use the same gamma curve that a projector creates from re mapping HDR video and encode that to 8bit video and it will look the same on screen as the 10 bit remapped image.

 

For those who don't believe me try this simple test. Take a quality photo from a good camera that is 14bit or better and using a suitable video editor on your PC make a 10bit and an 8bit copies in a lossless file format. Now connect the PC to your projector and display one image and then the other and see if they look different. If the software did its job properly they wont.

We can also manipulate the gamma of the 14bit image as we see fit and save off 10 bit and 8 bit versions for comparison with the same result.

 

So, displaying "HDR" mastered video on a projector does NOT give you a HDR image on screen, its just SDR with "different" gamma and typically no WCG. Thats not to say that plenty of people don't like what they see, no doubt they do, but they are fooling them selves if they think what they see is HDR, its not.

 

I have always thought 4K Bluray disks should have contained the cinema release of the movie without any dynamic scene by scene HDR gamma trickery that can't be properly reversed. HDR should have been a command layer that tells the player what gamma to use on a frame by frame basis and only used when wanted by the user. That way we could see what the production team intended us to see via our projectors, not some remapped version of a HDR "enhanced" version intended for TV's.

Share this post


Link to post
Share on other sites
On a projector its the different mastering and predominantly a different on screen gamma curve due to the mastering and subsequently dramatic gamma re mapping in the projector. Gamma is the distribution of levels "between" black and white, its very adjustable and you can adjust in whatever "look" you like.
If HDR content is mastered and being displayed accurately the image should look the very much the same as an SDR mastered image except for the brightness of the highlights. Unfortunately the "look" of the HDR version can often be quite different in areas of the picture that are not highlights which is indicative of inaccuracy due to a deliberate mastering-grading change or display inaccuracy-lack of capability.
 
Quote Javs: "To get the same effect in SDR you would need to add an S-Curve to the 2.4 power gamma you would otherwise be using, that breaks calibration/specs and is completely not how the content was mastered or intended to be viewed."
 
In responce to Javs comment quoted above, how is re mapping the gamma of SDR video a problem when DRAMATICALLY re mapping the gamma of HDR video to make it suitable for a projector is not?
Re mapping HDR gamma totally breaks calibration/specs and is completely not how the content was mastered or intended to be viewed.
 
Projectors are standard dynamic range display devices and can be calibrated to accurately conform to the SRD standards, so when we view a 1080 Bluray movie on a calibrated projector we see what the producers of the disk wanted us to see.
In stark contrast, the HDR standards, especially for brightness and gamma are completely out of reach, plus the HDR "effect" can be applied differently from scene to scene. All we can have on a projector is an interpretation of HDR content that we like or we don't. We can alter the look to taste, just as we can with SDR content, but we can never see what the HDR disk producers wanted us to see.
 
As for the deceptive HDR promotional material entitled "Chasing the human vision system" posted by Al I bring your attention to a few discrepancies. 
Look at the black and white levels quoted, 0.05 black and 100cd/m2 white for SDR and 0.0005 and 10,000 cd/m2 white for HDR (1cd/m2 =1nit).
The black level of SDR is THE SAME as the black level of HDR NOT 10 times higher and is entirely dictated by the display not the video. If you run a projector brighter for "HDR" content the black level will be higher not lower, and thats not good as far as I am concerned.
The 10,000nit white level quoted for HDR is 10 times the current HDR standard so its a deceptive exaggeration. 
 
While the human eye does have a dynamic range of 20 stops or more from bright scene to dark scene, provided we allow adiquate time for our eyes to adjust, the dynamic range we can see in any single scene is limited to about 10 stops, far less then a good camera or a good projector running SDR levels.
What I like about movies on a projector is the subtlety of the presentation and what the low average picture levels allow my eyes to comfortably see in the image. High brightness is counter productive to my eyes, looks more like a TV.
 
Al also goes on about HDR and WCG working together, they do. Wide gamut colours are EXREAMLY bright and garish so to recreate them on screen we need a display that is VERY bright. Now to widen the gamut of the projector a special filter is required in the light path, problem is that filter drops light output by about 40% which is exactly the opposite of what we need. So if we try to expand the gamut most projection setups will not even achieve the 100nit SDR standard.
I'll bet that the vast majority of projector owners run without the wide gamut filter so no WCG for them, and no HDR either.
 
Lets go over what HDR actually is one more time. Its VERY high peak display brightness combined with a video gamma suitable for that high peak output, without BOTH there is no "HDR" and a projectors provide neither.
 
The video sent to cinemas is 10bit with 4:4:4 colour with very little compresion so is potentially much better then 4K Bluray. Its runs a gamma suitable for projection at 50 nits, half SDR. We can take that exact video and re master it with HDR gamma and display at 1000 nits or remaster gamma to the 100 nit SDR standard. In fact we can convert between any of these gammas as often as we like as long as we stay in an un compressed 10 bit digital domain without loss, its just a re mapping of video levels, no information need be removed.
 
So 4K HDR Bluray video should be thought of as just compressed 10bit 4:2:0 video with a gamma appropriate for 1000 nits display. If we display it at 100 nits its SDR so HDR gamma is inapropriate and must be re mapped. If we display at 50 nits we should really re map for 50 nits as the average picture level of 100nits mastered video displayed at 100 nits can be a bit much on a big screen in a dark room.
 
1080 Bluray is an 8bit format thats suitable for SDR 100 nit distribution,  its not suitable for dramatic gamma remapping as doing so can introduce colour banding. However with care the look of the image can be very significant altered without problems if we keep on screen brightness down under 100 nits, which is normally the case with projectors. 
8 bit video is just fine for 100 nits display, the standard it was developed for,  and plenty for 50 nit display. More bits are only really needed if we are using higher image brightness and at 1000 nits plus its definitely necessary.  We can use the same gamma curve that a projector creates from re mapping HDR video and encode that to 8bit video and it will look the same on screen as the 10 bit remapped image.
 
For those who don't believe me try this simple test. Take a quality photo from a good camera that is 14bit or better and using a suitable video editor on your PC make a 10bit and an 8bit copies in a lossless file format. Now connect the PC to your projector and display one image and then the other and see if they look different. If the software did its job properly they wont.
We can also manipulate the gamma of the 14bit image as we see fit and save off 10 bit and 8 bit versions for comparison with the same result.
 
So, displaying "HDR" mastered video on a projector does NOT give you a HDR image on screen, its just SDR with "different" gamma and typically no WCG. Thats not to say that plenty of people don't like what they see, no doubt they do, but they are fooling them selves if they think what they see is HDR, its not.
 
I have always thought 4K Bluray disks should have contained the cinema release of the movie without any dynamic scene by scene HDR gamma trickery that can't be properly reversed. HDR should have been a command layer that tells the player what gamma to use on a frame by frame basis and only used when wanted by the user. That way we could see what the production team intended us to see via our projectors, not some remapped version of a HDR "enhanced" version intended for TV's.
We follow an EOTF for HDR and there are guidelines, you just use a multiplier for HDR gamma but you are not actually departing from it otherwise, its not a mere stab in the dark, far from it.

If you add an s-curve to an sdr power gamma you are.

Also the JVC only lose about 7% light due to filter.

You should really stop being such an HDR detractor.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Recently Browsing   0 members

    No registered users viewing this page.



×