Jump to content

JVC Redux, JVC's DLA-X7000 4K E-shift projector - an end user perspective


Recommended Posts

talking about calibration, I should update this thread, a few us got together one evening last week at my place and we gave the JVC's autocal a run, documented in the link below,
 
 
to summarise under the direction of imdave, we utilised the datacolor Spyder5 colorimeter that jvc supports through its auto calibration program linked to the projector
 
Datacolor-Spyder5-Monitor-Calibrator_63cd1118b9dfec13854dcddbbae104bf.jpg
 
below a shot of getting setup, and imdave giving us a run down,
36197955534_f8673f0310_b.jpg
 
colorimeter setup and in place,
36636545480_07290dc514_b.jpg
 
and ivc's actual software confirming things all ready to go before it was lights off to give the autocal a run....
36636553650_d0b87fcc6e_b.jpg
 
very happy with end results, for both R709 and for UHD and HDR.
 
last night I fine tuned both brightness and contrast and confirmed color was spot on ! 
 
wiht only the hundred something dollar the outlay of the colorimeter, fantastic tool the jvc autocal, not sure i could buy a projector now without it [emoji3] 


Spot on Al do it yourself is the best way.


Sent from my iPhone using Tapatalk
  • Like 2
Link to comment
Share on other sites



Manual calibration by someone who knows what they are doing is "the best way", not dumb auto cal.

For HDR content there is no way to properly "calibrate" a projector, its entirely a subjective thing because HDR mastering is totally random. There is not "standard".

Link to comment
Share on other sites

On 9/10/2017 at 11:31 PM, Owen said:

Manual calibration by someone who knows what they are doing is "the best way", not dumb auto cal.

For HDR content there is no way to properly "calibrate" a projector, its entirely a subjective thing because HDR mastering is totally random. There is not "standard".

 

Hey Owen, I don't think anyone here is questioning that a manual calibration is the ultimate, but of course that asks much more of the user.

I've only done the JVC Autocal myself using imported REC 709 colour profiles for my X7000.

It certainly looks better now than it did out of the box; the main thing I noticed is that skin tones were way too rich before, too orange, too red.

I'm absolutely welcoming a full manual calibration, so whenever you're up this way the beer will be good and cold. :thumb::ahappy:

Edited by Satanica
Link to comment
Share on other sites

  • 2 weeks later...

bit of a movie marathon last few days :D and first decent run of projector since the autocal,

 

we watched the new blade runner uhd. which have to say was really pretty amazing ! yes a bit of grain but would expect a movie of its era. am so glad they were kind in their re mastering. but it has come up a treat. great use of HDR (not for glitz and glamor) it genuinely looks very natural and done to add well. the audio track to follow is pretty amazing too ! 

 

and then we watched wonder woman on uhd. and i have to say this one is just jaw dropping in picture. simply stunning !!!!!

 

what helps this is such naturally shot stuff.. i.e. natural scenery the beautiful coastline of the greek islands it kicks off in. the faces the costumes ...the gleem of the crowns etc. and then the scenes in london. just all so natural and so beautifully they come up. wonder woman is reference material here.

 

today we watched return of the jedi on blu-ray(am going through the star wars series with my daughter in original order) best have seen it. great detail the faces scenery , beautiful blacks of darth wader etc.

 

really enjoying the picture quality of jvc since the autocal. and very much enjoying some movies while at it :D

 

I keep telling myself I really dont know how this can get any better ! 

 

 

  • Like 1
Link to comment
Share on other sites



Al, I've said this before but its it obviously has not sunk in.  You get EXACTLY the same "dynamic" range when viewing a normal 1080 Bluray as you do when viewing a HDR 4K Bluray, the black and white level should be EXACTLY the same if the projector is setup properly.

So what you are seeing has absolutely NOTHING to do with HDR display because you are viewing in LESS than SDR. The different look is entirely up to the altered gamma applied by the studio for "HDR" mastering and the subsequent remapping of that gamma by the projector. The same gamma could have been applied to the 1080 mastered version if desired.

Studios go out of their way to make 4K disk versions look deliberately "different" to entice people to buy the title again. Quite obviously we don't need 1000 nits to see the effects of remastered content, nor do we need 1000 nits to get "high dynamic range", the high native contrast of JVC projectors provides that natively even without HDR video.

The X7xxx and X9xxx JVC's have greater dynamic range than the cameras used to shoot movies so are not the limitation some brightness obsessed people seem to think.

  • Like 1
Link to comment
Share on other sites

guys reading on the net I came across this from a jvc owners and what quoted for calibration

Quote

"Hello everyone, long time lurker, first post. Amazing site, thanks to all the regular posters for the insights and advice that have helped me immeasurably me in setting up my first dedicated HT.

I recently bought the X7500, and want to get it calibrated. I've had a play using the Spears & Munsil disc, but I just can't get it right. This detailed settings are still beyond my understanding though I reckon I'll get there with lots more reading !

Anyhow, a local HT shop has quoted me the following

Pricing for calibration is:
$658 JVC 7500
$133 3D addition
$199 4K adition

This is in $AUD, so around $790 in USD. I don't need the 3D calibration at this point, but it seems really expensive? Rep said it was 2-3 hours work. Can anyone advise if this is in line with calibration pricing elsewhere? Are there any aussies (sydney / central coast) who can recommend someone who is good that I can get a comparable quote from?

Thanks again guy for all your insights.....loving my HT so far"

 

Im a bit blown away. especially given the projector will need a tidy up in calibration every 250-500 hours. am sure he will get a more competitive quote. but just goes to show the need to learn how to fish. i am thoroughly enjoying things on the jvc. no spears and munsil on own wont get you there. thats probably about 20% of whats needed and will only help on the blu-ray side. yes the jvcs out of box are actually pretty da mn good on own. but every projector needs setting up for installation for output at least and in this day some basic calibration for HDR e.g. with the RM patterns.  and the autocal I can see such a useful thing to calibrate and whip back in shape.

 

certainly very much enjoying the benefits of UHD with its HDR and WCG. and I certainly hope others can as well.

Link to comment
Share on other sites

6 hours ago, :) al said:

certainly very much enjoying the benefits of UHD with its HDR and WCG.

And all on a display running about half the brightness of the SDR standard and watching movies that almost never contain any colour beyond Rec.709.

 

6 hours ago, :) al said:

and I certainly hope others can as well.

Sure they can, but hopefully they will have a better understanding of what they are seeing and why.

Link to comment
Share on other sites

as  point of note the Xx000 series jvc I've found out arent afflicted by the cmd bug the later Xx500 series appear to be and need to be taken in to get sorted (ps not that jvc au likely admit too) but its what guys in the us are organising.  hard ware fix apparently for the later projectors but not something need to worry about wiht the x7000. not that i use CMD in anycase :D

Link to comment
Share on other sites

On 11/09/2017 at 12:01 AM, Owen said:

Manual calibration by someone who knows what they are doing is "the best way", not dumb auto cal.

For HDR content there is no way to properly "calibrate" a projector, its entirely a subjective thing because HDR mastering is totally random. There is not "standard".

Actually manual touch ups after the Autocal is the best way, and the very best JVC Calibrators overseas do this very thing.

 

The average end user cannot manually do a 33pt gamma calibration across red, green and blue for a total of 99 individual adjustments for the gamma, only the Autocal can. You can do the 12pt manual gamma calibration through the Autocal software, or if you have access to the ISF JVC Software you can attack it from 64 points, but that is NOT easy to get, even Calman and the like cannot do this since it needs to be done internally on the Gamma tables, and those commercial calibration suites have zero access to the tables. Even if you do manual gamma calibrations with either the ISF software or the Manual 12pt calibration in the Autocal software (it used to be in the JVC projector menu itself, now moved), you are forced to use the 'Import' gamma slot in the projector, and the baseline gamma for ST2084 EOTF on the more recent JVC's will be totally broken and you cannot correct any inevitable droop.

 

Autocal can certainly do a most excellent job, I would not be paying a professional when I can get results like this with fairly minimal effort.

 

OibjUlR.png

 

In saying that, the very most recent JVC models have had an Autocal red push bug, which has only just been corrected. I am still seeing anomalies with the new software in tracking targets with Autocal, which have required me to create a new custom colour space taking into account meter errors from a log scan on the default Rec709 target profile. When this is done and a new profile is created based on the offset errors on the old one, you actually dont even need to run autocal, you just upload the new colour profile and its bang on accurate immediately. An excel spread sheet literally makes this a walk in the park! The same method is used with BT2020 colour space. The above images are actually my X9500 using this spreadsheet method. My X7000 was even more accurate than this since that version of Autocal actually has zero bugs and the spreadsheet method was not required, as Al no doubt has experienced since he has the same projector.

 

Most of us over on AVS have arrived at most excellent way to calibrate and set up HDR EOTF to our projectors specific peak white output, the Arve tool software we use creates an EOTF based on some input settings which are specific to our units and light output, we all seemed to have settled on a very similar methodology which indicates we are on the right track. HDR looks fantastic on the JVC with custom EOTF gamma curves. In fact, the Arve curve I am using now, is strikingly similar to the JVC ST2084 EOTF, so to say that HDR is broken would be incorrect.

 

This is a JVC X7500, from a youtube video, but you can see it gets damn close after a calibration to the correct ST2084 EOTF.

 

f9XtIdv.png

 

Looks like this particular unit is set to clip at 1000 nits, which I would say given what the Arve tool can do, and that at least 40% of UHD Blurays are clipping at 4000 nits, is not correct. The correct way is to set the EOTF for a 4000 nit clipping point with a soft rolloff starting a specific point which is dictated by your peak white level and the reference white level you wish to use. There is very little real content past 1000nits  > 4000nits on those 40% of titles, so having the general APL level set for 1000nit content, not a hard cut-off, and a soft roll off in place so you don't clip harshly what's above 1000nits is the way to go right now. Peak white and ref white are very different things. If for eg, you used to watch SDR bluray at -10 iris in low lamp, and you have 50 nits peak white, and you now have high lamp -0 iris for HDR, then the input variables into the arve curve will set your reference white to the same approx 50nits (representing 100 nits on the EOTF) or so in which you used to watch SDR bluray, and then the rolloff above that will be smooth all the way up to 4000 nits makig use of the other 50 nits on top for a 100 nit peak white in the case of my HDR set up. In actual fact, for the most part, you should be experiencing a similar average picture level brightness as you are used to in your SDR viewing, for eg, if the scene is indoors and you have two people in room talking,  but then the specular highlights etc are going to be far brighter giving you that HDR look, while technically the dynamic range has not increased since your black level has increased at the same rate as your white level by going to high lamp -0, this is what Al was referring to as we now have 50 nits extra peak white on tap vs the 50 we used to use for SDR Bluray giving the image a lot more apparent range to it and a very increased sense depth.

 

This is what my Arve generated curve looks like.

 

Hn4Edjc.png

 

For reference, this is what a true 1000nit display would look like generated with the same software assuming peak 1000 nits and ref white 100 nits.

 

ahbsLur.png

 

This tool makes the JVC projectors IMO, the very best tone mapping HDR displays on the market bar none. I don't know of ANY other display device that has this much lattitude in the tool available to custom configure HDR Gamma EOTF for precisely the peak white output of the display and have total control over the curve at almost 100 data points.

 

Viewing recent back catalogue films with properly set up HDR curves such as Crouching Tiger Hidden Dragon, Blade Runner, E.T. and even Independence Day, I have been awestruck at how good these titles look now. The picture is VERY natural, the colour is excellent and the depth in the image taking advantage of the full spectrum of light from the projector is something to behold. there is no way I could watch the above titles in SDR Bluray, especially something like Blade Runner, it is so much better on a good HDR set up it is not even funny. 

 

Perhaps you shouldn't be too quick to judge HDR until you have owned HDR capable JVC's and set them up properly. Something clear from your comments you have not done.

Edited by Javs
  • Like 2
Link to comment
Share on other sites



The SDR standard is 100 nits (30fL) and unless a small or unusually high gain screen is used even that is a struggle to attain with a JVC X series projector. The HDR standard is 1000 nits plus which is 10 times or more the output level that most JVC projector owners will ever see. Thats a shortfall of more then 3 stops in dynamic range.

 

So, no matter how you cut it there is no such thing as "HDR" in the conventional sense on a JVC projector, or any domestic projector for that matter, all we can have is HDR video gamma remapped to look good at SDR or sub SDR peak light levels, which means its not HDR on screen, its just 10 bit video with tweaked gamma.

Do people like the look of remastered video with "different"  gamma, obviously they do, but the "look" is due to the mastering of the content and subsequently how the projector remaps it, the on screen image is not HDR.

There is nothing to stop normal 1080 Bluray content from being mastered to have the same overall look created by having the same on screen gamma, and the same peak on screen output using a projector. At 100 nits peak or less we don't even need 10 bit video, 8 bit works fine.

 

In the end it all comes down to the mastering of the content and what look you like, HDR mastering can be stuffed up just as easily as SDR. Pushing up highlights relative to mid tones has its limits and its easy to go overboard and get a look that I'm not keen on. HDR demo content is a good example, looks bloody dreadful to me, about as subtle as a house brick, WAY too "in your face" for this little black duck but others think its fantastic. To each their own.

1080 Bluray titles are often stuffed up by not using the full dynamic range available, I see that all too often and there is no need for it. The dull flat image that results is not good but an over contrasty HDR one is not either IMHO. Dud mastering is what it is, I don't blame the disk format for studio stuff ups.

 

I've experimented with the gamma of both SRD and HDR content, taking SDR towards HDR with lamp on high and iris wide open, and gone the other way taking HDR towards SDR presentation.  Its interesting to play around but I end up preferring a more subtle look that does not resemble a digital TV, its a personal preference thing.

 

The great thing about the JVC projectors is that everything we view has a very high effective dynamic range due to the high native contrast. That said, I still find JVC's best wanting in the black level department and I need brighter whites like a hole in the head, the projector is therefore run on low lamp for the lowest black level.

 

Content availability, or the lack there of in the case of 4K, is where it begins and ends for me. It doesn't matter how fantastic 4K disk content becomes, if the movies I want to watch are not available, and they are not, I could not care less what happens with 4K and I will not buy the disks.

  • Like 1
Link to comment
Share on other sites

I hear what you mean on brightness.  I love the deep blacks, And have never Everett myself wanting more brightness.

 

In line with that, is light bleed an issue with these style reflective sensor projectors (sxrd/dila/dlp)? I imagine not, unlike lcd. It is my personal bugbear...

Edited by Mobe1969
Link to comment
Share on other sites

31 minutes ago, Mobe1969 said:

I hear what you mean on brightness.  I love the deep blacks, And have never Everett myself wanting more brightness.

 

In line with that, is light bleed an issue with these style reflective sensor projectors (sxrd/dila/dlp)? I imagine not, unlike lcd. It is my personal bugbear...

 

never heard of anyone whinging about bleed or anything , but if a major concern for your go check something like this out for your self in person.

 

brightness is not an issue as long as dont go nuts in size. you very simply need to be able to achieve 12-16FL for blu-ray and 30 FL with UHD and with some up the sleeve as lamp ages and you will be right. there are goodness knows how many people enjoying and achieving this. with my screen size and relatively benign  1.1 gain screen am quite happily running low lamp on both uhd and blu-ray to achieve. and lamp life is great with this series 4500 hours rating and drop off is very slow by the looks and all reports.

 

  • Like 1
Link to comment
Share on other sites

11 hours ago, Owen said:

The SDR standard is 100 nits (30fL) and unless a small or unusually high gain screen is used even that is a struggle to attain with a JVC X series projector. The HDR standard is 1000 nits plus which is 10 times or more the output level that most JVC projector owners will ever see. Thats a shortfall of more then 3 stops in dynamic range.

 

So, no matter how you cut it there is no such thing as "HDR" in the conventional sense on a JVC projector, or any domestic projector for that matter, all we can have is HDR video gamma remapped to look good at SDR or sub SDR peak light levels, which means its not HDR on screen, its just 10 bit video with tweaked gamma.

 

Do people like the look of remastered video with "different"  gamma, obviously they do, but the "look" is due to the mastering of the content and subsequently how the projector remaps it, the on screen image is not HDR.

There is nothing to stop normal 1080 Bluray content from being mastered to have the same overall look created by having the same on screen gamma, and the same peak on screen output using a projector. At 100 nits peak or less we don't even need 10 bit video, 8 bit works fine.

 

In the end it all comes down to the mastering of the content and what look you like, HDR mastering can be stuffed up just as easily as SDR. Pushing up highlights relative to mid tones has its limits and its easy to go overboard and get a look that I'm not keen on. HDR demo content is a good example, looks bloody dreadful to me, about as subtle as a house brick, WAY too "in your face" for this little black duck but others think its fantastic. To each their own. 1080 Bluray titles are often stuffed up by not using the full dynamic range available, I see that all too often and there is no need for it. The dull flat image that results is not good but an over contrasty HDR one is not either IMHO. Dud mastering is what it is, I don't blame the disk format for studio stuff ups.

 

I've experimented with the gamma of both SRD and HDR content, taking SDR towards HDR with lamp on high and iris wide open, and gone the other way taking HDR towards SDR presentation.  Its interesting to play around but I end up preferring a more subtle look that does not resemble a digital TV, its a personal preference thing.

 

The great thing about the JVC projectors is that everything we view has a very high effective dynamic range due to the high native contrast. That said, I still find JVC's best wanting in the black level department and I need brighter whites like a hole in the head, the projector is therefore run on low lamp for the lowest black level.

 

Content availability, or the lack there of in the case of 4K, is where it begins and ends for me. It doesn't matter how fantastic 4K disk content becomes, if the movies I want to watch are not available, and they are not, I could not care less what happens with 4K and I will not buy the disks.

So, are you telling me you watch SDR Bluray at 100 nits? Pretty much nobody does, its 50nits more like it. So already, you are breaking the mold with SDR. SDR is also designed on display devices with 1000:1 contrast in mind, which as you know on the JVC's that's just a joke, you run out of gradients with it and the image can become a little flat. HDR takes and expands on it giving you over 1000 gradients.

 

By the way, I have a true 55" 1000 nit 4k display, I use it as my PC monitor in actual fact. So I have seen proper HDR10 content almost nit for nit on it, and you know what, the JVC looks better, more subtle, and far more refined. Its more filmic, and closer to that SDR Bluray we all love, by the image has more depth and everything looks 'right'. The extra brightness lost going from that small screen to a stratospheric screen by comparison is moot, the JVC 'seems' brighter due to the extra light in the room plain and simple. Tone mapping is a fact of life, we do it every day, I am betting you do it yourself.

 

Its pretty clear from your comments that you have no proper experience with HDR, so literally everything you say about it is frankly an assumption.

  • Like 2
Link to comment
Share on other sites



4 hours ago, Mobe1969 said:

So for a 3m sitting distance, 4m projector distance, 100" screen,  I should be fine, and stick with low gain, even with 3D.

e.g. from harkness as far as digital cinema below is kind of result you want for blu-ray etc

 

DCI_Compliance.pdf

 

uhd and hdr you want more in ilk of 30FL

 

what you are thinking is pretty close to what i have for viewing distance and screen. for projection distance thats a lot closer than mine so I suspect you will not be far off and plenty up the sleeve with blu-ray uhd and 3D. prob just want to check for screen size vs zoom for throw wiht the calculator and want to make sure you get what want. also in the jvcs would look at the X7x00 or Xx9x00 series as they have the dual iris as likely will need it to clamp down further than the single iris machines as I suspect you will have a bit of output up your sleeve :D

 

 

  • Like 1
Link to comment
Share on other sites

On 9/27/2017 at 9:30 PM, Mobe1969 said:

In line with that, is light bleed an issue with these style reflective sensor projectors (sxrd/dila/dlp)? I imagine not, unlike lcd. It is my personal bugbear...

SXRD, and DILA are similar in concept but very different in contrast performance, with JVC's system outperforming Sony's by a LARGE margin.

DLP has pathetic native contrast and is totally uncompetitive, its the IPS LCD of the projector world while DILA is the OLED.

 

Even though DLP projectors have very poor native contrast "light bleed" is much less significant then on a much brighter LCD TV for example. Projector screen uniformity is also much better than LCD flat panel TV's.

Even a 6 year old base model JVC projector makes the best Plasma TV look sad for contrast and black level, the high end models are much better again. Black is never absolutely black to the eye because in a dark room the human eye is very sensitive, but the black level is more then 50 times lower then the best Plasma TV and hundreds of times lower than LCD native performance.

  • Like 2
Link to comment
Share on other sites

19 hours ago, Javs said:

So, are you telling me you watch SDR Bluray at 100 nits? Pretty much nobody does, its 50nits more like it.

I dont even use 50 nits with a projector because I have a strong dislike of "bright", and that includes bright highlights.

However, there seems to be plenty of people on AVS forum using very high gain projection screens and who view SDR content at MORE THAN 100 nits. Lots of people like BRIGHT.

 

19 hours ago, Javs said:

Pretty much nobody does, its 50nits more like it. So already, you are breaking the mold with SDR

The SDR standard was constructed around CRT TV's which could comfortably achieve 100 nits. I find 100 nits totally inappropriate on a large projection screen viewed in a totally dark room, but I reckon I'm the exception these days.

TV viewers have always used around 100 nits and with modern TV's they are viewing SDR content at 200, 300 or more nits because they like BRIGHT.

While domestic SDR content 91080 Bluray) is mastered for 100 nits, cinema video is mastered for 50 nits, is 10 bit and uses 4:4:4 colour, which is better than anything on 4K Bluray disk, no bloody HDR gamma either. Mastering for the intended display brightness obviously provides the intended viewing experience and best result, so 1000 or 4000 nit mastering is totally wrong for projectors.

 

19 hours ago, Javs said:

SDR is also designed on display devices with 1000:1 contrast in mind, which as you know on the JVC's that's just a joke, you run out of gradients with it and the image can become a little flat. HDR takes and expands on it giving you over 1000 gradients.

CRT was a high contrast technology and puts LCD to shame when a good TV was properly set up. Admittedly few consumers had well set up high quality CRT TV's so most never saw the true potential, but the mastering engineers sure did on their pro grade CRT monitors.

HDR and the number of "gradients"  are separate issues. 10 bit video has 1000 gradients with or without HDR gamma. 8 bit gradients are just fine for 100 nit display or less because each step in video level is too small to be visible, at 50 nits its plenty.

As for "running out of gradients", I've never seen it with properly encoded 8 bit video display via a quality projector, its a non issue and well mastered 8 bit video NEVER looks "flat", that's a mastering problem plain and simple.

One of the main points I am trying to get across is that we MUST separate mastering differences from the 1080 Vs 4K HDR thing. When we view 4K disks on a projector WE ARE VIEWING IN SDR, NO ITS' NO BUT"s. 100nits is SDR by definition so lets have no talk about "HDR" at 100 nits or less, its IMPOSSIBLE and always will be.

Different gamma and mastering, which is all we see when viewing HDR content on a projector, changes the "look" of the content but it does not change the on screen "dynamic range" at all, only increasing the peak output of the display VERY significantly can do that, amusing black level does not increase in proportion with white level, which for projectors it does.

 

19 hours ago, Javs said:

By the way, I have a true 55" 1000 nit 4k display, I use it as my PC monitor in actual fact. So I have seen proper HDR10 content almost nit for nit on it, and you know what, the JVC looks better, more subtle, and far more refined. Its more filmic, and closer to that SDR Bluray we all love, by the image has more depth and everything looks 'right'. The extra brightness lost going from that small screen to a stratospheric screen by comparison is moot, the JVC 'seems' brighter due to the extra light in the room plain and simple.

No argument from me about that, I find super bright HDR TV's look bloody terrible and I never want to view movies like that. The image provided by a JVC  projector is so much better and more natural its silly.

 

Really the whole thing comes down to your preferred content, disk mastering and system gamma. There is no "HDR" on a projector, there is only SDR video displayed as SDR or HDR video displayed as SDR with tweaked gamma.

Yes some people are in a position to run the projector slightly brighter for "HDR" content at the expense of black level, but many are not because the projector is mixed out already or they do dont want to sacrifice black level.  In any event they will be viewing SDR no matter what they do, and there is also the option of running SDR content at the same or higher peak output level then HDR with tweaked gamma for that HDR look. No it wont look exactly the same as the HDR mastered version because it was mastered differently, although if one spends a bit of time experimenting with gamma the overall result can be very bloody similar to the "HDR" version, depending on the title and how drastically it was changed from the SDR version.

I find that by the time I tweak the gamma of the 4K version to get the "look" I want it ends up looking VERY close to the 1080 disk version, at least for the couple of titles I have experimented with. The exaggerated high contrast "HDR" look that so many people seem to like does not appeal to me. I can understand why people would like that exaggerated contrast "look", and good luck to them but its not what I'm after.

 

I should stress that I consider the way 4K disks are mastered is not ideal for projector users.  On a big screen in a dark room high peak output levels WAY above what's possibly with a projector is counter productive to the viewing experience IMHO. 1000 plus nits on a BIG screen is quite frankly ridiculous and highly undesirable IMHO. A super low black level is vastly more important to picture "quality" as far as I am concerned. JVC's best efforts are still not good enough IMHO,  progress in regard to contrast ratio has completely stalled and black level has gone backwards over the last 3 or 4 generations due to the push for more lumens.

 

As we here can appreciate we dont need a super bright picture to get a fantastic image, in fact an image with super bright highlights would be un viewable on a big screen and with current technology super high peak output comes at the cost of dramatically increased black level which is the LAST thing I would want. High contrast and super deep black level rules for SDR and HDR content.

 

The whole 4K HDR eco system is designed around Joe average consumer who does not understand or appreciate what constitutes a "quality" accurate picture and views on a relatively small un calibrated TV in a non dark room. Its understandable why a new product would be directed at the biggest market, but projector users are not best served by content mastered for 1000 nits plus. Mastering the content for the peak output the display will achieve is by far the best approach.

 

19 hours ago, Javs said:

Its pretty clear from your comments that you have no proper experience with HDR, so literally everything you say about it is frankly an assumption.

Don't treat me like a fool mate, I didn't come down in the last shower.

Lets get one thing perfectly clear, THERE IS NO SUCH THING AS "HDR" ON A PROJECTOR, its displayed as  SDR with tweaked gamma, end of story. The fact that you, and others, like the look of 4K remastered video displayed that way is irrelevant, it is what it is and it sure isn't HDR display.

As I said before, "true" 1000 nit plus HDR display is TOTALLY inappropriate for big screen dark room viewing, so dont think I am denigrating the image produced by the JVC projectors in any way. Its vastly better than any HDR TV as far s I am concerned. I wouldn't watch a move on one of them if you paid me.

 

 

Edited by Owen
  • Like 1
Link to comment
Share on other sites

7 hours ago, Owen said:

SXRD, and DILA are similar in concept but very different in contrast performance, with JVC's system outperforming Sony's by a LARGE margin.

DLP has pathetic native contrast and is totally uncompetitive, its the IPS LCD of the projector world while DILA is the OLED.

 

Even though DLP projectors have very poor native contrast "light bleed" is much less significant then on a much brighter LCD TV for example. Projector screen uniformity is also much better than LCD flat panel TV's.

Even a 6 year old base model JVC projector makes the best Plasma TV look sad for contrast and black level, the high end models are much better again. Black is never absolutely black to the eye because in a dark room the human eye is very sensitive, but the black level is more then 50 times lower then the best Plasma TV and hundreds of times lower than LCD native performance.

Can you give any comments on light bleed on DILA and SXRD? Is there some?

 

I like your views on HDR, brightness, and projectors. I'm also little personally undecided on HDR itself. I still recall an interview with Jodie Foster on her concerns on existing material being post processed to be something it never was. Just like colorization. I understand shooting images in HDR, but take a photo or film shot at a single exposure...

It is like the Hubble photographs when you realise the images you see are fantastic but they're taking a larger chunk of the EM spectrum and compressing or shifting to the visible range.

 

I just want something where the image displayed has dark blacks that aren't blown out by bright spots on an image (or ruined by subtitles being turned on and you can see the light bleed affect  go on and off as each subtitle is displayed...)

Link to comment
Share on other sites

3 hours ago, Owen said:

I dont even use 50 nits with a projector because I have a strong dislike of "bright", and that includes bright highlights.

However, there seems to be plenty of people on AVS forum using very high gain projection screens and who view SDR content at MORE THAN 100 nits. Lots of people like BRIGHT.

 

The SDR standard was constructed around CRT TV's which could comfortably achieve 100 nits. I find 100 nits totally inappropriate on a large projection screen viewed in a totally dark room, but I reckon I'm the exception these days.

TV viewers have always used around 100 nits and with modern TV's they are viewing SDR content at 200, 300 or more nits because they like BRIGHT.

While domestic SDR content 91080 Bluray) is mastered for 100 nits, cinema video is mastered for 50 nits, is 10 bit and uses 4:4:4 colour, which is better than anything on 4K Bluray disk, no bloody HDR gamma either. Mastering for the intended display brightness obviously provides the intended viewing experience and best result, so 1000 or 4000 nit mastering is totally wrong for projectors.

 

CRT was a high contrast technology and puts LCD to shame when a good TV was properly set up. Admittedly few consumers had well set up high quality CRT TV's so most never saw the true potential, but the mastering engineers sure did on their pro grade CRT monitors.

HDR and the number of "gradients"  are separate issues. 10 bit video has 1000 gradients with or without HDR gamma. 8 bit gradients are just fine for 100 nit display or less because each step in video level is too small to be visible, at 50 nits its plenty.

As for "running out of gradients", I've never seen it with properly encoded 8 bit video display via a quality projector, its a non issue and well mastered 8 bit video NEVER looks "flat", that's a mastering problem plain and simple.

One of the main points I am trying to get across is that we MUST separate mastering differences from the 1080 Vs 4K HDR thing. When we view 4K disks on a projector WE ARE VIEWING IN SDR, NO ITS' NO BUT"s. 100nits is SDR by definition so lets have no talk about "HDR" at 100 nits or less, its IMPOSSIBLE and always will be.

Different gamma and mastering, which is all we see when viewing HDR content on a projector, changes the "look" of the content but it does not change the on screen "dynamic range" at all, only increasing the peak output of the display VERY significantly can do that, amusing black level does not increase in proportion with white level, which for projectors it does.

 

No argument from me about that, I find super bright HDR TV's look bloody terrible and I never want to view movies like that. The image provided by a JVC  projector is so much better and more natural its silly.

 

Really the whole thing comes down to your preferred content, disk mastering and system gamma. There is no "HDR" on a projector, there is only SDR video displayed as SDR or HDR video displayed as SDR with tweaked gamma.

Yes some people are in a position to run the projector slightly brighter for "HDR" content at the expense of black level, but many are not because the projector is mixed out already or they do dont want to sacrifice black level.  In any event they will be viewing SDR no matter what they do, and there is also the option of running SDR content at the same or higher peak output level then HDR with tweaked gamma for that HDR look. No it wont look exactly the same as the HDR mastered version because it was mastered differently, although if one spends a bit of time experimenting with gamma the overall result can be very bloody similar to the "HDR" version, depending on the title and how drastically it was changed from the SDR version.

I find that by the time I tweak the gamma of the 4K version to get the "look" I want it ends up looking VERY close to the 1080 disk version, at least for the couple of titles I have experimented with. The exaggerated high contrast "HDR" look that so many people seem to like does not appeal to me. I can understand why people would like that exaggerated contrast "look", and good luck to them but its not what I'm after.

 

I should stress that I consider the way 4K disks are mastered is not ideal for projector users.  On a big screen in a dark room high peak output levels WAY above what's possibly with a projector is counter productive to the viewing experience IMHO. 1000 plus nits on a BIG screen is quite frankly ridiculous and highly undesirable IMHO. A super low black level is vastly more important to picture "quality" as far as I am concerned. JVC's best efforts are still not good enough IMHO,  progress in regard to contrast ratio has completely stalled and black level has gone backwards over the last 3 or 4 generations due to the push for more lumens.

 

As we here can appreciate we dont need a super bright picture to get a fantastic image, in fact an image with super bright highlights would be un viewable on a big screen and with current technology super high peak output comes at the cost of dramatically increased black level which is the LAST thing I would want. High contrast and super deep black level rules for SDR and HDR content.

 

The whole 4K HDR eco system is designed around Joe average consumer who does not understand or appreciate what constitutes a "quality" accurate picture and views on a relatively small un calibrated TV in a non dark room. Its understandable why a new product would be directed at the biggest market, but projector users are not best served by content mastered for 1000 nits plus. Mastering the content for the peak output the display will achieve is by far the best approach.

 

Don't treat me like a fool mate, I didn't come down in the last shower.

Lets get one thing perfectly clear, THERE IS NO SUCH THING AS "HDR" ON A PROJECTOR, its displayed as  SDR with tweaked gamma, end of story. The fact that you, and others, like the look of 4K remastered video displayed that way is irrelevant, it is what it is and it sure isn't HDR display.

As I said before, "true" 1000 nit plus HDR display is TOTALLY inappropriate for big screen dark room viewing, so dont think I am denigrating the image produced by the JVC projectors in any way. Its vastly better than any HDR TV as far s I am concerned. I wouldn't watch a move on one of them if you paid me.

 

 

Firstly, I dont know what AVS you are reading, but none of the guys I converse with over there, and I am a BIG part of that forum are watchin SDR at 100 nits, perhaps one or two, everybody else are in the 12-16fl range of brightness, personally I watch SDR Bluray at 45nits. Which is about 14/15fl. I find it perfect, its still enough to blind me momentarily in some scenes and the black level is nice and low. But on a projector reaching 160k:1 the difference between low lamp -10 on the iris and high lamp -0 on the iris for HDR viewing is almost irrelevant since the DI comes into play in both scenarios when APL gets low enough anyway. I get about 45k:1 at -0 and probably 75k:1 at -10. When the iris moves past 10 it starts to more agressively increase the contrast, plus my PJ is a min throw, so its not hitting 160k even if the iris was at -15, that would be on the long throw end. Your eyes are barely going to notice 45-75k:1 difference, it IS noticeable but its getting difficult, and when the DI is thrown in the mix, I get black fades so dark in either modes that I cant see my hand in front of my face for a good 4-5 seconds after a great fade to black scenario. As I said before, the average picture level in HDR at about 100nits, is actually THE SAME as when I watch bluray at 45 nits. Its ONLY the real special highlights that move higher than that. Things like light sources, neon signs, explosions, even then some of them dont do much. Blade Runner is a recent film that has been jaw on the floor stunning in HDR with its Neon landscape and deadly inky blacks scattered throughout it.

 

Here is a simple plot showing what is actually going on in a simple scene graded in HDR, the left image is log gamma, and the right is the HDR grading of the image, I am sure you know how to read this since you are a video expert, but as you can see, most of the content exists below 100 nits, and the highlights above that are scaled, so, given how we never watch SDR bluray at 100 nits anyway (which by the way is scaled nit for nit in the colour grade) you can see that HDR follows a different scale after 100 nits. 0-100 is actually 50% of the entire possible HDR range, and most titles are only maxing at 4000, so that top end has to be tone mapped, you can see that even if a display was 4000 nit capable, there is generally not much up there.

 

JL7l47A.png

 

In real life, its generally tiny little details such as little LED diodes in a room, or a small sun specular highlight off a car window etc which are the really blinding thing, HDR GRADING allows for far more latitude in these kinds of things. If we were to apply the same logic to a 100nit SDR grade, you end up with a far less contrasting image, it will be flatter.

 

I stand by the last sentence of my previous post. Perhaps no need to spew your 'no such thing as hdr' mantra on to every person that casually mentions it (I felt compelled to post after reading you stomping all over peoples enjoyment in the past couple of months, you actually talk to users like they are children, feeling you need to school them, and its frankly very rude) when you have no experience with the state of UHD HDR as it sits today. You are fixating on the by the book definition of HDR. But ignoring that they are now mastering films for display devices with far more dynamic range than before. In the case of our specific JVC's a well set up HDR curve surpasses anything I have actually seen on flat panels, and no need, Owen, to take the road of assuming your taste for film contrast and colour grading superseeds and is superior to everybody else, no, we dont like jacked up contrast either for the sake of jacking contrast. I am MOST impressed with the back catalogue films over anything else which are getting the HDR treatment, they literally have never, ever looked better.

 

Crouching Tiger, Hidden Dragon. E.T. Blade Runner. Those films are freaking stunning in HDR. They completely blow away the SDR Bluray counterparts. Before you say there were specifically remastered for UHD, remember that Blade Runner is still going off the Final Cut remaster from 2007 and I did see that film projected in 4k back in '07 too, have watched the bluray probably 50 times, and the UHD version completely blows it right out of the water.

 

Whatever voodoo magic or false sense of HDR you want to call it, its far superior audiovisual experience than anything Bluray can offer. End of story.

 

I cant help but think, if you saw what I am seeing, if you saw Blade Runner on my X9500 with its extremely tasteful ST2084 gamma, not overdone even in the slightest, you may seriously be changing your tune. But if you want to be stubborn about HDR and its advantages, all the power to you I guess, its not my place to force my views on you, but just remember, thats what you are doing upon others first.

Link to comment
Share on other sites



2 hours ago, Javs said:

irstly, I dont know what AVS you are reading, but none of the guys I converse with over there, and I am a BIG part of that forum are watchin SDR at 100 nits, perhaps one or two, everybody else are in the 12-16fl range of brightness, personally I watch SDR Bluray at 45nits. Which is about 14/15fl.

to support this,

 

In ALL my time on AVS I have NEVER ever come across anyone who knows what they are doing, watching SDR on a projection setup at 100nits. there are people at one extreme like zombie who own a 150" HP 2.8 gain screen but he as recently highlighted is enjoying that because he can run HDR at low lamp with a screen that size. at other extreme theres manni (who supports a massive resource on calibration on JVCs on AVS) and dave vaughn(tech writer and reviewer for sound and vision magazine) who run screens a tad smaller than mine wiht their 84/88" and get away with relatively low gain screens like i do. none of these guys like javs here are running SDR at 100 nits and all absolutely meticulous in their setup and know the capabilities and ins and outs of setting up these projectors to optimum. to suggest they might not know what they are doing would be ludicrous to the extreme. 

 

and for the record as I posted with the harkness suggestion re 14 FL +/- 3 FL.

SMPTE (Society of Motion Picture and Television Engineersstate 16 FL. accept as low as 4.5 FL for certain conditions, and as high as 22 FL for drive ins. this is the society of motion pictures and stating how movies need to be appreciated ! 

 

20F%20TC%20On%20Screen%20Light%20Measure

 

similarly DCI(Digital Cinema Initiative) which is by definition and initiative by the major studios to make sure their movies are enjoyed and viewed as intended specify a 48 candella/m2 or 14 FL 

 

http://dcimovies.com/archives/DCI_CTP_v1_1.pdf

 

I hope NO ONE walks away reading this thread thinking for one minute should be watching SDR on a projection setup at 100 nits ! as it would be simply an incorrect assumption. 

  • Like 2
Link to comment
Share on other sites

16 hours ago, Javs said:

I cant help but think, if you saw what I am seeing, if you saw Blade Runner on my X9500 with its extremely tasteful ST2084 gamma, not overdone even in the slightest, you may seriously be changing your tune.

Mate, Im not saying I don't want anything to do with 4K disks, or that I would not like the way Blade Runner looks on the JVC. I'm simply disputing that however good it or other titles look on a JVC its not simply because its HDR. It looks good because its been remastered, thats the big differance.

 

A this point we should define what HDR actually is.

 

First lets look at the video used in commercial cinemas, be it 2K or 4K its 10 bit, uses DCI colour and the same display gamma because the target brightness is the same for both.

Would we classify cinema presentation or the video used in cinemas as HDR, no we would not. Its actually below SDR by definition.

 

 

So how dose 4K HDR differ from the above? Its 10 bit and uses DCI colour so no difference there. What differentiates HDR is the requirement for a VERY bright display and a skewed gamma curve to go with it.

The very different gamma doesn't alter the dynamic range of the content in any way, its ONLY the super bright display that increases on screen “dynamic range” and with a projector we are lucky to get the brightness of an old CRT TV which is SDR by definition. 

The “HDR” video contains no more information then the 10 bit cinema version, its the same information displayed differently due to gamma manipulation.

 

Yes some people may be able and willing to run their projector a little brighter when viewing HDR content, but its always going to be a long way from HDR presentation. They could also run normal SDR at the same peak level and use a tweaked gamma curve to get the average picture level back down to what it would have been at 50 nits, that would be a much fairer comparison. 

If its ok to remap HDR gamma to suit the display, whats wrong with doing the same with SDR?

 

Even if we can reach an arbitrary 100 nits, which is no magic number by the way so I don't know why you bring it up, its only as bright as a 30 year old CRT TV. Who would classify an OLD CRT TV as a HDR display device?

 

 

 

Now for a little experiment. If you have a PC capable of HDR playback this is easy, if not a disk player and a camera can be used for a comparison.

Play a HDR clip and pause at a scene that you think best displays what HDR is capable of on the projector. Do a screen capture and save the image in a lossless 10 bit format and as an 8 bit file.

Now compare the 4K 10 bit image to the 4K 8 bit image on the projector using the exact same gamma and brightness for both. You will see that the two images will look effectively identical. 

We  can also down scale the 8bit version to 2K to emulate a 1080 Bluray and it will still look the same on the projector, although with potentially less resolution if the original source was exceptional. This proves that what people see and like about 4K “HDR” re mastered content displayed on a projector is predominantly due to the remastering and has stuff all to do with “HDR”. 

10 bit isn't needed because 100 nits or less doesn't need 10 bit. For 1000 nits displays (HDR) 10 bit is required because the steps in video level are too large without 10 bit.

 

People commonly attribute what they see with 4K disks on a projector to 4K, HDR and WCG when in fact all three have very little influence on what we see, its all about the content mastering and display setup.

 

I hope this clarifies why I say HDR in and of its self has little if any relevance to projectors, the remastered content obviously does for those who can find something to watch, but thats a seperate issue IMHO.

 

Just because 1080 disks typically look different to 4K remastered titles doesn't not mean its not possible to master 2K disks for the same look on a projector, the experiment above proves it is possible. Its not going to happen so its a moot point, so if one wants the remastered “look” the 4K disk is obviously the way to go.

Edited by Owen
  • Like 3
Link to comment
Share on other sites

I keep thinking of HDR in terms of the Hubble NASA pictures. They look beautiful,  but they don't represent reality of what a human eye can see.  They shift and compress the spectrum to make it visible. 

 

I'm not so concerned with HDR manipulation of SDR captured  images (eg each frame of the damn film was shot single exposure) as opposed to a projector having full 4k resolution and being able to reproduce an image as it would be is I printed a frame onto a print. So no blooming. No light bleed. I don't want to turn subtitles on and cover the bottom half of the frame, look just at the top half and be able to tell you when the subtitles are displaying even though I can't physically see them.

Link to comment
Share on other sites

17 hours ago, Mobe1969 said:

Can you give any comments on light bleed on DILA and SXRD? Is there some

Can you define what you mean by "light bleed", I'm not clear on your interpretation.

 

2 hours ago, Mobe1969 said:

I'm not so concerned with HDR manipulation of SDR captured  images (eg each frame of the damn film was shot single exposure) as opposed to a projector having full 4k resolution and being able to reproduce an image as it would be is I printed a frame onto a print.

I'm not aware of any cinema camera that uses multiple exposes to create a true HDR images, moving images dont allow that technique for a start due to the blur that would result from combining images captured at different points in time. All we get is single shutter speed and single aperture setting per frame.  The native dynamic range of the camera is what it is, its not like there is some "special" HDR camera.

The exact same source video is used to create the SDR and HDR versions of a movie. HDR is an "effect" created by running the display very bright and mastering with a particular gamma to compliment the super bright display.

 

As for 4K resolution, there is no such thing. 4K video is limited to about 3K luma resolution, and with 4K Bluray 1.5k chroma resolution due to colour sub sampling, and thats best case in a lab perfect capture environment with perfect focus, optimal lens aperture and absolutely ZERO motion. Real world capture, especially at 24fps, is FAR more challenging and VERY limited, any movement of camera or subject trashes resolution as does imperfect lens focus.

 

35mm cinema film has no usable amplitude response past 2K, we can scan film at 4K or 8K but that just reduces the digitisation losses, it doesnt get around the limitations of the source.

Edited by Owen
Link to comment
Share on other sites



  • Recently Browsing   0 members

    • No registered users viewing this page.




×
×
  • Create New...
To Top