Jump to content

Projector calibration


Recommended Posts

On 02/06/2018 at 10:41 AM, franin said:

I'm with you Dave, do it yourself. I've just recently received my x5900 ( Last night )  and very curious about the jvc Autocal ( coming from Calman ) and now you mentioning Arves gamma curves. What is Arves Gamma curves and can you upload different gamma curves for the JVC ?

 

 

Franin do you manage to generate the S Gamma curve?

Link to comment
Share on other sites



It seems that part of the OPPO 203 Tone Mapping is to create an S -Curve, so anyone using the OPPO 203 with the Tone Mapping FW should have this.  

 

 

 

Edited by CAVX
Link to comment
Share on other sites

  • 2 weeks later...

Not that I'm using either a Oppo with tone Mapping or a JVC with autocal with the Arves gamma curves.

 

But, it seems to me that the Arves gamma curve is very similar to BT 1886. compared to the long used movie/TV standard 2.2 gamma

 

Which is fine if a movie/whatever has been mastered at that gamma, but truth is, that is seldom , if ever the case....usually it's mastered to the long used standard Gamma 2.2.

 

Things never really moved on from the standard CRT display of 2.2 gamma, BT 1886 was designed to bring the blacks out quicker than 2.2 on a flat panel/PJ display.....worth a try with FTA TV on a flat panel, but I wouldn't use it for movies

If a movie is not mastered at that BT1886, you lose contrast, so it pointless using it, likewise the Arver gamma curves......it's a option yes, but few and far between movies are going to benefit form using it IMHO.

 

I remains a switchable option until HDR-10+ encoded movies/viewing becomes the norm, then gamma will be changed by the meta data sent and received on a scene to scene basis......basiaclly making all displays " Gamma Neutral ", and little need for specific calibration apart from Grey Scale and color.

Link to comment
Share on other sites

1 hour ago, Tweaky said:

Not that I'm using either a Oppo with tone Mapping or a JVC with autocal with the Arves gamma curves.

 

But, it seems to me that the Arves gamma curve is very similar to BT 1886. compared to the long used movie/TV standard 2.2 gamma

 

Which is fine if a movie/whatever has been mastered at that gamma, but truth is, that is seldom , if ever the case....usually it's mastered to the long used standard Gamma 2.2.

 

Things never really moved on from the standard CRT display of 2.2 gamma, BT 1886 was designed to bring the blacks out quicker than 2.2 on a flat panel/PJ display.....worth a try with FTA TV on a flat panel, but I wouldn't use it for movies

If a movie is not mastered at that BT1886, you lose contrast, so it pointless using it, likewise the Arver gamma curves......it's a option yes, but few and far between movies are going to benefit form using it IMHO.

 

I remains a switchable option until HDR-10+ encoded movies/viewing becomes the norm, then gamma will be changed by the meta data sent and received on a scene to scene basis......basiaclly making all displays " Gamma Neutral ", and little need for specific calibration apart from Grey Scale and color.

I rarely say this on forums...but wtf are you talking about?

" You know nuffing John Snow"

Is that really a troll post, because you got me?

What a load of rubbish.

 

  • Like 1
Link to comment
Share on other sites

I won't be that harsh but what I will say is that using the S-Curve gives the image real pop.  It makes you wonder why we weren't doing this sooner.  Images have depth now.    

 

What I find interesting is that prior to getting iSF certified, I could never get a gamma of 2.2 (measured on the charts) on my system.  The one time I did, and it killed contrast.  It measured flat and also looked flat.  

 

Since the training, I understand what it does and how to use it.  But that 2.2 number value is questionable, because a gamma of 2.35 looks better.  Part of me thinks it was chosen because 2.2 is generally in the middle of the range of the gamma options and nothing more. 

 

As for program mastering, just look at Universal's botch up on Blu-ray for FLASH GORDON, TOTAL RECALL, TERMINATOR 2 and the list goes on.  All of these have a gamma other than 2.2 used for mastering.  The quick fix on my system was to use a higher value and readjust brightness/contrast.  And at one point, I had a "universal" mode set up on my projector for these titles. 

 

 

 

    

 

.  

 

Link to comment
Share on other sites



Well I suspect why my last post was received with WTF are you on about is because you need to know a bit about the history of calibration spec's, why they were chosen and what equipment they were originally used on......plus some bad post editing on my part ?

My mistake for assuming that was the case.

 

Gamma of 2.2 is the old standard that was originally the default norm for calibrating the old CRT displays and projectors, that's where it's use comes from, it's what I've calibrated all my displays and projectors [especially the old 3 lens CRT PJ's when I ran them] for at least 15 years.

It looks/looked great on these types of CRT displays because of the huge difference in Black levels, and subsequently contrast compared to all but the very latest OLED screens.

 

When flat screen displays became the norm, people found that a gamma of 2.2 no longer looked so good on them, mainly because the Black levels and contrast are so different [less] between a CRT and Flat panel, so a new gamma more suited to flat panel displays to try to emulate the now lost contrast ratio was spec'd and that was BT1886.

BT1886 looks similar to the S-Curve calibration you guys are using [but not as severe] compared to a straight gamma of 2.2/2.3/2.4 etc, like the S-curve, it basically leaves the mid range alone, and tweaks the low end / Blacks and High end/ whites, very easy to see this in action if you switch to BT1886 from another gamma.

 

Some people prefer a higher gamma, as it's perceived to give a picture more punch, which I get, but to me looks slightly out of whack.....I know Tony [ Forum member and ISF tech REC 709] will often calibrate a customers display to 2.4, and I know of several others at the AVS forum that say they regularly calibrate to 2.5.

I suppose 2.35 is a good compromise for projector use, but I find anything higher than that annoying.

I suppose it depends what you look for and consider a great picture, to me a amazing picture is being able to see all the gradients in shadow details, like the folds in clothes, and all those gradients are smooth, the higher the gamma from 2.2 the less smooth these gradients become.......don't believe me? If possible do two calibrations on the one TV/PJ using different HDMI inputs, one at 2.2 and another at 2.4 and compare for yourselves.

By the way, I always calibrate at 120cm2 Luminance, you might get a brighter picture if you calibrate at a higher light output, but in the end, it's like using compression in music playback, you lose dynamic range, in this instance, visually.

 

Anyway, what I was trying to convey in my original, not understood post was.

Basically that using the S-Curve gamma settings, seems a double edged sword.....one that would end up cutting the user the most given most playback material available, that's unless they knew exactly what was going on, and just how far from 'How it should look' the picture they are viewing is.

 

Choosing to use these S-curves as your prime/blanket settings, could be doing you a disservice in the long run, as unfortunately Bluray discs or any other format you care to name, never states what gamma it was mastered at.

Most program material [Movies/TV shows] prior to 1990/8 would almost certainly would have been mastered at gamma 2.2.

I don't know the take up level of BT1886 in mastering houses, once their old CRT monitors died and were replaced with flat screens, or if it was actually adhered to....old habits die hard.

 

Depends if you want to view what the director envisioned, or a different variation of that gives you more punch visually.....sort of like using the loudness button on a 70's amp.

 

PS: I have a set of 3 NOS Tubes for a NEC 9PJ CRT PJ, plus point board and set up remote if anybody needs one.

I've got a spare deflection board for the same PJ as well.:)

 

CAVX...how did you find the 'Last Jedi' Bluray ?

Horrible color shifts I found [Not the only one, see review elsewhere at this forum]. ...RED's were garish , Blues were over saturated....go figure?

 

 

Edited by Tweaky
Link to comment
Share on other sites

I'll just zip up my flame suit now.  

 

The definition of Gamma is " a display's no- linea response to an electrical signal."

 

In theory, Gamma is twice the light out for twice the power in, or 45 degree from 0 (black) to 100 (white) on an x.y chart.  In the real world, it does not happen that way.  A capture lens and image chip does one thing to the signal, and we compensate on the display end with our gamma curves.  The curves are created to try to bring it back to that "perfect" straight line.  Our eyes are non-linear, so Gamma curves in a TV are now now user selectable to better match the room environment that the display is actually being viewed in.    
 

For the history of TV, the room is supposed to be dark with the ambient light being no brighter than 10% of the peak white light from the display itself.  Does that happen in the real world in the homes of the TVs I calibrate?  No!  Almost never.  In most homes it is white walls and ceilings, white tiles on floors and big windows.  Light can simply NOT be controlled.      

 

I like to have the customer in the room when I calibrate their display.  And I explain as simply as I can, the process and what I am doing.  When I explain gamma to a customer in their home, I like to refer to it as the rate at which the display comes out of black.  The higher the number, slower it comes out (which works better in a darker room).  In a light or bright room, we have to go lower value or correction curve to come out of black faster to allow us to actually see the shadow details.   I don't like doing it because it goes against everything I've been taught, but in some homes, I have no choice.  So what we think of in "text book" goes right out the window sometimes.  In  the end, I have to get as close to the standard as I can and still keep my customer happy.  

 

Whilst I learned so much from doing that ISF course, I wish there was a support network .  If there is, I don't know about it, so I sometimes  feel like the terminator " skynet presets the switch to read only and we are sent out alone".  This industry has changed more in the last 5 years than it has in the previous 20 or maybe even 30.  

 

Gamma in Software:  Now I have pretty much bought every version of T2 on packaged media that it has been released on.  There is a HUGE difference between the BD of Skynet Edition and the latest 4K release where the BD is much darker than the previous release.  If the Skynet Edition was mastered at 2.2, then that is crap!   Apart from the over use of DNR, the new transfer looks stellar.  Comparing the two is like upping the brightness control of the display for the Skynet Edition.  The lower midtones are quite bold and you see things that even James Cameron says (in the running audio commentary of one previous release) were not seen on film.  Therefore, the latest version is much closer to the Director's Vision" or what was seen on film IMO.

 

CAVX...how did you find the 'Last Jedi' Bluray ?



Horrible color shifts I found [Not the only one, see review elsewhere at this forum]. ...RED's were garish , Blues were over saturated....go figure?

 

I am still running an older 1080P projector, but am now using the OPPO 203.  

 

HDR-10 has been a challenge and then TLJ comes out in Dolby Vision and throws a really big spanner into the works.  

HDR-10 is an open standard and allows studios to master at varying levels.  The OPPO allows user tweaks to make the picture better (or worse).  
Dolby Vision is a closed standard and it locks out everything.  And I get a message that states "user controls can not be used for Dolby Vision content.    

 

So I had to find a work around to make this work on my system, and some (most) or what I have done is thinking right outside of the box.  The Last Jedi has a certain look, so how much of that is the directors vision verses technical error?  I don't know.  The Matrix on the other hand is a complete different beast.  I don't have any other DV titles to compare at this time.  

 

In the latest FW update for the OPPO 203, a huge white line was created down the left hand side of the 16:9 image by the player.  Very annoying on my CIH system when viewing 16:9 content.  OPPO gave me instructions of how to create a system log based from thew player, save the txt file and email it off to them.  Hopefully they can find a solution to this from the data the player generated and fix this in the next FW.  In the meantime, I have found that using the "Convert to Dolby Vision" in the set up has worked for me.  This is where it gets really interesting as I have had to re-calibrate everything. 

 

So my first screening of The Last Jedi in Dolby Vision was crappy based on a system set up for HDR-10.  It looked smokey.  I didn't like it.  It also killed the 21:9 mode, but that is another story.

So I did some experimentation and found that you can really go to town on the contrast and brightness.  Because there is no "test patterns" for Dolby Vision, my work around was (did i say outside the box?) quite different, but it has given me a result that works.

 

To set the black level, I switched to the 4x3 mode of the projector and whilst watching the letterbox bars, set brightness to match the projector generated side pillars.  This sets what I would call the "back floor of the projector" or as black as this older DLP can go.    

I then found that the image was dark, so I had to up the gamma, then reset brightness/contrast.

So I had to go from 2.4 to 2.8.  

Contrast was set using (again, due to no test pattern) the scene where Kylo smashes his helmet.  I paused the image and used the bright white sparks and shattered glass as my reference.  I took contrast way up until it white crushed and then backed it off to allow me to see the details.  Later I noticed other lights in the rebel ships were crushed, so a click or two down revealed the details I was meant to see. 

So contrast on my system has gone from typically around 50 up to 78!  

Brightness is currently at about 42.  At the 2.4 gamma, I had to drop brightness down into the 30s to level match which caused the image to be darker.  So this is how user selecting a gamma curve can be made to work for me.  Increasing the correction value, allows me to increase the brightness setting.  It maintains the black level, but gives the lower mid tones a bit more life.  

 

Once I was OK with brightness/contrast, I took a white field measure, then set colour levels at their percentage level against white.  I am finding colours to be bright, but not harsh or over saturated.    

 

The OPPO 203 has a variable HDR to SDR slide control as part of the converter.  If this slide was too low, the peak whites in the signal would flash almost displaying between crushed whites and what we should be seeing.  It has been a challenge to find that fine balance where this flashing does not occur and still have a nice bright image.  Since converting to DV, this does not work, but the flashing issue is no more.  

 

I'll leave it there for now.  I'm sure after reading this you will be like WTF?             

 

 

             

 

Link to comment
Share on other sites

It seems those HDR-10 patterns work as well with this conversion, though there are some limiting things going, it is still better than trying to adjust using actual pictures.

 

There are two Black Crush tests and it seems my DLP can only go down to 2% black, so again, using the side pillars of the 4x3 mode, I can adjust brightness until I can see all the visible black bars as well as level match the black background to the side pillars.

There are three White crush patterns and the first one up to 1000 nits behaves like any SDR contrast pattern.  Take it as high as it will go whilst being able to see all the bars.  Easy.

 

Recheck brightness.    

Link to comment
Share on other sites

  • 11 months later...
On 27/06/2017 at 8:10 PM, betty boop said:

its really simple just hold sensor at centre of screen pointing back at projector and you want to close iris down for for about 12-14 FL for blu-ray DVD FTA etc,   FL = FC multiplied by your screen gain. FC is usually what you measure with the light meter :)

FL to nits is a very simple conversion. use below if like but 30FL is about 100 nits ...

http://www.unitconversion.org/luminance/foot-lamberts-to-nits-conversion.html

Sorry to drag up an old topic, I have the light meter you recommended.  To measure the light output, is there a specific test pattern/screen you have to use, or just turn the projector on and measure when menus screen is up.

Link to comment
Share on other sites

Just now, T800 said:

Sorry to drag up an old topic, I have the light meter you recommended.  To measure the light output, is there a specific test pattern/screen you have to use, or just turn the projector on and measure when menus screen is up.

Just a pure white screen is needed :) most test discs patterns have it. Wow thx or any others will be same :)

Link to comment
Share on other sites



  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...
To Top