Jump to content

Owen

Members
  • Content Count

    13,022
  • Joined

  • Last visited

Community Reputation

100 Good

About Owen

  • Rank
    10,000+ Post Club

Profile Fields

  • Location
    NSW
  • Country
    Australia

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Owen

    madVR settings and JVC x5900be

    The TV show was likely 60Hz.
  2. 8 bit is and always has been just fine for SDR like brightness displays and we have been editing digital photos and video on 8 bit PC displays since the software first became available to do it back in the 1980's. I have been doing so since then and never had any problems with banding in dark areas of images. The only time I see issues in dark areas of video images is when they are over compressed such as with Netflix because colour resolution has been trashed, it has nothing to do with any 8 bit limitations. All the 1080 Bluray and 4K Bluray titles I have compared look the same after gamma has been corrected and there is no technical reason they should not. Thats not to say that there aren't poorly mastered movies out there or movies in which shadows where deliberately crushed to black for effect, I have seen both but again its not due to 8 bit encoding. It should be noted that 8 bit is suitable for video and image distribution but not for editing that required very significant alteration to the image, it was never designed for that purpose. Decent still cameras can record 12 to16 bit RAW uncompressed images for editing and 8 bit compressed .jpg images for immediate display simultaneously for good reason. If the image is as you like it out of the camera the 8 bit .jpg is fine, but for serious gamma or colour correction the RAW version should be used for best results. When we "tone map" SDR video to look right on a projection system running at the same peak brightness as it would for HDR video the corrections required are minor and well within the limits of 8 bit video, no banding issues. In comparison the "tone mapping" required to get HDR video to look decent on a projector requires VERY dramatic correction so its a good thing its 10 bit to start with. When doing adjustments to video it is upsampled to 10, 12 or 14 bits first to avoid rounding errors that would occur if we where to do the adjustments in 8 bit directly. The old Sony SXRD TV's they you and I owned back in the day used 12 and 14 bit internal video processing and 10 bit display according to the service manual, and that was in 2007. The black level of the display can have a dramatic effect on perceived shadow detail and required gamma calibration. Lets take a typical LCD TV and a high end Plasma TV with a black level 10 times lower than the LCD. If an industry standard SDR gamma curve is used on both the shadows on the Plasma will be 10 times darker than on the LCD which may make them perceptually too dark and difficult to see even though all the video information is on screen. To correct this gamma needs to be adjusted to bring the Plasma out of black quicker, but this is a non standard gamma. CRT TV's and CRT projectors in particular have a black level 10 times or more lower than any Plasma TV and come out of black very slowing, they also had no way of adjusting gamma so tend to crush dark shadows to black VERY easily unless the brightness control was used to elevate the shadows, but doing that elevated the black level and destroyed contrast. I was using a PC to do gamma correction for CRT projection 20 years ago to address this problem. JVC projectors have a black level MUCH lower than any Plasma TV and the two I have owned both come out of black slowly which required gamma correction similar to CRT. The inbuilt preset gamma curves are far from ideal and if you calibrate by the numbers using a colour meter and software you wont get the right outcome either because of measurement errors near black, due to the low black level, and the inability of software to compensate for screen size, viewing distance and ambient lighting. A big screen that fills your field of view in a dark room will be perceptually brighter than a small screen (TV) running at the same brightness. This is why running SDR video on a big projector screen at the 100 nits it was mastered for with standard SDR gamma intended for TV's looks too bright. It looks wrong because it is wrong, we MUST adjust gamma to compensate which involves pulling the mid tones down in level, which lowers the average picture level, while leaving the white level alone to get bright highlights. Thats what provides the high contrast "HDR" look that people like even though the "dynamic range" of the image is unchanged. We certainly don't need HDR video to achieve this because the projector is the limiting factor in dynamic range NOT the video. Bluray movies are mastered on and for flat screen displays not projectors, and if a projector is used as part of the process its unlikely to have been in a pitch dark room. The result is there is going to be a disconnect between what the production staff saw and what we see at home if we run ours projectors at 100 nits with standard SDR gamma, the images will be perceived differently and that needs to be taken into account and corrected. With HDR video the disconnect is VAST and getting an on screen image that looks like it should is pretty much impossible. Can you get an image you like with HDR video , sure you can, but that doesn't make it technically "better" for projector use. If people crank up the bass and treble on their audio system it may sound "better" to them than a perfectly flat and accurate audio system, thats a personal preference not "better". Gamma adjustment is for video like a multi band graphic or parametric equaliser is for audio, we can use it to adjust out inaccuracy or to impart whatever "look" we like. However, just as with audio equalisation, we can make a right royal mess of the picture if we don't know what we are doing and why.
  3. Sorry guy's, I haven't had time to respond to this thread till now. I think it would be good idea to explain how “SDR” and “HDR” video is actually created and what “tone mappings” is supposed to achieve. We can create our own at home without any need for a cinema camera and professional studio gear, all thats needed is a DSLR or good quality mirrorless digital camera that can capture 12bit (or higher) “RAW” un compressed and un processed images, a PC and some free image editing software. We don't need to shoot video, still images are fine and actually preferable for demonstration purposes. I should add that the bit depth a camera uses for capture tells use NOTHING about the dynamic range of the images, digital image dynamic range is entirely governed by the cameras imaging chip and associated electronic systems NOT bits. On screen dynamic range is entirely governed by the display device, NOT the digital images that are being displayed. Ok, lets get going. First we need a few test images, an average daylight scene, a dark scene with bright highlights like a street at night, and a couple of challenging bright scenes such as a sunlit landscape with clouds in a blue sky and a typical sunset scene with lots of contrast. Now, using a display setup for the SDR standard of 100 nits peak connected to our PC we load each image into the editing software and adjust the gamma and colour of each image to look how we would like it to look, this creates our “SDR” master images. Next we do the same thing again, but this time using a display setup to conform to the 1000 nit HDR standard, this gives us our “HDR” masters images. Note that both the SDR and HDR masters where created from the same source images, this is the same situation as with movies. A cinema camera is just a camera, there is no special “HDR” movie camera. The only difference between the SDR and HDR versions of movies is how the content is mastered and subsequently displayed. Now to the “tone mapping” thing. Tone mapping is simply a gamma adjustment that converts video mastered for 1000 nits to video suitable for a 100 nit display, so lets do that. Note, adjusting gamma alters the relative brightens of video levels between black and white but NOT the black or white levels, black and white levels are unchanged by gamma. So we take our test HDR master images, and using our editing software and a 100 nit display we adjust the gamma of our “HDR” images so that they look how we wanted them to look at 100 nits. Now since we have already established what that “look” is with our 100 nit SDR master images, after gamma adjusting (tone mapping) our HDR images they should look identical to our SDR master images if we have done the job properly, any visible difference represents an error. Now if we look at the gamma correction we had to use for each of our test images, each representing a different scene in a movie, we will see that there are significant differences in the correction required from image to image so we cannot use a standard static “tone map” to correct them all, yet this is what projectors have been doing up until the latest generation. Its a HIGHLY compromised approach and gives VERY ordinary results as even Javs admits. Its amusing to me how some on this forum say that HDR video on their X series JVC projector looks fantastic with the standard projector tone mapping, to each their own. Anyway, this is where dynamic tone mapping comes in, it does a scene by scene gamma adjustment that can be much better than static tone mapping, BUT it will always be guessing at how the director-producer intended the scene to look when displayed at SDR like brightness levels and will always be less accurate than SDR video for projector use. I have no doubt that dynamic tone mapping can be “good enough” most of the time and since the viewer will normally have no reference point they will never be aware of any significant differences or problems. I should add that re mapping gamma affects apparent colour saturation in mid tones and shadows so if dynamic gamma is used colour saturation tracking will suffer unless it is also corrected dynamically as well. What gets me is that all this is a lot of stuffing around just to get a result that SDR video provides without ANY processing. With SDR video on a well setup projector we see what the director-producer intended us to see in every scene in every movie, AND we can have exactly the same dynamic range as HDR video on screen. I understand that people my like the way HDR video looks on their projector as it will be “different” to the way SDR looks by default, BUT what needs to be understood is that they can have the same “look” with SDR video if they want, all thats needed is a suitable gamma setup and to run the projector at the same brightness for both SDR and HDR. No dynamic gamma (tone mapping) is required with SDR and there will be consistency from scene to scene that HDR video with dynamic tone mapping will never match. Its about time that we educated consumers rather than perpetuate manufacturers marketing BS about “High Dynamic Range” on projectors, its doesn't exist and nor does it need to as is obvious when viewing. Subtlety of tone and shade along with high native contrast are FAR more important to the movie viewing experience IMHO, I want super bright highlights like a whole in the head. HDR was never intended for BIG screen dark room viewing.
  4. Thats an over simplification mate. The display sets the on screen dynamic range NOT the video. If your projector can provide 14 stops dynamic range it can deliver 14 stop with SDR video and 14 stops with HDR video, its a function of the displays contrast ratio. SDR video is not limited to 6 stops either. The mastering is intended for a display running 100 nits peak so notionally that is the white level but black level can be ultra low giving a crazy high dynamic range. 8bit encoding has proved to be ample for a display running 100 nits peak, which is what it was designed for, as each step in video level is too small to be seen. However, when we move up to 1000 nits peak display brightness each step in video level gets too large with 8bit and we get colour banding-posterisation so 10bit is required. In future when 4000 nit and higher displays become available its anticipated that 10bit will not be enough and 12bit or maybe even more will be come necessary. Now when HDR video is "tone mapped' for 100nits the video levels about about 80% of peak white will be progresivly compressed-truncated or even clipped to fit within the new 100nit limit and the end result is the same as SDR provided the mastering is not stuffed up. Since we are back down to 100nits peak, or there abouts, we no longer need 10bit to avoid colour banding. If we compare a 16bit DSLR photo with a lossless 8bit version of the same photo side by side on a suitable PC monitor they have exactly the same dynamic range on screen, more bits don't help.
  5. So will I. Cinemas don't use domestic "SDR" mastered content, the "SDR" 100 nit standard was designed for TV's not projectors. Its not "necessary" to display SDR video dimmer than HDR video on a projector either. SRR can be displayed at 100 or 150nits IF gamma is adjusted appropriately and will look the same as HDR tone mapped and displayed at 100 or 150nits. If you display at SDR 100 nits on a big screen in dark room using standard SDR gamma it will not look its best at all, and thats what you have done so your comparisons are completely invalid. Its apparently great to tone map HDR video to suit the display brightness but totally unacceptable to do the same for SDR. Where is the logic in that? The original movie content isn't HDR or SDR, it doesn't conform to any domestic format out of the camera and the same video is used to make both SRD and HDR masters that target different display brightness. Tone mapping HDR is simply converting HDR video that was mastered for 1000nit or more down to suit a display running a much lower peak output, be it 100nits or less. With SDR video the "tone mapping" is done in the studio and every scene is optimised for the 100nit standard so when the movie is displayed at or around 100 nits on a little TV in a NON dark room it will look as intended. On a large projection screen viewed in a dark room this is not optimal so slight adjustment is required to compensate. if you don't do this correction don't complain that the picture doesn't look right. With HDR video we must "tone map" at the display end for projector use, but since the director-producer has no control over this process there are significant scene to scene inconsistencies and the results are far from optimal. Dynamic tone mapping is an attempt to guess what the production team would have done to "tone map" the image to suit SDR like display brightness, but obviously this will never be as good or as accurate as what the studio would do and did do with the SDR master.
  6. If you use a PC for playback it will be doing the upscaling to 4K in video hardware, like it or not. I can't comment of the quality of Intel's inbuilt video, it never used to be much good. Software upscaling with sharpening is definatelly better but not strictly necessary. When playing PAL DVD's (50Hz) make sure you run your desktop at 4K 25Hz or motion will be bad. The picture will be pretty ordinary no matter what you do as DVD is very low resolution and old DVD's used dodgy compression.
  7. We can view those images at 50 nits or 150 nits and the relative differences will remain the same. The images will just be dimmer or brighter, and with gamma adjustment they can all be equalised Tone mapping (gamma mapping) can provide whatever "look" one desires for both SDR and HDR video source, but which map is accurate? With HDR its a guess. When using as SDR display any look that is mastered into HDR video can be mastered into SDR video, the projector is the limitation not the video. As for dynamic tone mapping applied at the display end, no matter how much you may like what it does to a particular scene it's not accurate to the source, no dynamic system is including a dynamic iris. As I said before its like playing with the tone controls of your stereo as the music plays, you might like what you hear but its not what your where intended to hear by the music producers. I dislike dynamic behaviour in a display system, but to each their own. I tolerate a dynamic iris for some movies because black levels without it are ordinary. The knowledgable people don't post very often and I can't be bothered reading dozens if not hundreds of posts to find something of interest. I typically only go to AVS when I am searching for something specific or to check out something that is being discussed elsewhere. I have over 2 decades of experiance manipulating image gamma so I'm fully aware of what can and cannot be done and why, I don't need to read about it. When it comes to SDR displays, (projectors) any "look" that can be applied to the image in HDR mastering can also be applied to SDR mastering, the display is the limiting factor and it defines image dynamic range not the video. But I have. I can make HDR look like SDR and SDR look like any tone mapped HDR, or anything in between. Just set the same peak white level for both, which gives the same "dynamic range", and adjust gamma - colour to suit. After that any "differences" are academic. I prefer the consistency of SDR video over HDR, it doesn't need any dynamic crap because it was mastered appropriately for a projector in the first place. Not perfect but close, unless the mastering was stuffed up, obviously we don't have any control over that. I have alternative gamma maps I can call up to compensate for dodgy mastering and can modify maps on the fly very quickly to fine tune, but I don't need to do that very often as most SDR movies are pretty right with a fixed SDR to HDR "look" conversion map. When comparing SDR video to HDR video on a projector its imperative that the same peak white level be used for both to get the same dynamic range AND the gamma of SDR be adjusted to get the same "look" as whatever tone map you are using for HDR video, if you don't do what what are you actually comparing? This is a task that MUST be done by eye and it can be very time consuming to get right, but once done you are good to go for most titles. Dynamic tone mapping is cheating as its re mastering the video on the fly, its not the job of a replay system to do that. If the director-producer mastered in a very dark look with subtle dim highlights for a particular scene thats the way it should look on screen. A dynamic tone mapping system will make such a scene brighter and or boost the highlights when they should not be and in turn this will affect the relative contrast between scenes, it also affects how an dynamic iris system behaves which may be visually good or bad depending on the scene.To get an accurate picture the dynamic iris system and ALL dynamic systems must be disabled. 40 to 50 nits is fine for SDR if you use a standard SDR gamma, BUT thats not what I am proposing. Displaying SDR dimmer than HDR is a choice not a necessity, the projector will happily display both at exactly the same brightness. Now when we do that standard SDR gamma is no longer appropriate as the average picture level will be too high so we adjust gamma to pull down the mid tones which makes the highlights look comparatively brighter and brings the average picture level down to where it was for SDR at 40 to 50 nits. This is what so called HDR is actually doing on a projector so we are achieving the same overall result. 50 to 100nits is only a one stop change in brightness, and thats not much. 100 to 1000nits is 3.25 stops, thats a LOT more. SDR can be displayed at the same brightness as HDR on a projector and deliver exactly the same "dynamic range", and once gamma is equalised the resulting picture looks near identical. On a projector HDR is effectively SDR with tweaked gamma, not HDR at all. I'm fine with that and don't see how "true" HDR would ever be appropriate on a big screen in a dark room. 100 nits with appropriate gamma is plenty ands thats Standard Dynamic Range, like it or not. When both are viewed at the same brightness and with equalised gamma? How so when the picture looks the same?
  8. Yes, HDR video can be displayed on any projector, it doesn't need to be 4K - HDR compatible or have HDMI 2.x ports either. Its the display that sets the dynamic range of the picture NOT the video, its dictated by the displays contrast ratio and brightness. It makes no difference what video you use you will only get a standard dynamic range image on screen no matter what domestic projector you have. Some have higher contrast than others and some are brighter, BUT they are ALL SDR ONLY.
  9. It is, and so is the fact that domestic projectors are SDR display devices and thats all they can display no matter what video you run into them. This isn't a bad thing because on a big screen in dark room there is no need for the stupidly high peak output of true HDR, as you freely admit. Dynamic tone mapping is a form of distortion, its not accurate to the source. You may like what it does most of the time but thats a personal preference thing. I draw your attention to the images you posted, a dark scene for The Revenant and a night scene from Lucy. We are viewing them in SDR on our PC screens and yet the "differences" can be easily seen, thats because the only real difference between them is gamma and we can adjust gamma to make the image look however we want it to look. There is nothing "HDR" about any of those images viewed on a PC monitor they are all SDR, just as they would be when viewed via a projector. With projectors we can run SDR at exactly the same peak brightness as HDR so the Dynamic Range of the on screen image is the same for both if we want it to be, we can then tone map (adjust gamma) to get the "Look" we want, whatever that may be, and end up with virtually identical results. The scenes to scene differences will come down to mastering but all else being equal the SDR mastered video displayed at SDR like brightness (as with a projector) will be more accurate to the directors intent than a 1000-4000 nit HDR master displayed at the same SDR like brightness level. Add dynamic tone mapping into the mix for HDR video and accuracy goes out the window, an algorithm will dictate what you see not the director-producer. Thats because tone mapping has converted the image into something usable with an SDR display, it's is no longer High Dynamic Range its SDR on screen. The display dictates the "Dynamic Range" of the on screen image not the video. Depending on the mastering and the tone mapping used at the display end HDR video may or may not look the same as the SDR version of the movie on a particular display but it is supposed to be very close, if its obviously different there is a problem. Its odd to me that people will put a lot of effort into the tone mapping of HDR video to get the look they like but they wont do the same for SDR video.
  10. Thats the way I look at it mate, no angst at my end.
  11. I no longer waste my time reading or posting on AVS, its been 10 years or more since I was active there. Its a sea of rubbish posts with little useful information to be found so I just don't bother, I've got better things to do with my time mate.
  12. "oxygen free" is marketing BS, it makes no bloody deference what so ever to video cables. If you want high bandwidth over extended lengths go optical, end of story.
  13. Come on mate, I'm not forgetting anything. I can dial in whatever look I like for both SDR and HDR which is exactly why I don't see what the fuss is about. No matter what gamma setup is used for HDR video I can achieve the same visual results with SDR video. Again, COME ON MAN, I never said it should. When we view SDR and HDR video with standard SDR 2.2 gamma for example, just for giggles, its easy to see that HDR video looks VERY, VERY flat and un colourful compared to SDR video and needs a wild gamma curve just to get an image with as much apparent contrast and colour saturation as SDR video. The gamma correction required to get SDR looking right at whatever peak output one desires on a projector is FAR, FAR less extreme than that required for HDR video AND SDR video is much more consistent because it was mastered for projector like peak output light levels, HDR video is NOT so often looks wrong. Dynamic gamma derived from an algorithm at the display, as apposed to mastering intent applied in the studio, simply introduces another layers of inaccuracy. If the display can achieve the peak output level the video was mastered for there is no need for ANY "tone mapping" and to use it would be inaccurate and not display the director/producers intent. How can an algorithm at the display end only know what the directors/producers mastering intent was, especially when the display is incapable of doing what is required, it CANT. With Dolby Vision the display does what its told to do utilised informing the video production team embedded in the video. Obviously if the display can't do what the mastering requires the results will never be as intended, and projectors are not even remotely close. SDR video suffers NONE of these issues because its mastered for the sort of light output levels projectors actually deliver. Set the peak white level you want, adjust gamma to suit and get the "look" you like, and you are good to go. No need to fiddle and absolutely ZERO need for any type of dynamic "enhancement". Not surprised, its a given. I can get whatever "look" I like with ANY video. Its a real shame more projector owners don't understand what is possible and how to achieve it. Projector manufactures could easily include a preset to make SDR video look just like HDR video at no cost to the consumer and one has to wonder why they don't. However, it would expose the inconvenient reality that "HDR" is not what its cracked up to be on a projector and thats not good for marketing. Projectors are SDR display devices, I've said this MANY times but its not sinking in. No matter what video you run on them you are looking at Standard Dynamic Range on screen, or close enough. The only difference between the default presentation of SDR and HDR on a projector is gamma and default peak light output, and we can have complete control over both if we want to have. SDR video is mastered for projector like light output levels and it works just fine up to 300 nits or more with appropriately corrected gamma. No domestic projector is going to give you anywhere near that. HDR is on a different planet, its designed for 1000 nits plus. When displayed on a projector it MUST be "tone mapped" back to an SDR like responce to get a usable picture, BUT since the mastering is for 1000 nits plus it will NEVER look as intended. SDR is mastering is MUCH more appropriate for projectors and is going to be for the foreseeable future, its about time projector manufactures got on board as it would benefit 99.9% of content released to date. Only slight tweaking is required to get a "HDR" look out of SDR video displayed at the same peak light output as "HDR" on a projector AND the result will be more consistent and more accurate to the director intent. Using some any of display "dynamic" gamma with HDR video, just to get a result that is similar to but less accurate than SDR video, doesn't doesn't make any sense to me.
  14. If you use standard HDR gamma that's true, but thats NOT what I was talking about. Gamma for SDR must be changed to that shadows and mid tones are no brighter than with HDR, and since peak white level and black level are also the same as HDR displayed on the same projector the resulting SDR picture look just like the HDR picture, I can get them so close that I can't tell them apart. No matter what "look" I dial in for HDR I can get the same "look" out of SDR. Is every scene the same, no it's not, but unless there is a side by side comparison no one would know, and as I said SDR is FAR more consistent and accurate, it needs no dynamic alteration of ANY kind. Not if you want picture accuracy. I would much rather the movie director/producer dictate what I see rather than an algorithm that is remastering the movie on the fly. To each their own. HDR video is VERY VERY flat and bland with undersaturated colour compared to SDR video when both are displayed with same display gamma. What gives HDR the apparent contrast and "pop" that people tend to like is VERY exaggerated gamma applied by the display for HDR. SDR video needs comparatively very little gamma adjustment to get the same apparent contrast and "pop". Regarding the pictures of The Revenant, I actually prefer the look of the SDR version you posted, the dynamically adjusted HDR version looks harsh and artificial to me. It's a personal preference thing and I suspect that the exagerated high contrast look that you, and many others, find attractive is not my thing. It's more reminiscent of a TV image and I really don't like the way TV's display movies. As for the Lucy images, it's dead easy to get the SDR version to look just like that so called "HDR" image and I have done so. I would prefer more shadow detail, which the SDR version is providing in your example, and that can be achieved without sacrificing the highlights which will be the same brightness with SDR as HDR when gamma is adjusted appropriately and the projector is run at the same brightness. SDR mastered content doesn't have that problem so there is no need for any dynamic correction. We see what the director/producer intended us to see which cant happen with HDR mastered content. True HDR displays don't use dynamically adjusted tone mapping for good reason, they don't need it. Dynamic mapping is an imperfect solution to a problem that should not exist, and doesn't with SDR. I'll download build 46 to test out of curiosity, but I have stopped bothing with HDR movies due to HDR's inherent unsiutability for projectors which are all currently SDR. Maybe when-if hybrid log gamma gets worked out and implemented on Bluray the situation might change.
  15. Obviously there will be differences on some scenes, even if gamma is adjusted on a film by film basis, they are different masters after all. However, unless one does a side by side comparison the viewer will almost always be unaware of them. Even when I do a direct comparison and find individual scenes that do look "different" 9 out of 10 times I prefer the look of tweaked SDR over so called "HDR" on a projector. The director/producer can do a FAR better job of optimising the image for projector brightness (SDR) than any tone mapping algorithm in the display chain ever could. HDR is mastered for VERY bright TV's and no matter what we do the display of HDR video on a projector is a compromise and gives inconstant results from scene to scene. If we introduce dynamic display gamma into the mix the situation gets WORSE as an algorithm is now deciding what we see not the director/producer, its remastering on the fly and is inherently inaccurate and inconstant. SDR is mastered for the sort of brightness projectors actually deliver. SDR video is therefor consistent and EVERY scene in EVERY movie can be properly reproduced by a projector to convey the directors/producers mastering intent so a global approach to gamma adjustment to suit an individual display or viewer preference not only works but is ideal, the LAST thing we should want is any type dynamic behaviour. A dynamic iris is bad enough and a compromise in and of its self, add in dynamic gamma that adjusts the picture on a scene by scene or frame by frame basis and its a bloody mess and fundamentally inaccurate. Would we want an audio system that adjusts its frequency response profile on the fly depending on what music is being played?
×