Jump to content

Owen

Members
  • Content count

    12,742
  • Joined

  • Last visited

About Owen

  • Rank
    10,000+ Post Club

Profile Fields

  • Location
    NSW
  • Country
    Australia
  1. That's right, no such thing as 4K video with 4K actual usable resolution even in lab perfect capture conditions. The purpose of my posts was to explain to the pixel obsess out there why a "true" 4K display is not needed, I am NOT saying 4K video is without benefit so can we drop that please. The video still has to be low pass filtered to fit within the sampling limit of a 4K system so MTF is still going to be in the 25-30% range at 3K and effectively zero at 4K. Nyquist has us by the nuts. There are small MTF gains to be had by capturing in 6K, 8K or even higher and then down scaling but they are very subtle. Panavision worked out that 3 times over sampling was ideal. Don't fall into the trap of comparing totally different masters mate, its an apples to oranges comparison, they where mastered to look different. Down scaling the 4K version to 2K to strip out any resolution over 2K and then scale back up to 4K for comparison is a much fairer as it removes all the other mastering variables like colour gamma, data rate compression etc. When you do that its remarkable how difficult if not impossible it is to pick them apart in a double blind test where you dont know which you are viewing, and thats with the best quality digitally shot content. Sure if you blow the images up and study them closely you can tell which is which but we dont view movies like that. This ties in with what was explained in the Panavision video, its low to medium frequency MTF thats important for movie viewing not super high "resolution". I find it remarkable just how effective and impressive 1080 Bluray can be, the highest quality titles look fantastic and I'm completely satisfied with the picture quality. When we consider that 1080 Bluray is only delivering about 1.5K luma and about 0.75K chroma resolution its all the more remarkable. Even then few 1080 Bluray tiles are up to the standard set by the very best titles which proves that the quality of the original video source, the subsequent mastering and encoding dominate what we see in most cases. The 1080 Bluray format is extremely capable when the rest of the chain is up to scratch. Not as flawed as you suggest. It almost imposable to hold a camera completely still, even on a very good tripod. Just a small vibration is enough to move the image as seen by the camera a single pixel or more during the shutter open period and resolution is insensately halved or worse. We dont notice because there is still plenty of resolution to get the job done and there is an averaging effect over consecutive frames. To reliably capture high resolution images we need a fast shutter speed, the faster the better to mitigate the effects of vibration during the shutter open period. The 1/24th second shutter speed used for motion picture capture is totally unsuitable for high resolution capture and there is normally a camera operator holding onto the camera, in many cases supporting the camera. Any lens zoom magnifies movement. Limitations also apply to focus, unless its a long distance shot with the lens focused to infinity the camera is focused manually via a distance measurement from subject to lens which is prone to error. With the large lens aperture openings used in most shots to provide strong depth of field which puts the subject in focus and the background out of focus even a 1cm error in subject distance will throw focus off and trash resolution. I know it comes across that way mate, but its not my intention to shoot the format down. I just want to cut through the 4K marketing BS and the misunderstandings it has generated in the minds of the consumers. 4K Bluray has thus far done a very good job of shooting itself down as far as I am concerned by the movies that are chosen for release. Out of the movies I have viewed since the release of 4k Bluray over 90% are not available on 4K disk and the titles that are where movies I do not think highly off or are old and no longer of interest. The Revenant is top of my dislike list, couldn't even watch it to the end. I always thought DVD was crap, much better than VHS tape but I was never satisfied. I started upscaling DVD up to 300% with sharpening on a PC back in 2002-2003 in an attempt to get a better result. It was an improvement but still lipstick on a pig. True, but for the majority of movies, especially older titles shot on film, its obvious they are not even using the full capabilities of 1080 Bluray for whatever reason. Unless the mastering or compression was the culprit I fail to see how more pixels are going make much of a difference. Well maybe things have changed recently, but the titles I have looked at have been around for a few months and they where no better than the 1080 Bluray. Yet people raved about how great 4K is when viewing these titles that are 4K in pixel count only. Those people obviously believe what they want to believe. MTF correction is sharpening, they are one and the same. I run the video though FFDShow first for upscaling with luma and chroma sharpening which provides a subtle low to mid frequency boost with little effect at the top end . I then run it through MadVR for a final touch up depending on the title. For lower quality titles I typically turn MadVR off and run just FFDShow. I dont like the way MadVR looks on its own and I will not use it that way.
  2. Come on mate, fair go. I specifically stated the comparison was between 1080 Bluray disk and 4K Netflix. The maximum bit rate 1080 Netflix stream is CRAP compare to 1080 Bluray. Its bad on a TV and intolerable on the projection screen as far as I am concerned. Not only has the resolution been trashed but there are heaps of compression problems, particularly in shadow areas, with pixelation, low colour resolution etc. Its ugly so I will not watch any movies on Netflix, quality 720 content eats 1080 Netflix. On top of all that the sound quality SUCKS. Its no loss to me as their movie content is either old and I've seen it, or of no interest.
  3. Here are some links to videos I posted here years ago, they cover the issues of digital sampling, the need to low pass filter and the effects of MTF on what we see. The presenters go to great lengths to point out that high “resolution” is not useful for moves, the game is all about high MTF at low to medium spatial frequencies NOT super high “resolution”. 4K is useful not because it provides 4K resolution but because it provides higher MTF at 2K and under which is what the eye perceives are sharpness. There is one noted difference between what John Gult has to say about usable resolution limits and what the Arri paper suggests. Arri suggest that the 10% MTF point is a useful limit while John and just about all major camera manufacturers and testers of both still and video cameras typically use 30%. The 30% limit is MUCH more realistic given that a 100% full amplitude input signal is only valid in a test environment, real would details have MUCH lower contrast with 30% being generous. A 30% contrast input signal going into a system with 30% MTF at a given spatial frequency will give an output of only 10% in the video which is useless. All properly engineered digital cameras have a 30% MTF point at about 75% of the pixel count (3K for a 4k camera) because MTF (amplitude response) MUST fall to almost zero at a spatial frequency equal or greater than the cameras image sensor grid pitch. Higher resolution cameras tend to be worse because they are more lens limited. It should be noted that the usable resolution limit of 35mm film exposed in a camera via a lens and scanned at 4K is about 2K best case under laboratory perfect test conditions. Neither the Arri paper or the Panavision presentation take into account colour sub sampling, video compression or most importantly MOTION. Movies are captured at 24fps and almost always with 1/24th shutter speed which is really slow and incapable of high resolution if there is ANY motion what so ever. So real world “resolution” is going to be less then the ideal and ofter MUCH, MUCH less. For example a small error in focus distance can halve resolution at 4K or worse. Anything more than slow motion can drag resolution down to VHS tape levels, no matter if the camera is 40K “resolution”. The slow shutter speeds and the motion blur it creates are needed to disguise low frame rate jitter that would otherwise give the audience a headache with medium panning shots. So, until much higher frame rates and much faster shutter speeds are used to make movies high “resolution” will remain a dream. Even that wont work most of the time because there is typically not enough light to run a faster shutter speed unless its an outdoors day time shot. https://www.youtube.com/watch?v=ht4Mv2wIRyQ https://www.youtube.com/watch?v=ht4Mv2wIRyQ https://www.youtube.com/watch?v=v96yhEr-DWM
  4. I suspect you may be right about the studios playing it safe with 1080 Bluray production, especially with lower quality movies. Any sort of digital artefacts would be a total fail IMHO so to be on the better safe side. Having said that, high quality 1080 titles like Lucy are anything but "soft" on my 100" screen viewed from 3m, especially after appropriate MTF correction has been applied which makes a BIG difference. MTF correction is like speaker or room correction for audio, its a way to get a flatter amplitude (MTF) response on the particular display in use to get the "correct" level of sharpness over a wider range of spatial frequencies, not too much and not too little. It can also be tweaked for individual movie titles if desired. To do this properly we need control of which spatial frequencies are boosted and by how much, more like a graphic or parametric equaliser rather than the basic treble (sharpness) control provided by displays. Remember image sharpness is directly related to contrast, so if you want to do a proper comparison of 1080 and 4K content from various sources you simply MUST calibrated the display so that major variables like display brightness are eliminated. Brighter will always look sharper even if "resolution" is lower, and HDR video will trigger the display into "bright high contrast" mode making a valid comparison to SDR video impossible. With projectors we can run the same peak output for HDR and SDR using custom gamma mapping. As for Netflix so called 4K, I'm very sure that if you down scaled it to 2K to strip out any detail above 2K, and then scaled it back up to 4K for display it would look the same. On a BIG screen 1080 Bluray is still superior, particularly for audio. I have done the above experiment with some 4K "rips" and there was nothing 4K about them other than the pixel count, even though the file sizes where huge. People who download those "rips" are deluding themselves if they think they are the real deal. Look, I'm not suggesting that 1080 Bluray is the equal of 4K Bluray, its plainly not technically, however given how great 1080 Bluray can look on a properly setup MTF corrected big screen system, and the VASTLY greater range of content available on 1080 disk, I just cant get excited about 4K video at all.
  5. As far as I am concerned a "tangible difference" is something that can be reliably be detected in isolation WITHOUT scrutinised captured images and comparing them side by side. You have stressed that we CANNOT see such subtle differences when actually viewing a movie, even with an exceptionally high quality genuine 4K title like Lucy that VERY few other titles can match. For lesser quality titles its pissing in the wind. For those interested in image "quality" rather than pixel numbers on a spec sheet its MUCH better to concentrate on what really matters, CONTRAST as you yourself can attest. When we sit down to view a movie we simply will not know if the image is from a "true" 4K Sony of an E-Shift JVC as far as visible detail is concerned, in fact the processing we use makes more difference then the projectors native performance. Absolutely, on my older 9 series JVC feeding the projector 1080 and letting it upscale resulted in very ordinary performance and JVC's image sharpening was very crude and harsh looking. I soon leaned that upscaling and sharpening on the PC and feeding the projector a 4K signal improved performance immensely. It looks like a completely different projector and I'm so impressed with results I could not give a rats about 4K movies, even if I could find one I wanted to watch. When viewing high quality 1080 Bluray title I never feel the need for more sharpness, clarity or detail, there is plenty already and its easy to make the image too sharp so moderation is required. For less than best titles the source is the limitation and that applies to 4K disks as well. I do hope that JVC's processing has improved with later generations. I wouldn't use internal processing myself but most people will so its important. Sony's Reality Creation was not great either, ok if used in moderation but its easy to overcook the image. External processing still rules but not everyone wants to use a PC as a playback system.
  6. I wasn't sure how to interpret the downloadable photos either. In any event Javs says it all here: Seems to reinforce what I have said very well.
  7. You are taking things out of context mate, beyond a certain point more "resolution" become very much a case of diminishing returns. DVD to 1080 Bluray was a massive jump and very noticeable because the original movie source was very limited by DVD, however thats not the case with 1080 Bluray as its more than good enough for 98% of the content available on it. A 1080 display has much more resolution then any 1080 video. Like most people you are making the mistake of comparing totally different video streams. The 720p stream not only has less pixels, its also has MUCH lower data rate so its not comparable at all. If you want to see what altering "resolution" alone does you have to take a different approach. Take the 1080 stream and down scale it to 720 to strip out any detail above 720, then scaling it back up to 1080 with MTF correction to compensate for the double scaling. The original and 720 resolution limited versions can then be compared at the same size on the same screen, side by side if need be and allows us to compare the effect of less "resolution" alone without any other factors affecting the result. You MUST NOT use any video compression when manipulating the video as that will degrade the picture and make any comparison unfair and therefore pointless. I can do this down scale - up scale thing in real time while actually playing the video on a PC and out to the projector if desired, and its obvious that 1080 streaming is not delivering even 720 resolution. In fact high quality high bit rate 720p is better than 1080 streaming in every way. Plenty of 1080 Bluray titles can be down scaled to 720 in real time and loose stuff all to the eye, even on a BIG screen, because the original movie source just wasn't up to it. So, the number of pixels doesnt tell use anything about the actual "resolution" of the content, which is always lower then people believe it to be. A 4K display is almost always overkill for movies, most of the differences people see in 4K Bluray are due to the mastering not 4K. Display the 4K content down scaled to 2K in real time and good luck spotting the difference, its VERY subtle even with the best titles. You are forgetting that ALL video is deliberately low pass filtered so has no detail at the pixel level by design. If that was not done nasty aliasing would result and irreparably ruin the video. People think of digital imaging in a simplistic way. We point a 4K camera at a scene and each pixel (photo sight) on the cameras imaging chip faithfully captures the information without loss. The digital data can then be faithfully and perfectly be reproduced on a 4K display. Well it doesnt work that way boys and girls, digital cameras have an analogue response with a deliberate steep high frequency roll of. A 4K camera has good but not perfect amplitude response up to about 1K, then response starts to roll off so that at 2K its noticeably down compared to 1K. At 3K response has dropped so far that its effectively useless for anything other than very high contrast detail to be visible and most detail in real world scenes is not high contrast. Beyond about 3K response crashes because it MUST be effectively zero at 4K to avoid going over the sampling limit and causing devastating aliasing that can never be removed. We can relate the frequency response of video to audio. For audio we want a ruler flat response to the limit of normal human hearing (20Khz). Audio electronics does this with ease, although speakers often dont. If we say that 20Khz in audio is much like the spatial frequency of a 4K test pattern people expect a flat amplitude response out to 4K (no loss), unfortunately we dont get anything like that with video. In audio terms the response of 4K video is down at 10Khz, WAY down to the point of being barley audible at 15Khz and off the chart at 20Khz. Now thats not what people expect but it is an inconvenient fact that the "resolution" obsesses need to understand. Shoot a 4K black and white alternating line test pattern with a 4K camera and all you will get in the video is a grey blur with no lines discernible, and thats without any video compression. A 4K system is limited to about 3K weakly resolved best case under lab perfect conditions, for real world movie capture at 24fps and the slow shutter speeds it requires, even 3K actual image resolution is not possible unless the camera and subject are PERFECTLY still, which isn't really practical or useful in a motion picture. Domestic video is also colour sub sampled, which means colour is encoded in 4 pixel blocks rather than each pixel getting its own colour. This means that colour is half resolution from the get go, a 1080 display has more then enough resolution to display all the colour detail in 4K video. 4K 4:2:0 video down scales to 2K 4:4:4 video with each pixel getting its own colour. The E-Shift JVC's are about 3K effective resolution for luma and chroma (colour) which is a good fit for what 4K video actually delivers, amusing the content was actually 4K plus to begin with. Yes amplitude response (MTF) is not as good as a "true" 4K projector at 3K but with the right MTF correction the visual differences are very small and will go unnoticed when actually viewing a movie. However, the very large contrast advantage of the better JVC models is very obvious in dark scenes which most movies have plenty of.
  8. Enabling "anamorphic" on JVCX9500

    That depends on the projector and the characteristics of its lens, you may only get 5%. In any event 14% is stuff all, 0.14 of a stop in camera terms. How about no reduction in MFT for a sharper picture, no added chromatic aberrations, no added geometric distortions, and no issues due to non linear scaling. Little wonder why cinemas dont use A-lenses any more, they are a thing of a bygone era.
  9. Its a shame that more people dont understand that pixels mean stuff all, and that there is no such thing as 4K resolution video, and never will be, so there is no need for a "true" 4K display. Image sharpness has stuff all to do with pixels, its dictated by system MFT, which is relative contrast based and easy to manipulate with image processing. You can have whatever sharpness you desire if you use external processing. In image sharpness and resolution (which is not the same thing) is limited by the video source NOT the display. The X7xxx and X9xxx JVC E-Shift projectors easily out perform the base model "true 4K" Sony for picture "quality" and even owners of the $30K ES1100 Sony have dumped them to buy a JVC because they have found that high contrast is much more important then some notional increase irresolution that you will never be aware of when viewing a movie. The new so called "4K" DLP projectors have VERY poor native contrast that would have been considered VERY average 10 years ago, and have gone backwards very significantly in contrast compared to quality DLP models from years past that used larger better performing DLP chips and dynamic iris systems. That's not to mention DLP's other inherent problems, so in 2017 DLP is a bit of a bad of a joke. As for the "review" posted above I have a few comments. Clearly the two projectors where NOT properly calibrated as the gamma of the Sony is VASTLY different the Optoma, thats why the Sony looks brighter yet has a low peak output level. The colours we see in the "review" on our PC monitor is a combination of the display, the camera and our monitors so is not representative of actual performance in any way. Thank god for that because the colour I see on my calibrated PC monitor of the projector screen shots look bloody DREADFUL. Both the Sony and Optoma look highly over saturated and the Optima has a serous purple bias. Laser projectors, with their narrow band primary colours, are pretty much impossible to calibrate properly. The narrow band primary colours look very different from person to person, from camera to camera and from calibration device to calibration device so there is no way of knowing if what the viewer is seeing is what the producers of the video saw in the studio. Lamp based projectors dont have that problem. I have never found quality 1080 Bluray titles to be lacking in resolution or detail at all when properly processed and displayed on my 100" screen viewed from 3m. In fact VERY few movies are limited by 1080 Bluray full stop, nothing shot on film is and stuff all digitally shot titles are either, even the handful that are mastered in 4K. Film is NOT a 4K medium so it doesnt matter what it is mastered in, and more than 99% of content shot to date is on film. JVC do make a "true" 4K projector, the Z1, its costs about $40K, its VERY bright, and it has about half the native contrast and MUCH higher black level then the X9xxx E-Shift model. Unless you need the high light output to light up a VERY large screen I dont see the point. The JVC 4K chips still have more than 3 times the native contrast of the best Sony's and maintain their contrast with use, unlike Sony light engines which loose contrast with use. DLP "4K" projectors are not in any way competitive with native contrast at least 70 times lower then an X9xxxx JVC.
  10. I didn't say I haven't seen what you are talking about mate, I have, what I said was I cant remember seeing any issues when actually viewing a movie and I cant. It has been a non issue for me. The increase in black level when bright content is on screen is far more noticeable to me and I see it with all projectors, however I only see it on the JVC if I am looking for it so its not a problem. Individual perceptions are obviously very different. The light scatter-streaking is likely to be irrelevant to Mobe1969 as its VERY different to what he/she has observed on flat panel TV's, and found wanting. The TV issues dont apply to quality projectors If I had the ear of JVC engineers I would ask for much better native contrast first, even though a high end JVC is WAY ahead of the competition, followed by better ANSI contrast. The light scatter issues that seems to bug you so much wouldn't even rate a mention, what I see on TV's is more objectionable. Before replacing my old non E-Shift JVC I actually considered opening the new projector up and removing the E-Shift element from the light path as I thought it would do more harm then good. However that turned out not to be the case and I would not go without E-Shift now. Had I used the JVC's internal up scaling and sharpening I would not have bothered with E-Shift as the results where less then best IMHO, but with external processing via a PC the improvement in clarity and sharpness possible with feeding the projector an up scaled 4K signal, which requires E-Shift, is very significant compared to no up scaling and displaying 1080 Bluray natively.
  11. A resolution test pattern has no relevance to video shot with a camera, any camera. This is because ALL video is low pass filtered and has effectively no amplitude resonance at the single pixel level. Everything at the pixel level is a total blur so there is no need for the display to reproduce sharp clear lines on a test pattern. In fact sharply defined square edges pixels are an impediment to image quality because they represent spatial frequencies FAR, FAR above anything in the video and are nothing but distortion.
  12. Ok, I think I know what you are getting at and its not what others here have mentioned, especially sense you have seen it on OLED. TV's have a relatively thick glass or acrylic screen that protects the underlying thin and fragile surface of the panel. Light is refracted as it passes through this screen to screen interface resulting in a glow around bright objects such as white text on a black background. Early Plasma TV's had significant issues because they used two sheets of relatively thick glass, one as part of panel and the other as a screen protector. Late model Panasonic Plasmas address this by used one sheet of glass, the panel surface was also the screen surface, no screen protector. This was an improvement but there was still refraction in the single sheet of glass. Since projection screens are not covered in glass - acrylic they dont suffer this problem and therefore perform much better in this regard, however there is light scatter in the optical path of the projector. This manifests its self as an overall increase in black level when bright content is on screen, (poor ANSI contrast). I find this much more natural and less distracting then what happens on TV's with MUCH better ANSI contrast. Its one of the reasons I much prefer viewing my projector then any TV. As an example my VT60 Plasma TV has over 30,000:1 ANSI contrast which is crazy high, yet in dark scenes the JVC projector with its poor ANSI contrast but much higher native contrast totally destroys the Plasma, no contest. I cant remember seeing the other light scatter issues mentioned here while viewing real content so they have never bugged me.
  13. Can you define what you mean by "light bleed", I'm not clear on your interpretation. I'm not aware of any cinema camera that uses multiple exposes to create a true HDR images, moving images dont allow that technique for a start due to the blur that would result from combining images captured at different points in time. All we get is single shutter speed and single aperture setting per frame. The native dynamic range of the camera is what it is, its not like there is some "special" HDR camera. The exact same source video is used to create the SDR and HDR versions of a movie. HDR is an "effect" created by running the display very bright and mastering with a particular gamma to compliment the super bright display. As for 4K resolution, there is no such thing. 4K video is limited to about 3K luma resolution, and with 4K Bluray 1.5k chroma resolution due to colour sub sampling, and thats best case in a lab perfect capture environment with perfect focus, optimal lens aperture and absolutely ZERO motion. Real world capture, especially at 24fps, is FAR more challenging and VERY limited, any movement of camera or subject trashes resolution as does imperfect lens focus. 35mm cinema film has no usable amplitude response past 2K, we can scan film at 4K or 8K but that just reduces the digitisation losses, it doesnt get around the limitations of the source.
  14. Mate, Im not saying I don't want anything to do with 4K disks, or that I would not like the way Blade Runner looks on the JVC. I'm simply disputing that however good it or other titles look on a JVC its not simply because its HDR. It looks good because its been remastered, thats the big differance. A this point we should define what HDR actually is. First lets look at the video used in commercial cinemas, be it 2K or 4K its 10 bit, uses DCI colour and the same display gamma because the target brightness is the same for both. Would we classify cinema presentation or the video used in cinemas as HDR, no we would not. Its actually below SDR by definition. So how dose 4K HDR differ from the above? Its 10 bit and uses DCI colour so no difference there. What differentiates HDR is the requirement for a VERY bright display and a skewed gamma curve to go with it. The very different gamma doesn't alter the dynamic range of the content in any way, its ONLY the super bright display that increases on screen “dynamic range” and with a projector we are lucky to get the brightness of an old CRT TV which is SDR by definition. The “HDR” video contains no more information then the 10 bit cinema version, its the same information displayed differently due to gamma manipulation. Yes some people may be able and willing to run their projector a little brighter when viewing HDR content, but its always going to be a long way from HDR presentation. They could also run normal SDR at the same peak level and use a tweaked gamma curve to get the average picture level back down to what it would have been at 50 nits, that would be a much fairer comparison. If its ok to remap HDR gamma to suit the display, whats wrong with doing the same with SDR? Even if we can reach an arbitrary 100 nits, which is no magic number by the way so I don't know why you bring it up, its only as bright as a 30 year old CRT TV. Who would classify an OLD CRT TV as a HDR display device? Now for a little experiment. If you have a PC capable of HDR playback this is easy, if not a disk player and a camera can be used for a comparison. Play a HDR clip and pause at a scene that you think best displays what HDR is capable of on the projector. Do a screen capture and save the image in a lossless 10 bit format and as an 8 bit file. Now compare the 4K 10 bit image to the 4K 8 bit image on the projector using the exact same gamma and brightness for both. You will see that the two images will look effectively identical. We can also down scale the 8bit version to 2K to emulate a 1080 Bluray and it will still look the same on the projector, although with potentially less resolution if the original source was exceptional. This proves that what people see and like about 4K “HDR” re mastered content displayed on a projector is predominantly due to the remastering and has stuff all to do with “HDR”. 10 bit isn't needed because 100 nits or less doesn't need 10 bit. For 1000 nits displays (HDR) 10 bit is required because the steps in video level are too large without 10 bit. People commonly attribute what they see with 4K disks on a projector to 4K, HDR and WCG when in fact all three have very little influence on what we see, its all about the content mastering and display setup. I hope this clarifies why I say HDR in and of its self has little if any relevance to projectors, the remastered content obviously does for those who can find something to watch, but thats a seperate issue IMHO. Just because 1080 disks typically look different to 4K remastered titles doesn't not mean its not possible to master 2K disks for the same look on a projector, the experiment above proves it is possible. Its not going to happen so its a moot point, so if one wants the remastered “look” the 4K disk is obviously the way to go.
  15. I dont even use 50 nits with a projector because I have a strong dislike of "bright", and that includes bright highlights. However, there seems to be plenty of people on AVS forum using very high gain projection screens and who view SDR content at MORE THAN 100 nits. Lots of people like BRIGHT. The SDR standard was constructed around CRT TV's which could comfortably achieve 100 nits. I find 100 nits totally inappropriate on a large projection screen viewed in a totally dark room, but I reckon I'm the exception these days. TV viewers have always used around 100 nits and with modern TV's they are viewing SDR content at 200, 300 or more nits because they like BRIGHT. While domestic SDR content 91080 Bluray) is mastered for 100 nits, cinema video is mastered for 50 nits, is 10 bit and uses 4:4:4 colour, which is better than anything on 4K Bluray disk, no bloody HDR gamma either. Mastering for the intended display brightness obviously provides the intended viewing experience and best result, so 1000 or 4000 nit mastering is totally wrong for projectors. CRT was a high contrast technology and puts LCD to shame when a good TV was properly set up. Admittedly few consumers had well set up high quality CRT TV's so most never saw the true potential, but the mastering engineers sure did on their pro grade CRT monitors. HDR and the number of "gradients" are separate issues. 10 bit video has 1000 gradients with or without HDR gamma. 8 bit gradients are just fine for 100 nit display or less because each step in video level is too small to be visible, at 50 nits its plenty. As for "running out of gradients", I've never seen it with properly encoded 8 bit video display via a quality projector, its a non issue and well mastered 8 bit video NEVER looks "flat", that's a mastering problem plain and simple. One of the main points I am trying to get across is that we MUST separate mastering differences from the 1080 Vs 4K HDR thing. When we view 4K disks on a projector WE ARE VIEWING IN SDR, NO ITS' NO BUT"s. 100nits is SDR by definition so lets have no talk about "HDR" at 100 nits or less, its IMPOSSIBLE and always will be. Different gamma and mastering, which is all we see when viewing HDR content on a projector, changes the "look" of the content but it does not change the on screen "dynamic range" at all, only increasing the peak output of the display VERY significantly can do that, amusing black level does not increase in proportion with white level, which for projectors it does. No argument from me about that, I find super bright HDR TV's look bloody terrible and I never want to view movies like that. The image provided by a JVC projector is so much better and more natural its silly. Really the whole thing comes down to your preferred content, disk mastering and system gamma. There is no "HDR" on a projector, there is only SDR video displayed as SDR or HDR video displayed as SDR with tweaked gamma. Yes some people are in a position to run the projector slightly brighter for "HDR" content at the expense of black level, but many are not because the projector is mixed out already or they do dont want to sacrifice black level. In any event they will be viewing SDR no matter what they do, and there is also the option of running SDR content at the same or higher peak output level then HDR with tweaked gamma for that HDR look. No it wont look exactly the same as the HDR mastered version because it was mastered differently, although if one spends a bit of time experimenting with gamma the overall result can be very bloody similar to the "HDR" version, depending on the title and how drastically it was changed from the SDR version. I find that by the time I tweak the gamma of the 4K version to get the "look" I want it ends up looking VERY close to the 1080 disk version, at least for the couple of titles I have experimented with. The exaggerated high contrast "HDR" look that so many people seem to like does not appeal to me. I can understand why people would like that exaggerated contrast "look", and good luck to them but its not what I'm after. I should stress that I consider the way 4K disks are mastered is not ideal for projector users. On a big screen in a dark room high peak output levels WAY above what's possibly with a projector is counter productive to the viewing experience IMHO. 1000 plus nits on a BIG screen is quite frankly ridiculous and highly undesirable IMHO. A super low black level is vastly more important to picture "quality" as far as I am concerned. JVC's best efforts are still not good enough IMHO, progress in regard to contrast ratio has completely stalled and black level has gone backwards over the last 3 or 4 generations due to the push for more lumens. As we here can appreciate we dont need a super bright picture to get a fantastic image, in fact an image with super bright highlights would be un viewable on a big screen and with current technology super high peak output comes at the cost of dramatically increased black level which is the LAST thing I would want. High contrast and super deep black level rules for SDR and HDR content. The whole 4K HDR eco system is designed around Joe average consumer who does not understand or appreciate what constitutes a "quality" accurate picture and views on a relatively small un calibrated TV in a non dark room. Its understandable why a new product would be directed at the biggest market, but projector users are not best served by content mastered for 1000 nits plus. Mastering the content for the peak output the display will achieve is by far the best approach. Don't treat me like a fool mate, I didn't come down in the last shower. Lets get one thing perfectly clear, THERE IS NO SUCH THING AS "HDR" ON A PROJECTOR, its displayed as SDR with tweaked gamma, end of story. The fact that you, and others, like the look of 4K remastered video displayed that way is irrelevant, it is what it is and it sure isn't HDR display. As I said before, "true" 1000 nit plus HDR display is TOTALLY inappropriate for big screen dark room viewing, so dont think I am denigrating the image produced by the JVC projectors in any way. Its vastly better than any HDR TV as far s I am concerned. I wouldn't watch a move on one of them if you paid me.
×