Jump to content

Owen

Member
  • Posts

    13,095
  • Joined

  • Last visited

Everything posted by Owen

  1. People, put the cash towards something useful, like a couple of movies. A light meter won't tell you what you like so its pointless IMHO, one person will find a given light output too bright and another too dim, its all subjective so forget about measurements. I have owned calibration gear for more than 15 years and NEVER bother with light measurements, the picture either looks right or its doesn't, I don't need a light meter to tell me. 30ftl is Standard Dynamic Range by the book, it's what 1080 Bluray disks are mastered for. On a big screen 30ftl can look too bright, with standard SDR gamma, for some people because the image fills far more of your vision than a TV and you are viewing in the dark, but others may think it's too dim and want brighter. Only slight gamma correction is needed for SDR at 30ftl or more on a projector. HDR, on the other hand, will be WAY, WAY, WAY to dark if standard HDR gamma is used because HDR is mastered for 10 to 40 times higher output than what the projector is capable of so the projector MUST dramatically "tone map" gamma for HDR video to get an acceptable picture at a MUCH lower peak output than the video was mastered for. How well this is done is critical for the on screen image, but don't fool your selves into thinking what you are seeing is in any way "HDR", it's not. The same result or better can be achieved with SDR video because the studio engineers mastered the SDR video for 30ftl and can do a far more accurate job than any "tone mapping" algorithm in any projector, be it dynamic or static. Dynamic HDR tone mapping is an attempt at a solution for a problem that should not exist, and doesn't with SDR video.
  2. +1 These UST "laser" projectors use single chip DLP projection systems that provide very poor contrast and are NOT true 4K, so don't be taken in by marketing BS. An old used decent quality 1080 conventional projector will provide better image quality at lower cost. They do serve a purpose for those who are just looking for a big screen image at relatively low cost, can't accommodate a convention projector and are not too fussy about picture quality.
  3. The colour filter CANNOT cause "yellowing", it affects all output levels equality. EVERYTHING you view on a projector is SDR, like it or not. The colour filter significantly reduces light output making "HDR" even more impossible. To get something closer to HDR the wide gamut filter MUST be disabled, don't worry you will never notice any difference in colour.
  4. Neither optics nor iris opening affects colour temperature (yellowing), show me a camera lens that shows such properties. Lamp dimming most certainly does cause yellowing, and since native contrast is lacking in the NX5 and NX7 it would seem JVC is dimming the lamp in dark scenes on those models in an effort to get more acceptable black levels. The NX9 has better native contrast and can get away without dimming the lamp and therefore does not suffer yellowing. The previous E-Shift models "X series" had much better contrast than the "N series" and did NOT use lamp dimming, therefore they don't suffer from the "yellowing" issues. Its as simple as that. Dynamic contrast systems are inherently flawed, there is NO substitute for high NATIVE contrast which the "N" series lacks. That being said, other manufacturers selling LCD projectors are offering far worse contrast then the base model N series, while others selling DLP projectors are providing FAR FAR FAR worse contrast. Only Sony is in any way competitive with the N series JVC for native contrast, but the big problem for JVC is the N series is not competitive with the previoius X series. More pixels, which are bloody pointless, and less contrast which is VITAL.
  5. Since no one else has chimed in I think I should. The video is a good example of why side by side comparisons like this are BAD idea, especially when no effort has been put into equalising gamma between all the projectors, and that was NOT done. Anyone wonder why the JVC looks so dark in this comparison even though the peak white level is the same as the other two projectors afters calibration? I'll tell you why, its because gamma was NOT equalised. If gamma was adjusted at all it was done "by the numbers" using calibration software, and since the JVC has the lowest black level ALL levels below peak white will be lower then the other projectors. When it gets down to shadow areas the JVC will be MUCH darker because the black level is a lot lower. The result is a significantly lower average picture level even though peak white and gamma "by the numbers" may be the same. This is compounded by the camera which will expose for the brightest parts of the image, which where from the Epson, and be under exposed for the JVC making it look too dark. Viewed in isolation the JVC would not look like that as the camera , and the human eye expose for the image in front of them. The Epson is NOT brighter than the other contenders after calibration, which is whats matters, but the gamma setup is pushing up every level below peak white in order to get a significantly higher average picture level, even at the expense of clipping near white areas. This will make it "look" much brighter even though peak brightness is the same. Its not accurate and is straight up cheating, ANY projector with adjustable gamma can be set up to look like the Epson. If gamma is corrected between those projectors they will ALL look identical in brightness until you get down to very dark shadows. Only levels below about 5% will be darker on the JVC , as they should be due to the much low black level. The contrast of DLP, ANY DLP, is straight up unacceptable to me and I would not buy one at any price. Its like comparing an old LCD TV with a good Plasma TV, no contest. The Epson picture is downright ugly, and contrast is poor by the standards of 7 years ago. I wouldn't even consider one. The JVC needs an external scaler. Its unforgivable they they can't provide decent scaling, it costs stuff all to provide so there is absolutely ZERO excuse. Using a PC for scaling and HDR tone mapping will remain the highest quality option for years to come so a projector doesn't need to do those jobs for my use. Contrast and blacks are also a significant step down from the standards of the previous generation JVC's and I can't accept that. No amount of extra "resolution" can make up for lower contrast IMHO. 9 times out of 10 the actual visible resolution of the movie is the limitation not the resolution of the projector. Resolution is just a number and is HIGHLY overrated, when your actually viewing a movie and not pixel peeping still images, you aren't going to notice.
  6. Rainbow effect will never be a "thing of the past" with single chip DLP systems because they are incapable of displaying more than one colour at a time, which is the reason for the rainbows. They MUST time share a single imaging chip to create the 3 primary colours. 3 chip systems (LCD, SXRD, D-ILA) simply don't have that issue because that have a separate imaging chip for each of the 3 primary colours so display ALL colours simultaneously without issue. No rainbows are possible and the image is not fatiguing to the viewer. The visibility of "rainbow effect" is also up to the individual, some people see it and others near will, its a personal thing. For those that are susceptible, single chip DLP will always be a potential problem with rainbows and viewer fatigue. Commercial cinema projectors are often DLP because DLP is the most light efficient, which is critical for large venues where every lumen counts, however they use 3 MASSIVE imaging chips to achieve maximum light efficency and display all colours at once so no rainbows or viewer fatigue issues. Single chip DLP exists because it's "cheep", not because its good. Its the bottom of the projector world heap. You get what you pay for.
  7. The native contrast of ALL so called "4K" DLP projection systems is significantly worse than the 1080 models they are supposed to replace. Low native contrast is an inherent problem for ALL DLP projectors at ANY price, and the move to "4K" has has taken DLP contrast performance back 10 years or more. Increasing the number of micro mirrors to get closer to, but never equal to "4K", introduces many more mirror edges to refract light and degrade contrast, nothing a dynamic iris system can do to fix that.
  8. The French review of the Benq W2700 reveal its significant performance short comings. Lumen output after colour calibration = 981 lumens on high lamp and 725 on low lamp, a LONG way from the advertised 2000 lumens. Thats a worse than normal light loss after calibration for DLP projectors. If wide gamut colour is used output drops to 609 lumens in high lamp mode, bloody useless. Native contrast ratio comes in at a staggering 991:1 native (in scene contrast). Yes thats LESS THAN 1000:1 people, and thats worse than most cheap 1080 DLP projectors. Dynamic contrast (scene to scene contrast) came in at 2769:1, which is around one tenth the native contrast of a 7 year old base model JVC with no "dynamic" enhancement. Note that Benq quote a contrast ration of 30,000:1 for this projector, they are only off by a factor of 10 BS marketing obviously works and sucks in the uneducated who make perchance decisions based on spec sheet numbers that are down right deceptive and therefore meaningless.
  9. Sorry mate, but must I always have to point out the obvious ? The test pattern image you posted above does NOT show 4K. The left side block shows a 2K horizontal pattern NOT 4K, and the right side pattern is 2K vertical resolution. To be fully resolved the alternating lines must be full black and full white which equals 100% MTF (or no loss). White lines that are grey and black lines that are grey represent a loss of MTF and "resolution". The closer the lines get to 50% grey the greater the loss of MTF and usable resolution. Here is the full test pattern. Note that there are 4 blocks of vertical lines to the left of centre and 4 blocks of horizontal lines to the right, not 3 as in the close up image you posted above. Only the blocks on the far left and far right are 4K. The blocks 3 out from centre are 2K, the blocks 2 out from centre are 1K, and the blocks either side of centre are 0.5K. So the close up of the test pattern you posted above only goes out to 2K and as we can see even 2K is very poorly resolved with VERY low MTF. The colour test pattern that was posted earlier is a mess, no 4K resolution there. Its interesting a colour pattern was used in place of standard black and white pattern. I'm confident this was done because the black and white line pattern looks a lot worse.
  10. If any manufacturer-distributor-retailer tells you that ask for the "specifications" in detail that state what level of colour non uniformity is "normal" so you can get an independent evaluation and watch them back down. Poor screen uniformity is a defect plain and simple, and if they can't fix it you are entitled to a refund as the projector is not suitable for purpose.
  11. Ok, I'll make it simple guys. A "TRUE" 4K - native 4K (call it what you will) projector can resolve a 4K resolution test pattern. None of the projectors using a shift system can do that, all they will display at 4K is a grey blur in stead of alternating black and white lines. Sony 4K projectors don't resolve the 4K test pattern either because they use so called "convergence" correction by default and the user can't turn it off. There is a service menu fix for this but it may void warranty if the user plays around in there. Have the dealer do it. The "TRUE" 4K JVC's do resolve the 4K test patterns because the convergence correction system is under full user control and can (must) be turned off to avid resolution loss. How can a projector that cannot resolve 4K be called a "True" 4K projector? If that's not deceptive marketing what the hell is? As I said before, the very best 4K video is limited to 3K (weakly resolved with 30% MTF or less) for luma (the grey scale image) and only 1.5K for chroma (the colour overlay) due to 4:2:0 colour sub sampling. A 2K (1080) display can FULLY resolve all the chroma (colour) detail in the very best 4K video with ease. MTF is a measure of image contrast at any specified spatial frequency. 30% MTF at 3K means that there is a 70% loss of contrast to details at a spatial frequency equal to 3K. Thats very poor as the human eye requires high contrast to resolve fine detail and almost all fine detail in real world scenes is of low contrast to begin with. If the detail in the original scene had only 30% contrast, which is typical, and we display it with 30% MTF we have 30% of 30% which is 10%. 10% MTF is absolutely useless and detail with such low contrast will not be visible. This is why the spatial frequency at which 30% MTF occures is considered the resolution limit of cameras, and that's being rather generous IMHO as only very high contrast details have any hope of being visible in the final on screen image. Exactly, properly processed and displayed 1080 Bluray looks great and viewers will not know its not 4K unless there is a side by side comparison. Any difference that subtle isn't worth much in my book. The differences people will be ware of when viewing 1080 and 4K versions of a particular movie on a True or E-Shift 4K projector have stuff all to do with resolution. The biggest difference by far is gamma followed by colour calibration, both of which have a VERY significant effect on the image we see. When we equalise the brightness, gamma and colour calibration between 1080 and 4K movies the visible "differences" magically disappear and you won't know which you are viewing. I am very reluctant to use photos to make any judgment but the gamma of the left and right side images is so different that comparisons are difficult. The left side image looks like it has undergone some very primitive image enhancement and looks ugly, its sure doesn't look like a lens difference. Just because the sharpness setting in the user menus are the same (default) does mean anything, whats doing on in the background can be VERY different and sharpening off does NOT means it is off. Sony has long used significant sharpening even when it shows that its off in the user menu as a marketing advantage, its cheating but is sure sucks consumers in.
  12. The pixels still overlap mate, and each pixel is around 4 times larger than it should be so no TRUE 4K.
  13. Yes it is 1080p and low lumens but it has much higher native contrast and a much darker native black level. Any projector can "do HDR", you just need to do the appropriate tone mapping externally. PC based tone mapping is ahead of anything used in current projectors. Even cheap data grade projector lenses can resolve 20K plus, if they did not you would not be able to see the black gaps between the pixels. What matters to the human perception of image sharpness is the MTF performance of the lens at MUCH lower spatial frequencies down around 1K. The thing is, digital sharpening is very effective at correcting MTF over the range of spatial frequencies that matter for image sharpness and when viewing a projector you wont know if the sharpness you see is due to the lens or digital sharpening unless the lens is REALLY poor. Any lens that bad on a projector costing $4k plus would be defective. Its a good trick for manufacturers to improve the sharpening system and claim the sharper image is due to a lens improvement. ? Clever sharping makes a really significant difference, more so than a lens will make as far as apparent sharpness is concerned. Good quality projectors don't have CA issues unless the lens is faulty, and bad convergence is a manufacturing defect and warrants a replacement projector.
  14. To be described as "TRUE" 4K the layout of the pixels must exactly match the domestic "4K" standard of 3840x2160 for 1:1 pixel mapping. The Texas Instruments system does not do that so the image must be scaled to fit the non standard pixel grid. On top of that, the shift system moves the pixels diagonally half a pixel and projects the second set of "shifted" pixels overlapping the first set. Each shifted pixel overlaps the corners of 4 of the first set of pixels as there is no space to put them without doing this. A "True" 4K projector does not have this pixel overlap, nor does it require any scaling to display a "True" 4K image. So no, the Benq is NOT "TRUE" 4K and to describe it as that is deceptive marketing. True 4K is only relevant to PC text, graphics and test patterns, 4K movies are ALL low pass filtered and have no detail at the pixel level, therefore "true" 4K is not important for movies. High contact, on the other hand is vital to image quality, much more so than pixels, and these so called 4K DLP projectors provide very poor contrast, worse than the 1080 DLP's they are designed to replace, and vastly inferior to the LCoS competition.
  15. There are no native 'scope' projectors and video is constant image width. For scope you zoom the image so it fits the width of your scope screen, the black bars, which are encoded into the video, will then be projected above and below the screen just as with the Panasonic or any other projector. As long as you only play scope movies you don't need to adjust anything but 16:9 content will be displayed the same width as scope and has active picture content in the black bar area that goes unused in scope titles. You will then need to zoom the picture down to a smaller size so that the screen height is filled but not the width isn't. Without lens memory you will have to do this manually. The advantage of a 16:9 screen is that it matches the native output of projectors and domestic video formats which are all 16:9 natively, the image will always fill the width of the screen, only the image height will vary according to the video being displayed. Thats the way the system is designed to work and its totally set and forget for all aspect ratios. If a high contrast projector is used, and the viewing environment is not too reflective, the unused areas of the screen, the black bars, are so dark you forget they are there. No need to mask IMHO.
  16. Cut the marketing talk mate, the Benq is a pixel shifter so its NOT "True 4K". Texas Instruments, who make the imaging system call it Expanded Pixel resolution. Its just like the E -Shift systems used by JVC and Epson. The real issue is does it matter? Remember, there are no 4K visible resolution movies. Any image captured from the real world with a camera, any camera, and encoded to a 4K video format is limited to at best 3K luma resolution and 1.5k chroma resolution. Therefore moves don't need a "True 4K" display device.
  17. Its a single chip DLP projector with a laser light source so the actual native contrast will be more like 1000:1 just like other projectors that use the same so called "4K" DLP imaging chip. The older 1080 DLP units had higher contrast. The silly 2,000,000:1 marketing contrast number is obtained by shutting off the laser for a full black screen, but unfortunately you can't display a picture with the laser shut off now can you so its BS marketing yet again.
  18. 2m, wow thats close. Do you live in a bed sit appartment? ? If you are happy with your 60" definitely go 65" with a new TV, otherwise stay with what you have.
  19. You are cherry picking mate. Don't evaluate using bright content, look at normal movies, in particular dim scenes like ones shot indoors at night, thats where compression throws away a lot of data because its expected that people viewing on TV's likely wont notice, and it seems they typically don't. Just because people are happy with streaming 4K doesn't make it comparable quality to 4K Bluray with MUCH higher data rate, and even thats never actually 4K resolution it just has 4K pixels which is not the same thing at all, as you well know. What the 4K streams do reveal is how poor the bit rate starved 1080 streams are. For image "quality" on a big projection screen I'll take 1080 Bluray over 4K streaming without hesitation.
  20. The Hecto screen is angular reflective and designed for ultra short through projectors mounted close to and above or below the screen. Light from a normal projector will come in at the wrong angle and mostly be rejected resulting in a dim picture.
  21. Colour can be calibrated acurately for 1080 SDR Rec.709 content but less accurately for WCG (P3) 4K HDR. With projectors the more you strive for a wider gamut the more light output you loose, and this is typically a poor trade off for HDR. Wide gamut colours are VERY bright and garish so need more brightness not less. On top of that its rare to find wide gamut colours in real world scenes so the loss of wide gamut capability is no big deal and goes unnoticed. The most important aspect of calibration IMHO is gamma, and while its possible to "calibrate" to one of the SDR standard gamma curves like 2.4 the ideal gamma for the particular viewing situation can be quite different as it dependant on ambient lighting, screen brightness, screen size and viewing distance. There is no one setup that fits all situations. HDR gamma is a complete $hit fight, and with projectors which are all SDR display devices and must use "tone mapping", there is no way to "calibrate", especially if the projector is dynamically tone mapping (gamma mapping) each movie differently. With HDR on projectors its all about what looks good with most movies, there is no way to "calibrate" to any standard, and thats why I have little time for HDR. Its all over the bloody place and horribly inconsistent, unlike SDR.
  22. True , HOWEVER there is a BIG deference in colour inaccuracy between manufacturers in "brightest" mode so spec sheet comparisons are invalid and highly deceptive. JVC don't allow "brightest" mode to be very colour inaccurate but other manufactures sure do. There is a VERY large difference in lumen output drop after calibration between maufacturers with JVC's typically suffering the least. DLP projectors typically loose almost half their lumens after colour calibration which is dramatic to say the least. Comparing the lumen output of uncalibrated projectors is therefore pointless and irrelevant, some manufactures cheat something tragic. Manufacturers quoted lumen output and contrast numbers should be taken with a bucket of salt because after colour calibration the end result can be VERY different to the spec sheet numbers. I've had no CA issues with my JVC projectors, plus focus has been universally good from edge to edge, corner to corner. The "hand picked" X9000 lens is REALLY good, no complains at all. Having said that the consistency of JVC's lens dropped off after the new supplier was appointed after the X700-X900 series.
  23. You think you do. Problem is what you see is not due to up sampling.
  24. Todays video compression systems are in NO WAY lossless. There is NO SUBSTITUTE FOR HIGH BIT RATE and if you had done even basic experiments you would know this. 25Mbps for 4K video, which is all streaming services offer, is nowhere near enough with HVEC compression. Viewed on a 65" TV it may look decent but on a BIG screen the deficiencies are obvious. 1080 Bluray has better video and audio "quality" than 4K streaming. The bit rate per pixel is MUCH higher and it shows. I say again, RUBISH. I have evaluated 8K images from my Nikon DSLR camera displayed 1:1 pixel mapped as well as down scaled to 4K and 2K. Beyond the distance where 4K is resolvable to my eyes 8K provides ZERO benefit, and beyond the distance 2K is resolvable 4K shows no visible benefit as well,. The web site that claimed the above is a MARKETING COMPANY and will say whatever they are paid to say. I suggest you do your own experiments and do it double blind with an assistant changing the images so you cannot know what you are viewing. 4K isn't 30% better than 2K, its maybe 10% and borderline visible when 2K is processed properly even on a BIG screen. Going from 4K to 8K provided a fraction of that "difference" and thats just not visible in a double bling test. As for depth, its totally dominated by relative contrast and gamma beyond the distance 2k or 4k is resolvable to the individual. The Darbie video processing system provided FAR more visible difference to image depth than extra pixels ever could because it manipulates relative contrast, and this can be seen from a distance far beyond the resolution limit. My claims have nothing to do with the "Snellen" eyesight cart, its all from practical hands on experience over many years. I suggest you get some of that under your belt rather than rehash what you read on a web site. Exactly, and the more compression used to more the losses. What missing may not be perceptible to you viewing on a 65" TV at 10 to 12', which is a LONG way back by the way, but it sure as hell is to me viewing on a big projection screen at 9'. No they do not, to do so would be creating something from nothing. Put a resolution test pattern though upscaling and see how you get on, there is no increase in image resolution WHAT SO EVER. Video upscaling has been my hobby for the last 15 years, its a topic I am very familiar with. There is no way to add information that never existed in the video, to do so would would be distortion, plain and simple. Its marketing BS, why do you believe that crap? The world is flat, I read it on the internet. It must be true. Dude, Netflix is crap on a big screen and I can't stand viewing it. Good old 1080 Bluray blows it into the weeds for video and sound quality. Bitrate RULES. Thats fine mate, I'm genuinely glad you are happy with your perchance, however the reasons why you like what you see are not what you think they are. A $6K TV is not expensive mate. I remember when a 26" standard definition CRT TV without a remote cost more than that in todays money. It's 2 months toy money to me.
×
×
  • Create New...
To Top