Jump to content

10 bit projectors


Recommended Posts

My pioneer 989 has a hdmi output.I can configure this for 10bit output rather than the 8bit output of other dvd players.Is it worth ruling out projectors that only have 8bit conversion and scaling [many fine ones out there]. Is the improved 10bit colour worth it ??? I'm looking closely at the infocus in 76 to be released soon. Ive seen it sell in england for about 1900 pounds.Good value for 720p.OR should i wait for a 1080i projector ?? Any help appreciated . colin

Link to comment
Share on other sites



In times past, A/D and D/A converters were not quite capable of perfect conversion of signals at the lowest levels.

For 16 bit conversion of audio signals often a 24 bit converter was used to ensure the signal conversion is faithful in its dynamic range.

I suspect the 10 bit video converter is simply gauranteeing true 8 bit conversion and may not add any more colour depth than a good 8 bit converter.Often this is used as a selling point.

C.M

Edited by Tweet
Link to comment
Share on other sites

In times past, A/D and D/A converters were not quite capable of perfect conversion of signals at the lowest levels.

For 16 bit conversion of audio signals often a 24 bit converter was used to ensure the signal conversion is faithful in its dynamic range.

I suspect the 10 bit video converter is simply gauranteeing true 8 bit conversion and may not add any more colour depth than a good 8 bit converter.Often this is used as a selling point.

C.M

Link to comment
Share on other sites

Thanks chicken man. What you said makes sense.Could you speculate on why dvi is only 8bit delivery whereas hdmi is 10bit??. Possibly for high bandwith 1080p signals from blu-ray and hd-dvd. Why did they upgrade hdmi? Thankfully infocus has put both hdmi and dvi inputs on the in76.Ill switch cables and see what happens. The 989 has an option for 8bit comp by hdmi as well.Thanks from a audio/video nut and computer neophyte. colin

Link to comment
Share on other sites

Thanks chicken man. What you said makes sense.Could you speculate on why dvi is only 8bit delivery whereas hdmi is 10bit??. Possibly for high bandwith 1080p signals from blu-ray and hd-dvd. Why did they upgrade hdmi? Thankfully infocus has put both hdmi and dvi inputs on the in76.Ill switch cables and see what happens. The 989 has an option for 8bit comp by hdmi as well.Thanks from a audio/video nut and computer neophyte. colin

You are only talking about colour depth here so the extra bandwidth required for higher resolution is irrelevant. The reason for DVI being 8 bit is that it's based on the old VGA RGB system. DVI from most computers is DVI-I which contains both the digital signal as well as the old analogue signal in order to retain backwards compatability with older HD15 equipped monitors. The PC industry learned early on that people will not adopt new technology unless it retains backwards compatability with their older hardware (IBM learnt this the hard way, nearly went broke as a result and handed the PC platform over to Intel. Amazingly people seem to have forgotten this with the push to HDMI which will require everyone to update virtually EVERY component in their system in order to get HDCP compliance under Vista).

As for why they "upgraded" HDMI, well I'm a bit cynical. HDMI offers very little to the consumer. Sure it can now carry audio over the same cable (but there seem to be many imcompatabilites between components when trying to do this at this stage) but how much of a hardship is it to run one extra cable when setting up your HT system? HDMI's reason for existence is down to Digital Rights Management. It is a way of locking down digital content. Now, if you're trying to get people to take up this new system, are you going to say "look, pay all this money for this new connection that offers you nothing but let's me control how you use your hardware"? No. You're going to try and tell them it's better than their existing system. People are a sucker for numbers so 10-bit must be better than 8-bit right?

CM's analysis is quite correct. If you do all your processing in 10 bit and only do the rounding to 8 bit at the end of the process, you will end up with a better result than doing everything in 8 bit. And personally I believe that if this is done the difference between a 10 bit display and an 8 bit display would be negligible.

There are other factors that will influence PQ far more than whether it uses 8 bit colour or 10 bit colour.

With an LCD projector for example, they just cannot produce a true black. As a result the lower end of the colour scale is crushed. Now the internal colour processing may be working at 10 bit colour but if the panel itself is not capable of displaying all of these colours who cares? I'd prefer a projector that can process 16.8 million colours and display 16.8 million colours than one that can process 1.07 billion colours but only display 1 million.

The best consumer projector in the world is arguably the Sony G90 (I say arguably, some people prefer the new 1080P 3 chip DLP's, but the G90 is still there or thereabouts). It's 8 bit. At the end of the day they're only numbers and more useful to marketting than in the real world.

Link to comment
Share on other sites



Thank you preacher.I am pretty sure dvi[with hdcp encryption] wont cut the mustard with blu-ray et-al.Point taken about colur depth v/s bandwidth. pasey25 couldnt agree more with the g90 3gun projector bloke who saw the gray scale improve with a 10bit signal. I think theres an analogy with audio dacs- the higher the bit rate the smoother the analog sound. An oscilloscope demonstrates this. In the future hi-def will put out a wider colour range [trying to future-proof myself] :blink:

Link to comment
Share on other sites

Thank you preacher.I am pretty sure dvi[with hdcp encryption] wont cut the mustard with blu-ray et-al.Point taken about colur depth v/s bandwidth. pasey25 couldnt agree more with the g90 3gun projector bloke who saw the gray scale improve with a 10bit signal. I think theres an analogy with audio dacs- the higher the bit rate the smoother the analog sound. An oscilloscope demonstrates this. In the future hi-def will put out a wider colour range [trying to future-proof myself] :blink:

Hmm. You must be seeing something in that AVS thread that I'm not. A G90 won't process a 10 bit signal.

I summarise that thread as such: A CRT projector is capable of displaying 10 bit colour but can't actually input it. A digital projector can input 10 bit colour but can't display it!

The only post there that said 10 bit was useful for digital PJ's was backing up CM's previous post that it's useful for processing of the information. If you're doing that processing outboard (either in the DVD player or an external scaler), then 10 bit isn't going to help much.

Link to comment
Share on other sites

My mistake preacher. I meant cris wiggle who said he had an 808 [possibly barco? ] .he said [if correct- who knows///??] he could perceive with avia pro a better gray scale. Ive got 20/10 vision in one eye[with glasses] maybe his is better? :blink:

Link to comment
Share on other sites

My mistake preacher. I meant cris wiggle who said he had an 808 [possibly barco? ] .he said [if correct- who knows///??] he could perceive with avia pro a better gray scale. Ive got 20/10 vision in one eye[with glasses] maybe his is better? :blink:

Yeah. He said that he could determine the individual 256 shades of grey (using an 8 bit input). As such he believed it would be worthwhile if his CRT PJ could do more than 8 bit. Unfortunately it can't.

Link to comment
Share on other sites



if a RGBHV source has 10-bit sampling then virtually any CRT projector will display it. generally a CRT projector does no image processing of its own. The DVI card in question in that thread is limited to 8-bit, not CRT projectors in general

Anyway, my point was related to the fact that few if any will be able to effectively see more than a 256 shade greyscale ramp. it is a minor attribute in the scheme of evaluating image quality

Edited by pasey25
Link to comment
Share on other sites

if a RGBHV source has 10-bit sampling then virtually any CRT projector will display it. generally a CRT projector does no image processing of its own. The DVI card in question in that thread is limited to 8-bit, not CRT projectors in general

Anyway, my point was related to the fact that few if any will be able to effectively see more than a 256 shade greyscale ramp. it is a minor attribute in the scheme of evaluating image quality

Interesting. I thought they were limited to 8-bit. (Not through any technical reason, just that that was all that was built).

I agree with your last point. 10 bit processing has more to do with marketability than any reall increase in PQ. It would certainly rate way down on my list of priorities when selecting a PJ.

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...
To Top