Jump to content

DTS HDMA vs 5.1 LPCM - Which is better?


Recommended Posts

It is not common to have disks which provide both DTS HDMA and 5.1 LPCM audio using the same source.  I have three classical music disks from 2L which I have been comparing: 2L the Nordic Sound, Magnificat, Divertimenti.

EDIT: Found other BDA disks that I have with both 5.1 audio 24/96 : Pink Floyd - Endless River, Genesis - Selling England by the Pound, Steve Wilson - Hand Cannot Erase & The Raven That Refuses to Sing

In all 3 cases, the 5.1 LPCM came across as cleaner, with more precise placement of instruments. E.g. while not a regular choral music listener, I have been playing Magnificat BDA DTS HDMA a number of times (takes me to the church where it was recorded!) before trying the SACD. To my surprise, the SACD double bass & other instruments came across cleaner, with more precise placement. Same result with Nordic Sound & Divertimenti with both formats on the same BDA disk.

Using Oppo 103 and Anthem MRX 710 feeding SGR Audio CX3 fronts, Monitor Audio Apex A10 rears. Oppo connected via hdmi to Anthem. Anthem handles DTS HDMA, LPCM but not DSD (Oppo converts SACD DSD to LPCM).  

Cannot see 2L changing the presentation for different formats. So likely Anthem is interpreting the DTS HDMA audio slightly differently. This may not be apparent in movies and non classical music.

EDIT: 2L confirms it should be the same, see later post.  And difficult to tell difference with Pink Floyd, Genesis, Steve Wilson disks.

Looking at other forums, people have dismissed similar claims arguing (at times nasty) that it is a placebo effect or someone pushing a particular barrow.  

Your thoughts please?

Edited by Snoopy8
Added additional findings in italics while I can still edit post
Link to comment
Share on other sites



  • Replies 40
  • Created
  • Last Reply

Top Posters In This Topic

As a general statement the sound ought to be identical as DTS HDMA is a lossless codec. if the 5.1 channel LPCM version is encoded without processing as 5.1 channel DTS HDMA then what is decoded should be the same as the 5.1 channel LPCM version.

It could be that bit depth and bit sample rate differ (e.g. 24 bit vs 16 bit dithered, or 48kHz vs 96kHz) but even if they do that should not result in the distinct differences in sound stage placement that you report hearing.

What about bass management: does the Anthem  MRX 710 do something special with the bass from DTS HDMA that is doesn't do with a discrete 5.1 source?

Link to comment
Share on other sites

If the name lossless is true, then there should not be a difference apart from file size. However my AVR shows DN differences of 4dB on the Nirvana Never Mind Blu-ray Audio disc that has LPCM, Dolby True HD and DTS Master Audio. The lossless encodes are both audibly louder than the LPCM track.

Sent from my SM-G900I using Tapatalk

Link to comment
Share on other sites

1 hour ago, MLXXX said:

As a general statement the sound ought to be identical as DTS HDMA is a lossless codec. if the 5.1 channel LPCM version is encoded without processing as 5.1 channel DTS HDMA then what is decoded should be the same as the 5.1 channel LPCM version.

Ought is the right word. 

1 hour ago, MLXXX said:

It could be that bit depth and bit sample rate differ (e.g. 24 bit vs 16 bit dithered, or 48kHz vs 96kHz) but even if they do that should not result in the distinct differences in sound stage placement that you report hearing.

DTS HDMA and LPCM are listed as 24/192 in BDA disks.  SACD is listed as 2.8224 Mbit/s/ch.   And Anthem ARC is limited to 96 Hz.

1 hour ago, MLXXX said:

What about bass management: does the Anthem  MRX 710 do something special with the bass from DTS HDMA that is doesn't do with a discrete 5.1 source?

ARC (Anthem Room Correction) is running all the time, no ability to differentiate when using the same source ie Oppo on HDMI 1. 

Good question because it was the double bass which prompted me.  I was on several rotations on DTS HDMA BDA and was more than happy with it.  So was very familiar with material and the SACD double bass stood out when I was not in dedicated listening mode, not looking for differences.

59 minutes ago, cavx1503565372 said:

If the name lossless is true, then there should not be a difference apart from file size. However my AVR shows DN differences of 4dB on the Nirvana Never Mind Blu-ray Audio disc that has LPCM, Dolby True HD and DTS Master Audio. The lossless encodes are both audibly louder than the LPCM track.

Have not measured, but no obvious difference in loudness in the 2L disks.  Do not have Never Mind, but I think you are referring to 2.0 LPCM . Some of my other disks show similar loudness difference for 2.0 LPCM.

-----------------

Don't get me wrong - am delighted with the DTS HDMA playback and the bonus is the 5.1 LPCM.  In non-classical or movies, the instrument (eg. guitar) or sound effect is directional but the exact placement is not critical.  Go to a classical concert in a dedicated hall and you can hear the seamless nature of the music but can still pick up the place for the instruments.  The 5.1 LPCM shows this but the DTS HDMA is more "fuzzy". 

On a slightly different note, classical purists do not like the 2L reproductions because it jars their senses to hear instruments behind them.

Edited by Snoopy8
Took out redundant quotes
Link to comment
Share on other sites

Have not measured, but no obvious difference in loudness in the 2L disks.  Do not have Never Mind, but I think you are referring to 2.0 LPCM . Some of my other disks show similar loudness difference for 2.0 LPCM.


Not just 2.0 stuff.

My copy of SuperMan Returns has both LPCM and one of the lossless codecs (Dolby True HD) and again there is a flag with +number DN reading.

Sent from my SM-G900I using Tapatalk

Link to comment
Share on other sites



5 hours ago, MLXXX said:

As a general statement the sound ought to be identical as DTS HDMA is a lossless codec. if the 5.1 channel LPCM version is encoded without processing as 5.1 channel DTS HDMA then what is decoded should be the same as the 5.1 channel LPCM version.

Ought is the right wordJust received this from 2L to confirm...

Quote

In numerable tests and in our own experience over more than a decade, the DTS HDMA is indeed a bit-exact lossless container. If you experience any difference from a comparable LPCM stream that means the local unpacking of the DTS HDMA in your processor/receiver is not quite right.

Will write to Anthem and see how they respond.

 

Link to comment
Share on other sites

2 hours ago, Snoopy8 said:

Will write to Anthem and see how they respond.

 

They may say its very difficult to match db levels exactly Snoopy and just a smidge is enough to advantage one format . Did you use an analog db meter or an exxy digital one ? A- weighted readings are best as you likely know

Quote

Sound level meters set to the A-weighting scale will filter out much of the low-frequency noise they measure, similar to the response of the human ear.

Heres an interesting article on the one most of us have ; and how pointing it drastically changes its upper frequency response :o

http://realtraps.com/art_microphones.htm

How accurate is the ARC one ? ; is it flat as a umik or calibrated like one .

Edited by cwt
Link to comment
Share on other sites

Does the anthem display the dialogue offset? It won't make the volume adjustments itself but if you can see the flash of dialogue normalization and a number (most DTS HD MA tracks seem to be encoded with a +4dB gain) then you can compensate by turning the volume down 4dB.

I personal hate the idea of them doing this but was told it is done for those not using an external sound system. So once again, they cater to those that don't know and don't care.

Sent from my SM-G900I using Tapatalk

Link to comment
Share on other sites

14 hours ago, cwt said:

They may say its very difficult to match db levels exactly Snoopy and just a smidge is enough to advantage one format .

My past dealings with Anthem have been good.  Will post reply when I get it.

14 hours ago, cwt said:

How accurate is the ARC one ? ; is it flat as a umik or calibrated like one .

Anthem supplied the calibrated mike for use with ARC (as well as a good mike stand).  The calibration file has its own proprietary format. :angry2:  Had to get a UMM-6 for use with REW

14 hours ago, cavx1503565372 said:

Does the anthem display the dialogue offset? It won't make the volume adjustments itself but if you can see the flash of dialogue normalization and a number (most DTS HD MA tracks seem to be encoded with a +4dB gain) then you can compensate by turning the volume down 4dB.

No ability to display, tweak dialogue offset within Anthem and ARC. 

While I can hear certain instruments clearer in 5.1 LPCM, the overall loudness is about the same for both 5.1 LPCM and DTS HDMA.  The recordings place the mikes to capture the ambience and placement of instruments often in a church.  Unlike non-classical, the placement of instruments is not altered at post processing with these recordings. Pictures below of 2L recording sessions.  Difficult to see in Magnificat picture but cluster of mikes are in front of conductor.

 

2L.jpg

Magnificat.jpg

Link to comment
Share on other sites

19 hours ago, Snoopy8 said:

No ability to display, tweak dialogue offset within Anthem and ARC. 

While I can hear certain instruments clearer in 5.1 LPCM, the overall loudness is about the same for both 5.1 LPCM and DTS HDMA.  The recordings place the mikes to capture the ambience and placement of instruments often in a church.  Unlike non-classical, the placement of instruments is not altered at post processing with these recordings. Pictures below of 2L recording sessions.  Difficult to see in Magnificat picture but cluster of mikes are in front of conductor.

 

Is there anything in a sub menu?  Yamaha AVRs do not display this offset on the main front panel (I wish all processors did, given the issues this raises) but the info can be found in the GUI.  

Link to comment
Share on other sites



i know discs blu-ray audio and 2L where all formats 2L, dts-hdma, true hd etc are included. and yeah they all sound differnet.

 

I put down to differences in mastering, and nature of the formats in question. while they claim to be lossless. they do handle things different. eg truehd uses MLP (remember DVDA) and it does things like gaps of silence are basically taken as no sound and hence this is how they compress..."losslerssly"

 

pcm tracks are subject to "jitter" whereas dts-hdma and true hd that are unpacked at the avr have less opportunities of this.

 

with movies its lot easier decision...they are typically native tracks and easy enough to choose those :)

Link to comment
Share on other sites

My own experience with the DTS MA vs Tru HD has been different - maybe my system or my ears can't resolve the differences, or I am too busy enjoying the movie :)

Many HKG discs have two flavors to choose from so you can A/B the same track.. 

Link to comment
Share on other sites

  • 3 weeks later...
Quote

As with all audio codecs, some modes will result in a different output level, imagining, soundscape, and also spaciously.

Technically, 2L is correct, in that THEIR discs shouldn’t differ, … but keep in mind, that the DTS FORMAT, has usually sounded more dynamic when compared to other formats(usually, not always).
So it is most likely do to the DTS algorithm itself, and not the disc.

Received this from Anthem today, more than 3 weeks after I contacted them.  So they acknowledge that there is a difference. 

Link to comment
Share on other sites

I might be in the minority here.

But I've always found that PCM signals are much more susceptible to jitter.

Whereas an encoded DTSHD or TrueHD stream will need to be decoded and reclocked inside the AVR/AVP itself and is less dependent on jitter. With high jitter in HDMI I invariably find TrueHD or HDMA to sound better than when I let the BDP do the decoding.

Link to comment
Share on other sites

57 minutes ago, doggiehowser1503564764 said:

With high jitter in HDMI I invariably find TrueHD or HDMA to sound better than when I let the BDP do the decoding.

Perhaps your AVR has insufficient buffering to ensure a smooth clocking out of the LPCM bitstream it receives via HDMI. Perhaps it is over-aggressive in trying to match the incoming clock rate, resulting in cyclical overflow and emptying of the buffer.

I'd have thought though that a modern AVR would have no difficulty in ensuring it processes the incoming bits at a steady rate. The specification for the accuracy for the clocking out of HDMI audio packets from a Blu-ray player is quite tight. It should not be difficult for the AVR to find the incoming clock rate and keep to that rate with slow minor adjustments for drift, using an adequate sized buffer, and well designed clock rate control algorithms, thereby fully neutralizing any incoming jitter. I do mean "fully". No error in decoding the bits. And clocking out these 100% accurate bits smoothly and steadily so as to avoid any jitter being generated by the AVR itself.

Link to comment
Share on other sites



Have a listen to the same BD player play a CD through both coaxial and HDMI and see which you prefer.

One has native higher jitter than the other.

Denon used to have DenonLink which used a separate clock signal for HDMI that I found really improved PCM playback.

Link to comment
Share on other sites

2 hours ago, doggiehowser1503564764 said:

But I've always found that PCM signals are much more susceptible to jitter.

Whereas an encoded DTSHD or TrueHD stream will need to be decoded and reclocked inside the AVR/AVP itself and is less dependent on jitter. With high jitter in HDMI I invariably find TrueHD or HDMA to sound better than when I let the BDP do the decoding.

Yes they are ; the new oppo 205 has better jitter clocking than previous ; which matches well with the emo xmc1 [ it has asynchronous clocking] :)
 

Quote

 

HDMI Audio Jitter Reduction:
The UDP-205 features a high-stability, high-precision HDMI clock and a special HDMI audio jitter reduction circuit. This unique design significantly reduces jitter and eliminates timing errors, allowing you to enjoy your music with increased accuracy when you use the audio-only HDMI output port for connecting the audio signal. PCM and DSD signals rely on the HDMI clock directly, so the HDMI audio jitter reduction circuitry can improve the sound quality of PCM and DSD audio. For compressed bitstream audio, it helps to ensure error-free transmission, and may improve the audio performance depending on whether the audio decoder in the A/V processor or receiver uses a synchronous or asynchorous clock scheme.

 

Heres a post from an engineer @ emotiva on the importance of good asynchronous clocking ; and where its done in the signal path...

Quote

The XMC-1 has an ASRC between its audio DSP stages.... which means that ANY digital audio passing through the XMC-1 is re-clocked to remove jitter. (This is true as long as the audio goes through the DSP, which includes all digital audio, and includes analog audio in any mode that includes processing options; in Reference Stereo Mode the analog input signal remains analog throughout, and doesn't go through the DSP, so it doesn't get re-clocked... or clocked at all.)

In essence, the only clock that has an impact on sound quality is the clock associated with the data WHEN IT IS FED INTO THE DAC. It is critical that the DAC receive the correct data at the correct times - as controlled by the clock. An imperfect clock at the input to the DAC will have the same effect as imperfect data - the output will be incorrect. The clock at other points along the signal path really doesn't matter as long as it's "cleaned up" when the signal gets sent to the DAC. (The data on a CD is a list of numbers. A clock is used to facilitate moving those numbers around, but its accuracy doesn't matter unless it's so far off that it actually causes data to be lost or corrupted. Imagine reading a description of events in a news article. The dates of the events described in the article won't change if you read it too slowly or too quickly, unless your voice actually becomes unintelligible. If the data is going straight from the CD mechanism to the DAC, without being re-clocked, then the rate of the clock matters, because it's the clock being used by the DAC to convert the data. However, if the data is being re-clocked, then the accuracy of the clock before that point doesn't matter much... because the data is reaching the DAC under the control of the last clock it passes through.)

The ASRC in the XMC-1 is permanently installed between the two DSPs.... it is ALWAYS enabled for any audio that passed through the DSPs.

 

Link to comment
Share on other sites

43 minutes ago, doggiehowser1503564764 said:

One has native higher jitter than the other.

An AVR ought to be able to neutralize jitter whether from an incoming coaxial, optical, or HDMI signal.  If the AVR gives an audibly inferior result  from an incoming HDMI mixture of video and audio packets, compared with other digital sources, that may indicate a flaw in its processing of the packets of HDMI data.  Alternatively perhaps the HDMI connection is just on the edge of failure and some audio packets are being lost completely (though I haven't read about anyone having that precise issue for audio) as can happen with a very long HDMI cable giving rise to "sparklies" for the video. Provided the packets are actually getting through intact, it really shouldn't be so hard to buffer the correctly received contents, and thus neutralize jitter. 

I know that audiophiles who prefer 2-channel gear can get very concerned about jitter. I've listened to stereo test files with non-buffered artificially induced jitter and it does need to be pretty high to be audible for my hearing, much higher than the almost unmeasurable level of jitter of buffered LPCM streams.

 

3 minutes ago, cwt said:

Heres a post from an engineer @ emotiva on the importance of good asynchronous clocking ; and where its done in the signal path...

" In essence, the only clock that has an impact on sound quality is the clock associated with the data WHEN IT IS FED INTO THE DAC. It is critical that the DAC receive the correct data at the correct times - as controlled by the clock. An imperfect clock at the input to the DAC will have the same effect as imperfect data - the output will be incorrect. The clock at other points along the signal path really doesn't matter as long as it's "cleaned up" when the signal gets sent to the DAC. (The data on a CD is a list of numbers. A clock is used to facilitate moving those numbers around, but its accuracy doesn't matter unless it's so far off that it actually causes data to be lost or corrupted. Imagine reading a description of events in a news article.  "


Ah yes, that's what I was trying to say in my previous post. The critical element is not what jitter may have been present in the signal during its journey to the AVR in a digital form. What counts is how the AVR assembles the correctly received incoming digital data and feeds it to its DACs.

 

1 hour ago, doggiehowser1503564764 said:

Have a listen to the same BD player play a CD through both coaxial and HDMI and see which you prefer.

That would be easier to evaluate if the two happened to give exactly the same audible level when the AVR was switched between the two connections. I'll see what my recently acquired UHD BD player does in conjunction with an AVR I've had for a few years, and report back.

Link to comment
Share on other sites

55 minutes ago, MLXXX said:

That would be easier to evaluate if the two happened to give exactly the same audible level when the AVR was switched between the two connections. I'll see what my recently acquired UHD BD player does in conjunction with an AVR I've had for a few years, and report back.

My Samsung UBD-K8500 player has optical out, no coaxial out. I connected the player via its optical output and its second (or "SUB")  HDMI output to a mid-priced AVR  (a Pioneer VSX-820). The AVR was connected to good quality main speakers (psb Imagine T Towers).

Although I didn't measure the main speaker output level with instruments, the sound level appeared to be the same when switching between the Toslink and HDMI inputs. And for my partner and for me the sound quality appeared to be exactly the same. This was when playing CD tracks with solo cello, and cello + orchestra.

Certainly there was no obvious difference in the sound when swapping between HDMI and optical cable. 

Link to comment
Share on other sites



1 hour ago, doggiehowser1503564764 said:

So Emotiva's and Oppo's improved HDMI clocks are useless?
 

 

it only works on pcm, like you suggested. the oppo 205 also utilises for dsd. but no doest according to them do much for losslesly packed formats.... depending on the style of encoder...

 

  • "HDMI Audio Jitter Reduction
    The UDP-205 features a high-stability, high-precision HDMI clock and a special HDMI audio jitter reduction circuit. This unique design significantly reduces jitter and eliminates timing errors, allowing you to enjoy your music with increased accuracy when you use the audio-only HDMI output port for connecting the audio signal. PCM and DSD signals rely on the HDMI clock directly, so the HDMI audio jitter reduction circuitry can improve the sound quality of PCM and DSD audio. For compressed bitstream audio, it helps to ensure error-free transmission, and may improve the audio performance depending on whether the audio decoder in the A/V processor or receiver uses a synchronous or asynchronous clock scheme."

and further info here,

https://www.oppodigital.com/KnowledgeBase.aspx?KBID=129&ProdID=UDP-205

but then read the below post from an owner,

"Official OPPO UDP-205 UHD Blu-ray Player Owner's Thread

I'm actually debating and thinking about returning the Oppo 205 and just keeping my 105. I paid retail and it's just not a big enough upgrade over the 105. I'm not getting my uhd tv until later in the year and in the meantime maybe I can get one later down the line discounted online. It works and sounds great but so did my 105. emoji482.png to all enjoying there new player. "
 
So any benefit  in the yet to be proven category for now I would suggest.
 
as a note I have personally owned quite a few items over the last decade or so with jitter elimination methods...starting from i-link that pioneer denon, marantz and sony utilised back in cd and sacd days i.e. for pcm and dsd... thats how far this goes... and even earlier with denon utilising denon link which it has done over many iterations over lan, then utilising over spdif and also  over just hdmi. there is also pioneer who has utilised "pqls" over hdmi
 
so lots of effort over the years to improve jitter over hdmi....
Link to comment
Share on other sites

34 minutes ago, doggiehowser1503564764 said:

So Emotiva's and Oppo's improved HDMI clocks are useless?
 

As implied earlier doggie there are well engineered jitter solutions and ones that don't cut the mustard . There s a close relationship between noise and jitter . Its illustrated well in the following pdf vvv.  Even hdmi .org introduced a low jitter mechanism with hdmi1.3 ARC [ audio rate control] that synced the timing of the sinks clock with the video clock ; it didn't catch on ... same as Sonys HATS Pio's pqls as you know

Quote

The ADC
 An ADC converts an analogue electrical signal of some volts into a digital signal. For this the audio signal has to be sampled. How precise the obtained digital signal will represent the original analogue audio depends on the sampling rate and the jitter on the sampling signal. This sampling signal is supplied by a so called clock oscillator. A noisy/jittery clock signal will distort the digital audio representation, which errors can't be corrected later on!

Theres a good example under .5 of how manufacturers can do a better circuit by reclocking and avoiding any cross talk ''The clock- and data-signals should be re-clocked in different circuits, so that no cross talk will take place''  There are 3 examples each one a better implementation ;) 

https://www.by-rutgers.nl/PDFiles/Audio Jitter.pdf

Link to comment
Share on other sites

2 hours ago, doggiehowser1503564764 said:

So Emotiva's and Oppo's improved HDMI clocks are useless?
 

Well they will give peace of mind to certain fastidious buyers. Using  DVD or Blu-ray players to play CDs has been frowned on in two-channel audiophile circles because of concerns about the video disturbing the audio, and of the audio data being compromised by jitter. 

Using a very stable HDMI clock for controlling the sending of packets of data into the HDMI cable and specifically referring to jitter reducing qualities in promotional material for a Blu-ray player may be enough for that Blu-ray player to be sanctioned for use by serious 2-channel audiophiles. It will of course not satisfy all of them! Some will still insist on a dedicated CD transport.

Buyers may see the special HDMI clocking as insurance against having any possible issues with jitter. It doesn't matter whether they have ever heard audible jitter in their lifetimes attributable to the reproduction chain, as distinct from the recording chain. (I note that very early transfers to CD might have involved relatively high levels of jitter in the sampling of the analogue output of open reel tape recorders: very possibly inaudible to the human ear, but high by today's technical standards.)
 

 

1 hour ago, cwt said:

The ADC
 An ADC converts an analogue electrical signal of some volts into a digital signal. For this the audio signal has to be sampled. How precise the obtained digital signal will represent the original analogue audio depends on the sampling rate and the jitter on the sampling signal. This sampling signal is supplied by a so called clock oscillator. A noisy/jittery clock signal will distort the digital audio representation, which errors can't be corrected later on!

Analogue to Digital Converters for sound originally were primarily the domain of studio recording engineers. Two-channel audiophiles can still avoid them altogether by using amplifiers that run entirely in analogue mode.

But the advent of Audyssey and other digital room correction algorithms has tempted many 2-channel audiophiles into digital processing. And it seems to have become the norm for home theatre buffs. So we find that ADCs these days commonly do play a role when music is being reproduced in the home.

AVRs operate in the digital domain, converting any analogue input (such as from the RCA line out of a radio tuner, or from a vinyl disc cartridge) to digital using an ADC. And AVRs resample incoming digital (such as 44.1kHz stereo PCM from a CD player) to a sample rate and bit depth the AVR can use for tone control and other effects such as reverb, gain control, delay for certain channels depending on where speakers have been placed, and -- if activated -- room correction. After all of that has been achieved, there is a conversion to the analogue domain using a DAC for each analogue power amplifier channel .
 

This extensive use of digital processing for room correction and other functions makes it very important to use stable low-noise clocking within the home theatre AVR itself. The digital buck stops with the AVR itself: it performs the all-critical final DAC conversions. 

As for any incoming digital audio stream, a well designed AVR ought to be able to tidy that up with appropriately sized buffers and with appropriately designed rate control for the clocking out of the content stored in those buffers.

6 hours ago, doggiehowser1503564764 said:


One has native higher jitter than the other.

Denon used to have DenonLink which used a separate clock signal for HDMI that I found really improved PCM playback.

Jitter has never been an issue for me. I had to go searching for artificially created examples on the net to get some idea of what it might sound like. That's the only time I'm specifically aware of having heard it.  Perhaps I'm not particularly sensitive to the effects of jitter.

Years ago, in the era of audio cassettes, I would hear wow and flutter, distortion, hiss, and high frequency roll off. And I was never impressed with the sound from vinyl discs, even with premium cartridges. (I can't understand the love affair some audiophiles still have with vinyl.) Today my concern is with low and medium bitrates used for digital radio broadcasting, not because I particularly wish to listen to DAB+ at its current usual standard in Australia, but because it has the potential in the long-term to lead to a decision for the spectrum space currently allocated in Australia for FM broadcasting to be used for non-radio purposes. Despite FM radio having limitations to its sound quality even in the home and with a good antenna, I find it far more agreeable to listen to than moderate bitrate digital radio. I'd hate to see FM go without being replaced by something better.

Edited by MLXXX
Link to comment
Share on other sites

Theres a good example under .5 of how manufacturers can do a better circuit by reclocking and avoiding any cross talk ''The clock- and data-signals should be re-clocked in different circuits, so that no cross talk will take place''  There are 3 examples each one a better implementation [emoji6] 
https://www.by-rutgers.nl/PDFiles/Audio Jitter.pdf

I'm aware of ARC - before it was the acronym for Audio Return Channel but from what I recall very few companies adopted it but instead focused on their own proprietary systems - I think HATS and PQLS from Sony and Pioneer didn't work across systems even though they were derived from the ARC
Link to comment
Share on other sites



  • Recently Browsing   0 members

    • No registered users viewing this page.




×
×
  • Create New...
To Top