Jump to content

Why digital seems to be affected by power and cables


jolon

Recommended Posts

Yes... so, can digital sound analogue... Yes :)

 

Sort of simply...

 

Audio digital is pulse-code modulation (PCM/DSD) of grouped of 1s and 0s (16/24/etc bit) where the 1 is effectively like a clipped/truncated analogue sine wave which is effectively noise... I believe!

 

This is why a good power cable and interconnect (and good connectors) are important because noise + noise equals more noise which is not good for digital signal transmission integrity and analogue playback, especially re (low energy) harmonic reproduction. It appears that our ears usually do not like the resulting noise affected DAC reproduction, especially re (low energy) harmonic reproduction/translation. With analogue playback noise can more easily absorbed/integrated, because no conversion/translation is required, although the result is usually harmonic smearing/coagulation (which my not appeal to some people). It appears that our ears usually do not find analogue (low energy) harmonic smearing/coagulation as offensive as the resulting noise affected DAC reproduction, especially re (low energy) harmonic reproduction/translation :)

 

Can noiseless digital sound as good as noiseless analogue? ... theoretically NO (although both formats will sound excellent), because analogue will contain more harmonic content, but neither formats are noiseless!

Can low noise digital sound as good as low noise analogue? ... theoretically YES, because both formats may contain about the same (real) harmonic content and digital sound may even sound better due to (possibly) less harmonic smearing/coagulation :)

Can low cost digital sound as good as low cost analogue? ... well YES and NO because noise is the issue/killer in this area and it will depend on how it has been treated/absorbed!

Edited by Ping
  • Like 1
Link to comment
Share on other sites



Yes... so, can digital sound analogue... Yes :)

 

Sort of simply...

 

Audio digital is pulse-code modulation (PCM/DSD) of grouped of 1s and 0s (16/24/etc bit) where the 1 is effectively like a clipped/truncated analogue sine wave which is effectively noise... I believe!

 

 

 

It depends on what digital format.  SPDIF is bi-phase mark with the clock embedded in the data stream.  USB uses NRZI (non return to zero) signalling where each data packet is prefixed with a clock synch period.

 

I'm not sure what you mean by "harmonic content".  A harmonic is an integer multiple of a single frequency.  What does the term mean to you?

  • Like 1
Link to comment
Share on other sites

Harmonic content/elements...

 

If you were to break music down into pure mathematics, you would end up dealing with acoustics and harmonics. An instrument's acoustics and harmonics define that instrument's unique sound. As an acoustic note decays (looses energy) harmonics of that note are produced. Instruments get their specific sounds, their timbre, because their sound comes from many different tones all sounding together at different frequencies and the decay each of these tones produce.

 

Essentially all instruments produce overtones, which are frequencies other than the dominant frequency of the note. When one or more overtones is a multiple of the base (or fundamental) frequency, it's called a harmonic.

 

A guitar, when you strike an open string the wavelength λ of the sound produced is double the length of the string. When you play the twelfth fret, the wavelength is half of that (λ / 2), so the frequency is doubled. The sound produced by the open string actually has that doubled frequency as a harmonic. You can think of it as the string vibrating at both frequencies, as if you were somehow playing both the open string and the twelfth fret at the same time. In fact, a vibrating guitar string has components at many multiples of the base frequency (call it F). To your ear it still sounds like the fundamental, but mathematically it's more like this:

 

a*F + b*2F + c*3F + ...

 

The higher-frequency elements/content give the note it's Timbre and this is how you can tell two instruments apart or even tell different kinds of guitar strings apart. For example, the sound where a=1 b=0.6 c=0.3 will sound different than a=1 b=0.5 c=0.4. Note that a is usually the largest coefficient, since F is the fundamental frequency. If it wasn't it would sound like you were playing a different note, or multiple notes.

 

Imagine a pure sine wave, x, with period T. The note one octave higher is the sine wave y with period T/2. If you add both together that's what you see with the harmonic effect. The frequency of the resulting signal is 1/T with a clear sub-frequency of 2/T.

pI8ya.jpg      3rBfX.jpg        MYDrU.jpg

 

Note that the second half of the combined wave (x + y) is the inverse of the first half, just like any regular sine wave. The period is still T, but the distinct crests make it "feel" like it has period T/2. Note that normally y would have a smaller amplitude than x, since it is not the fundamental.

Edited by Ping
  • Like 1
Link to comment
Share on other sites

Harmonic content/elements...

 

Which is precisely what I said.  So how does your web definition of harmonic content relate to the different sound perception between analogue and digital media?  You need to be specific and technical here....

Link to comment
Share on other sites



Actually, no not really, this is a forum and I have given my explanation which for me is sufficient! What would you like to be more clear about and I believe that you are welcome to be as specific and technically clear as you want to be? I will not take offense :)  and very nice that you provided some measurements :) It is good for people to know that even reasonable RCAs can be as good as true high quality 75-ohm BNCs and that RCAs can be even better :)

 

The Topic though is "Why digital seems to be affected by power and cables" and that it can sound very analogue. I assume that includes SPDIF, USB, etc formats, all digital formats :)

 

So be as specific and technical as you want to be (but please expect comments from others) and I am sure that the OP will appreciate it :)

Edited by Ping
Link to comment
Share on other sites

One type of thing "more sensitively affected by power and cables" than digital audio converters themselves, is the test equipment used to measure them.

 

Manufacturers of test equipment (eg. audio precision) have a lot of published knowledge on how to use and deploy test equipment .... because of course, you can't measure an audio system for the types of issues being discussed if the device doing the measurement is also plagued by the same issues.

 

It's all very dry of course, but the point is that the issues / solutions to the question posed in the thread title are completely on the other end of the spectrum from controversial or poorly understood.

Link to comment
Share on other sites

We listen to music - sine wave tests are irrelevant IMO.

 

Perhaps consider whether you understand the types of things a "sine wave test" is able to tell you about a devices ability to carry any type of signal?

 

I don't know the answer to that question ....   but the word "irrelevant" does strike a chord.

 

 

It is certainly the case that people do "a test" (eg. one involving a sine wave) and then they use those test result to make claims which are unjustified  .....  but that is not always the case.

 

... and a claim similar to  "sine waves don't represent real music" ....  with no further caveats, raises alarm bells.

Link to comment
Share on other sites

The observation is very audible, so, how are the observed differences measured, realtime!

 

It is usually very difficult.....   but your implication that a "relatime" (assuming with "music") measurement is required, is generally unjustified.

 

In fact, specific test signals, can be used which would be even more revealing of the problems under discussion than music itself.

Link to comment
Share on other sites

Perhaps consider whether you understand the types of things a "sine wave test" is able to tell you about a devices ability to carry any type of signal?

 

I don't know the answer to that question ....   but the word "irrelevant" does strike a chord.

 

 

It is certainly the case that people do "a test" (eg. one involving a sine wave) and then they use those test result to make claims which are unjustified  .....  but that is not always the case.

 

... and a claim similar to  "sine waves don't represent real music" ....  with no further caveats, raises alarm bells.

 

You are absolutely correct, Dave. :thumb:

 

I was simply being short in my reply ... if I had added caveats, I would've had to spend an hour, replying.  :sorry:

 

Playing a sine wave through your system and listening to what comes out of the speakers will, I suggest ... not be very enlightening.

Playing a percussive track and listening to how the leading edges of the cymbals, rim-shots and triangles sound ... will (be enlightening)!! :D

 

 

Regards,

 

Andy

Link to comment
Share on other sites



Data transmission differences(errors) should not occur with USB because USB transmission is carrying checksum packaged digital data

 

However those checksums are NOT used to retransmit data  (without custom drivers, none are known to me).

 

 

what measurable changes to the data could there be?

 

The correct 1s and 0s (in the right order) are not difficult to achieve....    but it is possible that the timing of the data on the DAC-side of the USB receiver can be affected, by what happens on the computer side  (by various complex means)

 

.... leading to jitter in the digital signal at time of conversion.

Link to comment
Share on other sites

I'm in agreement with just about all of that (but not the harmonic content stuff - sorry Ping).  Setting up a good clinical digital link to minimise clock recovery jitter is one thing (both necessary and essential).  But that is only one side of the coin - the difficulties are always to do with the conversion of the digital data to an analogue signal.  And that depends not only on the DAC itself, but also on circuit board layout, quality of power supplies etc etc.

 

Actually, consider the following - every digital data communication method between two audio pieces of gear is serial.  In one way or another the clock is embedded in the data itself.  Now even so it is possible to synchronise everything to a master clock, provided that the equipment has a word clock input (very few do), and there is some evidence that you can hear what ought to be vanishingly small imperfections in master clock performance in high resolution gear.

 

But if you had a clean sheet to design the best possible digital data link, what would it be?  Since all the digital domain in whatever bit of gear you are considering is done as 16 (or 24) bit parallel, and then is used in the same format in the DAC - why convert it to serial in the first place, with all its shortcomings?  Why not use a parallel twisted pair differentially signalled connection, and then synchronise using a master clock?

 

Serial (SPDIF, USB, CAT5,6,7 etc) is a bit like the old guy leaning on the five bar gate, and the lost motorist stops to ask directions.  The old boy looks at him and says "Well, if I was you I wouldn't have started from here".

Edited by CraigS
  • Like 2
Link to comment
Share on other sites

 

Serial (SPDIF, USB, CAT5,6,7 etc) is a bit like the old guy leaning on the five bar gate, and the lost motorist stops to ask directions.  The old boy looks at him and says "Well, if I was you I wouldn't have started from here".

 

That's a beauty, now I get it....lol

Link to comment
Share on other sites

I'm in agreement with just about all of that (but not the harmonic content stuff - sorry Ping).  Setting up a good clinical digital link to minimise clock recovery jitter is one thing (both necessary and essential).  But that is only one side of the coin - the difficulties are always to do with the conversion of the digital data to an analogue signal.  And that depends not only on the DAC itself, but also on circuit board layout, quality of power supplies etc etc.

 

Actually, consider the following - every digital data communication method between two audio pieces of gear is serial.  In one way or another the clock is embedded in the data itself.  Now even so it is possible to synchronise everything to a master clock, provided that the equipment has a word clock input (very few do), and there is some evidence that you can hear what ought to be vanishingly small imperfections in master clock performance in high resolution gear.

 

But if you had a clean sheet to design the best possible digital data link, what would it be?  Since all the digital domain in whatever bit of gear you are considering is done as 16 (or 24) bit parallel, and then is used in the same format in the DAC - why convert it to serial in the first place, with all its shortcomings?  Why not use a parallel twisted pair differentially signalled connection, and then synchronise using a master clock?

 

Serial (SPDIF, USB, CAT5,6,7 etc) is a bit like the old guy leaning on the five bar gate, and the lost motorist stops to ask directions.  The old boy looks at him and says "Well, if I was you I wouldn't have started from here".

 

The Topic though is "Why digital seems to be affected by power and cables" and that it can sound very analogue. I assume that includes SPDIF, USB, etc formats, all digital formats :)

All good but are you saying that harmonic integrity is not important to ensuring that digital reproduction sounds like analogue reproduction?  Especially as harmonics (overtones) identify a singer or an instrument and provide the analogueness (made that word up :) ) of the reproduction :) Do you think that we should be using i2s?

 

USB should not have a clock or cable issue but davewantsmoore has provided an interesting answer, re USB :) which appears to be saying that USB is no better than spdif or i2s and perhaps/probably not as good :)

Edited by Ping
Link to comment
Share on other sites



I2S has some distinct potential advantages - although the left and right channels are multiplexed onto a single line, at least the clock is sent separately.  It is not without problems however - we used it for internal comms inside an instrument which was most definitely not audio.  Both send and receive were implemented to industry best practice.  The (short - 20cm max) cables were so prone to interference pick up that the system real-time clock would randomly reset and every now and again the microcontroller would crash. We ended up using galvanic isolation on each and every one of the I/O lines which cured the problem at a stroke.

 

Also the Philips I2S specification does not define cable impedance, load impedance or whether galvanic isolation is recommended or necessary.  At least they made a better fist of their (and Sony's) SPDIF definition, imperfect though the link is.

 

USB was never designed for audio use - it has gained acceptance because PC's use them.  Then there is the defunct Firewire and now Thunderbolt from Apple - so if you stream from an Apple product your DAC needs that interface too.

 

Still not sure what the issue is with harmonic integrity.  If the frequency response is flat (and DACs are ruler flat in general) musical instrument harmonics are all going to be in the correct relationship.  If what you are saying is that harmonics can drop below the noise floor, that is going to be much worse with vinyl since the SNR is only 70dB or so, whereas with digital sources it is >100dB, and signal distortion is several orders of magnitude lower.  Of course the ear/brain is a funny old thing and has a low tolerance for particular types of nonlinearity, which is why CD sounded absolutely dire until relatively recently - but that was nothing to do with mucking up instrument harmonics.

Edited by CraigS
Link to comment
Share on other sites

I haven't said much about mains cables and their effect on sound quality.  I have to declare an interest here; the Russ Andrews Superkord cables are my design.  Generally - yes mains cables make a difference.  Why?  Management of RFI getting into and out of equipment in a massively RF polluted world; and getting worse each year.  

 

Why not use a filtered IEC socket?  Because they are only specified to the regulatory limit of 30MHz, whereas most powerful sources of interference are at up to 1GHz (and beyond) - like mobile phones, wifi, networking over mains etc.  And unfortunately you can (for some reason I don't have a credible physics explanation for) hear the ferrite cored chokes in the filtered IEC's.  I take them out and put an unfiltered IEC in their place in every piece of gear I own (Furutech make some very tasty ones).

  • Like 2
Link to comment
Share on other sites

If what you are saying is that harmonics can drop below the noise floor, that is going to be much worse with vinyl since the SNR is only 70dB or so, whereas with digital sources it is >100dB, and signal distortion is several orders of magnitude lower.  Of course the ear/brain is a funny old thing and has a low tolerance for particular types of nonlinearity, which is why CD sounded absolutely dire until relatively recently - but that was nothing to do with mucking up instrument harmonics.

Yes, but this effect seems to be worse for digital than analogue to our ears/brain, perhaps due to the loss of the harmonics during the recording (compression process) or conversion/translation process or both :)

Link to comment
Share on other sites

Neither of which works at all over a few tens of kHz.  Let's put it this way - how many times do you hear the zzt zzt sound when your mobile phone is picked up by your audio system?  That is 800MHz getting into the case via mains and interconnects, being radiated in there and being demodulated by semiconductor junctions.

 

Just look at a few spec sheets for PSRR and CMRR as a function of frequency for a few chips.

  • Like 1
Link to comment
Share on other sites



This is one of the best audio op-amps around - the LM4562.  Low noise, vanishingly small distortion.  Datasheet here http://www.ti.com/lit/ds/symlink/lm4562.pdf

 

PSRR is not symmetrical (this is typical) and is worse for the negative power supply than the positive.  Negative line PSRR starts off pretty great at 20Hz being 120dB or so.  By 200kHz, where the data stops it has degraded to 50dB and shows no sign of stopping. What happens at 1MHz, 10MHz...up to 1GHZ - nothing nice.

 

CMRR is better, but also degrades progressively with frequency - and in I to V convertors and unity gain buffers it suffers from a phenomenon called common mode distortion.  Not well publicised this one.  The LM4562 is better than just about everything else out there though on this.

Link to comment
Share on other sites

... and in I to V convertors and unity gain buffers it suffers from a phenomenon called common mode distortion.  Not well publicised this one ...

Interesting ... does this occur with balanced xlr line input?

Edited by Ping
Link to comment
Share on other sites

Yes.  Common mode distortion is a feature of XLR connections if implemented wrong.  One thing I ought to have made clear, is that this is only an issue when the source impedance (ie the feedback and input resistor) are relatively high.  If you chose a decent opamp, and run it with resistors of less than a few k, you are *probably* OK.  In fact the best way to do it for balanced is to use two unity gain buffers, one on + and one on - (resistors = 0, so no common mode distortion) where you can set the input impedance to be a decent value of 100k, Followed by an regular differential circuit using resistors of say 1k.  Common mode distortion essentially is not an issue with that.  But use 47k feedback and series resistors and you will have a problem with most IC's.

 

But the main thing is RFI,   There are two issues - differential mode and common mode RFI.  

 

Differential mode is the normal way a balanced input works, helping to defeat hum loops.  But since semiconductors are not perfect, RF getting into the differential inputs will definitely be a bad thing (demodulated into the audio band, or cross modulated with the audio stream and produce music-coherent distortion).  The usual way of dealing with this is to use an RC filter, or multiple RCRC filters on the + and - lines to restrict the range of frequencies getting in.

 

There are implementation problems with balanced inputs which can still inject substantial RF through the cable shield, connected to pin 1 of the XLR - known as the "pin1 problem".  The best treatises on this are http://www.rane.com/note151.html and  http://www.rane.com/note165.html .

 

Common mode noise is a much bigger problem, ie when both + and - lines move in phase.  Leaving aside RF for the moment, CMRR in a real world link is massively dominated by resistor tolerance, see http://sound.westhost.com/articles/balanced-interfaces.pdf . Now Bill Whitlock is the founder of Jensen transformers, so his conclusions are either use a transformer or one of these http://www.thatcorp.com/1200-series_High_CMRR_Balanced_Line_Receiver_ICs.shtml .  The THAT opamp is great for CMRR, but is not particularly low noise or low distortion.  And will also degrade with frequency, although THAT's datasheet shows a network that is effective for both differential and common mode RF http://www.thatcorp.com/datashts/THAT_1200-Series_Datasheet.pdf

 

Like all these things, the devil is in the detail - there are lots of gotchas.  Properly implemented a balanced connection via XLR is capable of excellent performance - it is of course used in both recording studios and live performances with hundred metre cable runs.  Do I have confidence that domestic audio (or pro audio for that matter) manufacturers are clinical and rigorous with their balanced drivers and receivers?  Well no, alas.

 

We have not even touched on balanced cables.  Just to fuel your paranoia there is something called shield induced current noise http://www.rane.com/note166.html .

Edited by CraigS
Link to comment
Share on other sites

Yes.  Common mode distortion is a feature of XLR connections if implemented wrong.  One thing I ought to have made clear, is that this is only an issue when the source impedance (ie the feedback an input resistor) are relatively high.  If you chose a decent opamp, and run it with resistors of less than a few k, you are *probably* OK.  In fact the best way to do it is to use a unity gain buffer (resistors = 0, so no common mode distortion) where you can set the input impedance to be a decent value of 100k, Followed by an inverter using resistors of say 2.2k.  Common mode distortion essentially is not an issue with that.  But use 47k feedback and series resistors and you will have a problem with most IC's.

 

... Like all these things, the devil is in the detail - there are lots of gotchas.  Properly implemented a balanced connection via XLR is capable of excellent performance - it is of course used in both recording studios and live performances with hundred metre cable runs.  Do I have confidence that domestic audio (or pro audio for that matter) manufacturers are clinical and rigorous with their balanced drivers and receivers?  Well no, alas.

 

We have not even touched on balanced cables.  Just to fuel your paranoia there is something called shield induced current noise http://www.rane.com/note166.html

Interesting... so this could easily be a few of the many reasons that low noise unbalanced/RCA design (which is less complicated) is (can be and probably is) superior to balanced/XLR design :) for domestic HiFi and even within the mixing desks.   A friend has always said to me that the best mixing desk he had ever used was unbalanced but of course the desk accepted balanced mic connections :)

Edited by Ping
Link to comment
Share on other sites

It is very difficult to design a balanced input which approaches an unbalanced input for noise.  Partly this is because most balanced inputs use high value resistors of typically 10k and have an input referred noise of about -105dBu.  A good single ended input can get close to -120dBu, so 15dB quieter than a simple balanced input.  That is a *lot* quieter.  So for noise (hiss) an unbalanced input can always win if done properly.

 

You *can* design a balanced input with <-115dBu noise, so a lot quieter than the regular balanced input which is most frequently used, and close to an unbalanced input - but you have to use a lot of opamps to do it (12!).

Edited by CraigS
  • Like 1
Link to comment
Share on other sites



  • Recently Browsing   0 members

    • No registered users viewing this page.




×
×
  • Create New...
To Top