This may be a red herring, but I've seen some self-proclaimed "audiophile"
publications which claim that when headphones are driven from a resistive
source impedance of a few tens of Ohms, they "sound better" than when they
are driven from an ideal (very low impedance) voltage source. As far as I
recall, these statements were referring to listening tests of
professional-quality headphones with nominal impedances of 200 Ohms.
These publications didn't present any analytical measurements to suggest why
this might be the case, but the output resistance added to many headphone
amps might not be there solely to prevent damage or distortion - it might
also be to persuade audiophiles that they're getting the best sound quality.
For music produced and mixed to be listened to via loudspeakers, it may be
that adding a series resistor might indeed make the headphones sound more
like the original mixing engineer or producer intended, but for scientific
perceptual experiments I can't see any advantage in artificially increasing
the resistance.
Steve Beet
-----Original Message-----
From: AUDITORY - Research in Auditory Perception
[mailto:AUDITORY@xxxxxxxxxxxxxxx] On Behalf Of Bob Masta
Sent: 11 December 2014 17:03
To: AUDITORY@xxxxxxxxxxxxxxx
Subject: Re: USB sound cards
However, there *is* a problem getting low output impedance as well. The
native design of modern amplifier stages has essentially zero output
impedance due to negative feedback (milliohms or less). That means that if
you connect such an amp to a low-impedance load, the current draw can be
high... high enough to damage the output stages, or at least cause massive
distortion as they go into protective current limiting. Since these are for
consumer use, where anyone can plug in most anything that fits the jack,
manufacturer's typically add some output impedance.