Glossary definition of 'Total Harmonic Distortion Plus Noise'(THD+N) A measure of the distortion and noise in a signal achieved by measuring what is left after the wanted signal is filtered out. It is usually expressed as a percentage or dB ratio of the wanted signal.
THD+N is traditionally measured using a sine wave at a set frequency. A notch filter is used to remove this frequency and the remaining signal which consists of distortion, noise, hum etc. is measured. The level of this distortion and noise signal can be expressed in absolute units such as volts, but it is more common to express its level as a percentage or dB ratio of the level of the original signal. In this way 0.1% THD+N would mean that the level of the distortion and noise in a signal is 0.1% of desired signal level. It is important to note however that for this statement to be meaningful, it must state the bandwidth of the measurement and the frequency and amplitude of the signal. The screenshot below shows the FFT spectrum of a 1kHz sine wave (red trace) driven into clipping. To the right of the fundamental are all the distortion harmonics, and to its left are some mains frequency components. In blue is the filter response, clearly showing the fundamental being notched out, along with a 22Hz high pass and 22kHz low pass filter.
dScope can measure THD+N in either the time domain using the Continuous Time Detector (as shown above), or in the frequency domain using an FFT detector. Using an FFT detector, a much sharper filter can be used to completely remove the fundamental frequency
Note also that THD+N is often incorrectly referred to as "THD" - THD is a more specialised technique where, rather than filtering out the fundamental and measuring what is left, narrow band pass filters are placed on the distortion harmonics and they are measured in isolation to greatly reduce the impact of noise on the measurement. This is most easily achieved in the frequency domain.