Recordings and timing and phase linearity
- Has it not occurred to people that recordings
are practically NEVER like this(phase linear) if they are made
with spaced microphones?
Audiophiles rave on about phase and timing while
also raving on about Mercuries and RCAs and Telarcs.
A recording where the microphones are separated
by substantial distances--anything over a few inches(and even ORTF with only 17 cm spacing) has phase errors on playback below 700 Hz)--
has already made a complete hash of the phase of signals
that anything but extremely deep bass--unless the signal is
picked up by only one of the microphones which NEVER HAPPENS
If you remember the high school trigonometty formulas,
write down what happens when you add a sine wave to another sine
wave of the same frequency but with a substantial time delay.
Phase shift city! except if you luck out on the relationship
between the period and the time delay.
This is what happens when two sine wave signals of the same
frequency are added together--one still gets a sine wave
of that same frequency, no harmonics in the combination,
but the PHASE (and amplitude) shifts!
Believe me, I know what I am talking about here. This is straight mathematics.
Only by fortunate accident could the resulting sum be of the same phase as the original signal.
Spaced miking makes a hash of phase everywhere except in the deep deep deep bass(where the spacing is small compared to the wavelength). And yet Telarcs were adored by almost everyone--including the same audiophiles who like to carry on about phase. Does this not give people pause?