Abstract:
The absorption of a small-amplitude monofrequency signal by finite-amplitude shockless noise in one dimension has been predicted [O. Rudenko and A. Chirkin, Sov. Phys. JETP 40, 945 (1975)] and experimentally measured [Larraza et al., J. Acoust. Soc. Am. 100, 3554 (1996)] to cause the signal amplitude to decrease as a Gaussian in the distance from the source. Numerical simulations based on Riemann's exact implicit solution for the unidirectional propagation of sound are presented, where the noise at the source is composed of 50 equally spaced equal-amplitude frequency components with random phases. The Gaussian attenuation is confirmed and is extended to the case where the signal is injected downstream from the noise source. When the phases of the spectral components of the noise are equal, the attenuation of the signal is dramatically reduced. The extent to which the Gaussian attenuation is restored due to small random variations of the amplitudes, phases, and frequencies of the spectral components of the noise is investigated. Preliminary results show that this is specified by the statistical nature of the noise at the source; specifically, by the degree to which the instantaneous amplitude is normally distributed. [Work supported by ONR.]