Digital Signal Processing
Hi-index | 0.00 |
A simple method is presented for adaptive canceling sinusoidal disturbances with known frequencies in a time series. The system is characterized by the phase and amplitude parameters, which are updated directly according to a least mean square (LMS)-style algorithm. The computational complexity of the algorithm is proportional to the number of the interfering sinusoids. Convergence behaviors as well as variances of the parameters are derived and verified by computer simulations. It is also shown that the performance of the least squares realization of the interference canceller can attain the Cramfier-Rao lower bound (CRLB).