Importance Sampling and Mean-Square Error in Neural Detector Training
Neural Processing Letters
Importance Sampling Techniques in Neural Detector Training
EMCL '01 Proceedings of the 12th European Conference on Machine Learning
Adaptive Importance Sampling Technique for Neural Detector Training
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
Hi-index | 0.07 |
We present an IS stochastic technique for the efficient simulation of adaptive systems which employ diversity in the presence of frequency nonselective slow Rayleigh fading and additive, white, Gaussian noise. The computational efficiency is achieved using techniques based on importance sampling (IS). We utilize a stochastic gradient descent (SGD) algorithm to determine the near-optimal IS parameters that characterize the dominant fading process. After accounting for the overhead of the optimization algorithm, average speed-up factors of up to six orders of magnitude [over conventional Monte Carlo (MC)] were attained for error probabilities as low as 10-11 for a fourth-order diversity model