Distribution theory and transform analysis: an introduction to generalized functions, with applications
Signal processing with alpha-stable distributions and applications
Signal processing with alpha-stable distributions and applications
Design of detectors based on stochastic resonance
Signal Processing
Stochastic resonance in noisy threshold neurons
Neural Networks - 2003 Special issue: Advances in neural networks research IJCNN'03
2005 Special Issue: Stochastic resonance in noisy spiking retinal and sensory neuron models
Neural Networks - 2005 Special issue: IJCNN 2005
Noise
Stochastic resonance and improvement by noise in optimal detection strategies
Digital Signal Processing
Noise-enhanced performance for an optimal Bayesian estimator
IEEE Transactions on Signal Processing
Adaptive stochastic resonance in noisy neurons based on mutual information
IEEE Transactions on Neural Networks
Stochastic Resonance in Continuous and Spiking Neuron Models With Levy Noise
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
We present several necessary and sufficient conditions and a learning algorithm for noise benefits in threshold neural signal detection using error probabilities. The first condition ensures noise benefits in threshold detection of discrete binary signals and applies to noise types from scale families. The condition also gives an easy way to compute optimal noise values for closed-form scale-family noise densities. A related condition ensures noise benefits in threshold detection of signals that have absolutely continuous distributions. This condition reduces to a simple weighted-derivative comparison of the signal densities at the detection threshold when the signal densities are continuously differentiable and when the additive noise is either zero-mean discrete bipolar or finite-variance symmetric scalefamily noise. A gradient-ascent learning algorithm can find the optimal noise value for thick-tailed stable densities and many other noise probability densities that do not have a closed form.