Introduction to the theory of neural computation
Introduction to the theory of neural computation
Adaptive state representation and estimation using recurrent connectionist networks
Neural networks for control
Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
A theory for neural networks with time delays
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
Learning the architecture of neural networks for speech recognition
ICASSP '91 Proceedings of the Acoustics, Speech, and Signal Processing, 1991. ICASSP-91., 1991 International Conference
A learning algorithm for continually running fully recurrent neural networks
Neural Computation
IEEE Transactions on Signal Processing
The gamma-filter-a new class of adaptive IIR filters withrestricted feedback
IEEE Transactions on Signal Processing
1994 Special Issue: Dynamics of compartmental model neurons
Neural Networks - Special issue: models of neurodynamics and behavior
A low-sensitivity recurrent neural network
Neural Computation
Analog Hardware Implementation of Continuous-Time Adaptive Filter Structures
Analog Integrated Circuits and Signal Processing - Special issue on Learning on Silicon
A Multistrategy Approach to Classifier Learning from Time Series
Machine Learning - Special issue on multistrategy learning
Stability analyses of cellular neural networks with continuous time delay
Journal of Computational and Applied Mathematics
Sequence Learning - Paradigms, Algorithms, and Applications
Spatiotemporal Connectionist Networks: A Taxonomy and Review
Neural Computation
Hybrid neuro-genetic systems as effective analysis schemes of financial and accounting statements
SMO'06 Proceedings of the 6th WSEAS International Conference on Simulation, Modelling and Optimization
Predictive modeling for wastewater applications: Linear and nonlinear approaches
Environmental Modelling & Software
Novel LMI Criteria for Stability of Neural Networks with Distributed Delays
ISNN '07 Proceedings of the 4th international symposium on Neural Networks: Advances in Neural Networks
WSEAS Transactions on Information Science and Applications
Gamma SOM for Temporal Sequence Processing
WSOM '09 Proceedings of the 7th International Workshop on Advances in Self-Organizing Maps
Prediction of Mackey-glass Chaotic time series with effect of superimposed noise using FTLRNN model
ACST '08 Proceedings of the Fourth IASTED International Conference on Advances in Computer Science and Technology
Global Asymptotic Stability of Fuzzy Cellular Neural Networks with Unbounded Distributed Delays
Neural Processing Letters
A dynamic system approach for radio location fingerprinting in wireless local area networks
IEEE Transactions on Communications
Expert Systems with Applications: An International Journal
Novel FTLRNN with gamma memory for short-term and long-term predictions of chaotic time series
Applied Computational Intelligence and Soft Computing
Applied Computational Intelligence and Soft Computing
Gamma-filter self-organizing neural networks for time series analysis
WSOM'11 Proceedings of the 8th international conference on Advances in self-organizing maps
Dual adaptive ANN controllers based on wiener models for controlling stable nonlinear systems
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
Foetal ECG recovery using dynamic neural networks
Artificial Intelligence in Medicine
Hi-index | 0.00 |
In this paper we develop the gamma neural model, a new neural net architecture for processing of temporal patterns. Time varying patterns are normally segmented into a sequence of static patterns that are successively presented to a neural net. In the approach presented here segmentation is avoided. Only current signal values are presented to the neural net, which adapts its own internal memory to store the past. Thus, in the gamma neural net, an adaptive short term mechanism obviates a priori signal segmentation. We evaluate the relation between the gamma net and competing dynamic neural models. Interestingly, the gamma model brings many popular dynamic net architectures, such as the time-delay-neural-net and the concentration-in-time-neural-net, into a unifying framework. In fact, the gamma memory structure appears as general as a temporal convolution memory structure with arbitrary time varying weight kernel w(t). Yet, the gamma model remains mathematically equivalent to the additive (Grossberg) model with constant weights. We present a back propagation procedure to adapt the weights in a particular feedforward structure, the focused gamma net.