On the stationary state of Kohonen's self-organizing sensory mapping
Biological Cybernetics
Fractals everywhere
Voronoi diagrams—a survey of a fundamental geometric data structure
ACM Computing Surveys (CSUR)
Discrete Sequence Prediction and Its Applications
Machine Learning
Dynamic cell structure learns perfectly topology preserving map
Neural Computation
The power of amnesia: learning probabilistic automata with variable memory length
Machine Learning - Special issue on COLT '94
Two Methods for Improving Performance of a HMM and their Application for Gene Finding
Proceedings of the 5th International Conference on Intelligent Systems for Molecular Biology
Hybrid Neural Systems, revised papers from a workshop
Design of a linguistic postprocessor using variable memory length Markov models
ICDAR '95 Proceedings of the Third International Conference on Document Analysis and Recognition (Volume 1) - Volume 1
Controlling the magnification factor of self-organizing feature maps
Neural Computation
Spatial representation of symbolic sequences through iterative function systems
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
A universal finite memory source
IEEE Transactions on Information Theory
The context-tree weighting method: basic properties
IEEE Transactions on Information Theory
IEEE Transactions on Neural Networks
Blind construction of optimal nonlinear recursive predictors for discrete sequences
UAI '04 Proceedings of the 20th conference on Uncertainty in artificial intelligence
Dynamics and Topographic Organization of Recursive Self-Organizing Maps
Neural Computation
Elman Backpropagation as Reinforcement for Simple Recurrent Networks
Neural Computation
A Novel Architecture for the Classification and Visualization of Sequential Data
ICANNGA '07 Proceedings of the 8th international conference on Adaptive and Natural Computing Algorithms, Part I
Neurocomputing
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Training recurrent connectionist models on symbolic time series
ICONIP'08 Proceedings of the 15th international conference on Advances in neuro-information processing - Volume Part I
Simple deterministically constructed recurrent neural networks
IDEAL'10 Proceedings of the 11th international conference on Intelligent data engineering and automated learning
On non-markovian topographic organization of receptive fields in recursive self-organizing map
ICNC'05 Proceedings of the First international conference on Advances in Natural Computation - Volume Part II
Simple deterministically constructed cycle reservoirs with regular jumps
Neural Computation
Hi-index | 0.00 |
We propose a novel approach for building finite memory predictive models similar in spirit to variable memory length Markov models (VLMMs). The models are constructed by first transforming the n-block structure of the training sequence into a geometric structure of points in a unit hypercube, such that the longer is the common suffix shared by any two n-blocks, the closer lie their point representations. Such a transformation embodies a Markov assumption—n-blocks with long common suffixes are likely to produce similar continuations. Prediction contexts are found by detecting clusters in the geometric n-block representation of the training sequence via vector quantization. We compare our model with both the classical (fixed order) and variable memory length Markov models on five data sets with different memory and stochastic components. Fixed order Markov models (MMs) fail on three large data sets on which the advantage of allowing variable memory length can be exploited. On these data sets, our predictive models have a superior, or comparable performance to that of VLMMs, yet, their construction is fully automatic, which, is shown to be problematic in the case of VLMMs. On one data set, VLMMs are outperformed by the classical MMs. On this set, our models perform significantly better than MMs. On the remaining data set, classical MMs outperform the variable context length strategies.