An Efficient, Probabilistically Sound Algorithm for Segmentation andWord Discovery
Machine Learning - Special issue on natural language learning
Feature extraction through LOCOCODE
Neural Computation
Unsupervised discovery of morphemes
MPL '02 Proceedings of the ACL-02 workshop on Morphological and phonological learning - Volume 6
Neural Computation
Unsupervised discovery of Persian morphemes
EACL '06 Proceedings of the Eleventh Conference of the European Chapter of the Association for Computational Linguistics: Posters & Demonstrations
An object-based visual attention model for robotic applications
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
A spiking neural network model of multi-modal language processing of robot instructions
Biomimetic Neural Learning for Intelligent Robots
Hi-index | 0.00 |
A redundancy reduction strategy, which can be applied in stages,is proposed as a way to learn as efficiently as possible thestatistical properties of an ensemble of sensory messages. Themethod works best for inputs consisting of strongly correlatedgroups, that is features, with weaker statistical dependencebetween different features. This is the case for localized objectsin an image or for words in a text. A local feature measuredetermining how much a single feature reduces the total redundancyis derived which turns out to depend only on the probability of thefeature and of its components, but not on the statisticalproperties of any other features. The locality of this measuremakes it ideal as the basis for a "neural" implementation ofredundancy reduction, and an example of a very simple non-Hebbianalgorithm is given. The effect of noise on learning redundancy isalso discussed.