Machine Learning
Neural maps and topographic vector quantization
Neural Networks
Is Combining Classifiers Better than Selecting the Best One
ICML '02 Proceedings of the Nineteenth International Conference on Machine Learning
Improving Regressors using Boosting Techniques
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Combining Multiple Clusterings Using Evidence Accumulation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Clustering Ensembles: Models of Consensus and Weak Partitions
IEEE Transactions on Pattern Analysis and Machine Intelligence
Fusion of Topology Preserving Neural Networks
HAIS '09 Proceedings of the 4th International Conference on Hybrid Artificial Intelligence Systems
Fusion of self organizing maps
IWANN'07 Proceedings of the 9th international work conference on Artificial neural networks
ViSOM ensembles for visualization and classification
IWANN'07 Proceedings of the 9th international work conference on Artificial neural networks
A weighted voting summarization of SOM ensembles
Data Mining and Knowledge Discovery
Switching between selection and fusion in combining classifiers: anexperiment
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Growing a hypercubical output space in a self-organizing feature map
IEEE Transactions on Neural Networks
Hi-index | 0.10 |
Artificial neural networks techniques have been successfully applied in vector quantization (VQ) encoding. The objective of VQ is to statistically preserve the topological relationships existing in a data set and to project the data to a lattice of lower dimensions, for visualization, compression, storage, or transmission purposes. However, one of the major drawbacks in the application of artificial neural networks is the difficulty to properly specify the structure of the lattice that best preserves the topology of the data. To overcome this problem, in this paper we introduce merging algorithms for machine-fusion, boosting-fusion-based and hybrid-fusion ensembles of SOM, NG and GSOM networks. In these ensembles not the output signals of the base learners are combined, but their architectures are properly merged. We empirically show the quality and robustness of the topological representation of our proposed algorithm using both synthetic and real benchmarks datasets.