Communications of the ACM - Special issue on parallelism
The nature of statistical learning theory
The nature of statistical learning theory
Discriminant Adaptive Nearest Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Neural networks for pattern recognition
Neural networks for pattern recognition
Computing Optimal Attribute Weight Settings for Nearest NeighborAlgorithms
Artificial Intelligence Review - Special issue on lazy learning
Artificial Intelligence Review - Special issue on lazy learning
An equivalence between sparse approximation and support vector machines
Neural Computation
Generalization performance of support vector machines and other pattern classifiers
Advances in kernel methods
General and Efficient Multisplitting of Numerical Attributes
Machine Learning
Feature Extraction, Construction and Selection: A Data Mining Perspective
Feature Extraction, Construction and Selection: A Data Mining Perspective
Indexing the Solution Space: A New Technique for Nearest Neighbor Search in High-Dimensional Space
IEEE Transactions on Knowledge and Data Engineering
Examining Locally Varying Weights for Nearest Neighbor Algorithms
ICCBR '97 Proceedings of the Second International Conference on Case-Based Reasoning Research and Development
Improved heterogeneous distance functions
Journal of Artificial Intelligence Research
IEEE Transactions on Neural Networks
Symbolic adaptive neuro-fuzzy inference for data mining of heterogenous data
Intelligent Data Analysis
Hi-index | 0.00 |
The application of the Radial Basis Function neural networks in domains involving prediction and classification of symbolic data requires a reconsideration and a careful definition of the concept of distance between patterns. This distance in addition to providing information about the proximity of patterns should also obey some mathematical criteria in order to be applicable. Traditional distances are inadequate to access the differences between symbolic patterns. This work proposes the utilization of a statistically extracted distance measure for Generalized Radial Basis Function (GRBF) networks. The main properties of these networks are retained in the new metric space. Especially, their regularization potential can be realized with this type of distance. However, the examples of the training set for applications involving symbolic patterns are not all of the same importance and reliability. Therefore, the construction of effective decision boundaries should consider the numerous exceptions to the general motifs of classification that are frequently encountered in data mining applications. The paper supports that heuristic Instance Based Learning (IBL) training approaches can uncover information within the uneven structure of the training set. This information is exploited for the estimation of an adequate subset of the training patterns serving as RBF centers and for the estimation of effective parameter settings for those centers. The IBL learning steps are applicable to both the traditional and the statistical distance metric spaces and improve significantly the performance in both cases. The obtained results with this two-level learning method are significantly better than the traditional nearest neighbour schemes in many data mining problems.