A valuation of state of object based on weighted Mahalanobis distance
Pattern Recognition
Generalization of the Mahalanobis distance in the mixed case
Journal of Multivariate Analysis
Prediction of Euclidean distances with discrete and continuous outcomes
Journal of Multivariate Analysis
Maximum trimmed likelihood estimator for multivariate mixed continuous and categorical data
Computational Statistics & Data Analysis
Bearing similarity measures for self-organizing feature maps
IDEAL'05 Proceedings of the 6th international conference on Intelligent Data Engineering and Automated Learning
Hi-index | 0.00 |
A distance for mixed nominal, ordinal and continuous data is developed by applying the Kullback-Leibler divergence to the general mixed-data model, an extension of the general location model that allows for ordinal variables to be incorporated in the model. The distance obtained can be considered as a generalization of the Mahalanobis distance to data with a mixture of nominal ordinal and continuous variables. Moreover, it includes as special cases previous Mahalanobis-type distances developed by Bedrick et al. (Biometrics 56 (2000) 394) and Bar-Hen and Daudin (J. Multivariate Anal. 53 (1995) 332). Asymptotic results regarding the maximum likelihood estimator of the distance are discussed. The results of a simulation study on the level and power of the tests are reported and a real-data example illustrates the method.