The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Matrix computations (3rd ed.)
A decision-theoretic generalization of on-line learning and an application to boosting
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
Multiple kernel learning, conic duality, and the SMO algorithm
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Learning a Similarity Metric Discriminatively, with Application to Face Verification
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
Privacy Preserving Data Mining (Advances in Information Security)
Privacy Preserving Data Mining (Advances in Information Security)
An introduction to ROC analysis
Pattern Recognition Letters - Special issue: ROC analysis in pattern recognition
Large Scale Multiple Kernel Learning
The Journal of Machine Learning Research
Efficient projections onto the l1-ball for learning in high dimensions
Proceedings of the 25th international conference on Machine learning
Privacy-Preserving Data Mining: Models and Algorithms
Privacy-Preserving Data Mining: Models and Algorithms
Semi-supervised metric learning by maximizing constraint margin
Proceedings of the 17th ACM conference on Information and knowledge management
Efficient Euclidean projections in linear time
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Distance Metric Learning for Large Margin Nearest Neighbor Classification
The Journal of Machine Learning Research
Locality sensitive discriminant analysis
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Two Heads Better Than One: Metric+Active Learning and its Applications for IT Service Classification
ICDM '09 Proceedings of the 2009 Ninth IEEE International Conference on Data Mining
Localized Supervised Metric Learning on Temporal Physiological Data
ICPR '10 Proceedings of the 2010 20th International Conference on Pattern Recognition
A metric learning based approach to evaluate task-specific time series similarity
WAIM'13 Proceedings of the 14th international conference on Web-Age Information Management
Hi-index | 0.00 |
In the real world, it is common that different experts have different opinions on the same problem due to their different experience. How to come up with a consistent decision becomes a critical issue. As an example, patient similarity assessment is an important task in the context of patient cohort identification for comparative effectiveness studies and clinical decision-support applications. The goal is to derive clinically meaningful distance metric to measure the similarity between patients represented by their key clinical indicators. It is desirable to learn the distance metric based on experts' knowledge of clinical similarity among subjects. However, often different physicians have different understandings of patient similarity based on the specifics of the cases. The distance metric learned for each individual physician often leads to a limited view of the true underlying distance metric. The key challenge will be how to integrate the individual distance metrics obtained for a group of physicians into a globally consistent unified metric. To achieve this goal, we propose the composite distance integration (Comdi) approach in this paper. Comdi first constructs discriminative neighborhoods from each individual metrics, then it combines all discriminative information in those neighborhoods to learn a single optimal distance metric. We formulate Comdi as a quadratic optimization problem and propose an efficient alternating strategy to find the solution. Besides learning a globally consistent metric, Comdi provides an elegant way to share knowledge across multiple experts without sharing the underlying data, which lowers the risk of disclosing private data. Our experiments on several benchmark data sets show approximately 10% improvement in classification accuracy over baseline methods, which suggests that Comdi is an effective and general metric learning approach. We also demonstrate two case studies on applying Comdi to real-world clinic data sets. © 2012 Wiley Periodicals, Inc. Statistical Analysis and Data Mining 5: 54–69, 2012 © 2012 Wiley Periodicals, Inc.