IEEE Transactions on Pattern Analysis and Machine Intelligence
Adjustment Learning and Relevant Component Analysis
ECCV '02 Proceedings of the 7th European Conference on Computer Vision-Part IV
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
A data-driven reflectance model
ACM SIGGRAPH 2003 Papers
Learning a Locality Preserving Subspace for Visual Recognition
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
Online and batch learning of pseudo-metrics
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Manifold-ranking based image retrieval
Proceedings of the 12th annual ACM international conference on Multimedia
Learning an image manifold for retrieval
Proceedings of the 12th annual ACM international conference on Multimedia
Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment
SIAM Journal on Scientific Computing
Learning a Similarity Metric Discriminatively, with Application to Face Verification
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
On the Euclidean Distance of Images
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning from facial aging patterns for automatic age estimation
MULTIMEDIA '06 Proceedings of the 14th annual ACM international conference on Multimedia
Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples
The Journal of Machine Learning Research
Regression on manifolds using kernel dimension reduction
Proceedings of the 24th international conference on Machine learning
Modified Kernel functions by geodesic distance
EURASIP Journal on Applied Signal Processing
Geodesic Gaussian kernels for value function approximation
Autonomous Robots
Facial age estimation by nonlinear aging pattern subspace
MM '08 Proceedings of the 16th ACM international conference on Multimedia
Unsupervised learning of image manifolds by semidefinite programming
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
Comparing different classifiers for automatic age estimation
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Orthogonal Laplacianfaces for Face Recognition
IEEE Transactions on Image Processing
Image-Based Human Age Estimation by Manifold Learning and Locally Adjusted Robust Regression
IEEE Transactions on Image Processing
Transfer metric learning by learning task relationships
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
Gait-based human age estimation
IEEE Transactions on Information Forensics and Security
Biased metric learning for person-independent head pose estimation
MM '11 Proceedings of the 19th ACM international conference on Multimedia
Transfer Metric Learning with Semi-Supervised Extension
ACM Transactions on Intelligent Systems and Technology (TIST)
Hi-index | 0.00 |
A good distance metric for the input data is crucial in many pattern recognition and machine learning applications. Past studies have demonstrated that learning a metric from labeled samples can significantly improve the performance of classification and clustering algorithms. In this paper, we investigate the problem of learning a distance metric that measures the semantic similarity of input data for regression problems. The particular application we consider is human age estimation. Our guiding principle for learning the distance metric is to preserve the local neighborhoods based on a specially designed distance as well as to maximize the distances between data that are not in the same neighborhood in the semantic space.Without any assumption about the structure and the distribution of the input data, we show that this can be done by using semidefinite programming. Furthermore, the low-level feature space can be mapped to the high-level semantic space by a linear transformation with very low computational cost. Experimental results on the publicly available FG-NET database show that 1) the learned metric correctly discovers the semantic structure of the data even when the amount of training data is small and 2) significant improvement over the traditional Euclidean metric for regression can be obtained using the learned metric. Most importantly, simple regression methods such as k nearest neighbors (kNN), combined with our learned metric, become quite competitive (and sometimes even superior) in terms of accuracy when compared with the state-of-the-art human age estimation approaches.