Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Multi-Objective Optimization Using Evolutionary Algorithms
Multi-Objective Optimization Using Evolutionary Algorithms
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Learning a kernel matrix for nonlinear dimensionality reduction
ICML '04 Proceedings of the twenty-first international conference on Machine learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Unsupervised Learning of Image Manifolds by Semidefinite Programming
International Journal of Computer Vision
An introduction to nonlinear dimensionality reduction by maximum variance unfolding
AAAI'06 proceedings of the 21st national conference on Artificial intelligence - Volume 2
A fast and elitist multiobjective genetic algorithm: NSGA-II
IEEE Transactions on Evolutionary Computation
Objective reduction in evolutionary multiobjective optimization: Theory and applications
Evolutionary Computation
Constrained many-objective optimization: a way forward
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
Multiobjective optimization: redundant and informative objectives
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
A study of multiobjective metaheuristics when solving parameter scalable problems
IEEE Transactions on Evolutionary Computation
Multi-objective control systems design with criteria reduction
SEAL'10 Proceedings of the 8th international conference on Simulated evolution and learning
EMO'11 Proceedings of the 6th international conference on Evolutionary multi-criterion optimization
Discriminant sparse neighborhood preserving embedding for face recognition
Pattern Recognition
Many-objective optimization using differential evolution with variable-wise mutation restriction
Proceedings of the 15th annual conference on Genetic and evolutionary computation
International Journal of Hybrid Intelligent Systems
Hi-index | 0.00 |
In our recent publication [1], we began with an understanding that many real-world applications of multi-objective optimization involve a large number (10 or more) of objectives but then, existing evolutionary multi-objective optimization (EMO) methods have primarily been applied to problems having smaller number of objectives (5 or less). After highlighting the major impediments in handling large number of objectives, we proposed a principal component analysis (PCA) based EMO procedure, for dimensionality reduction, whose efficacy was demonstrated by solving upto 50-objective optimization problems. Here, we are addressing the fact that, when the data points live on a non-linear manifold or that the data structure is non-gaussian, PCA which yields a smaller dimensional 'linear' subspace may be ineffective in revealing the underlying dimensionality. To overcome this, we propose two new non-linear dimensionality reduction algorithms for evolutionary multi-objective optimization, namely C-PCA-NSGA-II and MVU-PCA-NSGA-II. While the former is based on the newly introduced correntropy PCA [2], the later implements maximum variance unfolding principle [3,4,5], in a novel way. We also establish the superiority of these new EMO procedures over the earlier PCA-based procedure, both in terms of accuracy and computational time, by solving upto 50-objective optimization problems.