Convex Optimization
Unifying divergence minimization and statistical inference via convex duality
COLT'06 Proceedings of the 19th annual conference on Learning Theory
An accelerated gradient method for trace norm minimization
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Second-Order Bilinear Discriminant Analysis
The Journal of Machine Learning Research
Limitations of matrix completion via trace norm minimization
ACM SIGKDD Explorations Newsletter
A novel framework based on trace norm minimization for audio event detection
ICONIP'11 Proceedings of the 18th international conference on Neural Information Processing - Volume Part II
Audio classification with low-rank matrix representation features
ACM Transactions on Intelligent Systems and Technology (TIST) - Special Section on Intelligent Mobile Knowledge Discovery and Management Systems and Special Issue on Social Web Mining
A joint convex penalty for inverse covariance matrix estimation
Computational Statistics & Data Analysis
Hi-index | 0.00 |
We propose a method for the classification of matrices. We use a linear classifier with a novel regularization scheme based on the spectral l1-norm of its coefficient matrix. The spectral regularization not only provides a principled way of complexity control but also enables automatic determination of the rank of the coefficient matrix. Using the Linear Matrix Inequality technique, we formulate the inference task as a single convex optimization problem. We apply our method to the motor-imagery EEG classification problem. The method not only improves upon conventional methods in the classification performance but also determines a subspace in the signal that concentrates discriminative information without any additional feature extraction step. The method can be easily generalized to regression problems by changing the loss function. Connections to other methods are also discussed.