Improved Group Sparse Classifier
Pattern Recognition Letters
Robust classifiers for data reduced via random projections
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Linear classifier combination and selection using group sparse regularization and hinge loss
Pattern Recognition Letters
Hi-index | 0.00 |
Recently a new classification assumption was proposed in [1]. It assumed that the training samples of a particular class approximately form a linear basis for any test sample belonging to that class. The classification algorithm in [1] was based on the idea that all the correlated training samples belonging to the correct class are used to represent the test sample. The Lasso regularization was proposed to select the representative training samples from the entire training set (consisting of all the training samples). Lasso however tends to select a single sample from a group of correlated training samples and thus does not promote the representation of the test sample in terms of all the training samples from the correct group. To overcome this problem, we propose two alternate regularization methods, Elastic Net and Sum-Over-l2-norm. Both these regularization methods favor the selection of multiple correlated training samples to represent the test sample. Experimental results on benchmark datasets show that our regularization methods give better recognition results compared to [1].