The nature of statistical learning theory
The nature of statistical learning theory
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning to Decode Cognitive States from Brain Images
Machine Learning
Classification of Faces in Man and Machine
Neural Computation
Using unsupervised analysis to constrain generalization bounds for support vector classifiers
IEEE Transactions on Neural Networks
Information criteria for support vector machines
IEEE Transactions on Neural Networks
A Geometrical Method to Improve Performance of the Support Vector Machine
IEEE Transactions on Neural Networks
Recursive Support Vector Machines for Dimensionality Reduction
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Statistical learning methods are emerging as a valuable tool for decoding information from neural imaging data. The noisy signal and the limited number of training patterns that are typically recorded from functional brain imaging experiments pose a challenge for the application of statistical learning methods in the analysis of brain data. To overcome this difficulty, we propose using prior knowledge based on the behavioral performance of human observers to enhance the training of support vector machines (SVMs). We collect behavioral responses from human observers performing a categorization task during functional magnetic resonance imaging scanning. We use the psychometric function generated based on the observers behavioral choices as a distance constraint for training an SVM. We call this method behavior-constrained SVM (BCSVM). Our findings confirm that BCSVM outperforms SVM consistently.