The nature of statistical learning theory
The nature of statistical learning theory
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Bayesian Classification With Gaussian Processes
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
KBA: Kernel Boundary Alignment Considering Imbalanced Data Distribution
IEEE Transactions on Knowledge and Data Engineering
Fuzzy support vector machines with the uncertainty of parameter C
Expert Systems with Applications: An International Journal
An overview of statistical learning theory
IEEE Transactions on Neural Networks
Posterior probability support vector Machines for unbalanced data
IEEE Transactions on Neural Networks
Hi-index | 12.06 |
There have been a lot of reports about the fact that the characteristics of datasets will strongly affect the performance of different classifiers. A study in the cognition is thus conceived, and it is natural to propose the Bayesian approach. As is well known, valuable quantitative features from datasets are easily captured and then to update these previous classification problems to guarantee well class separability. The purpose of this learning method is to give an attractive pragmatic feature of the Bayesian approach in the quantitative description of class imbalance problem. Thus, a programming problem of mixing probability information: Bayesian Support Vector Machines (BSVMs) is discussed. In addition, we first change some of the aims and conditions of the original programming problems and then explore what effect will be acquired due to the change. The experiments on several existing datasets show that, if prior distributions are assigned to the programming problem, the estimated classification errors will be reduced.