Machine Learning
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Support vector domain description
Pattern Recognition Letters - Special issue on pattern recognition in practice VI
Detecting masquerades in intrusion detection based on unpopular commands
Information Processing Letters
Support vector machines: hype or hallelujah?
ACM SIGKDD Explorations Newsletter - Special issue on “Scalable data mining algorithms”
Support Vector Data Description
Machine Learning
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
A clustering method based on boosting
Pattern Recognition Letters
Estimating the Support of a High-Dimensional Distribution
Neural Computation
IEEE Transactions on Information Forensics and Security
Enhancing one-class support vector machines for unsupervised anomaly detection
Proceedings of the ACM SIGKDD Workshop on Outlier Detection and Description
Multimodal late fusion bag of features applied to scene detection
Proceedings of the 19th Brazilian symposium on Multimedia and the web
Hi-index | 0.00 |
This paper proposes a novel approach for directly tuning the gaussian kernel matrix for one class learning. The popular gaussian kernel includes a free parameter, σ, that requires tuning typically performed through validation. The value of this parameter impacts model performance significantly. This paper explores an automated method for tuning this kernel based upon a hill climbing optimization of statistics obtained from the kernel matrix.