Algorithms for better representation and faster learning in radial basis function networks
Advances in neural information processing systems 2
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
Swarm intelligence
Response Surface Methodology: Process and Product in Optimization Using Designed Experiments
Response Surface Methodology: Process and Product in Optimization Using Designed Experiments
The Equivalence of Support Vector Machine and Regularization Neural Networks
Neural Processing Letters
Recursive Update Algorithm for Least Squares Support Vector Machines
Neural Processing Letters
On the Kernel Widths in Radial-Basis Function Networks
Neural Processing Letters
Digital Least Squares Support Vector Machines
Neural Processing Letters
An Equivalence between SILF-SVR and Ordinary Kriging
Neural Processing Letters
New Least Squares Support Vector Machines Based on Matrix Patterns
Neural Processing Letters
Least Square Transduction Support Vector Machine
Neural Processing Letters
Sequential Approximate Multiobjective Optimization Using Computational Intelligence
Sequential Approximate Multiobjective Optimization Using Computational Intelligence
Kernel Width Optimization for Faulty RBF Neural Networks with Multi-node Open Fault
Neural Processing Letters
Sequential approximate multi-objective optimization using radial basis function network
Structural and Multidisciplinary Optimization
Hi-index | 0.00 |
This paper presents a simple method to estimate the width of Gaussian kernel based on an adaptive scaling technique. The Gaussian kernel is widely employed in radial basis function (RBF) network, support vector machine (SVM), least squares support vector machine (LS-SVM), Kriging models, and so on. It is widely known that the width of the Gaussian kernel in these machine learning techniques plays an important role. Determination of the optimal width is a time-consuming task. Therefore, it is preferable to determine the width with a simple manner. In this paper, we first examine a simple estimate of the width proposed by Nakayama et al. Through the examination, four sufficient conditions for the simple estimate of the width are described. Then, a new simple estimate for the width is proposed. In order to obtain the proposed estimate of the width, all dimensions are equally scaled. A simple technique called the adaptive scaling technique is also developed. It is expected that the proposed simple method to estimate the width is applicable to wide range of machine learning techniques employing the Gaussian kernel. Through examples, the validity of the proposed simple method to estimate the width is examined.