Hybrid Wavelet Model Construction Using Orthogonal Forward Selection with Boosting Search

  • Authors:
  • Meng Zhang;Jiaogen Zhou;Lihua Fu;Tingting He

  • Affiliations:
  • Central China Normal University, 430079 Wuhan, China;Wuhan University, 430072 Wuhan, China;Chinese University of Geosciences, 430079 Wuhan China;Central China Normal University, 430079 Wuhan, China

  • Venue:
  • FSKD '07 Proceedings of the Fourth International Conference on Fuzzy Systems and Knowledge Discovery - Volume 03
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

often encountered in science and engineering areas, it is unsuitable to use the conventional kernel methods. This paper considers sparse regression modeling using a generalized kernel model in which each kernel regressor has its individually tuned center vector and diagonal covariance matrix. An orthogonal least squares forward selection procedure is employed to select the regressors one by one using a guided random search algorithm. In order to prevent the possible over-fitting, a practical method to select termination threshold is used. A novel hybrid wavelet is constructed to make the model sparser. The experimental results show that this generalized model outperforms traditional methods in terms of precision and sparseness. And the models with wavelet and hybrid kernel have a much faster convergence rate as compared to that with conventional RBF kernel. For example, for non-flat function estimation problem, those methods adopt a single common variance for all kernel regressors and estimate both the steep and smooth variations using an unchanged scale. Recently, a revised version of SVR, namely multi- scale support vector regression (MSSVR) [4, 5], is proposed by combining several feature spaces rather than a single feature space in standard SVR. The constructed multi-feature space is induced by a set of kernels with different scales. MSSVR outperforms traditional methods in terms of precision and sparseness, which will also be illuminated in our experiments. Kernel basis pursuit (KBP) algorithm [6] is another possible solution which enables us to build a - regularized multiple-kernel estimator for regression. However, KBP is prone to over-fit the noisy data. We will compare its performance with our new algorithm. 1 l