Regularized multi-view learning machine based on response surface technique

  • Authors:
  • Zhe Wang;Jin Xu;Songcan Chen;Daqi Gao

  • Affiliations:
  • Department of Computer Science & Engineering, East China University of Science & Technology, Shanghai 200237, PR China;Department of Computer Science & Engineering, East China University of Science & Technology, Shanghai 200237, PR China;Department of Computer Science & Engineering, Nanjing University of Aeronautics & Astronautics, Nanjing 210016, PR China;Department of Computer Science & Engineering, East China University of Science & Technology, Shanghai 200237, PR China

  • Venue:
  • Neurocomputing
  • Year:
  • 2012

Quantified Score

Hi-index 0.01

Visualization

Abstract

Multi-view learning was supposed to process data with multiple information sources. Our previous work extended multi-view learning and proposed one effective learning machine named MultiV-MHKS. MultiV-MHKS firstly changed a base classifier into M different sub-classifiers, and then designed one joint learning process for the generated M sub-ones. Each sub-classifier was taken as one view of MultiV-MHKS. However, MultiV-MHKS assumed that each sub-classifier should play an equal role in the ensemble. Thus the weight values r"q, q=1...M for each sub-classifier were set to the equal value. In practice, this hypothesis was neither flexible nor appropriate since r"qs should reflect different effects of their corresponding views. In order to make r"qs flexible and appropriate, in this paper we propose a regularized multi-view learning machine named RMultiV-MHKS with the optimized r"qs. In this case, we optimize r"qs through using the Response Surface Technique (RST) on cross-validation data and thus can obtain a regularized multi-view learning machine. Doing so can assign a certain view with zero weight in the combination, which means that this specific view does not carry discriminative information for the problem and hence can be pruned. The experimental results here validate the effectiveness of the proposed RMultiV-MHKS and meanwhile explore the effect of some important parameters. The characters of the RMultiV-MHKS are: (1) distributing more weight to the favorable views which can reflect the property of the problem; (2) owning a tighter generalization risk bound than its corresponding single-view learning machine in terms of the Rademacher complexity; (3) having a statistically superior classification performance to the original MultiV-MHKS.