Comparing Linear Feature Space Transformations for Correlated Features

  • Authors:
  • Daniel Vásquez;Rainer Gruhn;Raymond Brueckner;Wolfgang Minker

  • Affiliations:
  • Department of Information Technology, University of Ulm, Ulm, Germany and Harman/Becker Automotive Systems, Speech Dialog Systems, Ulm, Germany;Department of Information Technology, University of Ulm, Ulm, Germany and Harman/Becker Automotive Systems, Speech Dialog Systems, Ulm, Germany;Harman/Becker Automotive Systems, Speech Dialog Systems, Ulm, Germany;Department of Information Technology, University of Ulm, Ulm, Germany

  • Venue:
  • PIT '08 Proceedings of the 4th IEEE tutorial and research workshop on Perception and Interactive Technologies for Speech-Based Systems: Perception in Multimodal Dialogue Systems
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

In automatic speech recognition, a common method to decorrelate features and to reduce feature space dimensionality is Linear Discriminant Analysis (LDA). In this paper, the performance of LDA has been compared with other linear feature space transformation schemes, as many alternative methods have been suggested and lead to higher recognition accuracy in some cases. Different approaches such as MLLT, HLDA, SHLDA, PCA, and combined schemes were implemented and compared. Experiments show that all methods lead to similar results.In addition, recent research has shown that the LDA algorithm is unreliable if the input features of LDA are strongly correlated. In this paper a stable solution to the correlated feature problem, consisting of a concatenation scheme with PCA and LDA, is proposed and verified. Finally, several transformation algorithms are evaluated on uncorrelated and strongly correlated features.