Discussion on score normalization and language robustness in text-independent multi-language speaker verification

  • Authors:
  • Jian Zhao;Yuan Dong;Xianyu Zhao;Hao Yang;Liang Lu;Haila Wang

  • Affiliations:
  • Beijing University of Posts and Telecommunications, Beijing, China;France Telecom Research & Development Center, Beijing, China and Beijing University of Posts and Telecommunications, Beijing, China;France Telecom Research & Development Center, Beijing, China;Beijing University of Posts and Telecommunications, Beijing, China;Beijing University of Posts and Telecommunications, Beijing, China;France Telecom Research & Development Center, Beijing, China

  • Venue:
  • ICIC'07 Proceedings of the intelligent computing 3rd international conference on Advanced intelligent computing theories and applications
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

In speaker recognition fields, score normalization is a widely used and effective technique to enhance the recognition performances and is developing further. In this paper, we are focused on the comparison among many kinds of candidates of score normalization methods and a new implementation of the speaker adaptive test normalization (ATnorm) based on a cross similarity measurement is presented which doesn't need an extra corpus for speaker adaptive impostor cohort selection. The use of ATnorm for the language robustness of the multi-language speaker verification is also investigated. Experiments are conducted on the core task of the 2006 NIST Speaker Recognition Evaluation (SRE) corpus. The experimental results indicate that all the score normalization methods mentioned can improve the recognition performances and ATnorm behaves best. Moreover, ATnorm can further contribute to the performance as a means of language robustness.