Robust iris verification based on local and global variations

  • Authors:
  • Nima Tajbakhsh;Babak Nadjar Araabi;Hamid Soltanian-Zadeh

  • Affiliations:
  • Control and Intelligent Processing Center of Excellence, School of Electrical and Computer Engineering, University of Tehran, Tehran, Iran;Control and Intelligent Processing Center of Excellence, School of Electrical and Computer Engineering, University of Tehran, Tehran, Iran and School of Cognitive Sciences, Institute for Research ...;Control and Int. Proc. Center of Excellence, Sch. of Electrical and Comp. Eng., Univ. of Tehran, Tehran, Iran and Sch. of Cognitive Sciences, Inst. for Res. in Fundamental Sciences, Tehran, Iran a ...

  • Venue:
  • EURASIP Journal on Advances in Signal Processing
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

This work addresses the increasing demand for a sensitive and user-friendly iris based authentication system. We aim at reducing False Rejection Rate (FRR). The primary source of high FRR is the presence of degradation factors in iris texture. To reduce FRR, we propose a feature extractionmethod robust against such adverse factors. Founded on local and global variations of the texture, this method is designed to particularly cope with blurred and unfocused iris images. Global variations extract a general presentation of texture, while local yet soft variations encode texture details that are minimally reliant on the image quality. Discrete Cosine Transform and wavelet decomposition are used to capture the local and global variations. In the matching phase, a support vector machine fuses similarity values obtained from global and local features. The verification performance of the proposed method is examined and compared on CASIA Ver.1 and UBIRIS databases. Efficiency of the method contending with degraded images of the UBIRIS is corroborated by experimental results where a significant decrease in FRR is observed in comparison with other algorithms. The experiments on CASIA show that despite neglecting detailed texture information, our method still provides results comparable to those of recent methods.