Sound direction estimation using an artificial ear for robots

  • Authors:
  • Sungmok Hwang;Youngjin Park;Youn-sik Park

  • Affiliations:
  • -;-;-

  • Venue:
  • Robotics and Autonomous Systems
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

We propose a novel design of an artificial robot ear for sound direction estimation using two measured outputs only. The spectral features in the interaural transfer functions (ITFs) of the proposed artificial ears are distinctive and move monotonically according to the sound direction. Thus, these features provide effective sound cues to estimate sound direction using the measured two output signals. Bilateral asymmetry of microphone positions can enhance the estimation performance even in the median plane where interaural differences vanish. We propose a localization method to estimate the lateral and vertical angles simultaneously. The lateral angle is estimated using interaural time difference and Woodworth and Schlosberg's formula, and the front-back discrimination is achieved by finding the spectral features in the ITF estimated from two measured outputs. The vertical angle of a sound source in the frontal region is estimated by comparing the spectral features in the estimated ITF with those in the database built in an anechoic chamber. The feasibility of the designed artificial ear and the estimation method were verified in a real environment. In the experiment, it was shown that both the front-back discrimination and the sound direction estimation in the frontal region can be achieved with reasonable accuracy. Thus, we expect that robots with the proposed artificial ear can estimate the direction of speaker from two output signals only.