Development of visualizing earphone and hearing glasses for human augmented cognition

  • Authors:
  • Byunghun Hwang;Cheol-Su Kim;Hyung-Min Park;Yun-Jung Lee;Min-Young Kim;Minho Lee

  • Affiliations:
  • School of Electronics Engineering, Kyungpook National University, Korea;School of Electronics Engineering, Kyungpook National University, Korea;Department of Electronic Engineering, Sogang University, Korea;School of Electronics Engineering, Kyungpook National University, Korea;School of Electronics Engineering, Kyungpook National University, Korea;School of Electronics Engineering, Kyungpook National University, Korea

  • Venue:
  • ICONIP'11 Proceedings of the 18th international conference on Neural Information Processing - Volume Part II
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we propose a human augmented cognition system which is realized by a visualizing earphone and a hearing glasses. The visualizing earphone using two cameras and a headphone set in a pair of glasses intreprets both human's intention and outward visual surroundings, and translates visual information into an audio signal. The hearing glasses catch a sound signal such as human voices, and not only finds the direction of sound sources but also recognizes human speech signals. Then, it converts audio information into visual context and displays the converted visual information in a head mounted display device. The proposed two systems includes incremental feature extraction, object selection and sound localization based on selective attention, face, object and speech recogntion algorithms. The experimental results show that the developed systems can expand the limited capacity of human cognition such as memory, inference and decision.