Human activity recognition for a content search system considering situations of smartphone users

  • Authors:
  • Tomohiro Mashita;Kentaro Shimatani;Mayu Iwata;Hiroki Miyamoto;Daijiro Komaki;Takahiro Hara;Kiyoshi Kiyokawa;Haruo Takemura;Shojiro Nishio

  • Affiliations:
  • Cybermedia Center, Osaka Univ;Grad. Sch. of Information Science and Tech., Osaka Univ;Grad. Sch. of Information Science and Tech., Osaka Univ;Grad. Sch. of Information Science and Tech., Osaka Univ;Grad. Sch. of Information Science and Tech., Osaka Univ;Grad. Sch. of Information Science and Tech., Osaka Univ;Cybermedia Center, Osaka Univ;Cybermedia Center, Osaka Univ;Grad. Sch. of Information Science and Tech., Osaka Univ

  • Venue:
  • VR '12 Proceedings of the 2012 IEEE Virtual Reality
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Smart-phone users can search for information about surrounding facilities or a route to their destination. However, it is difficult to get or search for information while walking because of low legibility. To address this problem, users have to stop walking or enlarge the screen. Our previously proposed system for smart-phone switches the information presentation policies in response to the user's context. In this paper we describe our context recognition mechanism for this system. This mechanism estimates user context from sensors embedded in a smart-phone. We use a Support Vector Machine for the context classification and compare four types of feature values consisting of FFT and 3 types of Wavelet Transforms. Experimental results show that recognition rates are 87.2 % with FFT, 90.9 % with Gabor Wavelet, 91.8 % with Haar Wavelet, and 92.1 % with MexicanHat Wavelet.