Time-ART: a tool for segmenting and annotating multimedia data in early stages of exploratory analysis

  • Authors:
  • Yasuhiro Yamamoto;Atsushi Aoki;Kumiyo Nakakoji

  • Affiliations:
  • NAIST, Takayama, Nara, Japan and TOREST, JST, Miyagino, Miyagi, Japan;SRA Inc., Shinjyuku, Tokyo, Japan;NAIST, Takayama, Nara, Japan and TOREST, JST, Miyagino, Miyagi, Japan and SRA Inc., Shinjyuku, Tokyo, Japan

  • Venue:
  • CHI '01 Extended Abstracts on Human Factors in Computing Systems
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

Time-ART is a tool that helps a user in conducting empirical multimedia(video/sound) data analysis as an exploratory iterative process. Time-ART helps a user in (1) identifying seemingly interesting parts, (2) annotating them both textually and visually by positioning them in a 2D space, and (3) producing a summary report. The system consists of Movie/SoundEditor to segment a part of a movie/sound, ElementSpace, which is a free 2D space where a user can position segmented parts as objects, a TrackListController that synchronously plays multiple sound/video data, AnnotationEditor with which a user can textually annotate each positioned object, DocumentViewer that automatically compiles positioned parts and their annotations in the space, ViewFinder that provides a 3D view of ElementSpace allowing a user to use different "depth" as layers to classify positioned objects, and TimeChart that is another 3D view of ElementSpace helping a user understand the location of each segmented part in terms of the original movie/sound.