Automatically testing interactive multimodal systems using task trees and fusion models

  • Authors:
  • Laya Madani;Ioannis Parissis

  • Affiliations:
  • University Al Baath, Al Baath, Syria;University of Grenoble, Grenoble, France

  • Venue:
  • Proceedings of the 6th International Workshop on Automation of Software Test
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Multimodal systems support communication with the user through different modalities such as voice and gesture. In such systems, modalities may be used sequentially or concurrently, and independently or combined synergistically. Their use is characterized by four properties, called CARE (Complementarity, Assignment, Redundancy and Equivalence). This paper presents a method for generating automatically test data for multimodal systems, based on two models: a model describing the multimodal aspects and a task tree model, both supporting operational profile specification.