Reasoning on gestural interfaces through syndetic modelling]

  • Authors:
  • G. P. Faconti

  • Affiliations:
  • CNUCE Institute, National Research Council of Italy, Pisa, Italy

  • Venue:
  • ACM SIGCHI Bulletin
  • Year:
  • 1996

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recent advances in user interface development have been mainly driven by technical innovation, based either on new interaction devices and paradigms, or on algorithms for achieving realistic audio/visual effects [Connor92] [Robertson91a] [Robertson91b] [Berners92]. In such a rich environment, the user potentially interacts with the computer by addressing concurrently different modalities. While the technology driven approach has made possible the implementation of systems in specific application areas, it largely misses an undelying theory. This makes it difficult to assess whether this technology will be effective for users with the consequence that cognitive ergonomics is becoming an urgent requirement for the design of new interactive systems. Attention has been paid to the psychology of terminal users from the very beginning of human-computer interface research [Martin73]. However, existing established design techniques do not readily accomodate issues such as concurrency and parallelism, or the potential for the interaction with multiple interface techniques [Coutaz93].Recently, works have taken place investigating models and techniques for the analysis and design of interactionally rich systems from a variety of disciplinary perspectives as it can be found in the DSV-IS book series edited by [Paterno94] and [Bastide95]. Formal methods have been one of a number of approaches; others include cognitive user models, design space representations, and software architecture models. Applications for formal methods are well known, see for example [Bowen95], and [Gaudel94]. However, none of the cited applications use formal methods to examine the user interface. One reason is that the factors that influence the design of interactive systems depend mostly on psychological and social properties of cognition and work, rather than on abstract mathematical models of programming semantics. For this reason, claims made through formal methods about the properties of interactive systems must be grounded in some psychological or social theory.This paper builds on previous works carried within the ESPRIT Amodeus project and shows how a new approach to human-computer interaction, called syndetic modeling, can be used to gain insight into user-oriented properties of interactive systems. The word syndesis comes from the ancient greek and means conjunction. It is used to emphasize the key point of this approach: user and system models are described within a common framework that enables one to reason about how cognitive resources are mapped onto the functionality of the system.