Automating human-performance modeling at the millisecond level

  • Authors:
  • Alonso H. Vera;Bonnie E. John;Roger Remington;Michael Matessa;Michael A. Freed

  • Affiliations:
  • NASA Ames Research Center, Moffett Field, CA and Carnegie Mellon University;Human-Computer Interaction Institute, Carnegie Mellon University, Pittsburgh, PA;NASA Ames Research Center, Moffett Field, CA;NASA Ames Research Center, Moffett Field, CA;NASA Ames Research Center, Moffett Field, CA and University of West Florida

  • Venue:
  • Human-Computer Interaction
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

A priori prediction of skilled human performance has the potential to be of great practical value but is difficult to carry out. This article reports on an approach that facilitates modeling of human behavior at the level of cognitive, perceptual, and motor operations, following the CPM-GOMS method (John, 1990). CPM-GOMS is a powerful modeling method that has remained underused because of the expertise and labor required. We describe a process for automatically generating CPM-GOMS models from a hierarchical task decomposition expressed in a computational modeling tool, taking advantage of reusable behavior templates and their efficacy for generating zero-parameter a priori predictions of complex human behavior. To demonstrate the process, we present a model of automated teller machine interaction. The model shows that it is possible to string together existing behavioral templates that compose basic HCI tasks, (e.g., mousing to a button and clicking on it) to generate powerful human performance predictions. Because interleaving of templates is now automated, it becomes possible to construct arbitrarily long sequences of behavior. In addition, the manipulation and adaptation of complete models has the potential of becoming dramatically easier. Thus, the tool described here provides an engine for CPM-GOMS that may facilitate computational modeling of human performance at the millisecond level.