Building Multi-modal Personal Sales Agents as Interfaces to E-commerce Applications

  • Authors:
  • Yasmine Arafa;Abe Mamdani

  • Affiliations:
  • -;-

  • Venue:
  • AMT '01 Proceedings of the 6th International Computer Science Conference on Active Media Technology
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

The research presented explores a new paradigm for human-computer interaction with electronic retailing applications. A paradigm that deploys face-to-face interaction with intelligent, visual, lifelike, multimodal conversational agents, which take on the role of electronic sales assistants. This paper discusses the motivations for enriching current e-commerce application interfaces with multi-modal interface agents and discusses the technical development issues they raise, as realised in the MAPPA (EU project EP28831) system architecture design and development.The paper addresses three distinct components of an overall framework for developing lifelike, multi-modal agents for real-time and dynamic applications: Knowledge Representation and Manipulation, Grounded Affect Models, and the convergence of both into support for multimedia visualisation of lifelike, social behaviour. The research presents a novel specification for such a medium and a functional agent-based system scenario (e-commerce) that is implemented with it. Setting forth a framework for building multi-modal interface agents and yielding a conversational form of human-machine interaction, which may have potential for shaping tomorrows interface to the world of e-commerce.