Falling off the cliff: when systems go nonlinear

  • Authors:
  • Yvonne Coady;Russ Cox;John DeTreville;Peter Druschel;Joseph Hellerstein;Andrew Hume;Kimberly Keeton;Thu Nguyen;Christopher Small;Lex Stein;Andrew Warfield

  • Affiliations:
  • University of Victoria;MIT;Microsoft Research;Rice University;IBM Research;AT&T Research;HP Labs;Rutgers University;Vanu, Inc.;Harvard University;University of Cambridge

  • Venue:
  • HOTOS'05 Proceedings of the 10th conference on Hot Topics in Operating Systems - Volume 10
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

As the systems we build become more complex, understanding and managing their behavior becomes more challenging. If the system's inputs are within an acceptable range, it will behave predictably. However, the system may "fall off the cliff" if input values are outside this range. This nonlinear behavior is undesirable, because the system no longer behaves predictably: it may not be possible to use, control or even recover the system. In this paper, we describe what it means for a system to fall off the cliff. We outline methods for detecting and predicting these modes of nonlinear behavior, and propose several approaches for designing systems to cope with these instabilities, or to avoid them altogether. We conclude by outlining open research questions for investigation by the systems community.