Significance-Based Failure and Interference Detection in Data Streams

  • Authors:
  • Nickolas J. Falkner;Quan Z. Sheng

  • Affiliations:
  • School of Computer Science, The University of Adelaide, Adelaide, Australia 5005;School of Computer Science, The University of Adelaide, Adelaide, Australia 5005

  • Venue:
  • DEXA '09 Proceedings of the 20th International Conference on Database and Expert Systems Applications
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Detecting the failure of a data stream is relatively easy when the stream is continually full of data. The transfer of large amounts of data allows for the simple detection of interference, whether accidental or malicious. However, during interference, data transmission can become irregular, rather than smooth. When the traffic is intermittent, it is harder to detect when failure has occurred and may lead to an application at the receiving end requesting retransmission or disconnecting. Request retransmission places additional load on a system and disconnection can lead to unnecessary reversion to a checkpointed database, before reconnecting and reissuing the same request or response. In this paper, we model the traffic in data streams as a set of significant events, with an arrival rate distributed with a Poisson distribution. Once an arrival rate has been determined, over-time, or lost, events can be determined with a greater chance of reliability. This model also allows for the alteration of the rate parameter to reflect changes in the system and provides support for multiple levels of data aggregation. One significant benefit of the Poisson-based model is that transmission events can be deliberately manipulated in time to provide a steganographic channel that confirms sender/receiver identity.