Situation-Aware adaptive visualization for sensory data stream mining

  • Authors:
  • Pari Delir Haghighi;Brett Gillick;Shonali Krishnaswamy;Mohamed Medhat Gaber;Arkady Zaslavsky

  • Affiliations:
  • Centre for Distributed Systems and Software Engineering, Monash University, Caulfield East, VIC, Australia;Centre for Distributed Systems and Software Engineering, Monash University, Caulfield East, VIC, Australia;Centre for Distributed Systems and Software Engineering, Monash University, Caulfield East, VIC, Australia;Centre for Distributed Systems and Software Engineering, Monash University, Caulfield East, VIC, Australia;Centre for Distributed Systems and Software Engineering, Monash University, Caulfield East, VIC, Australia

  • Venue:
  • Sensor-KDD'08 Proceedings of the Second international conference on Knowledge Discovery from Sensor Data
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

With the emergence of ubiquitous data mining and recent advances in mobile communications, there is a need for visualization techniques to enhance the user-interactions, real-time decision making and comprehension of the results of mining algorithms. In this paper we propose a novel architecture for situation-aware adaptive visualization that applies intelligent visualization techniques to data stream mining of sensory data. The proposed architecture incorporates fuzzy logic principles for modeling and reasoning about context/situations and performs gradual adaptation of data mining and visualization parameters according to the occurring situations. A prototype of the architecture is implemented and described in the paper through a real-world scenario in the area of healthcare monitoring.