StreamGlobe: processing and sharing data streams in grid-based P2P infrastructures

  • Authors:
  • Richard Kuntschke;Bernhard Stegmaier;Alfons Kemper;Angelika Reiser

  • Affiliations:
  • Technische Universität München, Munich, Germany;Technische Universität München, Munich, Germany;Technische Universität München, Munich, Germany;Technische Universität München, Munich, Germany

  • Venue:
  • VLDB '05 Proceedings of the 31st international conference on Very large data bases
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Data stream processing is currently gaining importance due to the developments in novel application areas like e-science, e-health, and e-business (considering RFID, for example). Focusing on e-science, it can be observed that scientific experiments and observations in many fields, e. g., in physics and astronomy, create huge volumes of data which have to be interchanged and processed. With experimental and observational data coming in particular from sensors, online simulations, etc., the data has an inherently streaming nature. Furthermore, continuing advances will result in even higher data volumes, rendering storing all of the delivered data prior to processing increasingly impractical. Hence, in such e-science scenarios, processing and sharing of data streams will play a decisive role. It will enable new possibilities for researchers, since they will be able to subscribe to interesting data streams of other scientists without having to set up their own devices or experiments. This results in much better utilization of expensive equipment such as telescopes, satellites, etc. Further, processing and sharing data streams on-the-fly in the network helps to reduce network traffic and to avoid network congestion. Thus, even huge streams of data can be handled efficiently by removing unnecessary parts early on, e. g., by early filtering and aggregation, and by sharing previously generated data streams and processing results.