Fulfilling end-to-end latency constraints in large-scale streaming environments

  • Authors:
  • Stamatia Rizou;Frank Diirr;Kurt Rothermel

  • Affiliations:
  • Universität Stuttgart, Institute of Parallel and Distributed Systems, Universitätstraβe 38, 70569 Stuttgart, Germany;Universität Stuttgart, Institute of Parallel and Distributed Systems, Universitätstraβe 38, 70569 Stuttgart, Germany;Universität Stuttgart, Institute of Parallel and Distributed Systems, Universitätstraβe 38, 70569 Stuttgart, Germany

  • Venue:
  • PCCC '11 Proceedings of the 30th IEEE International Performance Computing and Communications Conference
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

The on-line processing of high volume data streams is a prerequisite for many modern applications relying on real-time data such as global sensor networks or multimedia streaming. In order to achieve efficient data processing and scalability w.r.t. the number of distributed data sources and applications, in-network processing of data streams in an overlay network of data processing operators has been proposed. For such stream processing overlay networks, the placement of operators onto physical hosts plays an important role for the resulting quality of service--in particular, the end-to-end latency--and network load. To this end, we present an enhanced placement algorithm that minimizes the network load put onto the system by a stream processing task under user-defined delay constraints in this paper. Our algorithm finds first the optimal solution in terms of network load and then degrades this solution to find a constrained optimum. In order to reduce the overhead of the placement algorithm, we included mechanisms to reduce the search space in terms of hosts that are considered during operator placement. Our evaluations show that this approach leads to an operator placement of high quality solution while inducing communication overhead proportional only to a small percentage of the total hosts.