Analysis of Self-Similar Workload on Real-Time Systems

  • Authors:
  • Enrique Hernandez-Orallo;Joan Vila-Carbo

  • Affiliations:
  • -;-

  • Venue:
  • RTAS '10 Proceedings of the 2010 16th IEEE Real-Time and Embedded Technology and Applications Symposium
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Real-time systems used in media processing and transmission produce bursty workloads with highly variable execution and transmission times. To avoid the drawbacks of using the worst-case approach with these workloads, this paper uses a variation of the usual real-time task model where the WCET is replaced by a discrete statistical distribution. Using this approach, tasks are characterized by their processing time over a sampling period. We could expect that increasing the sampling period would smooth, in principle, the workload variability and the proposed analysis would provide more deterministic long- term results. However, we have surprisingly observed that this variability does not decreases with the sampling period: work- loads are bursty on many time scales. This property is known as self-similarity and is measured using the Hurst parameter.This paper studies how to properly model and analyze self- similar task sets showing the influence of the Hurst parameter on the schedulability analysis. It shows, through an analytical model and simulations, that this parameter may have a very negative impact on system performance. As a conclusion, it can be stated that this factor should be taken into account for statistical analysis of real-time systems, since simplistic workload models can lead to inaccurate results. It also shows that the negative effect of this parameter can be bounded using scheduling policies based on the bandwidth isolation principle.