Race to idle: New algorithms for speed scaling with a sleep state

  • Authors:
  • Susanne Albers;Antonios Antoniadis

  • Affiliations:
  • Humboldt-Universität zu Berlin, Berlin, Germany;Humboldt-Universität zu Berlin, Berlin, Germany

  • Venue:
  • ACM Transactions on Algorithms (TALG)
  • Year:
  • 2014

Quantified Score

Hi-index 0.00

Visualization

Abstract

We study an energy conservation problem where a variable-speed processor is equipped with a sleep state. Executing jobs at high speeds and then setting the processor asleep is an approach that can lead to further energy savings compared to standard dynamic speed scaling. We consider classical deadline-based scheduling, that is, each job is specified by a release time, a deadline and a processing volume. For general convex power functions, Irani et al. [2007] devised an offline 2-approximation algorithm. Roughly speaking, the algorithm schedules jobs at a critical speed scrit that yields the smallest energy consumption while jobs are processed. For power functions P(s) = sα & γ, where s is the processor speed, Han et al. [2010] gave an αα + 2)-competitive online algorithm. We investigate the offline setting of speed scaling with a sleep state. First, we prove NP-hardness of the optimization problem. Additionally, we develop lower bounds, for general convex power functions: No algorithm that constructs scrit-schedules, which execute jobs at speeds of at least scrit, can achieve an approximation factor smaller than 2. Furthermore, no algorithm that minimizes the energy expended for processing jobs can attain an approximation ratio smaller than 2. We then present an algorithmic framework for designing good approximation algorithms. For general convex power functions, we derive an approximation factor of 4/3. For power functions P(s) = β sα + γ, we obtain an approximation of 137/117 1.171. We finally show that our framework yields the best approximation guarantees for the class of scrit-schedules. For general convex power functions, we give another 2-approximation algorithm. For functions P(s) = βsα + γ, we present tight upper and lower bounds on the best possible approximation factor. The ratio is exactly eW−1(−e−1−1/e)/(eW−1(−e−1−1/e)+1) 1.211, where W-1 is the lower branch of the Lambert W function.