Learning to manage combined energy supply systems

  • Authors:
  • Azalia Mirhoseini;Farinaz Koushanfar

  • Affiliations:
  • Rice University, Houston, TX, USA;Rice University, Houston, TX, USA

  • Venue:
  • Proceedings of the 17th IEEE/ACM international symposium on Low-power electronics and design
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

The operability of a portable embedded system is severely constrained by its supply's duration. We propose a novel energy management strategy for a combined (hybrid) supply consisting of a battery and a set of supercapacitors to extend the system's lifetime. Batteries are not sufficient for handling high load fluctuations and demands in modern complex systems. Supercapacitors hold promise for complementing battery supplies because they possess higher power density, a larger number of charge/recharge cycles, and less sensitivity to operational conditions. However, supercapacitors are not efficient as a stand-alone supply because of their comparatively higher leakage and lower energy density. Due to the nonlinearity of the hybrid supply elements, multiplicity of the possible supply states, and the stochastic nature of the workloads, deriving an optimal management policy is a challenge. We pose this problem as a stochastic Markov Decision Process (MDP) and develop a reinforcement learning method, called Q-learning, to derive an efficient approximation for the optimal management strategy. This method studies a diverse set of workload profiles for a mobile platform and learns the best policy in form of an adaptive approximation approach. Evaluations on measurements collected from mobile phone users show the effectiveness of our proposed method in maximizing the combined energy system's lifetime.