Self-economy in cloud data centers: statistical assignment and migration of virtual machines
Euro-Par'11 Proceedings of the 17th international conference on Parallel processing - Volume Part I
Enacting SLAs in clouds using rules
Euro-Par'11 Proceedings of the 17th international conference on Parallel processing - Volume Part I
Balancing electricity bill and performance in server farms with setup costs
Future Generation Computer Systems
Auto-scaling to minimize cost and meet application deadlines in cloud workflows
Proceedings of 2011 International Conference for High Performance Computing, Networking, Storage and Analysis
Energy-efficient and SLA-aware management of IaaS clouds
Proceedings of the 3rd International Conference on Future Energy Systems: Where Energy, Computing and Communication Meet
Adaptive resource configuration for Cloud infrastructure management
Future Generation Computer Systems
Proceedings of the 2013 ACM Cloud and Autonomic Computing Conference
Proceedings of the Ninth IEEE/ACM/IFIP International Conference on Hardware/Software Codesign and System Synthesis
Hi-index | 0.00 |
Cloud providers, like Amazon, offer their data centers' computational and storage capacities for lease to paying customers. High electricity consumption, associated with running a data center, not only reflects on its carbon footprint, but also increases the costs of running the data center itself. This paper addresses the problem of maximizing the revenues of Cloud providers by trimming down their electricity costs. As a solution allocation policies which are based on the dynamic powering servers on and off are introduced and evaluated. The policies aim at satisfying the conflicting goals of maximizing the users' experience while minimizing the amount of consumed electricity. The results of numerical experiments and simulations are described, showing that the proposed scheme performs well under different traffic conditions.