Elevator Group Control Using Multiple Reinforcement Learning Agents
Machine Learning
Introduction to Reinforcement Learning
Introduction to Reinforcement Learning
Evaluation of an Economy-Based File Replication Strategy for a Data Grid
CCGRID '03 Proceedings of the 3st International Symposium on Cluster Computing and the Grid
On Recent Advances in Time/Utility Function Real-Time Scheduling and Resource Management
ISORC '05 Proceedings of the Eighth IEEE International Symposium on Object-Oriented Real-Time Distributed Computing
IEEE Transactions on Computers
An Enhanced Data-aware Scheduling Algorithm for Batch-mode Dataintensive Jobs on Data Grid
ICHIT '06 Proceedings of the 2006 International Conference on Hybrid Information Technology - Volume 01
Optimizing on-demand data broadcast scheduling in pervasive environments
EDBT '08 Proceedings of the 11th international conference on Extending database technology: Advances in database technology
Proceedings of the 9th ACM/IFIP/USENIX International Conference on Middleware
Hi-index | 0.00 |
This paper addresses the problem of dynamic scheduling of data-intensive multiprocessor jobs. Each job requires some number of CPUs and some amount of data that needs to be downloaded into a local storage space before starting the job. The completion of each job brings some benefit (utility) to the system, and the goal is to find the optimal scheduling policy that maximizes the average utility per unit of time obtained from all completed jobs. A co-evolutionary solution methodology is proposed, where the utility-based policies for managing local storage and for scheduling jobs onto the available CPUs mutually affect each other's environments, with both policies being adaptively tuned using the Reinforcement Learning methodology. Our simulation results demonstrate the feasibility of this approach and show that it performs better than the best heuristic scheduling policy we could find for this domain.