Markov Decision Processes: Discrete Stochastic Dynamic Programming
Markov Decision Processes: Discrete Stochastic Dynamic Programming
Adaptive Service Composition in Flexible Processes
IEEE Transactions on Software Engineering
A framework for QoS-aware binding and re-binding of composite web services
Journal of Systems and Software
Online Optimization in Application Admission Control for Service Oriented Systems
APSCC '08 Proceedings of the 2008 IEEE Asia-Pacific Services Computing Conference
Optimal Replacement Policy of Services Based on Markov Decision Process
SCC '09 Proceedings of the 2009 IEEE International Conference on Services Computing
Web service composition using markov decision processes
WAIM'05 Proceedings of the 6th international conference on Advances in Web-Age Information Management
QoS-aware management of monotonic service orchestrations
Formal Methods in System Design
Hi-index | 0.00 |
In the service computing paradigm, a service broker can build new applications by composing network-accessible services offered by loosely coupled independent providers. In this paper, we address the admission control problem for a a service broker which offers to prospective users a composite service with a range of different Quality of Service (QoS) classes. We formulate the problem as a Markov Decision Process (MDP) problem with the goal of maximizing the broker revenue while guaranteeing non-functional QoS requirements to its already admitted users. To assess the effectiveness of the MDP-based admission control, we present experimental results where we compare the optimal decisions obtained by the analytical solution of the MDP with other policies.