Markov Decision Processes: Discrete Stochastic Dynamic Programming
Markov Decision Processes: Discrete Stochastic Dynamic Programming
Approximate Dynamic Programming for Ambulance Redeployment
INFORMS Journal on Computing
Fair Dynamic Routing in Large-Scale Heterogeneous-Server Systems
Operations Research
Hi-index | 0.00 |
The decision about which servers to dispatch to which customers is an important aspect of service systems. This decision is complicated when servers must be equitably---as well as efficiently---dispatched to customers. In this paper, we formulate a model for determining how to optimally dispatch distinguishable servers to prioritized customers given a set of equity constraints. These issues are examined through the lens of emergency medical service EMS dispatch, for which a Markov decision process model is developed that captures how to dispatch ambulances servers to prioritized patients customers. It is assumed that customers arrive sequentially, with the priority and location of each customer becoming known upon arrival. Four types of equity constraints are considered---two of which reflect customer equity and two of which reflect server equity---all of which draw upon the decision analytic and social science literature to compare the effects of different notions of equity on the resulting dispatching policies. The Markov decision processes are formulated as equity-constrained linear programming models. A computational example is applied to an EMS system to compare the different equity models.