Model Checking MDPs with a Unique Compact Invariant Set of Distributions

  • Authors:
  • Rohit Chadha;Vijay Anand Korthikanti;Mahesh Viswanathan;Gul Agha;Youngmin Kwon

  • Affiliations:
  • -;-;-;-;-

  • Venue:
  • QEST '11 Proceedings of the 2011 Eighth International Conference on Quantitative Evaluation of SysTems
  • Year:
  • 2011

Quantified Score

Hi-index 0.01

Visualization

Abstract

The semantics of Markov Decision Processes (MDPs), when viewed as transformers of probability distributions, can described as a labeled transition system over the probability distributions over the states of the MDP. The MDP can be seen as defining a set of executions, where each execution is a sequence of probability distributions. Reasoning about sequences of distributions allows one to express properties not expressible in logics like PCTL, examples include expressing bounds on transient rewards and expected values of random variables, as well as comparing the probability of being in one set of states at a given time with another set of states. With respect to such a semantics, the problem of checking that the MDP never reaches a bad distribution is undecidable~\cite{qest10}. In this paper, we identify a special class of MDPs called \emph{semi-regular} MDPs that have a unique non-empty, compact, invariant set of distributions, for which we show that checking any $\omega$-regular property is decidable. Our decidability result also implies that for semi-regular probabilistic finite automata with isolated cut-points, the emptiness problem is decidable.