On reachability and minimum cost optimal control

  • Authors:
  • John Lygeros

  • Affiliations:
  • Department of Electrical and Computer Engineering, University of Palms, Rio, Patras GR-26500, Greece

  • Venue:
  • Automatica (Journal of IFAC)
  • Year:
  • 2004

Quantified Score

Hi-index 22.15

Visualization

Abstract

Questions of reachability for continuous and hybrid systems can be formulated as optimal control or game theory problems, whose solution can be characterized using variants of the Hamilton-Jacobi-Bellman or Isaacs partial differential equations. The formal link between the solution to the partial differential equation and the reachability problem is usually established in the framework of viscosity solutions. This paper establishes such a link between reachability, viability and invariance problems and viscosity solutions of a special form of the Hamilton-Jacobi equation. This equation is developed to address optimal control problems where the cost function is the minimum of a function of the state over a specified horizon. The main advantage of the proposed approach is that the properties of the value function (uniform continuity) and the form of the partial differential equation (standard Hamilton-Jacobi form, continuity of the Hamiltonian and simple boundary conditions) make the numerical solution of the problem much simpler than other approaches proposed in the literature. This fact is demonstrated by applying our approach to a reachability problem that arises in flight control and using numerical tools to compute the solution.