The Isaacs equation for differential games, totally optimal fields of trajectories and related problems

  • Authors:
  • E. A. Galperin

  • Affiliations:
  • Departement de mathematiques, Université du Quebec í Montreal, C.P. 8888, Succ. Centre Ville, Montreal, Quebec H3C 3P8, Canada

  • Venue:
  • Computers & Mathematics with Applications
  • Year:
  • 2008

Quantified Score

Hi-index 0.09

Visualization

Abstract

Some aspects of the Isaacs principle of transition and his main equation for differential games are considered to demonstrate that those equations contain some implicit assumptions, and are valid under certain contiguity condition which is defined and analyzed for differential games. The notion of total optimality is defined, and the totally optimal fields of trajectories and control curves are introduced and studied in relation to the Isaacs principle of transition, Bellman principle of optimality, maximum principle of Pontryagin, and variational principles of mechanics. It is demonstrated that the Isaacs, Bellman and Pontryagin theories are valid if and only if the optimal trajectories and optimal control curves generated by those methods are totally optimal. In this context, the Hamilton-Jacobi partial differential equation can be used for sequential solution of multi-games defined as n-person games with m controls, r cost functionals and multiple min, max, min-max, etc., operators in fixed order of application and not creating multi-objective game problems. Over totally optimal fields, the structure of controls is invariant under time uncertainty. Parallel and series games are considered, so the Isaacs procedure can be reduced to the application of the Bellman equations twice. Control systems with incomplete information or structural limitations on controls do not, in general, satisfy the contiguity condition, thus, are not totally optimal. Game problems for such systems may have optimal solutions which, however, cannot be obtained by the Isaacs equations. This fact is shown in an example of a widely used engineering system for which an optimal trajectory has all its parts non-optimal and non-contiguous to the optimal trajectory. The paper presents theoretical justification of the Isaacs equations for contiguous systems, comparison of optimal control principles with variational principles of mechanics, the consideration of total optimality and totally optimal fields of trajectories as necessary and sufficient conditions for validity of the three major optimal control theories, and some other results important for applications.