Mathematical control theory: deterministic systems
Mathematical control theory: deterministic systems
Encyclopedic dictionary of mathematics (2nd ed.)
Encyclopedic dictionary of mathematics (2nd ed.)
Mathematical aspects of classical and celestial mechanics (2nd ed.)
Mathematical aspects of classical and celestial mechanics (2nd ed.)
Dynamic Programming and Optimal Control, Two Volume Set
Dynamic Programming and Optimal Control, Two Volume Set
Refinement Calculus: A Systematic Introduction
Refinement Calculus: A Systematic Introduction
Symbolic Controller Synthesis for Discrete and Timed Systems
Hybrid Systems II
Automata logics, and infinite games: a guide to current research
Automata logics, and infinite games: a guide to current research
Hi-index | 0.00 |
There exist various methods for designing dynamical systems and dynamical games in order to ensure correctness and optimality. In the paper, they are systematically organized as follows. Two variational principles are recalled. Firstly, solutions must be stationary: this leads to necessary conditions and to gradient algorithms. Secondly, solutions, if any, must be optimal or correct; this leads to sufficient conditions and to dynamic-programming algorithms. Methods based on these principles allow to design dynamical systems and games such as control systems, hybrid systems and reactive ones. Time may be discrete or continuous; correctness can be viewed as an abstraction of optimality. The structured presentation of design methods is intended to foster their understanding, integration, cross-fertilization and improvement.