Home
Analysis and PDE

Analysis seminar: Dante Kalise

Main content

Speaker: Dante Kalise, Johann Radon Institute for Computational and Applied Mathematics (RICAM), Linz, Austria.

Title: Hamilton-Jacobi equations in optimal control: theory and numerics.

Abstract: In this talk we will review some classical and recent results concerning the link between Hamilton-Jacobi equations and optimal control, its numerical ap- proximation, and different applications. A standard tool for the solution of optimal control problem is the applicati- on of the Dynamic Programming Principle proposed by Bellman in the 50’s. In this context, the value function of the optimal control problem is characterized as the solution of a first-order, fully nonlinear Hamilton-Jacobi-Bellman (HJB) equation. The solution is understood in the viscosity solution sense introduced by Crandall and Lions. A major advantage of the approach is that a feedback mapping connecting the current state of the system and the optimal control can be obtained by means of the Pontryagin principle. However, since the HJB equation has to be solved in a state space of the same dimension as the system dynamics, the approach is only feasible for low dimensional dynamics. In the first part of the talk, we will present the main results related to HJB equati- ons, viscosity solutions and links to optimal control. The second part will be devoted to the construction of efficient and accurate numerical schemes for the approximation of HJB equations..