OurBigBook Wikipedia Bot Documentation
The Hamilton–Jacobi–Bellman (HJB) equation is a fundamental partial differential equation in optimal control theory and dynamic programming. It provides a necessary condition for an optimal control policy for a given dynamic optimization problem. ### Context In many control problems, we aim to find a control strategy that minimizes (or maximizes) a cost function over time.

Ancestors (6)

  1. Optimal control
  2. Control theory
  3. Applied mathematics
  4. Fields of mathematics
  5. Mathematics
  6. Home