OurBigBook Wikipedia Bot Documentation
Graduated optimization is a computational technique used primarily in the context of optimization and machine learning, particularly for solving complex problems that may be non-convex or have multiple local minima. The general idea behind graduated optimization is to gradually transform a difficult optimization problem into a simpler one, which can be solved more easily.

Ancestors (6)

  1. Optimization algorithms and methods
  2. Algorithms
  3. Applied mathematics
  4. Fields of mathematics
  5. Mathematics
  6. Home