Next: Full Hessian example:
Up: Optimization (optimize)
Previous: Broyden-Fletcher-Goldfarb-Shanno algorithm (optimize.fmin_bfgs)
The method which requires the fewest function calls and is therefore
often the fastest method to minimize functions of many variables is
fmin_ncg. This method is a modified Newton's method and
uses a conjugate gradient algorithm to (approximately) invert the
local Hessian. Newton's method is based on fitting the function locally
to a quadratic form:
where
is a matrix
of second-derivatives (the Hessian). If the Hessian is positive definite
then the local minimum of this function can be found by setting the
gradient of the quadratic form to zero, resulting in
The inverse of the Hessian is evaluted using the conjugate-gradient
method. An example of employing this method to minimizing the Rosenbrock
function is given below. To take full advantage of the NewtonCG method,
a function which computes the Hessian must be provided. The Hessian
matrix itself does not need to be constructed, only a vector which
is the product of the Hessian with an arbitrary vector needs to be
available to the minimization routine. As a result, the user can provide
either a function to compute the Hessian matrix, or a function to
compute the product of the Hessian with an arbitrary vector.
Subsections
Next: Full Hessian example:
Up: Optimization (optimize)
Previous: Broyden-Fletcher-Goldfarb-Shanno algorithm (optimize.fmin_bfgs)
2001-07-27