Next: Nelder-Mead Simplex algorithm (optimize.fmin) Up: tutorial Previous: Gaussian quadrature (integrate.gauss_quadtol)

Optimization (optimize)

There are several classical optimization algorithms provided by SciPy in the optimize package. An overview of the module is available using help (or pydoc.help):


>>> help(optimize)

 Optimization Tools

A collection of general-purpose optimization routines.

  fmin --       Nelder-Mead Simplex algorithm
                 (uses only function calls)
  fmin_bfgs --  Quasi-Newton method (can use function and gradient)
  fmin_ncg --   Line-search Newton Conjugate Gradient (can use
                 function, gradient and hessian).
  leastsq --    Minimize the sum of squares of M equations in
                 N unknowns given a starting estimate.
  fminbound --  Bounded minimization of a scalar function.
  fsolve --     Non-linear equation solver.
The first four algorithms are unconstrained minimization algorithms (fmin: Nelder-Mead simplex, fmin_bfgs: BFGS, fmin_ncg: Newton Conjugate Gradient, and leastsq: Levenburg-Marquardt). The fourth algorithm only works for functions of a single variable but allows minimization over a specified interval. The last algorithm actually finds the roots of a general function of possibly many variables. It is included in the optimization package because at the (non-boundary) extreme points of a function, the gradient is equal to zero.



Subsections

2001-07-27