Next: Broyden-Fletcher-Goldfarb-Shanno algorithm (optimize.fmin_bfgs) Up: Optimization (optimize) Previous: Optimization (optimize)

Nelder-Mead Simplex algorithm (optimize.fmin)

The simplex algorithm is probably the simplest way to minimize a fairly well-behaved function. The simplex algorithm requires only function evaluations and is a good choice for simple minimization problems. However, because it does not use any gradient evaluations, it may take longer to find the minimum. To demonstrate the minimization function consider the problem of minimizing the Rosenbrock function of \( N \) variables:

\begin{displaymath}
f\left( \mathbf{x}\right) =\sum _{i=1}^{N-1}100\left( x_{i}-x_{i-1}^{2}\right) ^{2}+\left( 1-x_{i-1}\right) ^{2}.\end{displaymath}

The minimum value of this function is 0 which is achieved when \( x_{i}=1. \) This minimum can be found using the fmin routine as shown in the example below:


>>> from scipy.optimize import fmin
>>> def rosen(x):  # The Rosenbrock function
        return sum(100.0*(x[1:]-x[:-1]**2.0)**2.0 + (1-x[:-1])**2.0)

>>> x0 = [1.3, 0.7, 0.8, 1.9, 1.2]
>>> xopt = fmin(rosen, x0)
Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 516
         Function evaluations: 825

>>> print xopt
[ 1.  1.  1.  1.  1.]



2001-07-27