Next: Newton-Conjugate-Gradient (optimize.fmin_ncg) Up: Optimization (optimize) Previous: Nelder-Mead Simplex algorithm (optimize.fmin)

Broyden-Fletcher-Goldfarb-Shanno algorithm (optimize.fmin_bfgs)

In order to converge more quickly to the solution, this routine uses the gradient of the objective function. If the gradient is not given by the user, then it is estimated using first-differences. The Broyden-Fletcher-Goldfarb-Shanno (BFGS) method requires fewer function calls than the simplex algorithm but unless the gradient is provided by the user, the speed savings won't be significant.

To demonstrate this algorithm, the Rosenbrock function is again used. The gradient of the Rosenbrock function is the vector:

\begin{eqnarray*}
\frac{\partial f}{\partial x_{j}} & = & \sum _{i=1}^{N}200\lef...
...00x_{j}\left( x_{j+1}-x_{j}^{2}\right) -2\left( 1-x_{j}\right) .
\end{eqnarray*}



This expression is valid for the interior derivatives. Special cases are

\begin{eqnarray*}
\frac{\partial f}{\partial x_{0}} & = & -400x_{0}\left( x_{1}-...
...}{\partial x_{N-1}} & = & 200\left( x_{N-1}-x_{N-2}^{2}\right) .
\end{eqnarray*}



A Python function which computes this gradient is constructed by the code-segment:


>>> def rosen_der(x):
        xm = x[1:-1]
        xm_m1 = x[:-2]
        xm_p1 = x[2:]
        der = zeros(x.shape,x.typecode())
        der[1:-1] = 200*(xm-xm_m1**2) - 400*(xm_p1 - xm**2)*xm - 2*(1-xm)
        der[0] = -400*x[0]*(x[1]-x[0]**2) - 2*(1-x[0])
        der[-1] = 200*(x[-1]-x[-2]**2)
        return der

The calling signature for the BFGS minimization algorithm is similar to fmin with the addition of the fprime argument. An example usage of fmin_bfgs is shown in the following example which minimizes the Rosenbrock function.


>>> from scipy.optimize import fmin_bfgs

>>> x0 = [1.3, 0.7, 0.8, 1.9, 1.2]
>>> xopt = fmin_bfgs(rosen, x0, fprime=rosen_der)
Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 109
         Function evaluations: 262
         Gradient evaluations: 110
>>> print xopt
[ 1.  1.  1.  1.  1.]                                                                       



2001-07-27