Next: Hessian product example: Up: Newton-Conjugate-Gradient (optimize.fmin_ncg) Previous: Newton-Conjugate-Gradient (optimize.fmin_ncg)

Full Hessian example:

The Hessian of the Rosenbrock function is

\begin{eqnarray*}
H_{ij}=\frac{\partial ^{2}f}{\partial x_{i}\partial x_{j}} & =...
...\delta _{i,j}-400x_{i}\delta _{i+1,j}-400x_{i-1}\delta _{i-1,j},
\end{eqnarray*}



if \( i,j\in \left[ 1,N-2\right] \) with \( i,j\in \left[ 0,N-1\right] \) defining the \( N\times N \) matrix. Other non-zero entries of the matrix are

\begin{eqnarray*}
\frac{\partial ^{2}f}{\partial x_{0}^{2}} & = & 1200x_{0}^{2}-...
...{N-2},\\
\frac{\partial ^{2}f}{\partial x^{2}_{N-1}} & = & 200.
\end{eqnarray*}



For example, the Hessian when \( N=5 \) is

\begin{displaymath}
\mathbf{H}=\left[ \begin{array}{ccccc}
1200x_{0}^{2}-400x_{1...
...& -400x_{3}\\
0 & 0 & 0 & -400x_{3} & 200
\end{array}\right] .\end{displaymath}

The code which computes this Hessian along with the code to minimize the function using fmin_ncg is shown in the following example:


>>> from scipy.optimize import fmin_ncg
>>> def rosen_hess(x):
        x = asarray(x)
        H = diag(-400*x[:-1],1) - diag(400*x[:-1],-1)
        diagonal = zeros(len(x),x.typecode())
        diagonal[0] = 1200*x[0]-400*x[1]+2
        diagonal[-1] = 200
        diagonal[1:-1] = 202 + 1200*x[1:-1]**2 - 400*x[2:]
        H = H + diag(diagonal)
        return H

>>> x0 = [1.3, 0.7, 0.8, 1.9, 1.2]
>>> xopt = fmin_ncg(rosen, x0, rosen_der, fhess=rosen_hess)
Optimization terminated successfully.
         Current function value: 0.000000
         Iterations: 19
         Function evaluations: 40
         Gradient evaluations: 19
         Hessian evaluations: 19
>>> print xopt
[ 0.9999  0.9999  0.9998  0.9996  0.9991]



2001-07-27