The simplex algorithm is probably the simplest way to minimize a fairly
well-behaved function. The simplex algorithm requires only function
evaluations and is a good choice for simple minimization problems.
However, because it does not use any gradient evaluations, it may
take longer to find the minimum. To demonstrate the minimization function
consider the problem of minimizing the Rosenbrock function of
variables:
>>> from scipy.optimize import fmin >>> def rosen(x): # The Rosenbrock function return sum(100.0*(x[1:]-x[:-1]**2.0)**2.0 + (1-x[:-1])**2.0) >>> x0 = [1.3, 0.7, 0.8, 1.9, 1.2] >>> xopt = fmin(rosen, x0) Optimization terminated successfully. Current function value: 0.000000 Iterations: 516 Function evaluations: 825 >>> print xopt [ 1. 1. 1. 1. 1.]