Chapter Contents Previous Next
 Nonlinear Optimization Examples

## Global Versus Local Optima

All the IML optimization algorithms converge toward local rather than global optima. The smallest local minimum of an objective function is called the global minimum, and the largest local maximum of an objective function is called the global maximum. Hence, the subroutines may occasionally fail to find the global optimum. Suppose you have the function, f(x) = (1/27)(3x14-28x13+84x12-96x1+64) + x22, which has a local minimum at f(1,0)=1 and a global minimum at the point f(4,0)=0.

The following statements use two calls of the NLPTR subroutine to minimize the preceding function. The first call specifies the initial point xa=(0.5,1.5), and the second call specifies the initial point xb=(3,1). The first call finds the local optimum x*=(1,0), and the second call finds the global optimum x*=(4,0).

```   proc iml;
start F_GLOBAL(x);
f=(3*x[1]**4-28*x[1]**3+84*x[1]**2-96*x[1]+64)/27 + x[2]**2;
return(f);
finish F_GLOBAL;
xa = {.5 1.5};
xb = {3 -1};
optn = {0 2};
call nlptr(rca,xra,"F_GLOBAL",xa,optn);
call nlptr(rcb,xrb,"F_GLOBAL",xb,optn);
print xra xrb;
```

One way to find out whether the objective function has more than one local optimum is to run various optimizations with a pattern of different starting points.

For a more mathematical definition of optimality, refer to the Kuhn-Tucker theorem in standard optimization literature. Using a rather nonmathematical language, a local minimizer x* satisfies the following conditions:

• There exists a small, feasible neighborhood of x* that does not contain any point x with a smaller function value f(x) < f(x*).
• The vector of first derivatives (gradient) of the objective function f (projected toward the feasible region) at the point x* is zero.
• The matrix of second derivatives (Hessian matrix) of the objective function f (projected toward the feasible region) at the point x* is positive definite.
A local maximizer has the largest value in a feasible neighborhood and a negative definite Hessian.

The iterative optimization algorithm terminates at the point xt, which should be in a small neighborhood (in terms of a user-specified termination criterion) of a local optimizer x*. If the point xt is located on one or more active boundary or general linear constraints, the local optimization conditions are valid only for the feasible region. That is,

• the projected gradient, ZT g(xt), must be sufficiently small
• the projected Hessian, ZT G(xt) Z, must be positive definite for minimization problems or negative definite for maximization problems
If there are n active constraints at the point xt, the nullspace Z has zero columns and the projected Hessian has zero rows and columns. A matrix with zero rows and columns is considered positive as well as negative definite.

 Chapter Contents Previous Next Top