Chapter Contents |
Previous |
Next |

Introduction to Optimization |

The NLP procedure (**N**on**L**inear **P**rogramming)
offers a set of optimization techniques for minimizing or
maximizing a continuous nonlinear function *f*(*x*) of *n*
decision variables, *x* = (*x _{1}*, ... ,

The NLP procedure provides a number of algorithms for solving
this problem that take advantage of special structure on
the objective function or constraints, or both.
One example is the **quadratic programming problem**:

Another example is the **least-squares problem**:

The following optimization techniques are supported in PROC NLP:

- Quadratic Active Set Technique
- Trust-Region Method
- Newton-Raphson Method With Line Search
- Newton-Raphson Method With Ridging
- Quasi-Newton Methods
- Double-Dogleg Method
- Conjugate Gradient Methods
- Nelder-Mead Simplex Method
- Levenberg-Marquardt Method
- Hybrid Quasi-Newton Methods

These optimization techniques require
a continuous objective function *f*, and all but
one (NMSIMP) require continuous first-order derivatives of the
objective function *f*.
Some of the techniques also require
continuous second-order derivatives.
There are three ways to compute derivatives in PROC NLP:

- analytically (using a special derivative compiler), the default method
- via finite difference approximations
- via user-supplied exact or approximate numerical functions

Nonlinear programs can be input into the procedure in various ways. The objective, constraint, and derivative fucntions are specified using the programming statements of PROC NLP. In addition, information in SAS data sets can be used to define the structure of objectives and constraints as well as specify constants used in objectives, constraints, and derivatives.

PROC NLP uses data sets to input various pieces of information.

- The DATA= data set enables you to specify data shared by all functions involved in a least squares problem.
- The INQUAD= data set contains the arrays appearing in a quadratic programming problem.
- The INVAR= data set specifies initial values for the decision variables, the values of constants that are referred to in the program statements, and simple boundary and general linear constraints.
- The MODEL= data set specifies a model (functions, constraints, derivatives) saved at a previous execution of the NLP procedure.

PROC NLP uses data sets to output various results.

- The OUTVAR= data set saves the values of the decision variables, the derivatives, the solution, and the covariance matrix at the solution.
- The OUT= output data set contains variables generated in the program statements defining the objective function as well as selected variables of the DATA= input data set, if available.
- The OUTMODEL= data set saves the programming statements. It can be used to input a model in the MODEL= input data set.

**Figure 1.5:** Input and Output Data Sets in PROC NLP

As an alternative to supplying data in SAS data sets, some or all data for the model can be specified using SAS programming statements. These are similar to those used in the SAS DATA step.

Consider the simple example of minimizing the Rosenbrock Function (Rosenbrock, 1960).

The following PROC NLP run be used to solve this problem:

proc nlp; min f; decvar x1 x2; f1 = 10 * (x2 - x1 * x1); f2 = 1 - x1; f = .5 * (f1 * f1 + f2 * f2); run;

The MIN statement identifies the symbol *f* that characterizes
the objective function in terms of *f*1 and *f*2, and the
DECVAR statement names the decision variables *X*1 and *X*2.
Because there is no explicit optimizing algorithm option specified (TECH=),
PROC NLP would use the Newton-Raphson method with ridging,
the default algorithm when there are no constraints.

A better way to solve this problem is to take advantage of the fact
that *f* is a sum of squares of *f*1 and *f*2 and to treat it as a
least-squares problem.
Using the LSQ statement instead of the MIN statement tells
the procedure that this is a least-squares problem, which results
in the use of
one of the specialized algorithms for solving least-squares
problems (for example, Levenberg-Marquardt).

proc nlp; lsq f1 f2; decvar x1 x2; f1 = 10 * (x2 - x1 * x1); f2 = 1 - x1; run;

The LSQ statement results in the minimization of a function that is the sum of squares of functions that appear in the LSQ statement.

The least-squares specification is preferred because it enables the procedure to exploit the structure in the problem for numeric stability and performance.

There are several other NLP statements that are used to supply additional data of the model, such as variable value bounds and linear and nonlinear constraints. The following is an example of a problem with bounds and with linear and nonlinear constraints:

proc nlp tech=QUANEW; min f; decvar x1 x2; bounds x1 - x2 <= .5; lincon x1 + x2 <= .6; nlincon c1 >= 0; c1 = x1 * x1 - 2 * x2; f1 = 10 * (x2 - x1 * x1); f2 = 1 - x1; f = .5 * (f1 * f1 + f2 * f2); run;

Chapter Contents |
Previous |
Next |
Top |

Copyright © 1999 by SAS Institute Inc., Cary, NC, USA. All rights reserved.