*Nonlinear Optimization Examples* |

## Kuhn-Tucker Conditions

The nonlinear programming (NLP) problem with one objective
function *f* and *m* constraint functions *c*_{i}, which
are continuously differentiable, is defined as follows:

In the preceding notation, *n* is the dimension of the function
*f*(*x*), and *m*_{e} is the number of equality constraints.
The linear combination of objective and constraint functions

is the *Lagrange function,* and the coefficients
are the *Lagrange multipliers.*
If the functions *f* and *c*_{i} are twice differentiable, the
point *x*^{*} is an *isolated local minimizer* of the NLP
problem, if there exists a vector that meets the following conditions:

In practice, you cannot expect that the constraint
functions *c*_{i}(*x*^{*}) will vanish within machine
precision, and determining the set of active
constraints at the solution *x*^{*} may not be simple.

Copyright © 1999 by SAS Institute Inc., Cary, NC, USA. All rights reserved.