Chapter Contents 
Previous 
Next 
Language Reference 
Optimization Subroutines 
Conjugate Gradient Optimization Method 

Double Dogleg Optimization Method 

NelderMead Simplex Optimization Method 

NewtonRaphson Optimization Method 

NewtonRaphson Ridge Optimization Method 

(Dual) QuasiNewton Optimization Method 

Quadratic Optimization Method 

TrustRegion Optimization Method 

LeastSquares Subroutines 
Hybrid QuasiNewton LeastSquares Methods 

LevenbergMarquardt LeastSquares Method 

Supplementary Subroutines 
Approximate Derivatives by Finite Differences 

Feasible Point Subject to Constraints 

Note: The names of the optional arguments can be used as keywords. For example, the following statements are equivalent:
call nlpnrr(rc,xr,"fun",x0,,,ter,,,"grad"); call nlpnrr(rc,xr,"fun",x0) tc=ter grd="grad";
All the optimization subroutines require at least two input arguments.
Note that you can specify optional arguments with the keyword=argument syntax.
All the optimization subroutines return the following results:
Chapter Contents 
Previous 
Next 
Top 
Copyright © 1999 by SAS Institute Inc., Cary, NC, USA. All rights reserved.