*Details of the OPTEX Procedure* |

## Memory and Run-Time Considerations

The OPTEX procedure provides a computationally intensive approach to
designing an experiment, and therefore some finesse is called for to
make the most efficient use of computer resources.
The OPTEX procedure must retain the entire set of candidate
points in memory. This is necessary because all of the search
algorithms access these points repeatedly. If this requires more
memory than is available, consider using knowledge of the problem to
reduce the set of candidate points. For example, for first- or
second-order models, it is usually adequate to restrict the
candidates to just the center and the edges of the experimental
region or perhaps an even smaller set; see the introductory
examples
"Handling Many Variables" and "Constructing a Mixture-Process Design"
.

The distance-based criteria (CRITERION=U and CRITERION=S) also require
repeated access to the distance between candidate points. The
procedure will try to fit the matrix of these distances in memory; if
it cannot, it will recompute them as needed, but this will cause the
search to be dramatically slower.

The run time of each search algorithm depends primarily on
*N*_{D}, the size of the target design and on *N*_{C}, the
number of candidate points. For a given model, the run times of the
sequential, exchange, and DETMAX algorithms are all roughly
proportional to both *N*_{D} and *N*_{C} (that is, *O*(*N*_{D})+*O*(*N*_{C})).
The run times for the two simultaneous switching algorithms
(FEDOROV and M_FEDOROV) are roughly proportional to the product of
*N*_{D} and *N*_{C} (that is, *O*(*N*_{C}*N*_{D})).
The constant of proportionality is larger when searching for
A-optimal designs because the update formulas are more complicated
(see "Search Methods," which follows).

For problems where either *N*_{D} or *N*_{C} is large, it is
a good idea to make a few test runs with a faster algorithm and a
small number of tries before attempting to use one of the slower and
more reliable search algorithms. For most problems, the efficiency
of a design found by a faster algorithm will be within one or two
percent of that for the best possible design, and this is usually
sufficient if it appears that searching with a slower algorithm is
infeasible.

Copyright © 1999 by SAS Institute Inc., Cary, NC, USA. All rights reserved.