The PatternSearch base class was conceived as part of an undergraduate honors thesis by Elizabeth Dolan under advisor Virginia Torczon in the Computer Science Department of the College of William and Mary. The project required access to a variety of pattern search algorithms that could be easily tested and compared. The base class is designed not only to support the pattern search algorithms that follow but also to provide leeway for the greatest possible variety of pattern searches. The base class includes routines for gathering user-defined information, for manipulating an explicitly stored pattern matrix, for initiating the general algorithm, and for providing basic data about the search and results. The base class is far from complete; in fact, it cannot even be instantiated on its own. Its sole purpose it to provide a foundation for child classes.

The Vector and Matrix classes, built from the Template Numerical Toolkit, implement vectors and matrices in a fashion suitable for mathematical computation. The Template Numerical Toolkit was originally created by Roldan Pozo, who is a research scientist in the Mathematical and Computational Sciences Division of the Information Technology Laboratory at the National Institute of Standards and Technology (NIST). This version was modified by Chris Siefert from the College of William and Mary, to add increased functionality. New features include 2-norm computation (based on the BLAS implementation of drnm2) for vectors, alternative methods for accessing matrix data.

The NLessSearch class performs a simple search on trial points located in the positions described by the vertices of a regular simplex centered about the current iterate. The number of trial steps in the search pattern is thus one more than the dimension of the problem. When the search finds objective function decrease, that improving trial point immediately becomes the new iterate and the search begins again. When no improving trial points can be located on the simplex, the size of the trial steps is halved. Convergence is declared when the step size falls below the user-defined threshold value of stoppingStepLength. Testing thus far indicates that the NLess core pattern shows the most promise in the area of global optimization but lacks efficiency in local search settings.

The CompassSearch class performs a simple search in the positive and negative coordinate directions for each dimension of the search space. When the search finds objective function decrease, that improving trial point immediately becomes the new iterate and the search begins again. When no improving trial points can be located, the size of the trial steps is halved. Convergence is declared when the step size falls below the user-defined threshold value of stoppingStepLength. The CompassSearch is recommended as a simple search for individuals learning about pattern searches to study.

The CoordinateSearch classes searches in the positive and negative coordinate vectors for each variable each iteration. In other words, the difference between the CoordinateSearch and the CompassSearch lies in the CoordinateSearch not restarting the local search immediately upon finding function decrease in one of the variables. While this raises the minimum number of function evaluations per iteration from one to n, where n is the number of variables in the problem, the CoordinateSearch represents a great improvement over the CompassSearch for many problems.

The HJSearch class is based on the algorithm first developed by Hooke and Jeeves in the 1960's. To begin the search, the initial point, or current iterate, is stored as the most recent permanent base point. A coordinate search about the current iterate that finds improvement at a trial point is followed by a pattern step to a temporary base point equal to the improving trial point minus the last permanent base point. The pattern step is taken whether function decrease exists at the temporary base point or not. A coordinate search is then performed relative to the location and function value of the temporary base point, after which, if no trial point showed improvement over the temporary base point, the temporary base point assumes the role of the best trial point for the comparison of the best trial point to the last permanent base point. If the trial point represents improvement over the last permanent base point, the search continues with another pattern step from the trial point equal to the trial point minus the last permanent base point, and the trial point becomes the new permanent base point. If no improvement over the last permanent base point is found via the coordinate search around the temporary base point (including the value at the temporary base point itself), the search back-tracks, making the last permanent base point the current iterate and initiating a coordinate search about that current iterate. Should a coordinate search about a permanent base point fail, the trial step size is decreased by some factor until convergence is marked by the search step length dropping below some pre-defined tolerance. The Hooke and Jeeves algorithm performs well, especially for local minimization.

The EdHJSearch represents a slightly more efficient version of the Hooke and Jeeves search on local searches. The change to the code is minimal: a simple check enforces restricted iterations after pattern contractions (decreases of the trial step length) where pattern steps are disallowed temporarily. The heuristic notion behind this modification is that a contraction of the pattern represents the time in the search where the norm of the gradient of the function can be more tightly bounded. In other words, it's our best indication that we are getting closer to the minimum. With that in mind, preventing pattern extending steps allows the search to focus on the current neighborhood of high potential instead of concentrating on moving away from that neighborhood. Pattern steps return in the iteration following the restricted iteration if our hopes for being within the trial step length of the minimum prove false.

The objective files are the home of the
end user's specifications, including the function to be optimized, the
function to return the initial point, *n* (the number of
variables of the problem), as well as variables to control the
starting
trial step length, the stopping tolerance on the trial step length,
and the upper tolerance on the number of function evaluations allowed.
The `objective.h` file's function declarations should be kept.
If the user wishes to use another file, the `PatternSearch.h`
and main program files
must be modified to include the alternate file instead of
`objective.h`, and the compilation line with
`objective.cc` should be modified appropriately.
The objective files
included here are somewhat
ludicrous in the function they test - a quadratic that most people
could minimize in their heads in no time. The starting point and
initial step length are also designed to force the minimum (the
origin)
to fall directly on the search lattice for most of the searches. In other words, it's
no fun, nor is it representative of typical use. However, this example
is included mostly so that the user may be convinced that everything
is compiling and executing properly and to make the names of the
various parameters readily apparent. For a list of the information
required of the user and our few helpful hints for
choosing parameters, return to the top of this page.

This main program offers examples of declaring the necessary classes, running the searches, and getting at some of the data possibilities. The searches run are simple: NLessSearch, CompassSearch, CoordinateSearch, HJSearch, and EdHJSearch. Instructions for compiling the program appear at the top of the main.cc file.