This module demonstrates the Nelder-Mead direct search method for
minimizing a nonlinear function in two dimensions. For a function of
n variables, the algorithm maintains a set of
The user selects a problem either by choosing a preset example or
typing a desired objective function
Once the objective function, starting points, and algorithm parameters have been chosen, the steps of the Nelder-Mead method are then carried out sequentially by repeatedly clicking NEXT or the currently highlighted step. At each iteration, a tentative new point (light blue) is computed, at a location determined by α, along the line (magenta) determined by the worst point (i.e., with highest function value) and the centroid of the other two points. If the function value at the tentative new point lies between those of the other two points, then it becomes the final new point (green) for this iteration. Otherwise, a rescaling step occurs. If the tentative new point is better than the other two, then an even better point is sought by expanding along the magenta line by an amount determined by β. If the tentative new point is worse than the other two, then a better point is sought by contracting along the magenta line by an amount determined by γ. In either case, the resulting rescaling determines the final new point (green) for this iteration.
Example 1 is a simple quadratic function that is fairly easily
minimized. Example 2 is Rosenbrock's function, which with its
“banana shaped” contours is notoriously difficult to
minimize. From the classic starting point
References:
P. E. Gill, W. Murray, and M. H. Wright, Practical Optimization,
Academic Press, New York, 1981. See Section 4.2.2, pages 94-96.
Michael T. Heath, Scientific Computing, An Introductory Survey, 2nd edition, McGraw-Hill, New York, 2002. See Section 6.5.1.
Developers: Jessica Schoen and Michael Heath