Interactive Educational Modules in
Scientific Computing

Nelder-Mead Method

This module demonstrates the Nelder-Mead direct search method for minimizing a nonlinear function in two dimensions. For a function of n variables, the algorithm maintains a set of n+1 points forming the vertices of a simplex in n-dimensional space. This simplex is successively updated at each iteration by discarding the vertex having the highest function value and replacing it with a new vertex having a lower function value. Such direct search methods have the advantage of requiring no derivative computations (indeed, the objective function need not even be smooth), but they tend to be efficient only in relatively low dimensions.

The user selects a problem either by choosing a preset example or typing a desired objective function f(x, y) into the box provided. Contours of the chosen objective function are drawn in the graph. The scale of the plot can be adjusted by changing the values in the boxes just below the objective function definition and then clicking Refresh. Clicking Reset resets to the state when either Refresh or Example was last clicked. If the objective function is typed in, three initial guesses (x, y) must be selected by typing them into the boxes provided and then clicking Refresh. The current points (x, y) are indicated by yellow bullets in the plot, and the corresponding function values and other data are printed in the table below. The reflection coefficient α, the expansion coefficient β, and the contraction coefficient γ can be adjusted at any time using the corresponding sliders.

Once the objective function, starting points, and algorithm parameters have been chosen, the steps of the Nelder-Mead method are then carried out sequentially by repeatedly clicking NEXT or the currently highlighted step. At each iteration, a tentative new point (light blue) is computed, at a location determined by α, along the line (magenta) determined by the worst point (i.e., with highest function value) and the centroid of the other two points. If the function value at the tentative new point lies between those of the other two points, then it becomes the final new point (green) for this iteration. Otherwise, a rescaling step occurs. If the tentative new point is better than the other two, then an even better point is sought by expanding along the magenta line by an amount determined by β. If the tentative new point is worse than the other two, then a better point is sought by contracting along the magenta line by an amount determined by γ. In either case, the resulting rescaling determines the final new point (green) for this iteration.

Example 1 is a simple quadratic function that is fairly easily minimized. Example 2 is Rosenbrock's function, which with its “banana shaped” contours is notoriously difficult to minimize. From the classic starting point (−1.2, 1), the successive points generated by the Nelder-Mead algorithm crawl along the narrow curved valley, eventually finding the minimum at (1, 1).

References:
P. E. Gill, W. Murray, and M. H. Wright, Practical Optimization, Academic Press, New York, 1981. See Section 4.2.2, pages 94-96.

Michael T. Heath, Scientific Computing, An Introductory Survey, 2nd edition, McGraw-Hill, New York, 2002. See Section 6.5.1.

Developers: Jessica Schoen and Michael Heath