The optimization package provides simplex-based direct search optimization algorithms.
The aim of this package is similar to the aim of the estimation package, but the algorithms are entirely differents as:
Direct search methods only use cost function values, they don't need derivatives and don't either try to compute approximation of the derivatives. According to a 1996 paper by Margaret H. Wright (Direct Search Methods: Once Scorned, Now Respectable), they are used when either the computation of the derivative is impossible (noisy functions, unpredictable dicontinuities) or difficult (complexity, computation cost). In the first cases, rather than an optimum, a not too bad point is desired. In the latter cases, an optimum is desired but cannot be reasonably found. In all cases direct search methods can be useful.
Simplex-based direct search methods are based on comparison of the cost function values at the vertices of a simplex (which is a set of n+1 points in dimension n) that is updated by the algorithms steps.
The instances can be built either in single-start or in
multi-start mode. Multi-start is a traditional way to try to avoid
beeing trapped in a local minimum and miss the global minimum of a
function. It can also be used to verify the convergence of an
algorithm. In multi-start mode, the minimizes
method
returns the best minimum found after all starts, and the etMinima
method can be used to retrieve all minima from all starts (including the one
already provided by the minimizes
method).
The package provides two solvers. The first one is the classical Nelder-Mead method. The second one is Virginia Torczon's multi-directional method.