commons-math/xdocs/userguide/optimization.xml

90 lines
4.0 KiB
XML

<?xml version="1.0"?>
<!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<?xml-stylesheet type="text/xsl" href="./xdoc.xsl"?>
<!-- $Revision: 480435 $ $Date: 2006-11-29 08:06:35 +0100 (mer., 29 nov. 2006) $ -->
<document url="optimization.html">
<properties>
<title>The Commons Math User Guide - Optimization</title>
</properties>
<body>
<section name="13 Optimization">
<subsection name="13.1 Overview" href="overview">
<p>
The optimization package provides simplex-based direct search optimization algorithms.
</p>
<p>
The aim of this package is similar to the aim of the estimation package, but the
algorithms are entirely differents as:
<ul>
<li>
they do not need the partial derivatives of the measurements
with respect to the free parameters
</li>
<li>
they do not rely on residuals-based quadratic cost functions but
handle any cost functions, including non-continuous ones!
</li>
</ul>
</p>
</subsection>
<subsection name="13.2 Direct Methods" href="direct">
<p>
Direct search methods only use cost function values, they don't
need derivatives and don't either try to compute approximation of
the derivatives. According to a 1996 paper by Margaret H. Wright
(<a href="http://cm.bell-labs.com/cm/cs/doc/96/4-02.ps.gz">Direct
Search Methods: Once Scorned, Now Respectable</a>), they are used
when either the computation of the derivative is impossible (noisy
functions, unpredictable dicontinuities) or difficult (complexity,
computation cost). In the first cases, rather than an optimum, a
<em>not too bad</em> point is desired. In the latter cases, an
optimum is desired but cannot be reasonably found. In all cases
direct search methods can be useful.
</p>
<p>
Simplex-based direct search methods are based on comparison of
the cost function values at the vertices of a simplex (which is a
set of n+1 points in dimension n) that is updated by the algorithms
steps.
</p>
<p>
The instances can be built either in single-start or in
multi-start mode. Multi-start is a traditional way to try to avoid
beeing trapped in a local minimum and miss the global minimum of a
function. It can also be used to verify the convergence of an
algorithm. In multi-start mode, the <code>minimizes</code>method
returns the best minimum found after all starts, and the <code>etMinima</code>
method can be used to retrieve all minima from all starts (including the one
already provided by the <code>minimizes</code> method).
</p>
<p>
The package provides two solvers. The first one is the classical
<a href="../apidocs/org/apache/commons/math/optimization/NelderMead.html">
Nelder-Mead</a> method. The second one is Virginia Torczon's
<a href="../apidocs/org/apache/commons/math/optimization/MultiDirectional.html">
multi-directional</a> method.
</p>
</subsection>
</section>
</body>
</document>