diff --git a/src/site/xdoc/userguide/optimization.xml b/src/site/xdoc/userguide/optimization.xml
index b61c2033f..040780de9 100644
--- a/src/site/xdoc/userguide/optimization.xml
+++ b/src/site/xdoc/userguide/optimization.xml
@@ -63,21 +63,21 @@
are only four interfaces defining the common behavior of optimizers, one for each
supported type of objective function:
- A
- UnivariateRealOptimizer is used to find the minimal values of a univariate real-valued
+ A
+ UnivariateOptimizer is used to find the minimal values of a univariate real-valued
function f.
@@ -174,10 +174,10 @@
The first two simplex-based methods do not handle simple bounds constraints by themselves.
However there are two adapters(
- MultivariateRealFunctionMappingAdapter and
- MultivariateRealFunctionPenaltyAdapter) that can be used to wrap the user function in
+ href="../apidocs/org/apache/commons/math3/optimization/direct/MultivariateFunctionMappingAdapter.html">
+ MultivariateFunctionMappingAdapter and
+ MultivariateFunctionPenaltyAdapter) that can be used to wrap the user function in
such a way the wrapped function is unbounded and can be used with these optimizers, despite
the fact the underlying function is still bounded and will be called only with feasible
points that fulfill the constraints. Note however that using these adapters are only a
@@ -238,8 +238,8 @@
In order to solve a vectorial optimization problem, the user must provide it as
an object implementing the
- DifferentiableMultivariateVectorialFunction interface. The object will be provided to
+ href="../apidocs/org/apache/commons/math3/analysis/DifferentiableMultivariateVectorFunction.html">
+ DifferentiableMultivariateVectorFunction interface. The object will be provided to
the estimate method of the optimizer, along with the target and weight arrays,
thus allowing the optimizer to compute the residuals at will. The last parameter to the
estimate method is the point from which the optimizer will start its
@@ -251,9 +251,10 @@
- We are looking to find the best parameters [a, b, c] for the quadratic function f(x)=a*x^2 + b*x + c .
- The data set below was generated using [a = 8, b = 10, c = 16]. A random number between zero and one was added
- to each y value calculated.
+ We are looking to find the best parameters [a, b, c] for the quadratic function
+ f(x) = a x2 + b x + c.
+ The data set below was generated using [a = 8, b = 10, c = 16].
+ A random number between zero and one was added to each y value calculated.
@@ -318,24 +319,23 @@ We'll tackle the implementation of the MultivariateMatrixFunction jacobian
In this case the Jacobian is the partial derivative of the function with respect
to the parameters a, b and c. These derivatives are computed as follows:
-
d(ax^2+bx+c)/da = x2
-
d(ax^2+bx+c)/db = x
-
d(ax^2+bx+c)/dc = 1
+
d(ax2 + bx + c)/da = x2
+
d(ax2 + bx + c)/db = x
+
d(ax2 + bx + c)/dc = 1
For a quadratic which has three variables the Jacobian Matrix will have three columns, one for each variable, and the number
-of rows will equal the number of rows in our data set, which in this case is ten. So for example for [a = 1, b=1, c=1]
-the Jacobian Matrix is (Exluding the first column which shows the value of x):
+of rows will equal the number of rows in our data set, which in this case is ten. So for example for [a = 1, b = 1, c = 1], the Jacobian Matrix is (excluding the first column which shows the value of x):
x
-
d(ax^2+bx+c)/da
-
d(ax^2+bx+c)/db
-
d(ax^2+bx+c)/dc
+
d(ax2 + bx + c)/da
+
d(ax2 + bx + c)/db
+
d(ax2 + bx + c)/dc
1
@@ -405,8 +405,7 @@ parameter is an ArrayList containing the independent values of the data set):