diff --git a/src/main/java/org/apache/commons/math3/fitting/HarmonicCurveFitter.java b/src/main/java/org/apache/commons/math3/fitting/HarmonicCurveFitter.java
index 2ff63fdbb..e5c45c3aa 100644
--- a/src/main/java/org/apache/commons/math3/fitting/HarmonicCurveFitter.java
+++ b/src/main/java/org/apache/commons/math3/fitting/HarmonicCurveFitter.java
@@ -137,99 +137,99 @@ public class HarmonicCurveFitter extends AbstractCurveFitter
We know f (t) at some sampling points ti and want to find a, - * ω and φ such that f (t) = a cos (ω t + φ). + *
We know \( f(t) \) at some sampling points \( t_i \) and want + * to find \( a \), \( \omega \) and \( \phi \) such that + * \( f(t) = a \cos (\omega t + \phi) \). *
* *From the analytical expression, we can compute two primitives : - *
- * If2 (t) = ∫ f2 = a2 × [t + S (t)] / 2 - * If'2 (t) = ∫ f'2 = a2 ω2 × [t - S (t)] / 2 - * where S (t) = sin (2 (ω t + φ)) / (2 ω) - *+ * \[ + * If2(t) = \int f^2 dt = a^2 (t + S(t)) / 2 + * \] + * \[ + * If'2(t) = \int f'^2 dt = a^2 \omega^2 (t - S(t)) / 2 + * \] + * where \(S(t) = \frac{\sin(2 (\omega t + \phi))}{2\omega}\) * * - *
We can remove S between these expressions : - *
- * If'2 (t) = a2 ω2 t - ω2 If2 (t) - *+ *
We can remove \(S\) between these expressions : + * \[ + * If'2(t) = a^2 \omega^2 t - \omega^2 If2(t) + * \] *
* - *The preceding expression shows that If'2 (t) is a linear - * combination of both t and If2 (t): If'2 (t) = A × t + B × If2 (t) + *
The preceding expression shows that \(If'2 (t)\) is a linear + * combination of both \(t\) and \(If2(t)\): + * \[ + * If'2(t) = A t + B If2(t) + * \] *
* *From the primitive, we can deduce the same form for definite - * integrals between t1 and ti for each ti : - *
- * If2 (ti) - If2 (t1) = A × (ti - t1) + B × (If2 (ti) - If2 (t1)) - *+ * integrals between \(t_1\) and \(t_i\) for each \(t_i\) : + * \[ + * If2(t_i) - If2(t_1) = A (t_i - t_1) + B (If2 (t_i) - If2(t_1)) + * \] * * - *
We can find the coefficients A and B that best fit the sample + *
We can find the coefficients \(A\) and \(B\) that best fit the sample * to this linear expression by computing the definite integrals for * each sample points. *
* - *For a bilinear expression z (xi, yi) = A × xi + B × yi, the - * coefficients A and B that minimize a least square criterion - * ∑ (zi - z (xi, yi))2 are given by these expressions:
- *+ *+ * For each \(t_i\) from \(t_1\) to \(t_{n-1}\), compute: + * \[ f(t_i) \] + * \[ f'(t_i) = \frac{f (t_{i+1}) - f(t_{i-1})}{t_{i+1} - t_{i-1}} \] + * \[ x_i = t_i - t_1 \] + * \[ y_i = \int_{t_1}^{t_i} f^2(t) dt \] + * \[ z_i = \int_{t_1}^{t_i} f'^2(t) dt \] + * and update the sums: + * \[ \sum x_i x_i, \sum y_i y_i, \sum x_i y_i, \sum x_i z_i, \sum y_i z_i \] + * + * Then: + * \[ + * a = \sqrt{\frac{\sum y_i y_i \sum x_i z_i - \sum x_i y_i \sum y_i z_i } + * {\sum x_i y_i \sum x_i z_i - \sum x_i x_i \sum y_i z_i }} + * \] + * \[ + * \omega = \sqrt{\frac{\sum x_i y_i \sum x_i z_i - \sum x_i x_i \sum y_i z_i} + * {\sum x_i x_i \sum y_i y_i - \sum x_i y_i \sum x_i y_i}} + * \] + * + *For a bilinear expression \(z(x_i, y_i) = A x_i + B y_i\), the + * coefficients \(A\) and \(B\) that minimize a least-squares criterion + * \(\sum (z_i - z(x_i, y_i))^2\) are given by these expressions:
+ * \[ + * A = \frac{\sum y_i y_i \sum x_i z_i - \sum x_i y_i \sum y_i z_i} + * {\sum x_i x_i \sum y_i y_i - \sum x_i y_i \sum x_i y_i} + * \] + * \[ + * B = \frac{\sum x_i x_i \sum y_i z_i - \sum x_i y_i \sum x_i z_i} + * {\sum x_i x_i \sum y_i y_i - \sum x_i y_i \sum x_i y_i} + * + * \] * - * ∑yiyi ∑xizi - ∑xiyi ∑yizi - * A = ------------------------ - * ∑xixi ∑yiyi - ∑xiyi ∑xiyi + *In fact, we can assume that both \(a\) and \(\omega\) are positive and + * compute them directly, knowing that \(A = a^2 \omega^2\) and that + * \(B = -\omega^2\). The complete algorithm is therefore:
* - * ∑xixi ∑yizi - ∑xiyi ∑xizi - * B = ------------------------ - * ∑xixi ∑yiyi - ∑xiyi ∑xiyi - *
Once we know \(\omega\) we can compute: + * \[ + * fc = \omega f(t) \cos(\omega t) - f'(t) \sin(\omega t) + * \] + * \[ + * fs = \omega f(t) \sin(\omega t) + f'(t) \cos(\omega t) + * \] *
* - * - *In fact, we can assume both a and ω are positive and - * compute them directly, knowing that A = a2 ω2 and that - * B = - ω2. The complete algorithm is therefore:
- *- * - * for each ti from t1 to tn-1, compute: - * f (ti) - * f' (ti) = (f (ti+1) - f(ti-1)) / (ti+1 - ti-1) - * xi = ti - t1 - * yi = ∫ f2 from t1 to ti - * zi = ∫ f'2 from t1 to ti - * update the sums ∑xixi, ∑yiyi, ∑xiyi, ∑xizi and ∑yizi - * end for - * - * |-------------------------- - * \ | ∑yiyi ∑xizi - ∑xiyi ∑yizi - * a = \ | ------------------------ - * \| ∑xiyi ∑xizi - ∑xixi ∑yizi - * - * - * |-------------------------- - * \ | ∑xiyi ∑xizi - ∑xixi ∑yizi - * ω = \ | ------------------------ - * \| ∑xixi ∑yiyi - ∑xiyi ∑xiyi - * - *- * - * - *
Once we know ω, we can compute: - *
- * fc = ω f (t) cos (ω t) - f' (t) sin (ω t) - * fs = ω f (t) sin (ω t) + f' (t) cos (ω t) - *- * - * - *
It appears that fc = a ω cos (φ)
and
- * fs = -a ω sin (φ)
, so we can use these
- * expressions to compute φ. The best estimate over the sample is
+ *
It appears that \(fc = a \omega \cos(\phi)\) and + * \(fs = -a \omega \sin(\phi)\), so we can use these + * expressions to compute \(\phi\). The best estimate over the sample is * given by averaging these expressions. *
* *Since integrals and means are involved in the preceding - * estimations, these operations run in O(n) time, where n is the + * estimations, these operations run in \(O(n)\) time, where \(n\) is the * number of measurements.
*/ public static class ParameterGuesser {