Skip to content

Commit f68b73f

Browse files
Roman DonchenkoOpenCV Buildbot
authored andcommitted
Merge pull request opencv#1521 from nailbiter:optimCG
2 parents 3342920 + 0324932 commit f68b73f

15 files changed

+731
-19
lines changed

modules/optim/doc/downhill_simplex_method.rst

Lines changed: 7 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -8,14 +8,15 @@ optim::DownhillSolver
88

99
.. ocv:class:: optim::DownhillSolver
1010
11-
This class is used to perform the non-linear non-constrained *minimization* of a function, given on an *n*-dimensional Euclidean space,
11+
This class is used to perform the non-linear non-constrained *minimization* of a function, defined on an *n*-dimensional Euclidean space,
1212
using the **Nelder-Mead method**, also known as **downhill simplex method**. The basic idea about the method can be obtained from
1313
(`http://en.wikipedia.org/wiki/Nelder-Mead\_method <http://en.wikipedia.org/wiki/Nelder-Mead_method>`_). It should be noted, that
1414
this method, although deterministic, is rather a heuristic and therefore may converge to a local minima, not necessary a global one.
1515
It is iterative optimization technique, which at each step uses an information about the values of a function evaluated only at
1616
*n+1* points, arranged as a *simplex* in *n*-dimensional space (hence the second name of the method). At each step new point is
1717
chosen to evaluate function at, obtained value is compared with previous ones and based on this information simplex changes it's shape
18-
, slowly moving to the local minimum.
18+
, slowly moving to the local minimum. Thus this method is using *only* function values to make decision, on contrary to, say, Nonlinear
19+
Conjugate Gradient method (which is also implemented in ``optim``).
1920

2021
Algorithm stops when the number of function evaluations done exceeds ``termcrit.maxCount``, when the function values at the
2122
vertices of simplex are within ``termcrit.epsilon`` range or simplex becomes so small that it
@@ -30,9 +31,9 @@ positive integer ``termcrit.maxCount`` and positive non-integer ``termcrit.epsil
3031
class CV_EXPORTS Function
3132
{
3233
public:
33-
virtual ~Function() {}
34-
//! ndim - dimensionality
35-
virtual double calc(const double* x) const = 0;
34+
virtual ~Function() {}
35+
virtual double calc(const double* x) const = 0;
36+
virtual void getGradient(const double* /*x*/,double* /*grad*/) {}
3637
};
3738

3839
virtual Ptr<Function> getFunction() const = 0;
@@ -150,7 +151,7 @@ optim::createDownhillSolver
150151
This function returns the reference to the ready-to-use ``DownhillSolver`` object. All the parameters are optional, so this procedure can be called
151152
even without parameters at all. In this case, the default values will be used. As default value for terminal criteria are the only sensible ones,
152153
``DownhillSolver::setFunction()`` and ``DownhillSolver::setInitStep()`` should be called upon the obtained object, if the respective parameters
153-
were not given to ``createDownhillSolver()``. Otherwise, the two ways (give parameters to ``createDownhillSolver()`` or miss the out and call the
154+
were not given to ``createDownhillSolver()``. Otherwise, the two ways (give parameters to ``createDownhillSolver()`` or miss them out and call the
154155
``DownhillSolver::setFunction()`` and ``DownhillSolver::setInitStep()``) are absolutely equivalent (and will drop the same errors in the same way,
155156
should invalid input be detected).
156157

Lines changed: 136 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,136 @@
1+
Nonlinear Conjugate Gradient
2+
===============================
3+
4+
.. highlight:: cpp
5+
6+
optim::ConjGradSolver
7+
---------------------------------
8+
9+
.. ocv:class:: optim::ConjGradSolver
10+
11+
This class is used to perform the non-linear non-constrained *minimization* of a function with *known gradient*
12+
, defined on an *n*-dimensional Euclidean space,
13+
using the **Nonlinear Conjugate Gradient method**. The implementation was done based on the beautifully clear explanatory article `An Introduction to the Conjugate Gradient Method Without the Agonizing Pain <http://www.cs.cmu.edu/~quake-papers/painless-conjugate-gradient.pdf>`_
14+
by Jonathan Richard Shewchuk. The method can be seen as an adaptation of a standard Conjugate Gradient method (see, for example
15+
`http://en.wikipedia.org/wiki/Conjugate_gradient_method <http://en.wikipedia.org/wiki/Conjugate_gradient_method>`_) for numerically solving the
16+
systems of linear equations.
17+
18+
It should be noted, that
19+
this method, although deterministic, is rather a heuristic method and therefore may converge to a local minima, not necessary a global one. What
20+
is even more disastrous, most of its behaviour is ruled by gradient, therefore it essentially cannot distinguish between local minima and maxima.
21+
Therefore, if it starts sufficiently near to the local maximum, it may converge to it. Another obvious restriction is that it should be possible
22+
to compute the gradient of a function at any point, thus it is preferable to have analytic expression for gradient and computational burden
23+
should be born by the user.
24+
25+
The latter responsibility is accompilished via the ``getGradient(const double* x,double* grad)`` method of a
26+
``Solver::Function`` interface (which represents function that is being optimized). This method takes point a point in *n*-dimensional space
27+
(first argument represents the array of coordinates of that point) and comput its gradient (it should be stored in the second argument as an array).
28+
29+
::
30+
31+
class CV_EXPORTS Solver : public Algorithm
32+
{
33+
public:
34+
class CV_EXPORTS Function
35+
{
36+
public:
37+
virtual ~Function() {}
38+
virtual double calc(const double* x) const = 0;
39+
virtual void getGradient(const double* /*x*/,double* /*grad*/) {}
40+
};
41+
42+
virtual Ptr<Function> getFunction() const = 0;
43+
virtual void setFunction(const Ptr<Function>& f) = 0;
44+
45+
virtual TermCriteria getTermCriteria() const = 0;
46+
virtual void setTermCriteria(const TermCriteria& termcrit) = 0;
47+
48+
// x contain the initial point before the call and the minima position (if algorithm converged) after. x is assumed to be (something that
49+
// after getMat() will return) row-vector or column-vector. *It's size and should
50+
// be consisted with previous dimensionality data given, if any (otherwise, it determines dimensionality)*
51+
virtual double minimize(InputOutputArray x) = 0;
52+
};
53+
54+
class CV_EXPORTS ConjGradSolver : public Solver{
55+
};
56+
57+
Note, that class ``ConjGradSolver`` thus does not add any new methods to the basic ``Solver`` interface.
58+
59+
optim::ConjGradSolver::getFunction
60+
--------------------------------------------
61+
62+
Getter for the optimized function. The optimized function is represented by ``Solver::Function`` interface, which requires
63+
derivatives to implement the method ``calc(double*)`` to evaluate the function. It should be emphasized once more, that since Nonlinear
64+
Conjugate Gradient method requires gradient to be computable in addition to the function values,
65+
``getGradient(const double* x,double* grad)`` method of a ``Solver::Function`` interface should be also implemented meaningfully.
66+
67+
.. ocv:function:: Ptr<Solver::Function> optim::ConjGradSolver::getFunction()
68+
69+
:return: Smart-pointer to an object that implements ``Solver::Function`` interface - it represents the function that is being optimized. It can be empty, if no function was given so far.
70+
71+
optim::ConjGradSolver::setFunction
72+
-----------------------------------------------
73+
74+
Setter for the optimized function. *It should be called at least once before the call to* ``ConjGradSolver::minimize()``, as
75+
default value is not usable.
76+
77+
.. ocv:function:: void optim::ConjGradSolver::setFunction(const Ptr<Solver::Function>& f)
78+
79+
:param f: The new function to optimize.
80+
81+
optim::ConjGradSolver::getTermCriteria
82+
----------------------------------------------------
83+
84+
Getter for the previously set terminal criteria for this algorithm.
85+
86+
.. ocv:function:: TermCriteria optim::ConjGradSolver::getTermCriteria()
87+
88+
:return: Deep copy of the terminal criteria used at the moment.
89+
90+
optim::ConjGradSolver::setTermCriteria
91+
------------------------------------------
92+
93+
Set terminal criteria for downhill simplex method. Two things should be noted. First, this method *is not necessary* to be called
94+
before the first call to ``ConjGradSolver::minimize()``, as the default value is sensible. Second, the method will raise an error
95+
if ``termcrit.type!=(TermCriteria::MAX_ITER+TermCriteria::EPS)`` and ``termcrit.type!=TermCriteria::MAX_ITER``. This means that termination criteria
96+
has to restrict maximum number of iterations to be done and may optionally allow algorithm to stop earlier if certain tolerance
97+
is achieved (what we mean by "tolerance is achieved" will be clarified below). If ``termcrit`` restricts both tolerance and maximum iteration
98+
number, both ``termcrit.epsilon`` and ``termcrit.maxCount`` should be positive. In case, if ``termcrit.type==TermCriteria::MAX_ITER``,
99+
only member ``termcrit.maxCount`` is required to be positive and in this case algorithm will just work for required number of iterations.
100+
101+
In current implementation, "tolerance is achieved" means that we have arrived at the point where the :math:`L_2`-norm of the gradient is less
102+
than the tolerance value.
103+
104+
.. ocv:function:: void optim::ConjGradSolver::setTermCriteria(const TermCriteria& termcrit)
105+
106+
:param termcrit: Terminal criteria to be used, represented as ``TermCriteria`` structure (defined elsewhere in openCV). Mind you, that it should meet ``termcrit.type==(TermCriteria::MAX_ITER+TermCriteria::EPS) && termcrit.epsilon>0 && termcrit.maxCount>0`` or ``termcrit.type==TermCriteria::MAX_ITER) && termcrit.maxCount>0``, otherwise the error will be raised.
107+
108+
optim::ConjGradSolver::minimize
109+
-----------------------------------
110+
111+
The main method of the ``ConjGradSolver``. It actually runs the algorithm and performs the minimization. The sole input parameter determines the
112+
centroid of the starting simplex (roughly, it tells where to start), all the others (terminal criteria and function to be minimized)
113+
are supposed to be set via the setters before the call to this method or the default values (not always sensible) will be used. Sometimes it may
114+
throw an error, if these default values cannot be used (say, you forgot to set the function to minimize and default value, that is, empty function,
115+
cannot be used).
116+
117+
.. ocv:function:: double optim::ConjGradSolver::minimize(InputOutputArray x)
118+
119+
:param x: The initial point. It is hard to overemphasize how important the choise of initial point is when you are using the heuristic algorithm like this one. Badly chosen initial point can make algorithm converge to (local) maximum instead of minimum, do not converge at all, converge to local minimum instead of global one.
120+
121+
:return: The value of a function at the point found.
122+
123+
optim::createConjGradSolver
124+
------------------------------------
125+
126+
This function returns the reference to the ready-to-use ``ConjGradSolver`` object. All the parameters are optional, so this procedure can be called
127+
even without parameters at all. In this case, the default values will be used. As default value for terminal criteria are the only sensible ones,
128+
``ConjGradSolver::setFunction()`` should be called upon the obtained object, if the function
129+
was not given to ``createConjGradSolver()``. Otherwise, the two ways (submit it to ``createConjGradSolver()`` or miss it out and call the
130+
``ConjGradSolver::setFunction()``) are absolutely equivalent (and will drop the same errors in the same way,
131+
should invalid input be detected).
132+
133+
.. ocv:function:: Ptr<optim::ConjGradSolver> optim::createConjGradSolver(const Ptr<Solver::Function>& f, TermCriteria termcrit)
134+
135+
:param f: Pointer to the function that will be minimized, similarly to the one you submit via ``ConjGradSolver::setFunction``.
136+
:param termcrit: Terminal criteria to the algorithm, similarly to the one you submit via ``ConjGradSolver::setTermCriteria``.

modules/optim/doc/optim.rst

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -10,3 +10,4 @@ optim. Generic numerical optimization
1010
linear_programming
1111
downhill_simplex_method
1212
primal_dual_algorithm
13+
nonlinear_conjugate_gradient

modules/optim/include/opencv2/optim.hpp

Lines changed: 10 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -10,8 +10,7 @@
1010
// License Agreement
1111
// For Open Source Computer Vision Library
1212
//
13-
// Copyright (C) 2000-2008, Intel Corporation, all rights reserved.
14-
// Copyright (C) 2008-2012, Willow Garage Inc., all rights reserved.
13+
// Copyright (C) 2013, OpenCV Foundation, all rights reserved.
1514
// Third party copyrights are property of their respective owners.
1615
//
1716
// Redistribution and use in source and binary forms, with or without modification,
@@ -30,7 +29,7 @@
3029
// This software is provided by the copyright holders and contributors "as is" and
3130
// any express or implied warranties, including, but not limited to, the implied
3231
// warranties of merchantability and fitness for a particular purpose are disclaimed.
33-
// In no event shall the Intel Corporation or contributors be liable for any direct,
32+
// In no event shall the OpenCV Foundation or contributors be liable for any direct,
3433
// indirect, incidental, special, exemplary, or consequential damages
3534
// (including, but not limited to, procurement of substitute goods or services;
3635
// loss of use, data, or profits; or business interruption) however caused
@@ -54,8 +53,8 @@ class CV_EXPORTS Solver : public Algorithm
5453
{
5554
public:
5655
virtual ~Function() {}
57-
//! ndim - dimensionality
5856
virtual double calc(const double* x) const = 0;
57+
virtual void getGradient(const double* /*x*/,double* /*grad*/) {}
5958
};
6059

6160
virtual Ptr<Function> getFunction() const = 0;
@@ -86,6 +85,13 @@ CV_EXPORTS_W Ptr<DownhillSolver> createDownhillSolver(const Ptr<Solver::Function
8685
InputArray initStep=Mat_<double>(1,1,0.0),
8786
TermCriteria termcrit=TermCriteria(TermCriteria::MAX_ITER+TermCriteria::EPS,5000,0.000001));
8887

88+
//! conjugate gradient method
89+
class CV_EXPORTS ConjGradSolver : public Solver{
90+
};
91+
92+
CV_EXPORTS_W Ptr<ConjGradSolver> createConjGradSolver(const Ptr<Solver::Function>& f=Ptr<ConjGradSolver::Function>(),
93+
TermCriteria termcrit=TermCriteria(TermCriteria::MAX_ITER+TermCriteria::EPS,5000,0.000001));
94+
8995
//!the return codes for solveLP() function
9096
enum
9197
{

modules/optim/include/opencv2/optim/optim.hpp

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -7,11 +7,9 @@
77
// copy or use the software.
88
//
99
//
10-
// License Agreement
10+
// License Agreement
1111
// For Open Source Computer Vision Library
1212
//
13-
// Copyright (C) 2000-2008, Intel Corporation, all rights reserved.
14-
// Copyright (C) 2009, Willow Garage Inc., all rights reserved.
1513
// Copyright (C) 2013, OpenCV Foundation, all rights reserved.
1614
// Third party copyrights are property of their respective owners.
1715
//
@@ -31,7 +29,7 @@
3129
// This software is provided by the copyright holders and contributors "as is" and
3230
// any express or implied warranties, including, but not limited to, the implied
3331
// warranties of merchantability and fitness for a particular purpose are disclaimed.
34-
// In no event shall the Intel Corporation or contributors be liable for any direct,
32+
// In no event shall the OpenCV Foundation or contributors be liable for any direct,
3533
// indirect, incidental, special, exemplary, or consequential damages
3634
// (including, but not limited to, procurement of substitute goods or services;
3735
// loss of use, data, or profits; or business interruption) however caused

0 commit comments

Comments
 (0)