Skip to content
This repository was archived by the owner on Mar 17, 2024. It is now read-only.

Commit 5366565

Browse files
committed
doc updates
1 parent 2731f48 commit 5366565

File tree

4 files changed

+94
-5
lines changed

4 files changed

+94
-5
lines changed

doc/index.rst

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ in its `documentation <http://jump.readthedocs.org/en/latest/>`_.
1212

1313
JuMPChance supports a particular class of chance constraints, namely those involving affine combinations of jointly normal random variables. Distributionally robust chance constraints are supported in the form of intervals on the mean and variance of the normally distributed random variables.
1414

15-
JuMPChance is research code and not officially supported as part of JuMP. JuMPChance is released unter the terms of the `LGPL <http://www.gnu.org/licenses/lgpl-3.0.txt>`_ licence version 3:
15+
JuMPChance is research code and not officially supported as part of JuMP. JuMPChance is released under the terms of the `LGPL <http://www.gnu.org/licenses/lgpl-3.0.txt>`_ license version 3:
1616

1717
.. code-block:: none
1818
@@ -34,4 +34,5 @@ Contents
3434
installation.rst
3535
quickstart.rst
3636
solution.rst
37+
references.rst
3738

doc/quickstart.rst

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ This quick start guide will introduce the syntax of JuMPChance, again assuming
88
familiarity with JuMP.
99

1010

11-
Creating a Model
11+
Creating a model
1212
^^^^^^^^^^^^^^^^
1313

1414
JuMPChance models should be created by using the following constructor::
@@ -29,7 +29,7 @@ By default, JuMPChance will use `ECOS <https://github.com/JuliaOpt/ECOS.jl>`_,
2929
a lightweight open-source solver which supports the conic constraints needed for the
3030
reformulation method for solving chance-constrained problems.
3131

32-
Defining Variables
32+
Defining variables
3333
^^^^^^^^^^^^^^^^^^
3434

3535
In JuMPChance, you can mix decision variables and random variables in expressions.
@@ -56,7 +56,7 @@ Index sets do not need to be ranges; they may be arbitrary Julia lists::
5656

5757
defines two variables ``x[:cat]`` and ``x[:dog]``.
5858

59-
Chance Constraints
59+
Chance constraints
6060
^^^^^^^^^^^^^^^^^^
6161

6262
A JuMPChance model may contain a combination of standard JuMP constraints
@@ -93,7 +93,7 @@ Chance constraints of the above form are added by using the ``addConstraint`` fu
9393
Adds the constraint :math:`P(z*x \leq -1) < 0.05`. Note that the ``with_probability`` argument specifies the *maximum* probability :math:`\epsilon` with which the constraint may be satisfied, and so should be a small number.
9494

9595

96-
Distributionally Robust Chance Constraints
96+
Distributionally robust chance constraints
9797
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
9898

9999
One may also specify normally distributed random variables whose parameters

doc/references.rst

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
2+
----------
3+
References
4+
----------
5+
6+
One of the first appearances of the convex reformulation discussed in the previous section was in 1958 by `Charnes, Cooper, and Symonds <http://www.jstor.org/stable/2627328>`_, subsequently studied by `Prékopa (1970) <http://rutcor.rutgers.edu/~prekopa/prob.pdf>`_ and others. It now appears in standard texts on stochastic programming. The field of chance constraints is quite broad, and a discussion of formulations which are not implemented here is beyond the scope of this document.
7+
8+
The class of distributionally robust chance constraints considered is a special case of so-called robust second-order conic optimization. Tractability is discussed by `Ben-Tal, El Ghaoui, and Nemirovski (2009) <http://www2.isye.gatech.edu/~nemirovs/FullBookDec11.pdf>`_. The particular model was proposed by `Bienstock, Chertkov, and Harnett (2014) <http://dx.doi.org/10.1137/130910312>`_ with application to control of power systems under uncertainty.
9+
10+
JuMPChance is being used in ongoing research projects and reports which are currently in preparation.

doc/solution.rst

Lines changed: 78 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2,3 +2,81 @@
22
-------------------------------
33
Solution methods and parameters
44
-------------------------------
5+
6+
Standard reformulation
7+
^^^^^^^^^^^^^^^^^^^^^^
8+
9+
Consider the chance constraint
10+
11+
.. math::
12+
13+
P\left(\sum_{i=1}^k \left(c_i^Tx +d_i\right)z_i \geq b\right) \leq \epsilon
14+
15+
where :math:`z \sim \mathcal{N}(\mu,\Sigma)` is a vector in :math:`\mathbb{R}^n` of jointly normal random
16+
variables with mean :math:`\mu` and covariance matrix :math:`\Sigma`. JuMPChance currently only supports a diagonal covariance matrix :math:`\Sigma`, i.e., all variables are independent, but we present the more general case here. For simplicity, we can introduce a new set of variables :math:`y_i = c_i^Tx + d_i` and reduce the constraint to:
17+
18+
.. math::
19+
20+
P\left(y^Tz \geq b\right) \leq \epsilon
21+
22+
Recall that :math:`y^Tz` is normally distributed with mean :math:`y^T\mu` and variance :math:`y^T\Sigma y`. Then
23+
24+
.. math::
25+
26+
P\left(y^Tz \geq b\right) = P\left(y^Tz - y^T\mu \geq b - y^T\mu\right) = P\left( \frac{y^Tz - \mu^Tz}{\sqrt{y^T\Sigma y}} \geq \frac{b - y^T\mu}{\sqrt{y^T\Sigma y}}\right)
27+
28+
29+
= 1- \Phi\left(\frac{b - y^T\mu}{\sqrt{y^T\Sigma y}}\right)
30+
31+
where :math:`\Phi` is the standard normal cumulative distribution function.
32+
33+
Therefore the chance constraint is satisfied if and only if
34+
35+
.. math::
36+
37+
\Phi\left(\frac{b - y^T\mu}{\sqrt{y^T\Sigma y}}\right) \geq 1- \epsilon
38+
39+
or, since :math:`\Phi^{-1}` is monotonic increasing,
40+
41+
.. math::
42+
43+
\frac{b - y^T\mu}{\sqrt{y^T\Sigma y}} \geq \Phi^{-1}(1-\epsilon)
44+
45+
which is
46+
47+
.. math::
48+
49+
y^T\mu + \Phi^{-1}(1-\epsilon)\sqrt{y^T\Sigma y} \leq b.
50+
51+
For :math:`\epsilon \leq 0`, :math:`\Phi^{-1}(1-\epsilon) > 0`, so the above constraint is convex and equivalent to
52+
53+
.. math::
54+
55+
||\Sigma^{\frac{1}{2}}y|| \leq (b-\mu^Ty)/\Phi^{-1}(1-\epsilon)
56+
57+
which is a `second-order conic <http://en.wikipedia.org/wiki/Second-order_cone_programming>`_ constraint, where :math:`\Sigma^{\frac{1}{2}}` is the `square root <http://en.wikipedia.org/wiki/Square_root_of_a_matrix>`_ of :math:`\Sigma`.
58+
59+
Methods for distributionally robust constraints
60+
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
61+
62+
Following the notation in the quick start guide, a distributionally robust
63+
chance constraint can be formulated as
64+
65+
.. math::
66+
67+
||\Sigma^{\frac{1}{2}}y|| \leq (b-\mu^Ty)/\Phi^{-1}(1-\epsilon)\quad \forall\, \mu \in M, \Sigma \in V
68+
69+
This is a convex constraint because it is the intersection of a large (possibly infinite) set of convex constraints. These are challenging to reformulate into an explicit conic form. Instead, we approximate the constraint by a sequence of linear tangents, i.e., given a point :math:`y`, we detect if the constraint is violated for any choice of :math:`\mu` or :math:`\Sigma`, and if so we add a separating hyperplane which is simple to compute.
70+
71+
solvechance parameters
72+
^^^^^^^^^^^^^^^^^^^^^^
73+
74+
The ``solvechance`` method has the following optional keyword parameters:
75+
76+
- ``method::Symbol``, either ``:Reformulate`` to use the second-order conic formulation or ``:Cuts`` to approximate the constraints by a sequence of linear outer-approximations. Defaults to ``:Reformulate``.
77+
- ``linearize_objective::Bool``, either ``true`` or ``false`` indicating whether to provide a convex quadratic objective directly to the solver or to use linear outer approximations. Defaults to ``false``.
78+
- ``probability_tolerance::Float64``, chance constraints are considered satisfied if within :math:`\epsilon` plus the given tolerance. Defaults to ``0.001``.
79+
- ``debug::Bool``, enables debugging output for the outer approximation algorithm. Defaults to ``false``.
80+
- ``iteration_limit::Int``, limits the number of iterations performed by the outer approximation algorithm. (In each iteration, a single linearization is added for each violated constraint.) Defaults to ``60``.
81+
- ``objective_linearization_tolerance::Float64``, absolute term-wise tolerance used when linearizing quadratic objectives. Defaults to ``1e-6``.
82+
- ``reformulate_quadobj_to_conic::Bool``, if ``true``, automatically reformulates a quadratic objective into second-order conic form. This is necessary for some solvers like Mosek or ECOS which don't support mixing quadratic and conic constraints. Defaults to ``false`` except if the solver is ECOS.

0 commit comments

Comments
 (0)