-
Notifications
You must be signed in to change notification settings - Fork 92
Add support to query the reduced cost of a variable #776
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
This was discussed early in the design of MOI. I am not in favor of a I would be in favor of having a helper function in |
The utility in MOI.Utilities could be |
Having a universal fallback + a way to call the solver's API sounds like a good compromise. I also understand that adding an attribute and tests means more code maintenance. One more thing I'd like to mention here (more for future readers of this issue). |
If the MOIU fallback + possibility to call the solver API directly (if one wishes to) is acceptable to all, then I can volunteer a PR for this. Most likely some time this summer. |
A
The release wouldn't block the |
@chriscoey and I discussed it a bit further and, rather than
Time is not a pressing matter here (at least not for me). I'd rather think this through than rush into things. |
This is related to constrained variables: #710. |
I'm not sure I fully understand what you meant. I could not find Math-wise, do we agree that, for a conic (vector) variable Code-wise, I agree, if For example, if I have a conic variable MOI.get(model, MOI.VariableDual(), x1) would return |
Yes, see #759
Yes but currently (i.e. without |
Then it's probably best to wait until #759 is merged. Happy to keep the discussion open though. |
I'm unsure I understand the motivation for defining another way to access the duals on This also seems quite far away from the original issue of "Why can't I access reduced costs easily?". |
May I contribute to this in some way? I am using the following excerpt in my code: function reduced_cost(var) :: Float64
rc = 0.0
has_upper_bound(var) && (rc += shadow_price(UpperBoundRef(var)))
has_lower_bound(var) && (rc += shadow_price(LowerBoundRef(var)))
is_fixed(var) && (rc += shadow_price(FixRef(var)))
!has_upper_bound(var) && !has_lower_bound(var) && !is_fixed(var) &&
@warn "CAUTION: reduce_cost was called over a variable with no bounds"
rc
end But I would prefer to have a 'official' method to call instead. While I need to use reduced_costs on my research I have not a deep understanding of simplex and the maths related to it. So I would have a greater peace of mind if I could call a method already named |
Agreed that rather than at the MOI level, this might make sense at the JuMP level. We have |
What you would suggest to change in my method to qualify it as a contribution? With some direction I can make a PR. |
Probably just the following plus docs and tests. There's a question whether we want to use function reduced_cost(x::VariableRef)::Float64
if !has_duals(owner_model(x))
error("... something descriptive ...")
end
if is_fixed(x)
return shadow_price(FixRef(x))
end
rc = 0.0
if has_upper_bound(x)
rc += shadow_price(UpperBoundRef(x)))
end
if has_lower_bound(x)
rc += shadow_price(LowerBoundRef(x))
end
return rc
end |
Since the start of the week I have worked intermittently on the idea of a
using JuMP, GLPK
@variable(model, 0 <= a <= 10)
@objective(model, Max, a)
optimize!(model)
@show value(a)
@show reduced_cost(a)
@show shadow_price(LowerBoundRef(a))
@show shadow_price(UpperBoundRef(a)) gives
What makes complete sense, given the definition of |
The following should answer your points 0, 2, 3, and 5. AFAIK, it is only used in LP -more specifically, for simplex-type algorithms.
then the dual writes
Note that It then follows from complementary slackness that, if a bound constraint is not active, then the corresponding multiplier is zero. In the extreme case where one of the bounds in infinite, the corresponding multiplier is also zero. Finally, for the duality-related conventions, you may want to check the JuMP docs and MOI docs on duality. |
Our definition of "reduced cost" is the dual on the bound constraints, which is To answer your questions:
|
I am sorry, but I think I will step out of this PR, at least for the moment. I discovered that:
When I have more time, and if the situation of this issue continue to be open, I will try to give it some effort again. |
You're so close! Ignore the "textbook" definitions. Here is the correct code and documentation. """
reduced_cost(x::VariableRef)
Return the reduced cost associated with variable `x`.
Equivalent to querying the shadow price of the active variable bound
(if one exists and is active).
See also: [`shadow_price`](@ref).
"""
function reduced_cost(x::VariableRef)::Float64
if !has_duals(owner_model(x))
error("Unable to query reduced cost of $(x) because duals are not available.")
end
if is_fixed(x)
return shadow_price(FixRef(x))
end
rc = 0.0
if has_upper_bound(x)
rc += shadow_price(UpperBoundRef(x)))
end
if has_lower_bound(x)
rc += shadow_price(LowerBoundRef(x))
end
return rc
end |
We should only be defining reduced cost if we're confident that it's compatible with standard textbook uses. @chkwon, as a textbook author do you have thoughts on this? |
So this is the "textbook" value and sign of the reduced cost. For example, in @henriquebecker91's example, Excel (that bastion of a textbook implementation) also returns +1. It also matches the implementation in Gurobi, which explicitly call it a reduced cost. I more meant that @henriquebecker91 can ignore "textbook" definitions of the reduced cost that assume |
If we match up with Gurobi and Excel for both minimization and maximization and lower and upper bounds, then we should all set. |
For some reason, using JuMP, Gurobi, Test
function reduced_cost(x::VariableRef)::Float64
model = owner_model(x)
if !has_duals(model)
error("Unable to query reduced cost of $(x) because duals are not available.")
end
sense = objective_sense(model) == MOI.MIN_SENSE ? 1.0 : -1.0
if is_fixed(x)
return sense * dual(FixRef(x))
end
rc = 0.0
if has_upper_bound(x)
rc += dual(UpperBoundRef(x))
end
if has_lower_bound(x)
rc += dual(LowerBoundRef(x))
end
return sense * rc
end
@testset "Reduced cost" begin
@testset "Error" begin
model = direct_model(Gurobi.Optimizer())
set_silent(model)
@variable(model, x >= 1)
@objective(model, Min, x)
@test_throws ErrorException reduced_cost(x)
end
@testset "Free variable" begin
model = direct_model(Gurobi.Optimizer())
set_silent(model)
@variable(model, x == 1.0)
@variable(model, y)
@objective(model, Min, x)
optimize!(model)
@test reduced_cost(y) == 0.0 ==
Gurobi.get_dblattrelement(backend(model).inner, "RC", 2)
end
@testset "Fixed Variables" begin
model = direct_model(Gurobi.Optimizer())
set_silent(model)
@variable(model, x == 10)
@objective(model, Min, x)
optimize!(model)
@test reduced_cost(x) == 1.0 ==
Gurobi.get_dblattrelement(backend(model).inner, "RC", 1)
@objective(model, Max, x)
optimize!(model)
@test reduced_cost(x) == 1.0 ==
Gurobi.get_dblattrelement(backend(model).inner, "RC", 1)
@objective(model, Min, -x)
optimize!(model)
@test reduced_cost(x) == -1.0 ==
Gurobi.get_dblattrelement(backend(model).inner, "RC", 1)
@objective(model, Max, -x)
optimize!(model)
@test reduced_cost(x) == -1.0 ==
Gurobi.get_dblattrelement(backend(model).inner, "RC", 1)
end
@testset "Bounds" begin
model = direct_model(Gurobi.Optimizer())
set_silent(model)
@variable(model, 0 <= x <= 10)
@objective(model, Min, x)
optimize!(model)
@test reduced_cost(x) == 1.0 ==
Gurobi.get_dblattrelement(backend(model).inner, "RC", 1)
@objective(model, Max, x)
optimize!(model)
@test reduced_cost(x) == 1.0 ==
Gurobi.get_dblattrelement(backend(model).inner, "RC", 1)
@objective(model, Min, -x)
optimize!(model)
@test reduced_cost(x) == -1.0 ==
Gurobi.get_dblattrelement(backend(model).inner, "RC", 1)
@objective(model, Max, -x)
optimize!(model)
@test reduced_cost(x) == -1.0 ==
Gurobi.get_dblattrelement(backend(model).inner, "RC", 1)
end
end |
This is exactly the kind of thing I would not catch, but as you already provided the implementation and documentation I will try to whip up some tests, and make a PR. I am stuck at the fact that trying to do the same kind of test that the |
I think Reading this thread, I learned many new things about the meaning of the reduced costs. And I also have a new doubt. Is it called 'reduced cost' for maximization problems also? What does it mean by 'cost'? For As far as I know, almost all graduate level textbooks are written for minimization problems. So no confusion. Interestingly, many undergraduate level textbooks are written for maximization problems. For example, Hillier & Lieberman is based on maximization and avoids the term 'reduced cost' at all. I see some books, written for maximization problems, just call I'm not suggesting any new name for |
Closed by jump-dev/JuMP.jl#2205 |
Querying the reduced cost of a variable is very common in linear programming (and the corresponding function is in all linear solvers' APIs), however there is no straightforward way of doing this through MOI.
Currently, to access the reduced cost of a variable
x
, one has to manually query the dual value of every possible bound constraint onx
(if any), and aggregate the results.That's very cumbersome, given that this feature is supported by all linear solvers.
Proposal:
Add a variable attribute to query the reduced cost of a variable, e.g.
ReducedCost
, or (more verbose)VariableReducedCost
. For now, this would only apply to linear programming solvers (see list below).The syntax would look like:
Solvers wrappers to be updated:
If it makes sense, support could be extended to non-linear solvers as well.
The text was updated successfully, but these errors were encountered: