-
Notifications
You must be signed in to change notification settings - Fork 87
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Utilities] add PenaltyRelaxation #1995
Conversation
Not if we use |
Bump. Does anyone want to take a look at this? |
I like this and looks good. Any reason why is it not handling vector function-in-set's? |
Only that it is a little more complicated to implement. I think this should cover 99% of the requests. |
It seems weird conceptually to use attributes to (destructively) transform a model. |
Indeed, maybe |
Yes, |
Any comments now that I've swapped to |
|
This works quite nicely. Here's what it looks like from JuMP: julia> using JuMP, HiGHS
julia> model = Model(HiGHS.Optimizer);
julia> set_silent(model)
julia> @variable(model, x >= 0);
julia> @objective(model, Max, 2x + 1);
julia> @constraint(model, c, 2x - 1 <= -2);
julia> optimize!(model)
julia> function penalty_relaxation!(
model::Model,
penalties;
default::Union{Nothing,Real} = 1.0,
)
if default !== nothing
default = Float64(default)
end
moi_penalties = Dict{MOI.ConstraintIndex,Float64}(
index(k) => Float64(v) for (k, v) in penalties
)
map = MOI.modify(
backend(model),
MOI.Utilities.PenaltyRelaxation(moi_penalties; default = default),
)
return Dict(
ConstraintRef(model, k, ScalarShape()) => jump_function(model, v) for
(k, v) in map
)
end
penalty_relaxation! (generic function with 2 methods)
julia> function penalty_relaxation!(model::Model; kwargs...)
return penalty_relaxation!(model, Dict(); kwargs...)
end
penalty_relaxation! (generic function with 2 methods)
julia> penalties = penalty_relaxation!(model; default = 2)
Dict{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.LessThan{Float64}}, ScalarShape}, AffExpr} with 1 entry:
c : 2 x - _[2] ≤ -1.0 => _[2]
julia> print(model)
Max 2 x - 2 _[2] + 1
Subject to
c : 2 x - _[2] ≤ -1.0
x ≥ 0.0
_[2] ≥ 0.0
julia> optimize!(model)
julia> value(penalties[c])
1.0 |
Bump. |
Co-authored-by: Benoît Legat <[email protected]>
src/Utilities/penalty_relaxation.jl
Outdated
::MOI.ConstraintIndex, | ||
::ScalarPenaltyRelaxation, | ||
) | ||
# A generic fallback if modification is not supported. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is inconsistent, if a modification is not supported, an error should be thrown instead. Is that useful for VariableIndex
only ? Then we can check whether it is VariableIndex
before calling modify
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, it's for any set that isn't supported.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We either need supports
or we need this fallback.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, it's for any set that isn't supported.
Isn't the method below defined for any AbstractScalarSet
?
So this is in case an MOI extension defines another type of AbstractScalarFunction
?
We could extend the definition of supports
but then we need to add supports
methods for solvers.
Another option is to add a keyword to the modify
that is model-wise like ignore_unsupported
.
If this is false, we just modify and let the error propagate.
Otherwise, we use a try
-catch
and handle ModifyConstraintNotAllowed
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I added the try-catch
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would probably consider adding a default fallback throwing ModifyConstraintNotAllowed
but then we should do it for all modifications so it's better done in a separate PR
Part of jump-dev/JuMP.jl#3034. This PR explores the options we have to add a feasibility relaxation function to MOI.
The main decisions would be:
The other option, not implemented here, is some sort of
Optimizer
likeDualization.Optimizer
. That might be a better solution, but would add more code and overhead.