CurrentModule = MathOptInterface
DocTestSetup = quote
using MathOptInterface
const MOI = MathOptInterface
end
DocTestFilters = [r"MathOptInterface|MOI"]
The Utilities submodule provides a variety of functionality for managing
MOI.ModelLike
objects.
Utilities.Model
provides an implementation of a ModelLike
that efficiently supports all functions and sets defined within MOI. However,
given the extensibility of MOI, this might not cover all use cases.
Create a model as follows:
julia> model = MOI.Utilities.Model{Float64}()
MOIU.Model{Float64}
Utilities.UniversalFallback
is a layer that sits on top of any
ModelLike
and provides non-specialized (slower) fallbacks for
constraints and attributes that the underlying ModelLike
does not
support.
For example, Utilities.Model
doesn't support some variable attributes
like VariablePrimalStart
, so JuMP uses a combination of Universal
fallback and Utilities.Model
as a generic problem cache:
julia> model = MOI.Utilities.UniversalFallback(MOI.Utilities.Model{Float64}())
MOIU.UniversalFallback{MOIU.Model{Float64}}
fallback for MOIU.Model{Float64}
!!! warning
Adding a UniversalFallback means that your model
will now support all
constraints, even if the inner-model does not! This can lead to unexpected
behavior.
For advanced use cases that need efficient support for functions and sets
defined outside of MOI (but still known at compile time), we provide the
Utilities.@model
macro.
The @model
macro takes a name (for a new type, which must not exist yet),
eight tuples specifying the types of constraints that are supported, and then a
Bool
indicating the type is a subtype of MOI.AbstractOptimizer
(if
true
) or MOI.ModelLike
(if false
).
The eight tuples are in the following order:
- Un-typed scalar sets, e.g.,
Integer
- Typed scalar sets, e.g.,
LessThan
- Un-typed vector sets, e.g.,
Nonnegatives
- Typed vector sets, e.g.,
PowerCone
- Un-typed scalar functions, e.g.,
VariableIndex
- Typed scalar functions, e.g.,
ScalarAffineFunction
- Un-typed vector functions, e.g.,
VectorOfVariables
- Typed vector functions, e.g.,
VectorAffineFunction
The tuples can contain more than one element. Typed-sets must be specified
without their type parameter, i.e., MOI.LessThan
, not MOI.LessThan{Float64}
.
Here is an example:
julia> MOI.Utilities.@model(
MyNewModel,
(MOI.Integer,), # Un-typed scalar sets
(MOI.GreaterThan,), # Typed scalar sets
(MOI.Nonnegatives,), # Un-typed vector sets
(MOI.PowerCone,), # Typed vector sets
(MOI.VariableIndex,), # Un-typed scalar functions
(MOI.ScalarAffineFunction,), # Typed scalar functions
(MOI.VectorOfVariables,), # Un-typed vector functions
(MOI.VectorAffineFunction,), # Typed vector functions
true, # <:MOI.AbstractOptimizer?
)
MathOptInterface.Utilities.GenericOptimizer{T, MathOptInterface.Utilities.ObjectiveContainer{T}, MathOptInterface.Utilities.VariablesContainer{T}, MyNewModelFunctionConstraints{T}} where T
julia> model = MyNewModel{Float64}()
MOIU.GenericOptimizer{Float64, MOIU.ObjectiveContainer{Float64}, MOIU.VariablesContainer{Float64}, MyNewModelFunctionConstraints{Float64}}
!!! warning
MyNewModel
supports every VariableIndex
-in-Set constraint, as well as
VariableIndex
, ScalarAffineFunction
, and
ScalarQuadraticFunction
objective functions. Implement
MOI.supports
as needed to forbid constraint and objective function
combinations.
As another example, PATHSolver, which
only supports VectorAffineFunction
-in-Complements
defines its optimizer as:
julia> MOI.Utilities.@model(
PathOptimizer,
(), # Scalar sets
(), # Typed scalar sets
(MOI.Complements,), # Vector sets
(), # Typed vector sets
(), # Scalar functions
(), # Typed scalar functions
(), # Vector functions
(MOI.VectorAffineFunction,), # Typed vector functions
true, # is_optimizer
)
MathOptInterface.Utilities.GenericOptimizer{T, MathOptInterface.Utilities.ObjectiveContainer{T}, MathOptInterface.Utilities.VariablesContainer{T}, MathOptInterface.Utilities.VectorOfConstraints{MathOptInterface.VectorAffineFunction{T}, MathOptInterface.Complements}} where T
However, PathOptimizer
does not support some VariableIndex
-in-Set
constraints, so we must explicitly define:
julia> function MOI.supports_constraint(
::PathOptimizer,
::Type{MOI.VariableIndex},
::Type{Union{<:MOI.Semiinteger,MOI.Semicontinuous,MOI.ZeroOne,MOI.Integer}}
)
return false
end
Finally, PATH doesn't support an objective function, so we need to add:
julia> MOI.supports(::PathOptimizer, ::MOI.ObjectiveFunction) = false
!!! warning This macro creates a new type, so it must be called from the top-level of a module, e.g., it cannot be called from inside a function.
A [Utilities.CachingOptimizer
] is an MOI layer that abstracts the difference
between solvers that support incremental modification (e.g., they support adding
variables one-by-one), and solvers that require the entire problem in a single
API call (e.g., they only accept the A
, b
and c
matrices of a linear
program).
It has two parts:
- A cache, where the model can be built and modified incrementally
- An optimizer, which is used to solve the problem
julia> model = MOI.Utilities.CachingOptimizer(
MOI.Utilities.Model{Float64}(),
PathOptimizer{Float64}(),
)
MOIU.CachingOptimizer{MOIU.GenericOptimizer{Float64, MOIU.ObjectiveContainer{Float64}, MOIU.VariablesContainer{Float64}, MOIU.VectorOfConstraints{MOI.VectorAffineFunction{Float64}, MOI.Complements}}, MOIU.Model{Float64}}
in state EMPTY_OPTIMIZER
in mode AUTOMATIC
with model cache MOIU.Model{Float64}
with optimizer MOIU.GenericOptimizer{Float64, MOIU.ObjectiveContainer{Float64}, MOIU.VariablesContainer{Float64}, MOIU.VectorOfConstraints{MOI.VectorAffineFunction{Float64}, MOI.Complements}}
A Utilities.CachingOptimizer
may be in one of three possible states:
NO_OPTIMIZER
: The CachingOptimizer does not have any optimizer.EMPTY_OPTIMIZER
: The CachingOptimizer has an empty optimizer, and it is not synchronized with the cached model. Modifications are forwarded to the cache, but not to the optimizer.ATTACHED_OPTIMIZER
: The CachingOptimizer has an optimizer, and it is synchronized with the cached model. Modifications are forwarded to the optimizer. If the optimizer does not support modifications, and error will be thrown.
Use Utilities.attach_optimizer
to go from EMPTY_OPTIMIZER
to
ATTACHED_OPTIMIZER
:
julia> MOI.Utilities.attach_optimizer(model)
julia> model
MOIU.CachingOptimizer{MOIU.GenericOptimizer{Float64, MOIU.ObjectiveContainer{Float64}, MOIU.VariablesContainer{Float64}, MOIU.VectorOfConstraints{MOI.VectorAffineFunction{Float64}, MOI.Complements}}, MOIU.Model{Float64}}
in state ATTACHED_OPTIMIZER
in mode AUTOMATIC
with model cache MOIU.Model{Float64}
with optimizer MOIU.GenericOptimizer{Float64, MOIU.ObjectiveContainer{Float64}, MOIU.VariablesContainer{Float64}, MOIU.VectorOfConstraints{MOI.VectorAffineFunction{Float64}, MOI.Complements}}
!!! info
You must be in ATTACHED_OPTIMIZER
to use optimize!
.
Use Utilities.reset_optimizer
to go from ATTACHED_OPTIMIZER
to
EMPTY_OPTIMIZER
:
julia> MOI.Utilities.reset_optimizer(model)
julia> model
MOIU.CachingOptimizer{MOIU.GenericOptimizer{Float64, MOIU.ObjectiveContainer{Float64}, MOIU.VariablesContainer{Float64}, MOIU.VectorOfConstraints{MOI.VectorAffineFunction{Float64}, MOI.Complements}}, MOIU.Model{Float64}}
in state EMPTY_OPTIMIZER
in mode AUTOMATIC
with model cache MOIU.Model{Float64}
with optimizer MOIU.GenericOptimizer{Float64, MOIU.ObjectiveContainer{Float64}, MOIU.VariablesContainer{Float64}, MOIU.VectorOfConstraints{MOI.VectorAffineFunction{Float64}, MOI.Complements}}
!!! info
Calling MOI.empty!(model)
also resets the state to EMPTY_OPTIMIZER
.
So after emptying a model, the modification will only be applied to the
cache.
Use Utilities.drop_optimizer
to go from any state to NO_OPTIMIZER
:
julia> MOI.Utilities.drop_optimizer(model)
julia> model
MOIU.CachingOptimizer{MOIU.GenericOptimizer{Float64, MOIU.ObjectiveContainer{Float64}, MOIU.VariablesContainer{Float64}, MOIU.VectorOfConstraints{MOI.VectorAffineFunction{Float64}, MOI.Complements}}, MOIU.Model{Float64}}
in state NO_OPTIMIZER
in mode AUTOMATIC
with model cache MOIU.Model{Float64}
with optimizer nothing
Pass an empty optimizer to Utilities.reset_optimizer
to go from
NO_OPTIMIZER
to EMPTY_OPTIMIZER
:
julia> MOI.Utilities.reset_optimizer(model, PathOptimizer{Float64}())
julia> model
MOIU.CachingOptimizer{MOIU.GenericOptimizer{Float64, MOIU.ObjectiveContainer{Float64}, MOIU.VariablesContainer{Float64}, MOIU.VectorOfConstraints{MOI.VectorAffineFunction{Float64}, MOI.Complements}}, MOIU.Model{Float64}}
in state EMPTY_OPTIMIZER
in mode AUTOMATIC
with model cache MOIU.Model{Float64}
with optimizer MOIU.GenericOptimizer{Float64, MOIU.ObjectiveContainer{Float64}, MOIU.VariablesContainer{Float64}, MOIU.VectorOfConstraints{MOI.VectorAffineFunction{Float64}, MOI.Complements}}
Deciding when to attach and reset the optimizer is tedious, and you will often write code like this:
try
# modification
catch
MOI.Utilities.reset_optimizer(model)
# Re-try modification
end
To make this easier, Utilities.CachingOptimizer
has two modes of
operation:
AUTOMATIC
: TheCachingOptimizer
changes its state when necessary. Attempting to add a constraint or perform a modification not supported by the optimizer results in a drop toEMPTY_OPTIMIZER
mode.MANUAL
: The user must change the state of theCachingOptimizer
. Attempting to perform an operation in the incorrect state results in an error.
By default, AUTOMATIC
mode is chosen. However, you can create a
CachingOptimizer
in MANUAL
mode as follows:
julia> model = MOI.Utilities.CachingOptimizer(
MOI.Utilities.Model{Float64}(),
MOI.Utilities.MANUAL,
)
MOIU.CachingOptimizer{MOI.AbstractOptimizer, MOIU.Model{Float64}}
in state NO_OPTIMIZER
in mode MANUAL
with model cache MOIU.Model{Float64}
with optimizer nothing
julia> MOI.Utilities.reset_optimizer(model, PathOptimizer{Float64}())
julia> model
MOIU.CachingOptimizer{MOI.AbstractOptimizer, MOIU.Model{Float64}}
in state EMPTY_OPTIMIZER
in mode MANUAL
with model cache MOIU.Model{Float64}
with optimizer MOIU.GenericOptimizer{Float64, MOIU.ObjectiveContainer{Float64}, MOIU.VariablesContainer{Float64}, MOIU.VectorOfConstraints{MOI.VectorAffineFunction{Float64}, MOI.Complements}}
Use print
to print the formulation of the model.
julia> model = MOI.Utilities.Model{Float64}();
julia> x = MOI.add_variable(model)
MathOptInterface.VariableIndex(1)
julia> MOI.set(model, MOI.VariableName(), x, "x_var")
julia> MOI.add_constraint(model, x, MOI.ZeroOne())
MathOptInterface.ConstraintIndex{MathOptInterface.VariableIndex, MathOptInterface.ZeroOne}(1)
julia> MOI.set(model, MOI.ObjectiveFunction{typeof(x)}(), x)
julia> MOI.set(model, MOI.ObjectiveSense(), MOI.MAX_SENSE)
julia> print(model)
Maximize VariableIndex:
x_var
Subject to:
VariableIndex-in-ZeroOne
x_var ∈ {0, 1}
Use Utilities.latex_formulation
to display the model in LaTeX form:
julia> MOI.Utilities.latex_formulation(model)
$$ \begin{aligned}
\max\quad & x\_var \\
\text{Subject to}\\
& \text{VariableIndex-in-ZeroOne} \\
& x\_var \in \{0, 1\} \\
\end{aligned} $$
!!! tip
In IJulia, calling print
or ending a cell with
Utilities.latex_formulation
will render the model in LaTeX.
Pass Utilities.PenaltyRelaxation
to modify
to relax the
problem by adding penalized slack variables to the constraints. This is helpful
when debugging sources of infeasible models.
julia> model = MOI.Utilities.Model{Float64}();
julia> x = MOI.add_variable(model);
julia> MOI.set(model, MOI.VariableName(), x, "x")
julia> c = MOI.add_constraint(model, 1.0 * x, MOI.LessThan(2.0));
julia> map = MOI.modify(model, MOI.Utilities.PenaltyRelaxation(Dict(c => 2.0)));
julia> print(model)
Minimize ScalarAffineFunction{Float64}:
0.0 + 2.0 v[2]
Subject to:
ScalarAffineFunction{Float64}-in-LessThan{Float64}
0.0 + 1.0 x - 1.0 v[2] <= 2.0
VariableIndex-in-GreaterThan{Float64}
v[2] >= 0.0
julia> map[c]
MathOptInterface.ScalarAffineFunction{Float64}(MathOptInterface.ScalarAffineTerm{Float64}[MathOptInterface.ScalarAffineTerm{Float64}(1.0, MathOptInterface.VariableIndex(2))], 0.0)
You can also modify a single constraint using Utilities.ScalarPenaltyRelaxation
:
julia> model = MOI.Utilities.Model{Float64}();
julia> x = MOI.add_variable(model);
julia> MOI.set(model, MOI.VariableName(), x, "x")
julia> c = MOI.add_constraint(model, 1.0 * x, MOI.LessThan(2.0));
julia> f = MOI.modify(model, c, MOI.Utilities.ScalarPenaltyRelaxation(2.0));
julia> print(model)
Minimize ScalarAffineFunction{Float64}:
0.0 + 2.0 v[2]
Subject to:
ScalarAffineFunction{Float64}-in-LessThan{Float64}
0.0 + 1.0 x - 1.0 v[2] <= 2.0
VariableIndex-in-GreaterThan{Float64}
v[2] >= 0.0
julia> f
MathOptInterface.ScalarAffineFunction{Float64}(MathOptInterface.ScalarAffineTerm{Float64}[MathOptInterface.ScalarAffineTerm{Float64}(1.0, MathOptInterface.VariableIndex(2))], 0.0)
The constraints of Utilities.Model
are stored as a vector of tuples
of function and set in a Utilities.VectorOfConstraints
. Other representations
can be used by parameterizing the type Utilities.GenericModel
(resp. Utilities.GenericOptimizer
). For instance, if all
non-VariableIndex
constraints are affine, the coefficients of all the
constraints can be stored in a single sparse matrix using
Utilities.MatrixOfConstraints
. The constraints storage can even be
customized up to a point where it exactly matches the storage of the solver of
interest, in which case copy_to
can be implemented for the solver by
calling copy_to
to this custom model.
For instance, Clp defines the following model
MOI.Utilities.@product_of_scalar_sets(LP, MOI.EqualTo{T}, MOI.LessThan{T}, MOI.GreaterThan{T})
const Model = MOI.Utilities.GenericModel{
Float64,
MOI.Utilities.MatrixOfConstraints{
Float64,
MOI.Utilities.MutableSparseMatrixCSC{Float64,Cint,MOI.Utilities.ZeroBasedIndexing},
MOI.Utilities.Hyperrectangle{Float64},
LP{Float64},
},
}
The copy_to
operation can now be implemented as follows (assuming that
the Model
definition above is in the Clp
module so that it can be referred
to as Model
, to be distinguished with Utilities.Model
):
function _copy_to(dest::Optimizer, src::Model)
@assert MOI.is_empty(dest)
A = src.constraints.coefficients
row_bounds = src.constraints.constants
Clp_loadProblem(
dest,
A.n,
A.m,
A.colptr,
A.rowval,
A.nzval,
src.lower_bound,
src.upper_bound,
# (...) objective vector (omitted),
row_bounds.lower,
row_bounds.upper,
)
# Set objective sense and constant (omitted)
return
end
function MOI.copy_to(dest::Optimizer, src::Model)
_copy_to(dest, src)
return MOI.Utilities.identity_index_map(src)
end
function MOI.copy_to(
dest::Optimizer,
src::MOI.Utilities.UniversalFallback{Model},
)
# Copy attributes from `src` to `dest` and error in case any unsupported
# constraints or attributes are set in `UniversalFallback`.
return MOI.copy_to(dest, src.model)
end
function MOI.copy_to(
dest::Optimizer,
src::MOI.ModelLike,
)
model = Model()
index_map = MOI.copy_to(model, src)
_copy_to(dest, model)
return index_map
end
Utilities provides Utilities.ModelFilter
as a useful tool to copy a
subset of a model. For example, given an infeasible model, we can copy the
irreducible infeasible subsystem (for models implementing
ConstraintConflictStatus
) as follows:
my_filter(::Any) = true
function my_filter(ci::MOI.ConstraintIndex)
status = MOI.get(dest, MOI.ConstraintConflictStatus(), ci)
return status != MOI.NOT_IN_CONFLICT
end
filtered_src = MOI.Utilities.ModelFilter(my_filter, src)
index_map = MOI.copy_to(dest, filtered_src)
The value of some attributes can be inferred from the value of other attributes.
For example, the value of ObjectiveValue
can be computed using
ObjectiveFunction
and VariablePrimal
.
When a solver gives direct access to an attribute, it is better to return this
value. However, if this is not the case, Utilities.get_fallback
can be
used instead. For example:
function MOI.get(model::Optimizer, attr::MOI.ObjectiveFunction)
return MOI.Utilities.get_fallback(model, attr)
end
When writing MOI interfaces, we often need to handle situations in which we map
ConstraintIndex
s to different values. For example, to a string
for ConstraintName
.
One option is to use a dictionary like Dict{MOI.ConstraintIndex,String}
.
However, this incurs a performance cost because the key is not a concrete type.
The DoubleDicts submodule helps this situation by providing two types main
types Utilities.DoubleDicts.DoubleDict
and
Utilities.DoubleDicts.IndexDoubleDict
. These types act like normal
dictionaries, but internally they use more efficient dictionaries specialized to
the type of the function-set pair.
The most common usage of a DoubleDict
is in the index_map
returned by
copy_to
. Performance can be improved, by using a function barrier.
That is, instead of code like:
index_map = MOI.copy_to(dest, src)
for (F, S) in MOI.get(src, MOI.ListOfConstraintTypesPresent())
for ci in MOI.get(src, MOI.ListOfConstraintIndices{F,S}())
dest_ci = index_map[ci]
# ...
end
end
use instead:
function function_barrier(
dest,
src,
index_map::MOI.Utilities.DoubleDicts.IndexDoubleDictInner{F,S},
) where {F,S}
for ci in MOI.get(src, MOI.ListOfConstraintIndices{F,S}())
dest_ci = index_map[ci]
# ...
end
return
end
index_map = MOI.copy_to(dest, src)
for (F, S) in MOI.get(src, MOI.ListOfConstraintTypesPresent())
function_barrier(dest, src, index_map[F, S])
end