Skip to content

Commit fc5418b

Browse files
authored
Add support for vector-valued objectives (#3176)
1 parent 4819d90 commit fc5418b

12 files changed

+543
-30
lines changed

docs/Project.toml

+3-1
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,7 @@ JSONSchema = "7d188eb4-7ad8-530c-ae41-71a32a6d4692"
1212
JuMP = "4076af6c-e467-56ae-b986-b466b2749572"
1313
Literate = "98b081ad-f1c9-55d3-8b20-4c87d4299306"
1414
MathOptInterface = "b8f27783-ece8-5eb3-8dc8-9495eed66fee"
15+
MultiObjectiveAlgorithms = "0327d340-17cd-11ea-3e99-2fd5d98cecda"
1516
Pkg = "44cfe95a-1eb2-52ea-b672-e2afdf69b78f"
1617
Plots = "91a5bcdd-55d7-5caf-9e0b-520d859cae80"
1718
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
@@ -33,7 +34,8 @@ Ipopt = "=1.1.0"
3334
JSON = "0.21"
3435
JSONSchema = "1"
3536
Literate = "2.8"
36-
MathOptInterface = "=1.11.5"
37+
MathOptInterface = "=1.12.0"
38+
MultiObjectiveAlgorithms = "=0.1.1"
3739
Plots = "1"
3840
SCS = "=1.1.3"
3941
SQLite = "1"

docs/make.jl

+3-1
Original file line numberDiff line numberDiff line change
@@ -131,6 +131,8 @@ const _PAGES = [
131131
"tutorials/linear/factory_schedule.md",
132132
"tutorials/linear/finance.md",
133133
"tutorials/linear/geographic_clustering.md",
134+
"tutorials/linear/multi_objective_knapsack.md",
135+
"tutorials/linear/multi_objective_examples.md",
134136
"tutorials/linear/knapsack.md",
135137
"tutorials/linear/multi.md",
136138
"tutorials/linear/n-queens.md",
@@ -285,7 +287,7 @@ function _add_moi_pages()
285287
!!! warning
286288
This documentation in this section is a copy of the official
287289
MathOptInterface documentation available at
288-
[https://jump.dev/MathOptInterface.jl/v1.11.5](https://jump.dev/MathOptInterface.jl/v1.11.5).
290+
[https://jump.dev/MathOptInterface.jl/v1.12.0](https://jump.dev/MathOptInterface.jl/v1.20.0).
289291
It is included here to make it easier to link concepts between JuMP and
290292
MathOptInterface.
291293
"""

docs/src/manual/objective.md

+81
Original file line numberDiff line numberDiff line change
@@ -157,3 +157,84 @@ julia> @objective(model, Min, 2x)
157157
julia> @objective(model, Max, objective_function(model))
158158
2 x
159159
```
160+
161+
## Set a vector-valued objective
162+
163+
Define a multi-objective optimization problem by passing a vector of objectives:
164+
165+
```jldoctest; setup = :(model=Model())
166+
julia> @variable(model, x[1:2]);
167+
168+
julia> @objective(model, Min, [1 + x[1], 2 * x[2]])
169+
2-element Vector{AffExpr}:
170+
x[1] + 1
171+
2 x[2]
172+
173+
julia> f = objective_function(model)
174+
2-element Vector{AffExpr}:
175+
x[1] + 1
176+
2 x[2]
177+
```
178+
179+
!!! tip
180+
The [Multi-objective knapsack](@ref) tutorial provides an example of
181+
solving a multi-objective integer program.
182+
183+
In most cases, multi-objective optimization solvers will return multiple
184+
solutions, corresponding to points on the Pareto frontier. See [Multiple solutions](@ref)
185+
for information on how to query and work with multiple solutions.
186+
187+
Note that you must set a single objective sense, that is, you cannot have
188+
both minimization and maximization objectives. Work around this limitation by
189+
choosing `Min` and negating any objectives you want to maximize:
190+
191+
```jldoctest; setup = :(model=Model())
192+
julia> @variable(model, x[1:2]);
193+
194+
julia> @expression(model, obj1, 1 + x[1])
195+
x[1] + 1
196+
197+
julia> @expression(model, obj2, 2 * x[1])
198+
2 x[1]
199+
200+
julia> @objective(model, Min, [obj1, -obj2])
201+
2-element Vector{AffExpr}:
202+
x[1] + 1
203+
-2 x[1]
204+
```
205+
206+
Defining your objectives as expressions allows flexibility in how you can solve
207+
variations of the same problem, with some objectives removed and constrained to
208+
be no worse that a fixed value.
209+
210+
```jldoctest; setup = :(model=Model())
211+
julia> @variable(model, x[1:2]);
212+
213+
julia> @expression(model, obj1, 1 + x[1])
214+
x[1] + 1
215+
216+
julia> @expression(model, obj2, 2 * x[1])
217+
2 x[1]
218+
219+
julia> @expression(model, obj3, x[1] + x[2])
220+
x[1] + x[2]
221+
222+
julia> @objective(model, Min, [obj1, obj2, obj3]) # Three-objective problem
223+
3-element Vector{AffExpr}:
224+
x[1] + 1
225+
2 x[1]
226+
x[1] + x[2]
227+
228+
julia> # optimize!(model), look at the solution, talk to stakeholders, then
229+
# decide you want to solve a new problem where the third objective is
230+
# removed and constrained to be better than 2.0.
231+
nothing
232+
233+
julia> @objective(model, Min, [obj1, obj2]) # Two-objective problem
234+
2-element Vector{AffExpr}:
235+
x[1] + 1
236+
2 x[1]
237+
238+
julia> @constraint(model, obj3 <= 2.0)
239+
x[1] + x[2] ≤ 2.0
240+
```

docs/src/manual/solutions.md

+4
Original file line numberDiff line numberDiff line change
@@ -573,6 +573,10 @@ for i in 2:result_count(model)
573573
end
574574
```
575575

576+
!!! tip
577+
The [Multi-objective knapsack](@ref) tutorial provides an example of
578+
querying multiple solutions.
579+
576580
## Checking feasibility of solutions
577581

578582
To check the feasibility of a primal solution, use
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,124 @@
1+
# Copyright 2017, Iain Dunning, Joey Huchette, Miles Lubin, and contributors #src
2+
# This Source Code Form is subject to the terms of the Mozilla Public License #src
3+
# v.2.0. If a copy of the MPL was not distributed with this file, You can #src
4+
# obtain one at https://mozilla.org/MPL/2.0/. #src
5+
6+
# # Simple multi-objective examples
7+
8+
# This tutorial contains a number of examples of multi-objective programs from
9+
# the literature.
10+
11+
# ## Required packages
12+
13+
# This tutorial requires the following packages:
14+
15+
using JuMP
16+
import HiGHS
17+
import MultiObjectiveAlgorithms as MOA
18+
19+
# ## Bi-objective linear problem
20+
21+
# This is example is taken from Example 6.3 (from Steuer, 1985), page 154 of
22+
# Multicriteria Optimization (2nd ed), M. Ehrgott, Springer 2005. The code was
23+
# adapted from an example in [vOptGeneric](https://github.com/vOptSolver/vOptGeneric.jl)
24+
# by [@xgandibleux](https://github.com/xgandibleux).
25+
26+
model = Model()
27+
set_silent(model)
28+
@variable(model, x1 >= 0)
29+
@variable(model, 0 <= x2 <= 3)
30+
@objective(model, Min, [3x1 + x2, -x1 - 2x2])
31+
@constraint(model, 3x1 - x2 <= 6)
32+
set_optimizer(model, () -> MOA.Optimizer(HiGHS.Optimizer))
33+
set_optimizer_attribute(
34+
model,
35+
MOA.Algorithm(),
36+
MOA.Lexicographic(; all_permutations = true),
37+
)
38+
optimize!(model)
39+
solution_summary(model)
40+
41+
#-
42+
43+
for i in 1:result_count(model)
44+
print(i, ": z = ", round.(Int, objective_value(model; result = i)), " | ")
45+
println("x = ", value.([x1, x2]; result = i))
46+
end
47+
48+
# ## Bi-objective linear assignment problem
49+
50+
# This is example is taken from Example 9.38 (from Ulungu and Teghem, 1994),
51+
# page 255 of Multicriteria Optimization (2nd ed), M. Ehrgott, Springer 2005.
52+
# The code was adapted from an example in [vOptGeneric](https://github.com/vOptSolver/vOptGeneric.jl)
53+
# by [@xgandibleux](https://github.com/xgandibleux).
54+
55+
C1 = [5 1 4 7; 6 2 2 6; 2 8 4 4; 3 5 7 1]
56+
C2 = [3 6 4 2; 1 3 8 3; 5 2 2 3; 4 2 3 5]
57+
n = size(C2, 1)
58+
model = Model()
59+
set_silent(model)
60+
@variable(model, x[1:n, 1:n], Bin)
61+
@objective(model, Min, [sum(C1 .* x), sum(C2 .* x)])
62+
@constraint(model, [i = 1:n], sum(x[i, :]) == 1)
63+
@constraint(model, [j = 1:n], sum(x[:, j]) == 1)
64+
set_optimizer(model, () -> MOA.Optimizer(HiGHS.Optimizer))
65+
set_optimizer_attribute(model, MOA.Algorithm(), MOA.EpsilonConstraint())
66+
optimize!(model)
67+
solution_summary(model)
68+
69+
#-
70+
71+
for i in 1:result_count(model)
72+
print(i, ": z = ", round.(Int, objective_value(model; result = i)), " | ")
73+
println("x = ", round.(Int, value.(x; result = i)))
74+
end
75+
76+
# ## Bi-objective shortest path problem
77+
78+
# This is example is taken from Exercise 9.5 page 269 of Multicriteria
79+
# Optimization (2nd edt), M. Ehrgott, Springer 2005. The code was adapted from
80+
# an example in [vOptGeneric](https://github.com/vOptSolver/vOptGeneric.jl) by
81+
# [@xgandibleux](https://github.com/xgandibleux).
82+
83+
M = 50
84+
C1 = [
85+
M 4 5 M M M
86+
M M 2 1 2 7
87+
M M M 5 2 M
88+
M M 5 M M 3
89+
M M M M M 4
90+
M M M M M M
91+
]
92+
C2 = [
93+
M 3 1 M M M
94+
M M 1 4 2 2
95+
M M M 1 7 M
96+
M M 1 M M 2
97+
M M M M M 2
98+
M M M M M M
99+
]
100+
n = size(C2, 1)
101+
model = Model()
102+
set_silent(model)
103+
@variable(model, x[1:n, 1:n], Bin)
104+
@objective(model, Min, [sum(C1 .* x), sum(C2 .* x)])
105+
@constraint(model, sum(x[1, :]) == 1)
106+
@constraint(model, sum(x[:, n]) == 1)
107+
@constraint(model, [i = 2:n-1], sum(x[i, :]) - sum(x[:, i]) == 0)
108+
set_optimizer(model, () -> MOA.Optimizer(HiGHS.Optimizer))
109+
set_optimizer_attribute(model, MOA.Algorithm(), MOA.EpsilonConstraint())
110+
optimize!(model)
111+
solution_summary(model)
112+
113+
#-
114+
115+
for i in 1:result_count(model)
116+
print(i, ": z = ", round.(Int, objective_value(model; result = i)), " | ")
117+
X = round.(Int, value.(x; result = i))
118+
print("Path:")
119+
for ind in findall(val -> val 1, X)
120+
i, j = ind.I
121+
print(" $i->$j")
122+
end
123+
println()
124+
end

0 commit comments

Comments
 (0)