Skip to content

Commit 0935661

Browse files
committed
2 parents 0fc29f0 + a302745 commit 0935661

File tree

9 files changed

+208
-98
lines changed

9 files changed

+208
-98
lines changed

docs/make.jl

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -24,6 +24,7 @@ makedocs(
2424
"tutorial/sim.md",
2525
"tutorial/mc.md",
2626
"tutorial/fit.md",
27+
"tutorial/gsa.md"
2728
],
2829
"API" => "api.md"
2930
],

docs/src/tutorial/gsa-fig01.png

18.5 KB
Loading

docs/src/tutorial/gsa-fig02.png

20.6 KB
Loading

docs/src/tutorial/gsa-fig03.png

21.3 KB
Loading

docs/src/tutorial/gsa-fig04.png

8.79 KB
Loading

docs/src/tutorial/gsa.md

Lines changed: 207 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,207 @@
1+
# Global Sensitivity Analysis
2+
3+
This tutorial describes methods of **Global Sensitivity Analysis (GSA)** by using **HetaSimulator** together with **GlobalSensitivity** package. The content is based on [GlobalSensitivity tutorial](https://docs.sciml.ai/GlobalSensitivity/stable/tutorials/juliacon21/). Current version of the tutorial is based on **HetaSimulator v0.6.2** and **GlobalSensitivity v2.7.0**
4+
5+
In order to run GSA we need to install **GlobalSensitivity** package in addition to **HetaSimulator** and **Plots**.
6+
We will also use **QuasiMonteCarlo** package for sampling.
7+
8+
```julia
9+
julia> ]
10+
(gsa) pkg> add GlobalSensitivity QuasiMonteCarlo
11+
```
12+
13+
## Working example
14+
15+
All GSA methods are illustrated by one-compartment [Ethanol PK Model](https://github.com/insysbio/alco).
16+
Download the following files to your working directory:
17+
```
18+
wget https://raw.githubusercontent.com/insysbio/alco/main/src/index.heta
19+
wget https://raw.githubusercontent.com/insysbio/alco/main/src/mod1.heta
20+
wget https://raw.githubusercontent.com/insysbio/alco/main/src/mod2.heta
21+
wget https://raw.githubusercontent.com/insysbio/alco/main/src/mod3.heta
22+
wget https://raw.githubusercontent.com/insysbio/alco/main/src/qsp-units.heta
23+
wget https://raw.githubusercontent.com/insysbio/alco/main/data-mumenthaler-2000/scenarios.csv
24+
```
25+
26+
Load the platform into the Julia environment. You should provide the path to the `index.heta` file as the first argument to `load_platform`.
27+
We will use the same working directory where the `index.heta` file is located.
28+
29+
```julia
30+
using HetaSimulator, GlobalSensitivity, Plots
31+
32+
p = load_platform(".")
33+
34+
scen = read_scenarios("scenarios.csv")
35+
add_scenarios!(p, scen)
36+
```
37+
38+
According to Model1 scenario we will study the sensitivity of *BrAC* observable (breath alcohol concentration) to four parameters *(:k_a, :Vd, :Vmax, :Km)*.
39+
40+
```julia
41+
scn1 = scenarios(p)[:scn1]
42+
s = sim(scn1)
43+
plot(s, vars=[:BrAC])
44+
```
45+
46+
![plot-default](./gsa-fig01.png)
47+
48+
To run GSA we need to define a function which takes in parameter values and outputs an observable. The observable can be any function of timeseries (e.g. mean, maximum, minimum, AUC, value at a certain timepoint). We can also define vector output to study sensitivity of different observables or an observed variable at different timepoints.
49+
50+
Let's have a look at some examples of such functions.
51+
52+
```julia
53+
pnames = [:k_a, :Vd, :Vmax, :Km]
54+
nump = length(pnames)
55+
56+
# Concentration value at t=8 hours
57+
function C8hours_func(params)
58+
sol = sim(scn1; parameters= pnames .=> params)
59+
return sol(8.0,:BrAC)
60+
end
61+
62+
# Concentration values at two timepoints t=4,8 hours
63+
function C8hours_func(params)
64+
sol = sim(scn1; parameters= pnames .=> params)
65+
return sol.([4.0, 8.0],:BrAC)
66+
end
67+
```
68+
In this tutorial we will observe maximum alcohol concentration in the compartment.
69+
```julia
70+
function Cmax_func(params)
71+
sol = sim(scn1; parameters= pnames .=> params)
72+
brac = sol.(times(sol),:BrAC)
73+
return maximum(brac)
74+
end
75+
```
76+
77+
78+
Now we define parameters bounds based mean/sd values reported in the original paper [Ethanol Pharmacokinetics in White Women](https://doi.org/10.1111/j.1530-0277.2000.tb02103.x). Parameters bound imply Uniform distribution. Usage of other distributions (Normal, LogNormal, etc) for parameters sampling will be demonstrated later in the tutorial.
79+
80+
```julia
81+
bounds = [[0.01, 0.5], [0.01, 1.], [0.01, 0.5], [0.01, 5.0]]
82+
```
83+
84+
85+
## GSA Methods
86+
87+
### Regression/Correlation coefficients method
88+
89+
[RegressionGSA method details](https://docs.sciml.ai/GlobalSensitivity/stable/methods/regression/)
90+
91+
The least computationally demanding method to estimate global sensitivities is to compute correlation/regression coefficients. This method is based on the assumption, that the output can be approximated by a linear model of parameters (e.g. `BrACmax = a1*k_a + a2∗Vd + a3∗Vmax + a4∗Km`). This assumption is reasonable if **the output is monotonic in each of the input parameters**, otherwise the reported coefficients can be missleading.
92+
93+
The number of samples required by the method should be larger than 2^P (where P is the number of input parameters). 2^P is the bare minimum and a reasonable number of samples for a model of four-five parameters is 1000-5000.
94+
95+
Usually **Standard Regression Coefficients (SRC)** and **Partial Correlation Coefficients (PCC)** are considered.
96+
97+
```julia
98+
nsamples = 1000
99+
reg_sens = GlobalSensitivity.gsa(Cmax_func, RegressionGSA(true), bounds, samples = nsamples)
100+
101+
p = plot(layout=(2,1), size=(400,600))
102+
heatmap!(p[1],
103+
reg_sens.partial_correlation, fc =cgrad([:blue, :orange]),
104+
yticks=false, xticks = (1:nump, String.(pnames)),
105+
title = "Partial correlation")
106+
107+
heatmap!(p[2],
108+
reg_sens.standard_regression,fc =cgrad([:blue, :orange]),
109+
yticks=false, xticks = (1:nump, String.(pnames)),
110+
title = "Standard correlation")
111+
p
112+
```
113+
![plot-regression](./gsa-fig02.png)
114+
115+
**(Note!)** Regression-based GSA can also be applied to the results of multiple simulations provided to `gsa()` function as `Matrices`. See [GlobalSensitivity docs](https://docs.sciml.ai/GlobalSensitivity/stable/methods/regression/). This allows us generate parameters in advance from pre-defined Distributions.
116+
117+
```julia
118+
# Matrix of input parameters
119+
X = zeros(length(pnames), nsamples)
120+
X[1,:] .= rand(LogNormal(-3.08, 0.77), nsamples) # k_a
121+
X[2,:] .= rand(Uniform(0.01, 1.), nsamples) # Vd
122+
X[3,:] .= rand(Uniform(0.01, 0.5), nsamples) # Vmax
123+
X[4,:] .= rand(LogNormal(0.46, 0.49), nsamples) # Km
124+
125+
# output max concentration values
126+
Y = [Cmax_func(X[:, i]) for i in 1:size(X,2)]'
127+
128+
reg_sens = GlobalSensitivity.gsa(X, Y, RegressionGSA(true))
129+
```
130+
131+
### Sobol (and eFAST) method
132+
133+
[Sobol method details](https://docs.sciml.ai/GlobalSensitivity/stable/methods/sobol/)
134+
135+
Sobol method is the GSA method of variance decomposition based on mathematical result by I.M. Sobol. Variance of the output is decomposed into contributions from each parameter and their interactions. Two indices are used to measure these contributions. The contribution of each parameter is represented by **first-order index** and the contribution of the parameter together with all its interactions with other parameters is represented by **total-order index**.
136+
137+
The requirement for the number of samples is the same as for Correlatin/Regression coefficients method: **The number of samples required by the method should be larger than 2^P (where P is the number of input parameters).**
138+
139+
```julia
140+
nsamples = 1000
141+
sobol_sens1 = GlobalSensitivity.gsa(Cmax_func, Sobol(), bounds, samples = nsamples)
142+
143+
p = plot(layout=(2,1), size=(400,600), xticks = (1:nump, String.(pnames)))
144+
bar!(p[1], 1:nump, sobol_sens1.S1, title = "Sobol First order indeces", label=false)
145+
bar!(p[2], 1:nump, sobol_sens1.ST, title = "Sobol Total order indeces", label=false)
146+
p
147+
```
148+
![plot-sobol1](./gsa-fig03.png)
149+
150+
**(Note 1 !)** It is recommended to use low-discrepancy sequence instead of a pure Monte-Carlo to effectively sample the search space . One can choose `SobolSample()`, `LatinHypercubeSample()`, `HaltonSample()` or other quasi-Monte Carlo samplers from `QuasiMonteCarlo` package.
151+
152+
```julia
153+
using QuasiMonteCarlo
154+
155+
lb = [b[1] for b in bounds]
156+
ub = [b[2] for b in bounds]
157+
sampler = SobolSample()
158+
A,B = QuasiMonteCarlo.generate_design_matrices(nsamples,lb,ub,sampler)
159+
160+
sobol_sens2 = GlobalSensitivity.gsa(Cmax_func, Sobol(), A, B)
161+
```
162+
163+
**(Note 2 !)** To ensure the convergence of Sobol indices and choose reasonable sample size it is useful to set different sample sizes and monitor (plot) how the indices' values stabilize with the increase of sample size.
164+
165+
**(Note 3 !)** As of [email protected] `Sobol()` implementation supports Uniform input ranges and not parameters' distributions. If you prefer to input parameters' distributions you can choose `eFAST()`, which is based on `Sobol()` method. Please refer to the [eFAST docs](https://docs.sciml.ai/GlobalSensitivity/stable/methods/efast/) for detailed example.
166+
167+
### Morris method (Elementary effects method)
168+
169+
[Morris method details](https://docs.sciml.ai/GlobalSensitivity/stable/methods/morris/)
170+
171+
Morris method estimates parameters' contribution to the output be computing individual "elementary effects" (EE), which are local sensitivities at different points in the parameters space. The method runs multiple trajectories through the grid of points in parameter space, computes EEs and outputs mean and std of EE per parameter. The sensitivities are calculated per parameter, so the method doesn't take into account interactions between parameters. Morris method doesn't have a strong mathematical foundation (as Sobol method). Due to its simplicity, it is commonly used as a pre-scan method with all the input parameters of interest. Parameters identified as noninfluential can be then fixed to run computationally demanding but more reliable variance-based methods (`Sobol()`, `eFAST()`)
172+
173+
The requirement for the number of samples is the same as for Correlatin/Regression coefficients method: **The number of samples required by the method should be larger than 2^P (where P is the number of input parameters).**
174+
175+
The number of estimated points in the parameter space is controlled by `total_num_trajectory` and `num_trajectory` arguments. See [Morris method docs](https://docs.sciml.ai/GlobalSensitivity/stable/methods/morris/) for details.
176+
```julia
177+
morris_sens = GlobalSensitivity.gsa(Cmax_func, Morris(total_num_trajectory=5000,num_trajectory=600), bounds)
178+
179+
bar(1:nump, morris_sens.means_star[1,:], title = "Morris method", label=false)
180+
```
181+
![plot-morris](./gsa-fig04.png)
182+
183+
**(Note 1 !)** Morris method also outputs std of EE per parameter: `morris_sens.variances`. High std reported by Morris method can be a marker of parameters interaction or high nonlinearity between the parameters and the output.
184+
185+
**(Note 2 !)** As in `Sobol()` method it useful to set different number of trajectories and monitor how the mean values stabilize with the increase of sample size.
186+
187+
**(Note 3 !)** As of [email protected] `Morris()` implementation doesn't support `generate_design_matrices` interface (see Sobol method notes), so only default sampling scheme (Monte-Carlo) can be used.
188+
189+
## Parallel setup
190+
191+
It is natural to speed-up GSA methods (e.g. Sobol method) by running simulations in parallel. To do so we need to rewrite `Cmax_func` and use `batch` keyword argument in the `gsa()` function.
192+
193+
```julia
194+
function Cmax_batch_func(params_batch)
195+
batch_size = size(params_batch,2)
196+
sol = mc(scn1, pnames .=> [params_batch[i,:] for i in eachindex(pnames)], batch_size; parallel_type=EnsembleDistributed())
197+
198+
out = zeros(batch_size)
199+
for i in eachindex(out)
200+
brac = sol[i].(times(sol[i]),:BrAC)
201+
out[i] = maximum(brac)
202+
end
203+
return out
204+
end
205+
206+
sobol_sens = GlobalSensitivity.gsa(Cmax_batch_func, Sobol(), A, B, batch=true)
207+
```

src/HetaSimulator.jl

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -71,7 +71,6 @@ module HetaSimulator
7171
include("monte_carlo.jl")
7272
include("ensemble_stats.jl")
7373
include("import_platform.jl")
74-
include("gsa.jl")
7574
include("save_as_heta.jl")
7675
include("heta_funcs.jl")
7776

@@ -92,7 +91,6 @@ module HetaSimulator
9291
export update
9392
export times, vals, status, status_summary
9493
export save_results, read_mcvecs
95-
export gsa, pearson, partial, standard
9694
export save_as_heta
9795
export scale_params, unscale_params
9896
export piecewise

src/gsa.jl

Lines changed: 0 additions & 92 deletions
This file was deleted.

test/single_comp_test.jl

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -65,7 +65,6 @@ end
6565
mc1 = mc(Scenario(model, (0., 200.), observables=[:r1], saveat=0:10:200), [:k1=>Normal(0.02,1e-3)], mciter)
6666
mc2 = mc([:one=>scn1,:two=>scn2], [:k1=>Normal(0.02,1e-3)], mciter)
6767
mc1_reduced = mc(Scenario(model, (0., 200.), observables=[:r1]), [:k1=>Normal(0.02,1e-3)], mciter; output_func=output_func)
68-
gsar = gsa(mc1, 200)
6968
ens = EnsembleSummary(mc1)
7069
@test typeof(ens) <: HetaSimulator.LabelledEnsembleSummary
7170
@test typeof(mc1) <: HetaSimulator.MCResult
@@ -77,9 +76,6 @@ ens = EnsembleSummary(mc1)
7776
@test typeof(parameters(mc1)) <: Vector
7877
@test times(mc1[1])[end] == 200.
7978
@test keys((parameters(mc1[1]))) == (:k1,)
80-
@test size(pearson(gsar)) == (1,1)
81-
@test size(partial(gsar)) == (1,1)
82-
@test size(standard(gsar)) == (1,1)
8379
@test length(mc1_reduced) == mciter
8480

8581

0 commit comments

Comments
 (0)