Skip to content

Commit 7c106e2

Browse files
committed
Add links to youtube tutorials
1 parent f5247d3 commit 7c106e2

File tree

6 files changed

+25
-1
lines changed

6 files changed

+25
-1
lines changed
425 KB
Loading

docs/source/figures/cvmc.png

423 KB
Loading

docs/source/figures/mc.png

426 KB
Loading

tutorials/multi_fidelity/plot_control_variate_monte_carlo.py

+14
Original file line numberDiff line numberDiff line change
@@ -134,6 +134,20 @@
134134
#%%
135135
#Change ``eta`` to ``eta_mc`` to see how the variance reduction changes when the covariance between models is approximated
136136

137+
138+
#%%
139+
#Videos
140+
#------
141+
#Click on the image below to view a video tutorial on control variate Monte Carlo quadrature with one low-fidelity model
142+
#
143+
#.. image:: ../../figures/cvmc.png
144+
# :target: https://youtu.be/2GpwGG8EmNU
145+
#
146+
#Click on the image below to view a video tutorial on control variate Monte Carlo quadrature with one multiple lower fidelity models
147+
#
148+
#.. image:: ../../figures/cvmc-many-models.png
149+
# :target: https://youtu.be/lBvtWo2ZwaU
150+
137151
#%%
138152
#References
139153
#^^^^^^^^^^

tutorials/multi_fidelity/plot_monte_carlo.py

+9
Original file line numberDiff line numberDiff line change
@@ -138,3 +138,12 @@ def plot_estimator_histrogram(nsamples, model_id, ax):
138138
# \mean{\left(Q_{\alpha}(\rvset_N)-\mean{Q}\right)^2}=\underbrace{N^{-1}\var{f_\alpha}}_{I}+\underbrace{\left(Q_{\alpha}-Q\right)^2}_{II}
139139
#
140140
#From this expression we can see that the MSE can be decomposed into two terms; a so called stochastic error (I) and a deterministic bias (II). The first term is the variance of the Monte Carlo estimator which comes from using a finite number of samples. The second term is due to using an approximation of :math:`f_0`. These two errors should be balanced, however in the vast majority of all MC analyses a single model :math:`f_\alpha` is used and the choice of :math:`\alpha`, e.g. mesh resolution, is made a priori without much concern for the balancing bias and variance.
141+
#
142+
143+
#%%
144+
#Video
145+
#-----
146+
#Click on the image below to view a video tutorial on Monte Carlo quadrature
147+
#
148+
#.. image:: ../../figures/mc.png
149+
# :target: https://youtu.be/OrLsYo11kvY?si=chAXN6UqssXY8pxh

tutorials/multi_fidelity/plot_multilevel_blue.py

+2-1
Original file line numberDiff line numberDiff line change
@@ -145,6 +145,7 @@
145145
#
146146
#where :math:`W_j` denotes the cost of evaluating the jth model and :math:`W_{\max}` is the total budget.
147147
#
148+
# This optimization problem can be solved effectively using semi-definite programming [CWARXIV2023]_.
148149

149150
target_costs = np.array([1e1, 1e2, 1e3], dtype=int)
150151
estimators = [
@@ -182,4 +183,4 @@
182183
#
183184
#.. [SUSIAMUQ2021] `D. Schaden, E. Ullmann. Asymptotic Analysis of Multilevel Best Linear Unbiased Estimators. SIAM/ASA Journal on Uncertainty Quantification 9 (3):953-978, 2021. <https://doi.org/10.1137/20M1321607>`_
184185
#
185-
#.. [CWARXIV2023]_ `M. Croci, K. Willcox, S. Wright. Multi-output multilevel best linear unbiased estimators via semidefinite programming. (2023) <https://doi.org/10.1016/j.cma.2023.116130>`_
186+
#.. [CWARXIV2023] `M. Croci, K. Willcox, S. Wright. Multi-output multilevel best linear unbiased estimators via semidefinite programming. (2023) <https://doi.org/10.1016/j.cma.2023.116130>`_

0 commit comments

Comments
 (0)