Skip to content

Commit 3143ffb

Browse files
authored
Fix broken links in tutorials/09-variational-inference/ (#516)
Fixes #515
1 parent ef4165e commit 3143ffb

File tree

1 file changed

+3
-5
lines changed

1 file changed

+3
-5
lines changed

tutorials/09-variational-inference/index.qmd

+3-5
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ Pkg.instantiate();
1313
In this post we'll have a look at what's know as **variational inference (VI)**, a family of _approximate_ Bayesian inference methods, and how to use it in Turing.jl as an alternative to other approaches such as MCMC. In particular, we will focus on one of the more standard VI methods called **Automatic Differentation Variational Inference (ADVI)**.
1414

1515
Here we will focus on how to use VI in Turing and not much on the theory underlying VI.
16-
If you are interested in understanding the mathematics you can checkout [our write-up](../../docs/for-developers/variational_inference) or any other resource online (there a lot of great ones).
16+
If you are interested in understanding the mathematics you can checkout [our write-up](../../tutorials/docs-07-for-developers-variational-inference/) or any other resource online (there a lot of great ones).
1717

1818
Using VI in Turing.jl is very straight forward.
1919
If `model` denotes a definition of a `Turing.Model`, performing VI is as simple as
@@ -26,7 +26,7 @@ q = vi(m, vi_alg) # perform VI on `m` using the VI method `vi_alg`, which retur
2626

2727
Thus it's no more work than standard MCMC sampling in Turing.
2828

29-
To get a bit more into what we can do with `vi`, we'll first have a look at a simple example and then we'll reproduce the [tutorial on Bayesian linear regression](../../tutorials/5-linearregression) using VI instead of MCMC. Finally we'll look at some of the different parameters of `vi` and how you for example can use your own custom variational family.
29+
To get a bit more into what we can do with `vi`, we'll first have a look at a simple example and then we'll reproduce the [tutorial on Bayesian linear regression](../../tutorials/05-linear-regression/) using VI instead of MCMC. Finally we'll look at some of the different parameters of `vi` and how you for example can use your own custom variational family.
3030

3131
We first import the packages to be used:
3232

@@ -248,12 +248,10 @@ plot(p1, p2; layout=(2, 1), size=(900, 500))
248248

249249
## Bayesian linear regression example using ADVI
250250

251-
This is simply a duplication of the tutorial [5. Linear regression](../regression/02_linear-regression) but now with the addition of an approximate posterior obtained using `ADVI`.
251+
This is simply a duplication of the tutorial on [Bayesian linear regression](../../tutorials/05-linear-regression/) (much of the code is directly lifted), but now with the addition of an approximate posterior obtained using `ADVI`.
252252

253253
As we'll see, there is really no additional work required to apply variational inference to a more complex `Model`.
254254

255-
This section is basically copy-pasting the code from the [linear regression tutorial](../regression/02_linear-regression).
256-
257255
```{julia}
258256
Random.seed!(1);
259257
```

0 commit comments

Comments
 (0)