Skip to content

Commit 0142603

Browse files
penelopeysmyebai
andauthored
Reorganise introductory docs (#520)
* Remove quick start -- it's repeated content * Restructure documentation * Streamline introductory tutorial The excised material is more appropriate for the very first page that people click on. * Streamline 'Getting Started' page 1. Remove the section on posterior checks; this is the landing page and it's not necessary for people reading about the library for the first time to go through that. 2. Signpost the way to the rest of the documentation at the bottom. 3. Minor wording changes * Update _quarto.yml Co-authored-by: Hong Ge <[email protected]> --------- Co-authored-by: Hong Ge <[email protected]>
1 parent 71a7a60 commit 0142603

File tree

5 files changed

+59
-160
lines changed

5 files changed

+59
-160
lines changed

_quarto.yml

+12-11
Original file line numberDiff line numberDiff line change
@@ -50,16 +50,15 @@ website:
5050
- text: documentation
5151
collapse-level: 1
5252
contents:
53-
- section: "Documentation"
53+
- section: "Users"
5454
# href: tutorials/index.qmd, This page will be added later so keep this line commented
5555
contents:
56-
- section: "Using Turing - Modelling Syntax and Interface"
56+
- tutorials/docs-00-getting-started/index.qmd
57+
- tutorials/docs-12-using-turing-guide/index.qmd
58+
59+
- section: "Usage Tips"
5760
collapse-level: 1
5861
contents:
59-
- tutorials/docs-00-getting-started/index.qmd
60-
- text: "Quick Start"
61-
href: tutorials/docs-14-using-turing-quick-start/index.qmd
62-
- tutorials/docs-12-using-turing-guide/index.qmd
6362
- text: "Mode Estimation"
6463
href: tutorials/docs-17-mode-estimation/index.qmd
6564
- tutorials/docs-09-using-turing-advanced/index.qmd
@@ -70,7 +69,7 @@ website:
7069
- text: "External Samplers"
7170
href: tutorials/docs-16-using-turing-external-samplers/index.qmd
7271

73-
- section: "Using Turing - Tutorials"
72+
- section: "Tutorials"
7473
contents:
7574
- tutorials/00-introduction/index.qmd
7675
- text: Gaussian Mixture Models
@@ -97,25 +96,27 @@ website:
9796
- text: "Gaussian Process Latent Variable Models"
9897
href: tutorials/12-gplvm/index.qmd
9998

100-
- section: "Developers: Contributing"
99+
- section: "Developers"
100+
contents:
101+
- section: "Contributing"
101102
collapse-level: 1
102103
contents:
103104
- text: "How to Contribute"
104105
href: tutorials/docs-01-contributing-guide/index.qmd
105106

106-
- section: "Developers: PPL"
107+
- section: "DynamicPPL in Depth"
107108
collapse-level: 1
108109
contents:
109110
- tutorials/docs-05-for-developers-compiler/index.qmd
110111
- text: "A Mini Turing Implementation I: Compiler"
111112
href: tutorials/14-minituring/index.qmd
112113
- text: "A Mini Turing Implementation II: Contexts"
113114
href: tutorials/16-contexts/index.qmd
114-
- tutorials/docs-06-for-developers-interface/index.qmd
115115

116-
- section: "Developers: Inference"
116+
- section: "Inference (note: outdated)"
117117
collapse-level: 1
118118
contents:
119+
- tutorials/docs-06-for-developers-interface/index.qmd
119120
- tutorials/docs-04-for-developers-abstractmcmc-turing/index.qmd
120121
- tutorials/docs-07-for-developers-variational-inference/index.qmd
121122
- text: "Implementing Samplers"

tutorials/00-introduction/index.qmd

+15-20
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
---
2-
title: Introduction to Turing
2+
title: "Introduction: Coin Flipping"
33
engine: julia
44
aliases:
55
- ../
@@ -12,23 +12,12 @@ using Pkg;
1212
Pkg.instantiate();
1313
```
1414

15-
### Introduction
15+
This is the first of a series of guided tutorials on the Turing language.
16+
In this tutorial, we will use Bayesian inference to estimate the probability that a coin flip will result in heads, given a series of observations.
1617

17-
This is the first of a series of tutorials on the universal probabilistic programming language **Turing**.
18+
### Setup
1819

19-
Turing is a probabilistic programming system written entirely in Julia. It has an intuitive modelling syntax and supports a wide range of sampling-based inference algorithms.
20-
21-
Familiarity with Julia is assumed throughout this tutorial. If you are new to Julia, [Learning Julia](https://julialang.org/learning/) is a good starting point.
22-
23-
For users new to Bayesian machine learning, please consider more thorough introductions to the field such as [Pattern Recognition and Machine Learning](https://www.springer.com/us/book/9780387310732). This tutorial tries to provide an intuition for Bayesian inference and gives a simple example on how to use Turing. Note that this is not a comprehensive introduction to Bayesian machine learning.
24-
25-
### Coin Flipping Without Turing
26-
27-
The following example illustrates the effect of updating our beliefs with every piece of new evidence we observe.
28-
29-
Assume that we are unsure about the probability of heads in a coin flip. To get an intuitive understanding of what "updating our beliefs" is, we will visualize the probability of heads in a coin flip after each observed evidence.
30-
31-
First, let us load some packages that we need to simulate a coin flip
20+
First, let us load some packages that we need to simulate a coin flip:
3221

3322
```{julia}
3423
using Distributions
@@ -43,8 +32,7 @@ and to visualize our results.
4332
using StatsPlots
4433
```
4534

46-
Note that Turing is not loaded here — we do not use it in this example. If you are already familiar with posterior updates, you can proceed to the next step.
47-
35+
Note that Turing is not loaded here — we do not use it in this example.
4836
Next, we configure the data generating model. Let us set the true probability that a coin flip turns up heads
4937

5038
```{julia}
@@ -63,13 +51,20 @@ We simulate `N` coin flips by drawing N random samples from the Bernoulli distri
6351
data = rand(Bernoulli(p_true), N);
6452
```
6553

66-
Here is what the first five coin flips look like:
54+
Here are the first five coin flips:
6755

6856
```{julia}
6957
data[1:5]
7058
```
7159

72-
Next, we specify a prior belief about the distribution of heads and tails in a coin toss. Here we choose a [Beta](https://en.wikipedia.org/wiki/Beta_distribution) distribution as prior distribution for the probability of heads. Before any coin flip is observed, we assume a uniform distribution $\operatorname{U}(0, 1) = \operatorname{Beta}(1, 1)$ of the probability of heads. I.e., every probability is equally likely initially.
60+
61+
### Coin Flipping Without Turing
62+
63+
The following example illustrates the effect of updating our beliefs with every piece of new evidence we observe.
64+
65+
Assume that we are unsure about the probability of heads in a coin flip. To get an intuitive understanding of what "updating our beliefs" is, we will visualize the probability of heads in a coin flip after each observed evidence.
66+
67+
We begin by specifying a prior belief about the distribution of heads and tails in a coin toss. Here we choose a [Beta](https://en.wikipedia.org/wiki/Beta_distribution) distribution as prior distribution for the probability of heads. Before any coin flip is observed, we assume a uniform distribution $\operatorname{U}(0, 1) = \operatorname{Beta}(1, 1)$ of the probability of heads. I.e., every probability is equally likely initially.
7368

7469
```{julia}
7570
prior_belief = Beta(1, 1);

tutorials/docs-00-getting-started/index.qmd

+29-54
Original file line numberDiff line numberDiff line change
@@ -16,96 +16,71 @@ Pkg.instantiate();
1616

1717
To use Turing, you need to install Julia first and then install Turing.
1818

19-
### Install Julia
19+
You will need to install Julia 1.7 or greater, which you can get from [the official Julia website](http://julialang.org/downloads/).
2020

21-
You will need to install Julia 1.3 or greater, which you can get from [the official Julia website](http://julialang.org/downloads/).
22-
23-
### Install Turing.jl
24-
25-
Turing is an officially registered Julia package, so you can install a stable version of Turing by running the following in the Julia REPL:
21+
Turing is officially registered in the [Julia General package registry](https://github.com/JuliaRegistries/General), which means that you can install a stable version of Turing by running the following in the Julia REPL:
2622

2723
```{julia}
24+
#| eval: false
2825
#| output: false
2926
using Pkg
3027
Pkg.add("Turing")
3128
```
3229

33-
You can check if all tests pass by running `Pkg.test("Turing")` (it might take a long time)
34-
35-
### Example
36-
37-
Here's a simple example showing Turing in action.
30+
### Example usage
3831

39-
First, we can load the Turing and StatsPlots modules
32+
First, we load the Turing and StatsPlots modules.
33+
The latter is required for visualising the results.
4034

4135
```{julia}
4236
using Turing
4337
using StatsPlots
4438
```
4539

46-
Then, we define a simple Normal model with unknown mean and variance
40+
We then specify our model, which is a simple Gaussian model with unknown mean and variance.
41+
Models are defined as ordinary Julia functions, prefixed with the `@model` macro.
42+
Each statement inside closely resembles how the model would be defined with mathematical notation.
43+
Here, both `x` and `y` are observed values, and are therefore passed as function parameters.
44+
`m` and `` are the parameters to be inferred.
4745

4846
```{julia}
4947
@model function gdemo(x, y)
5048
s² ~ InverseGamma(2, 3)
5149
m ~ Normal(0, sqrt(s²))
5250
x ~ Normal(m, sqrt(s²))
53-
return y ~ Normal(m, sqrt(s²))
51+
y ~ Normal(m, sqrt(s²))
5452
end
5553
```
5654

57-
Then we can run a sampler to collect results. In this case, it is a Hamiltonian Monte Carlo sampler
58-
59-
```{julia}
60-
chn = sample(gdemo(1.5, 2), NUTS(), 1000, progress=false)
61-
```
62-
63-
We can plot the results
55+
Suppose we observe `x = 1.5` and `y = 2`, and want to infer the mean and variance.
56+
We can pass these data as arguments to the `gdemo` function, and run a sampler to collect the results.
57+
Here, we collect 1000 samples using the No U-Turn Sampler (NUTS) algorithm.
6458

6559
```{julia}
66-
plot(chn)
60+
chain = sample(gdemo(1.5, 2), NUTS(), 1000, progress=false)
6761
```
6862

69-
In this case, because we use the normal-inverse gamma distribution as a conjugate prior, we can compute its updated mean as follows:
63+
We can plot the results:
7064

7165
```{julia}
72-
s² = InverseGamma(2, 3)
73-
m = Normal(0, 1)
74-
data = [1.5, 2]
75-
x_bar = mean(data)
76-
N = length(data)
77-
78-
mean_exp = (m.σ * m.μ + N * x_bar) / (m.σ + N)
66+
plot(chain)
7967
```
8068

81-
We can also compute the updated variance
69+
and obtain summary statistics by indexing the chain:
8270

8371
```{julia}
84-
updated_alpha = shape(s²) + (N / 2)
85-
updated_beta =
86-
scale(s²) +
87-
(1 / 2) * sum((data[n] - x_bar)^2 for n in 1:N) +
88-
(N * m.σ) / (N + m.σ) * ((x_bar)^2) / 2
89-
variance_exp = updated_beta / (updated_alpha - 1)
72+
mean(chain[:m]), mean(chain[:s²])
9073
```
9174

92-
Finally, we can check if these expectations align with our HMC approximations from earlier. We can compute samples from a normal-inverse gamma following the equations given [here](https://en.wikipedia.org/wiki/Normal-inverse-gamma_distribution#Generating_normal-inverse-gamma_random_variates).
75+
### Where to go next
9376

94-
```{julia}
95-
function sample_posterior(alpha, beta, mean, lambda, iterations)
96-
samples = []
97-
for i in 1:iterations
98-
sample_variance = rand(InverseGamma(alpha, beta), 1)
99-
sample_x = rand(Normal(mean, sqrt(sample_variance[1]) / lambda), 1)
100-
samples = append!(samples, sample_x)
101-
end
102-
return samples
103-
end
77+
::: {.callout-note title="Note on prerequisites"}
78+
Familiarity with Julia is assumed throughout the Turing documentation.
79+
If you are new to Julia, [Learning Julia](https://julialang.org/learning/) is a good starting point.
10480

105-
analytical_samples = sample_posterior(updated_alpha, updated_beta, mean_exp, 2, 1000);
106-
```
81+
The underlying theory of Bayesian machine learning is not explained in detail in this documentation.
82+
A thorough introduction to the field is [*Pattern Recognition and Machine Learning*](https://www.springer.com/us/book/9780387310732) (Bishop, 2006); an online version is available [here (PDF, 18.1 MB)](https://www.microsoft.com/en-us/research/uploads/prod/2006/01/Bishop-Pattern-Recognition-and-Machine-Learning-2006.pdf).
83+
:::
10784

108-
```{julia}
109-
density(analytical_samples; label="Posterior (Analytical)")
110-
density!(chn[:m]; label="Posterior (HMC)")
111-
```
85+
The next page on [Turing's core functionality](../../tutorials/docs-12-using-turing-guide/) explains the basic features of the Turing language.
86+
From there, you can either look at [worked examples of how different models are implemented in Turing](../../tutorials/00-introduction/), or [specific tips and tricks that can help you get the most out of Turing](../../tutorials/docs-17-mode-estimation/).

tutorials/docs-12-using-turing-guide/index.qmd

+3-1
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
---
2-
title: Guide
2+
title: "Core Functionality"
33
engine: julia
44
---
55

@@ -10,6 +10,8 @@ using Pkg;
1010
Pkg.instantiate();
1111
```
1212

13+
This article provides an overview of the core functionality in Turing.jl, which are likely to be used across a wide range of models.
14+
1315
## Basics
1416

1517
### Introduction

tutorials/docs-14-using-turing-quick-start/index.qmd

-74
This file was deleted.

0 commit comments

Comments
 (0)