You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: tutorials/00-introduction/index.qmd
+15-20
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,5 @@
1
1
---
2
-
title: Introduction to Turing
2
+
title: "Introduction: Coin Flipping"
3
3
engine: julia
4
4
aliases:
5
5
- ../
@@ -12,23 +12,12 @@ using Pkg;
12
12
Pkg.instantiate();
13
13
```
14
14
15
-
### Introduction
15
+
This is the first of a series of guided tutorials on the Turing language.
16
+
In this tutorial, we will use Bayesian inference to estimate the probability that a coin flip will result in heads, given a series of observations.
16
17
17
-
This is the first of a series of tutorials on the universal probabilistic programming language **Turing**.
18
+
### Setup
18
19
19
-
Turing is a probabilistic programming system written entirely in Julia. It has an intuitive modelling syntax and supports a wide range of sampling-based inference algorithms.
20
-
21
-
Familiarity with Julia is assumed throughout this tutorial. If you are new to Julia, [Learning Julia](https://julialang.org/learning/) is a good starting point.
22
-
23
-
For users new to Bayesian machine learning, please consider more thorough introductions to the field such as [Pattern Recognition and Machine Learning](https://www.springer.com/us/book/9780387310732). This tutorial tries to provide an intuition for Bayesian inference and gives a simple example on how to use Turing. Note that this is not a comprehensive introduction to Bayesian machine learning.
24
-
25
-
### Coin Flipping Without Turing
26
-
27
-
The following example illustrates the effect of updating our beliefs with every piece of new evidence we observe.
28
-
29
-
Assume that we are unsure about the probability of heads in a coin flip. To get an intuitive understanding of what "updating our beliefs" is, we will visualize the probability of heads in a coin flip after each observed evidence.
30
-
31
-
First, let us load some packages that we need to simulate a coin flip
20
+
First, let us load some packages that we need to simulate a coin flip:
32
21
33
22
```{julia}
34
23
using Distributions
@@ -43,8 +32,7 @@ and to visualize our results.
43
32
using StatsPlots
44
33
```
45
34
46
-
Note that Turing is not loaded here — we do not use it in this example. If you are already familiar with posterior updates, you can proceed to the next step.
47
-
35
+
Note that Turing is not loaded here — we do not use it in this example.
48
36
Next, we configure the data generating model. Let us set the true probability that a coin flip turns up heads
49
37
50
38
```{julia}
@@ -63,13 +51,20 @@ We simulate `N` coin flips by drawing N random samples from the Bernoulli distri
63
51
data = rand(Bernoulli(p_true), N);
64
52
```
65
53
66
-
Here is what the first five coin flips look like:
54
+
Here are the first five coin flips:
67
55
68
56
```{julia}
69
57
data[1:5]
70
58
```
71
59
72
-
Next, we specify a prior belief about the distribution of heads and tails in a coin toss. Here we choose a [Beta](https://en.wikipedia.org/wiki/Beta_distribution) distribution as prior distribution for the probability of heads. Before any coin flip is observed, we assume a uniform distribution $\operatorname{U}(0, 1) = \operatorname{Beta}(1, 1)$ of the probability of heads. I.e., every probability is equally likely initially.
60
+
61
+
### Coin Flipping Without Turing
62
+
63
+
The following example illustrates the effect of updating our beliefs with every piece of new evidence we observe.
64
+
65
+
Assume that we are unsure about the probability of heads in a coin flip. To get an intuitive understanding of what "updating our beliefs" is, we will visualize the probability of heads in a coin flip after each observed evidence.
66
+
67
+
We begin by specifying a prior belief about the distribution of heads and tails in a coin toss. Here we choose a [Beta](https://en.wikipedia.org/wiki/Beta_distribution) distribution as prior distribution for the probability of heads. Before any coin flip is observed, we assume a uniform distribution $\operatorname{U}(0, 1) = \operatorname{Beta}(1, 1)$ of the probability of heads. I.e., every probability is equally likely initially.
0 commit comments