Skip to content

Commit 036c1aa

Browse files
committed
Streamline introductory tutorial
The excised material is more appropriate for the very first page that people click on.
1 parent eac5bdf commit 036c1aa

File tree

1 file changed

+15
-20
lines changed

1 file changed

+15
-20
lines changed

Diff for: tutorials/00-introduction/index.qmd

+15-20
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
---
2-
title: Introduction to Turing
2+
title: "Introduction: Coin Flipping"
33
engine: julia
44
aliases:
55
- ../
@@ -12,23 +12,12 @@ using Pkg;
1212
Pkg.instantiate();
1313
```
1414

15-
### Introduction
15+
This is the first of a series of guided tutorials on the Turing language.
16+
In this tutorial, we will use Bayesian inference to estimate the probability that a coin flip will result in heads, given a series of observations.
1617

17-
This is the first of a series of tutorials on the universal probabilistic programming language **Turing**.
18+
### Setup
1819

19-
Turing is a probabilistic programming system written entirely in Julia. It has an intuitive modelling syntax and supports a wide range of sampling-based inference algorithms.
20-
21-
Familiarity with Julia is assumed throughout this tutorial. If you are new to Julia, [Learning Julia](https://julialang.org/learning/) is a good starting point.
22-
23-
For users new to Bayesian machine learning, please consider more thorough introductions to the field such as [Pattern Recognition and Machine Learning](https://www.springer.com/us/book/9780387310732). This tutorial tries to provide an intuition for Bayesian inference and gives a simple example on how to use Turing. Note that this is not a comprehensive introduction to Bayesian machine learning.
24-
25-
### Coin Flipping Without Turing
26-
27-
The following example illustrates the effect of updating our beliefs with every piece of new evidence we observe.
28-
29-
Assume that we are unsure about the probability of heads in a coin flip. To get an intuitive understanding of what "updating our beliefs" is, we will visualize the probability of heads in a coin flip after each observed evidence.
30-
31-
First, let us load some packages that we need to simulate a coin flip
20+
First, let us load some packages that we need to simulate a coin flip:
3221

3322
```{julia}
3423
using Distributions
@@ -43,8 +32,7 @@ and to visualize our results.
4332
using StatsPlots
4433
```
4534

46-
Note that Turing is not loaded here — we do not use it in this example. If you are already familiar with posterior updates, you can proceed to the next step.
47-
35+
Note that Turing is not loaded here — we do not use it in this example.
4836
Next, we configure the data generating model. Let us set the true probability that a coin flip turns up heads
4937

5038
```{julia}
@@ -63,13 +51,20 @@ We simulate `N` coin flips by drawing N random samples from the Bernoulli distri
6351
data = rand(Bernoulli(p_true), N);
6452
```
6553

66-
Here is what the first five coin flips look like:
54+
Here are the first five coin flips:
6755

6856
```{julia}
6957
data[1:5]
7058
```
7159

72-
Next, we specify a prior belief about the distribution of heads and tails in a coin toss. Here we choose a [Beta](https://en.wikipedia.org/wiki/Beta_distribution) distribution as prior distribution for the probability of heads. Before any coin flip is observed, we assume a uniform distribution $\operatorname{U}(0, 1) = \operatorname{Beta}(1, 1)$ of the probability of heads. I.e., every probability is equally likely initially.
60+
61+
### Coin Flipping Without Turing
62+
63+
The following example illustrates the effect of updating our beliefs with every piece of new evidence we observe.
64+
65+
Assume that we are unsure about the probability of heads in a coin flip. To get an intuitive understanding of what "updating our beliefs" is, we will visualize the probability of heads in a coin flip after each observed evidence.
66+
67+
We begin by specifying a prior belief about the distribution of heads and tails in a coin toss. Here we choose a [Beta](https://en.wikipedia.org/wiki/Beta_distribution) distribution as prior distribution for the probability of heads. Before any coin flip is observed, we assume a uniform distribution $\operatorname{U}(0, 1) = \operatorname{Beta}(1, 1)$ of the probability of heads. I.e., every probability is equally likely initially.
7368

7469
```{julia}
7570
prior_belief = Beta(1, 1);

0 commit comments

Comments
 (0)