You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* Update to Turing 0.37
* Update .~ syntax
* Cache _freeze and .julia even if build fails (#591)
* Update to Turing 0.37
* Fixes for Turing 0.37 compatibility
Copy file name to clipboardExpand all lines: tutorials/gaussian-mixture-models/index.qmd
+6-4
Original file line number
Diff line number
Diff line change
@@ -167,10 +167,12 @@ One solution here is to enforce an ordering on our $\mu$ vector, requiring $\mu_
167
167
`Bijectors.jl`[provides](https://turinglang.org/Bijectors.jl/dev/transforms/#Bijectors.OrderedBijector) an easy transformation (`ordered()`) for this purpose:
168
168
169
169
```{julia}
170
+
using Bijectors: ordered
171
+
170
172
@model function gaussian_mixture_model_ordered(x)
171
173
# Draw the parameters for each of the K=2 clusters from a standard normal distribution.
172
174
K = 2
173
-
μ ~ Bijectors.ordered(MvNormal(Zeros(K), I))
175
+
μ ~ ordered(MvNormal(Zeros(K), I))
174
176
# Draw the weights for the K clusters from a Dirichlet distribution with parameters αₖ = 1.
175
177
w ~ Dirichlet(K, 1.0)
176
178
# Alternatively, one could use a fixed set of weights.
@@ -285,7 +287,7 @@ using LogExpFunctions
285
287
@model function gmm_marginalized(x)
286
288
K = 2
287
289
D, N = size(x)
288
-
μ ~ Bijectors.ordered(MvNormal(Zeros(K), I))
290
+
μ ~ ordered(MvNormal(Zeros(K), I))
289
291
w ~ Dirichlet(K, 1.0)
290
292
dists = [MvNormal(Fill(μₖ, D), I) for μₖ in μ]
291
293
for i in 1:N
@@ -325,7 +327,7 @@ The `logpdf` implementation for a `MixtureModel` distribution is exactly the mar
325
327
@model function gmm_marginalized(x)
326
328
K = 2
327
329
D, _ = size(x)
328
-
μ ~ Bijectors.ordered(MvNormal(Zeros(K), I))
330
+
μ ~ ordered(MvNormal(Zeros(K), I))
329
331
w ~ Dirichlet(K, 1.0)
330
332
x ~ MixtureModel([MvNormal(Fill(μₖ, D), I) for μₖ in μ], w)
Copy file name to clipboardExpand all lines: usage/probability-interface/index.qmd
+2-4
Original file line number
Diff line number
Diff line change
@@ -18,7 +18,7 @@ Let's use a simple model of normally-distributed data as an example.
18
18
19
19
```{julia}
20
20
using Turing
21
-
using LinearAlgebra: I
21
+
using DynamicPPL
22
22
using Random
23
23
24
24
@model function gdemo(n)
@@ -98,11 +98,9 @@ logjoint(model, sample)
98
98
```
99
99
100
100
For models with many variables `rand(model)` can be prohibitively slow since it returns a `NamedTuple` of samples from the prior distribution of the unconditioned variables.
101
-
We recommend working with samples of type `DataStructures.OrderedDict` in this case:
101
+
We recommend working with samples of type `DataStructures.OrderedDict` in this case (which Turing re-exports, so can be used directly):
0 commit comments