Skip to content

Commit 511d6ec

Browse files
authored
Better documentation (#417)
* Better documentation * Apply reviews * Fix some typos * Fix typo
1 parent 4a2ce1b commit 511d6ec

13 files changed

+447
-294
lines changed

README.md

+7-274
Large diffs are not rendered by default.

docs/Project.toml

+6
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,9 @@
11
[deps]
22
AdvancedHMC = "0bf59076-c3b1-5ca4-86bd-e02cd72cde3d"
33
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
4+
DocumenterCitations = "daee34ce-89f3-4625-b898-19384cb65244"
5+
6+
[compat]
7+
AdvancedHMC = "0.7"
8+
Documenter = "1"
9+
DocumenterCitations = "1"

docs/make.jl

+23-3
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,28 @@
11
using Pkg
22

3-
using Documenter
3+
using Documenter, DocumenterCitations
44
using AdvancedHMC
55

6-
# cp(joinpath(@__DIR__, "../README.md"), joinpath(@__DIR__, "src/index.md"))
6+
bib = CitationBibliography(joinpath(@__DIR__, "src", "refs.bib"))
77

8-
makedocs(; sitename="AdvancedHMC", format=Documenter.HTML(), warnonly=[:cross_references])
8+
makedocs(;
9+
sitename="AdvancedHMC",
10+
format=Documenter.HTML(;
11+
assets=["assets/favicon.ico"],
12+
canonical="https://turinglang.org/AdvancedHMC.jl/stable/",
13+
),
14+
warnonly=[:cross_references],
15+
plugins=[bib],
16+
pages=[
17+
"AdvancedHMC.jl" => "index.md",
18+
"Get Started" => "get_started.md",
19+
"Automatic Differentiation Backends" => "autodiff.md",
20+
"Detailed API" => "api.md",
21+
"Interfaces" => "interfaces.md",
22+
"News" => "news.md",
23+
"Change Log" => "changelog.md",
24+
"References" => "references.md",
25+
],
26+
)
27+
28+
deploydocs(; repo="github.com/TuringLang/AdvancedHMC.jl.git", push_preview=true)

docs/src/api.md

+68-16
Original file line numberDiff line numberDiff line change
@@ -1,27 +1,79 @@
1-
# AdvancedHMC.jl
1+
# Detailed API for AdvancedHMC.jl
22

3-
Documentation for AdvancedHMC.jl
3+
An important design goal of AdvancedHMC.jl is modularity; we would like to support algorithmic research on HMC.
4+
This modularity means that different HMC variants can be easily constructed by composing various components, such as preconditioning metric (i.e., mass matrix), leapfrog integrators, trajectories (static or dynamic), adaption schemes, etc. In this section, we will explain the detailed usage of different modules in AdancedHMC.jl to provide a comprehensive udnerstanding of how AdvancedHMC.jl can achieve both modularity and efficiency. The section highlights the key components of AdvancedHMC.jl, with a complete documentation provided at the end.
45

5-
```@contents
6-
```
6+
### [Hamiltonian mass matrix (`metric`)](@id hamiltonian_mm)
77

8-
## Types
8+
- Unit metric: `UnitEuclideanMetric(dim)`
9+
- Diagonal metric: `DiagEuclideanMetric(dim)`
10+
- Dense metric: `DenseEuclideanMetric(dim)`
911

10-
```@docs
11-
ClassicNoUTurn
12-
HMCSampler
13-
HMC
14-
NUTS
15-
HMCDA
16-
```
12+
where `dim` is the dimensionality of the sampling space.
13+
14+
### [Integrator (`integrator`)](@id integrator)
15+
16+
- Ordinary leapfrog integrator: `Leapfrog(ϵ)`
17+
- Jittered leapfrog integrator with jitter rate `n`: `JitteredLeapfrog(ϵ, n)`
18+
- Tempered leapfrog integrator with tempering rate `a`: `TemperedLeapfrog(ϵ, a)`
19+
20+
where `ϵ` is the step size of leapfrog integration.
21+
22+
### [Kernel (`kernel`)](@id kernel)
23+
24+
- Static HMC with a fixed number of steps (`n_steps`) from [neal2011mcmc](@Citet): `HMCKernel(Trajectory{EndPointTS}(integrator, FixedNSteps(integrator)))`
25+
- HMC with a fixed total trajectory length (`trajectory_length`) from [neal2011mcmc](@Citet): `HMCKernel(Trajectory{EndPointTS}(integrator, FixedIntegrationTime(trajectory_length)))`
26+
- Original NUTS with slice sampling from [hoffman2014no](@Citet): `HMCKernel(Trajectory{SliceTS}(integrator, ClassicNoUTurn()))`
27+
- Generalised NUTS with slice sampling from [betancourt2017conceptual](@Citet): `HMCKernel(Trajectory{SliceTS}(integrator, GeneralisedNoUTurn()))`
28+
- Original NUTS with multinomial sampling from [betancourt2017conceptual](@Citet): `HMCKernel(Trajectory{MultinomialTS}(integrator, ClassicNoUTurn()))`
29+
- Generalised NUTS with multinomial sampling from [betancourt2017conceptual](@Citet): `HMCKernel(Trajectory{MultinomialTS}(integrator, GeneralisedNoUTurn()))`
1730

18-
## Functions
31+
### Adaptor (`adaptor`)
1932

20-
```@docs
21-
sample
33+
- Adapt the mass matrix `metric` of the Hamiltonian dynamics: `mma = MassMatrixAdaptor(metric)`
34+
35+
+ This is lowered to `UnitMassMatrix`, `WelfordVar` or `WelfordCov` based on the type of the mass matrix `metric`
36+
37+
- Adapt the step size of the leapfrog integrator `integrator`: `ssa = StepSizeAdaptor(δ, integrator)`
38+
39+
+ It uses Nesterov's dual averaging with `δ` as the target acceptance rate.
40+
- Combine the two above *naively*: `NaiveHMCAdaptor(mma, ssa)`
41+
- Combine the first two using Stan's windowed adaptation: `StanHMCAdaptor(mma, ssa)`
42+
43+
## The `sample` functions
44+
45+
```julia
46+
sample(
47+
rng::Union{AbstractRNG,AbstractVector{<:AbstractRNG}},
48+
h::Hamiltonian,
49+
κ::HMCKernel,
50+
θ::AbstractVector{<:AbstractFloat},
51+
n_samples::Int;
52+
adaptor::AbstractAdaptor=NoAdaptation(),
53+
n_adapts::Int=min(div(n_samples, 10), 1_000),
54+
drop_warmup=false,
55+
verbose::Bool=true,
56+
progress::Bool=false,
57+
)
2258
```
2359

24-
## More types
60+
Draw `n_samples` samples using the kernel `κ` under the Hamiltonian system `h`
61+
62+
- The randomness is controlled by `rng`.
63+
64+
+ If `rng` is not provided, the default random number generator (`Random.default_rng()`) will be used.
65+
66+
- The initial point is given by `θ`.
67+
- The adaptor is set by `adaptor`, for which the default is no adaptation.
68+
69+
+ It will perform `n_adapts` steps of adaptation, for which the default is `1_000` or 10% of `n_samples`, whichever is lower.
70+
- `drop_warmup` specifies whether to drop samples.
71+
- `verbose` controls the verbosity.
72+
- `progress` controls whether to show the progress meter or not.
73+
74+
Note that the function signature of the `sample` function exported by `AdvancedHMC.jl` differs from the [`sample`](https://turinglang.org/dev/docs/using-turing/guide#modelling-syntax-explained) function used by `Turing.jl`. We refer to the documentation of `Turing.jl` for more details on the latter.
75+
76+
## Full documentation of APIs in AdvancedHMC.jl
2577

2678
```@autodocs; canonical=false
2779
Modules = [AdvancedHMC, AdvancedHMC.Adaptation]

docs/src/assets/favicon.ico

197 KB
Binary file not shown.

docs/src/autodiff.md

+5
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
# Gradient in AdvancedHMC.jl
2+
3+
AdvancedHMC.jl supports automatic differentiation using [`LogDensityProblemsAD`](https://github.com/tpapp/LogDensityProblemsAD.jl) across various AD backends and allows user-specified gradients. While the default AD backend for AdvancedHMC.jl is ForwardDiff.jl, we can seamlessly change to other backend like Mooncake.jl using various syntax like `Hamiltonian(metric, ℓπ, AutoZygote())`. Different AD backend can also be pluged in using `Hamiltonian(metric, ℓπ, Zygote)`, `Hamiltonian(metric, ℓπ, Val(:Zygote))` but we recommend using ADTypes since that would allow you to have more freedom for specifying the AD backend.
4+
5+
In order to use user-specified gradients, please replace ForwardDiff.jl with `ℓπ_grad` in the `Hamiltonian` constructor as `Hamiltonian(metric, ℓπ, ℓπ_grad)`, where the gradient function `ℓπ_grad` should return a tuple containing both the log-posterior and its gradient, for example `ℓπ_grad(x) = (log_posterior, grad)`.

docs/src/changelog.md

+19
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
**CHANGELOG**
2+
3+
- [v0.5.0] **Breaking!** Convenience constructors for common samplers changed to:
4+
5+
+ `HMC(leapfrog_stepsize::Real, n_leapfrog::Int)`
6+
+ `NUTS(target_acceptance::Real)`
7+
+ `HMCDA(target_acceptance::Real, integration_time::Real)`
8+
9+
- [v0.2.22] Three functions are renamed.
10+
11+
+ `Preconditioner(metric::AbstractMetric)` -> `MassMatrixAdaptor(metric)` and
12+
+ `NesterovDualAveraging(δ, integrator::AbstractIntegrator)` -> `StepSizeAdaptor(δ, integrator)`
13+
+ `find_good_eps` -> `find_good_stepsize`
14+
- [v0.2.15] `n_adapts` is no longer needed to construct `StanHMCAdaptor`; the old constructor is deprecated.
15+
- [v0.2.8] Two Hamiltonian trajectory sampling methods are renamed to avoid a name clash with Distributions.
16+
17+
+ `Multinomial` -> `MultinomialTS`
18+
+ `Slice` -> `SliceTS`
19+
- [v0.2.0] The gradient function passed to `Hamiltonian` is supposed to return a value-gradient tuple now.

docs/src/get_started.md

+203
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,203 @@
1+
# Sampling from a multivariate Gaussian using NUTS
2+
3+
In this section, we demonstrate a minimal example of sampling from a multivariate Gaussian (10-dimensional) using the No U-Turn Sampler (NUTS). Below we describe the major components of the Hamiltonian system which are essential to sample using this approach:
4+
5+
- **Metric**: In many sampling problems the sample space is associated with a metric that allows us to measure the distance between any two points, and other similar quantities. In the example in this section, we use a special metric called the **Euclidean Metric**, represented with a `D × D` matrix from which we can compute distances.[^1]
6+
7+
- **Leapfrog integration**: Leapfrog integration is a second-order numerical method for integrating differential equations (In this case they are equations of motion for the relative position of one particle with respect to the other). The order of this integration signifies its rate of convergence. Any algorithm with a finite time step size will have numerical errors, and the order is related to this error. For a second-order algorithm, this error scales as the second power of the time step, hence the name. High-order integrators are usually complex to code and have a limited region of convergence; thus they do not allow arbitrarily large time steps. A second-order integrator is suitable for our purpose. Hence we opt for the leapfrog integrator. It is called `leapfrog` due to the ways this algorithm is written, where the positions and velocities of particles "leap over" each other.[^2]
8+
- **Kernel for trajectories (static or dynamic)**: Different kernels, which may be static or dynamic, can be used. At each iteration of any variant of the HMC algorithm, there are two main steps - the first step changes the momentum and the second step may change both the position and the momentum of a particle.[^3]
9+
10+
```julia
11+
using AdvancedHMC, ForwardDiff
12+
using LogDensityProblems
13+
using LinearAlgebra
14+
15+
# Define the target distribution using the `LogDensityProblem` interface
16+
struct LogTargetDensity
17+
dim::Int
18+
end
19+
LogDensityProblems.logdensity(p::LogTargetDensity, θ) = -sum(abs2, θ) / 2 # standard multivariate normal
20+
LogDensityProblems.dimension(p::LogTargetDensity) = p.dim
21+
function LogDensityProblems.capabilities(::Type{LogTargetDensity})
22+
return LogDensityProblems.LogDensityOrder{0}()
23+
end
24+
25+
# Choose parameter dimensionality and initial parameter value
26+
D = 10;
27+
initial_θ = rand(D);
28+
ℓπ = LogTargetDensity(D)
29+
30+
# Set the number of samples to draw and warmup iterations
31+
n_samples, n_adapts = 2_000, 1_000
32+
33+
# Define a Hamiltonian system
34+
metric = DiagEuclideanMetric(D)
35+
hamiltonian = Hamiltonian(metric, ℓπ, ForwardDiff)
36+
37+
# Define a leapfrog solver, with the initial step size chosen heuristically
38+
initial_ϵ = find_good_stepsize(hamiltonian, initial_θ)
39+
integrator = Leapfrog(initial_ϵ)
40+
41+
# Define an HMC sampler with the following components
42+
# - multinomial sampling scheme,
43+
# - generalised No-U-Turn criteria, and
44+
# - windowed adaption for step-size and diagonal mass matrix
45+
kernel = HMCKernel(Trajectory{MultinomialTS}(integrator, GeneralisedNoUTurn()))
46+
adaptor = StanHMCAdaptor(MassMatrixAdaptor(metric), StepSizeAdaptor(0.8, integrator))
47+
48+
# Run the sampler to draw samples from the specified Gaussian, where
49+
# - `samples` will store the samples
50+
# - `stats` will store diagnostic statistics for each sample
51+
samples, stats = sample(
52+
hamiltonian, kernel, initial_θ, n_samples, adaptor, n_adapts; progress=true
53+
)
54+
```
55+
56+
## Parallel Sampling
57+
58+
AdvancedHMC enables parallel sampling (either distributed or multi-thread) via Julia's [parallel computing functions](https://docs.julialang.org/en/v1/manual/parallel-computing/).
59+
It also supports vectorized sampling for static HMC.
60+
61+
The below example utilizes the `@threads` macro to sample 4 chains across 4 threads.
62+
63+
```julia
64+
# Ensure that Julia was launched with an appropriate number of threads
65+
println(Threads.nthreads())
66+
67+
# Number of chains to sample
68+
nchains = 4
69+
70+
# Cache to store the chains
71+
chains = Vector{Any}(undef, nchains)
72+
73+
# The `samples` from each parallel chain is stored in the `chains` vector
74+
# Adjust the `verbose` flag as per need
75+
Threads.@threads for i in 1:nchains
76+
samples, stats = sample(
77+
hamiltonian, kernel, initial_θ, n_samples, adaptor, n_adapts; verbose=false
78+
)
79+
chains[i] = samples
80+
end
81+
```
82+
83+
## Using the `AbstractMCMC` Interface
84+
85+
Users can also use the `AbstractMCMC` interface to sample, which is also used in Turing.jl.
86+
In order to show how this is done let us start from our previous example where we defined a `LogTargetDensity`, `ℓπ`.
87+
88+
```julia
89+
using AbstractMCMC, LogDensityProblemsAD
90+
91+
# Wrap the previous LogTargetDensity as LogDensityModel
92+
# where ℓπ::LogTargetDensity
93+
model = AdvancedHMC.LogDensityModel(LogDensityProblemsAD.ADgradient(Val(:ForwardDiff), ℓπ))
94+
95+
# Wrap the previous sampler as a HMCSampler <: AbstractMCMC.AbstractSampler
96+
D = 10;
97+
initial_θ = rand(D);
98+
n_samples, n_adapts, δ = 1_000, 2_000, 0.8
99+
sampler = HMCSampler(kernel, metric, adaptor)
100+
101+
# Now sample
102+
samples = AbstractMCMC.sample(
103+
model, sampler, n_adapts + n_samples; n_adapts=n_adapts, initial_params=initial_θ
104+
)
105+
```
106+
107+
## Convenience Constructors
108+
109+
In the previous examples, we built the sampler by manually specifying the integrator, metric, kernel, and adaptor to build our own sampler. However, in many cases, users might want to initialize a standard NUTS sampler. In such cases having to define each of these aspects manually is tedious and error-prone. For these reasons `AdvancedHMC` also provides users with a series of convenience constructors for standard samplers. We will now show how to use them.
110+
111+
- HMC:
112+
113+
```julia
114+
# HMC Sampler
115+
# step size, number of leapfrog steps
116+
n_leapfrog, ϵ = 25, 0.1
117+
hmc = HMC(ϵ, n_leapfrog)
118+
```
119+
120+
is equivalent to:
121+
122+
```julia
123+
metric = DiagEuclideanMetric(D)
124+
hamiltonian = Hamiltonian(metric, ℓπ, ForwardDiff)
125+
integrator = Leapfrog(0.1)
126+
kernel = HMCKernel(Trajectory{EndPointTS}(integrator, FixedNSteps(n_leapfrog)))
127+
adaptor = NoAdaptation()
128+
hmc = HMCSampler(kernel, metric, adaptor)
129+
```
130+
131+
- NUTS:
132+
133+
```julia
134+
# NUTS Sampler
135+
# adaptation steps, target acceptance probability,
136+
δ = 0.8
137+
nuts = NUTS(δ)
138+
```
139+
140+
is equivalent to:
141+
142+
```julia
143+
metric = DiagEuclideanMetric(D)
144+
hamiltonian = Hamiltonian(metric, ℓπ, ForwardDiff)
145+
initial_ϵ = find_good_stepsize(hamiltonian, initial_θ)
146+
integrator = Leapfrog(initial_ϵ)
147+
kernel = HMCKernel(Trajectory{MultinomialTS}(integrator, GeneralisedNoUTurn()))
148+
adaptor = StanHMCAdaptor(MassMatrixAdaptor(metric), StepSizeAdaptor(δ, integrator))
149+
nuts = HMCSampler(kernel, metric, adaptor)
150+
```
151+
- HMCDA:
152+
153+
```julia
154+
#HMCDA (dual averaging)
155+
# adaptation steps, target acceptance probability, target trajectory length
156+
δ, λ = 0.8, 1.0
157+
hmcda = HMCDA(δ, λ)
158+
```
159+
160+
is equivalent to:
161+
162+
```julia
163+
metric = DiagEuclideanMetric(D)
164+
hamiltonian = Hamiltonian(metric, ℓπ, ForwardDiff)
165+
initial_ϵ = find_good_stepsize(hamiltonian, initial_θ)
166+
integrator = Leapfrog(initial_ϵ)
167+
kernel = HMCKernel(Trajectory{EndPointTS}(integrator, FixedIntegrationTime(λ)))
168+
adaptor = StepSizeAdaptor(δ, initial_ϵ)
169+
hmcda = HMCSampler(kernel, metric, adaptor)
170+
```
171+
172+
Moreover, there's some flexibility in how these samplers can be initialized.
173+
For example, a user can initialize a NUTS (HMC and HMCDA) sampler with their own metrics and integrators.
174+
This can be done as follows:
175+
176+
```julia
177+
nuts = NUTS(δ; metric=:diagonal) #metric = DiagEuclideanMetric(D) (Default!)
178+
nuts = NUTS(δ; metric=:unit) #metric = UnitEuclideanMetric(D)
179+
nuts = NUTS(δ; metric=:dense) #metric = DenseEuclideanMetric(D)
180+
# Provide your own AbstractMetric
181+
metric = DiagEuclideanMetric(10)
182+
nuts = NUTS(δ; metric=metric)
183+
184+
nuts = NUTS(δ; integrator=:leapfrog) #integrator = Leapfrog(ϵ) (Default!)
185+
nuts = NUTS(δ; integrator=:jitteredleapfrog) #integrator = JitteredLeapfrog(ϵ, 0.1ϵ)
186+
nuts = NUTS(δ; integrator=:temperedleapfrog) #integrator = TemperedLeapfrog(ϵ, 1.0)
187+
188+
# Provide your own AbstractIntegrator
189+
integrator = JitteredLeapfrog(0.1, 0.2)
190+
nuts = NUTS(δ; integrator=integrator)
191+
```
192+
193+
## GPU Sampling with CUDA
194+
195+
There is experimental support for running static HMC on the GPU using CUDA.
196+
To do so, the user needs to have [CUDA.jl](https://github.com/JuliaGPU/CUDA.jl) installed, ensure the logdensity of the `Hamiltonian` can be executed on the GPU and that the initial points are a `CuArray`.
197+
A small working example can be found at `test/cuda.jl`.
198+
199+
## Footnotes
200+
201+
[^1]: The Euclidean metric is also known as the mass matrix in the physical perspective. See [Hamiltonian mass matrix](@ref hamiltonian-mm) for available metrics.
202+
[^2]: About the leapfrog integration scheme: Suppose ${\bf x}$ and ${\bf v}$ are the position and velocity of an individual particle respectively; $i$ and $i+1$ are the indices for time values $t_i$ and $t_{i+1}$ respectively; $dt = t_{i+1} - t_i$ is the time step size (constant and regularly spaced intervals), and ${\bf a}$ is the acceleration induced on a particle by the forces of all other particles. Furthermore, suppose positions are defined at times $t_i, t_{i+1}, t_{i+2}, \dots $, spaced at constant intervals $dt$, the velocities are defined at halfway times in between, denoted by $t_{i-1/2}, t_{i+1/2}, t_{i+3/2}, \dots $, where $t_{i+1} - t_{i + 1/2} = t_{i + 1/2} - t_i = dt / 2$, and the accelerations ${\bf a}$ are defined only on integer times, just like the positions. Then the leapfrog integration scheme is given as: $x_{i} = x_{i-1} + v_{i-1/2} dt; \quad v_{i+1/2} = v_{i-1/2} + a_i dt$. For available integrators refer to [Integrator](@ref integrator).
203+
[^3]: On kernels: In the classical HMC approach, during the first step, new values for the momentum variables are randomly drawn from their Gaussian distribution, independently of the current values of the position variables. A Metropolis update is performed during the second step, using Hamiltonian dynamics to provide a new state. For available kernels refer to [Kernel](@ref kernel).

docs/src/index.md

-1
This file was deleted.

0 commit comments

Comments
 (0)