|
6 | 6 |
|
7 | 7 | * [Add Gaussian Process submodule](http://pymc-devs.github.io/pymc3/notebooks/GP-introduction.html)
|
8 | 8 |
|
| 9 | +* Much improved variational inference support: |
| 10 | + |
| 11 | + - [Add Operator Variational Inference (experimental).](http://pymc-devs.github.io/pymc3/notebooks/bayesian_neural_network_opvi-advi.html) |
| 12 | + |
| 13 | + - [Add Stein-Variational Gradient Descent as well as Amortized SVGD (experimental).](https://github.com/pymc-devs/pymc3/pull/2183) |
| 14 | + |
| 15 | + - [Add pm.Minibatch() to easily specify mini-batches.](http://pymc-devs.github.io/pymc3/notebooks/bayesian_neural_network_opvi-advi.html#Minibatch-ADVI) |
| 16 | + |
| 17 | + - Added various optimizers including ADAM. |
| 18 | + |
| 19 | + - Stopping criterion implemented via callbacks. |
| 20 | + |
9 | 21 | * sample() defaults changed: tuning is enabled for the first 500 samples which are then discarded from the trace as burn-in.
|
10 | 22 |
|
| 23 | +* MvNormal supports Cholesky Decomposition now for increased speed and numerical stability. |
| 24 | + |
11 | 25 | * Many optimizations and speed-ups.
|
12 | 26 |
|
13 | 27 | * NUTS implementation now matches current Stan implementation.
|
|
18 | 32 |
|
19 | 33 | * [Add live-trace to see samples in real-time.](http://pymc-devs.github.io/pymc3/notebooks/live_sample_plots.html)
|
20 | 34 |
|
21 |
| -* ADVI stopping criterion implemented. |
22 |
| - |
23 | 35 | * Improved support for theano's floatX setting to enable GPU computations (work in progress).
|
24 | 36 |
|
25 |
| -* MvNormal supports Cholesky Decomposition now for increased speed and numerical stability. |
26 |
| - |
27 | 37 | * [Add Elliptical Slice Sampler.](http://pymc-devs.github.io/pymc3/notebooks/GP-slice-sampling.html)
|
28 | 38 |
|
29 |
| -* Much improved variational inference support: |
30 |
| - |
31 |
| - - [Add Operator Variational Inference (experimental).](http://pymc-devs.github.io/pymc3/notebooks/bayesian_neural_network_opvi-advi.html) |
32 |
| - |
33 |
| - - [Add Stein-Variational Gradient Descent as well as Amortized SVGD (experimental).](https://github.com/pymc-devs/pymc3/pull/2183) |
34 |
| - |
35 |
| - - [Add pm.generator() to easily specify mini-batches.](http://pymc-devs.github.io/pymc3/notebooks/bayesian_neural_network_opvi-advi.html#Minibatch-ADVI) |
36 |
| - |
37 |
| - - Added various optimizers including ADAM. |
38 |
| - |
39 | 39 | * [Sampled posteriors can now be turned into priors for Bayesian updating with a new interpolated distribution.](https://github.com/pymc-devs/pymc3/pull/2163)
|
40 | 40 |
|
41 | 41 | * `Model` can now be inherited from and act as a base class for user specified models (see pymc3.models.linear).
|
|
44 | 44 |
|
45 | 45 | * GLM models do not need a left-hand variable anymore.
|
46 | 46 |
|
47 |
| -* Add support for cholesky parametrizations for cov/corr matrices. |
48 |
| - |
49 | 47 | * Refactored HMC and NUTS for better readability.
|
50 | 48 |
|
51 | 49 | * Add support for Python 3.6.
|
|
0 commit comments