|
4 | 4 |
|
5 | 5 | This repository is a collection of notebooks about *Bayesian Machine Learning*. The following links display
|
6 | 6 | some of the notebooks via [nbviewer](https://nbviewer.jupyter.org/) to ensure a proper rendering of formulas.
|
| 7 | +Dependencies are specified in `requirements.txt` files in subdirectories. |
7 | 8 |
|
8 | 9 | - [Bayesian regression with linear basis function models](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression.ipynb).
|
9 |
| - Introduction to Bayesian linear regression. Implementation from scratch with plain NumPy as well as usage of scikit-learn |
10 |
| - for comparison. See also |
11 |
| - [PyMC4 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression_pymc4.ipynb) and |
| 10 | + Introduction to Bayesian linear regression. Implementation with plain NumPy and scikit-learn. See also |
12 | 11 | [PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression_pymc3.ipynb).
|
13 | 12 |
|
14 | 13 | - [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes.ipynb)
|
15 | 14 | [Gaussian processes](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes.ipynb?flush_cache=true).
|
16 |
| - Introduction to Gaussian processes for regression. Example implementations with plain NumPy/SciPy as well as with libraries |
17 |
| - scikit-learn and GPy ([requirements.txt](gaussian-processes/requirements.txt)). |
| 15 | + Introduction to Gaussian processes for regression. Implementation with plain NumPy/SciPy as well as with scikit-learn and GPy. |
18 | 16 |
|
19 | 17 | - [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes_classification.ipynb)
|
20 | 18 | [Gaussian processes for classification](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes_classification.ipynb).
|
21 |
| - Introduction to Gaussian processes for classification. Example implementations with plain NumPy/SciPy as well as with |
22 |
| - scikit-learn ([requirements.txt](gaussian-processes/requirements.txt)). |
| 19 | + Introduction to Gaussian processes for classification. Implementation with plain NumPy/SciPy as well as with scikit-learn. |
| 20 | + |
| 21 | +- [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes_sparse.ipynb) |
| 22 | + [Sparse Gaussian processes](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes_sparse.ipynb). |
| 23 | + Introduction to sparse Gaussian processes using a variational approach. Example implementation with JAX. |
23 | 24 |
|
24 | 25 | - [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-optimization/bayesian_optimization.ipynb)
|
25 | 26 | [Bayesian optimization](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-optimization/bayesian_optimization.ipynb).
|
26 |
| - Introduction to Bayesian optimization. Example implementations with plain NumPy/SciPy as well as with libraries |
27 |
| - scikit-optimize and GPyOpt. Hyper-parameter tuning as application example. |
| 27 | + Introduction to Bayesian optimization. Implementation with plain NumPy/SciPy as well as with libraries scikit-optimize |
| 28 | + and GPyOpt. Hyper-parameter tuning as application example. |
28 | 29 |
|
29 | 30 | - [Variational inference in Bayesian neural networks](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-neural-networks/bayesian_neural_networks.ipynb).
|
30 |
| - Demonstrates how to implement a Bayesian neural network and variational inference of network parameters. Example implementation |
31 |
| - with Keras ([requirements.txt](bayesian-neural-networks/requirements.txt)). See also |
32 |
| - [PyMC4 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-neural-networks/bayesian_neural_networks_pymc4.ipynb). |
| 31 | + Demonstrates how to implement a Bayesian neural network and variational inference of weights. Example implementation |
| 32 | + with Keras. |
33 | 33 |
|
34 | 34 | - [Reliable uncertainty estimates for neural network predictions](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/noise-contrastive-priors/ncp.ipynb).
|
35 |
| - Uses noise contrastive priors in Bayesian neural networks to get more reliable uncertainty estimates for OOD data. |
36 |
| - Implemented with Tensorflow 2 and Tensorflow Probability ([requirements.txt](noise-contrastive-priors/requirements.txt)). |
| 35 | + Uses noise contrastive priors for Bayesian neural networks to get more reliable uncertainty estimates for OOD data. |
| 36 | + Implemented with Tensorflow 2 and Tensorflow Probability. |
37 | 37 |
|
38 | 38 | - [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_1.ipynb)
|
39 | 39 | [Latent variable models, part 1: Gaussian mixture models and the EM algorithm](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_1.ipynb).
|
40 |
| - Introduction to the expectation maximization (EM) algorithm and its application to Gaussian mixture models. Example |
41 |
| - implementation with plain NumPy/SciPy and scikit-learn for comparison. See also |
| 40 | + Introduction to the expectation maximization (EM) algorithm and its application to Gaussian mixture models. |
| 41 | + Implementation with plain NumPy/SciPy and scikit-learn. See also |
42 | 42 | [PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_1_pymc3.ipynb).
|
43 | 43 |
|
44 | 44 | - [](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_2.ipynb)
|
45 | 45 | [Latent variable models, part 2: Stochastic variational inference and variational autoencoders](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_2.ipynb).
|
46 |
| - Introduction to stochastic variational inference with variational autoencoder as application example. Implementation |
| 46 | + Introduction to stochastic variational inference with a variational autoencoder as application example. Implementation |
47 | 47 | with Tensorflow 2.x.
|
48 | 48 |
|
49 | 49 | - [Deep feature consistent variational autoencoder](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/autoencoder-applications/variational_autoencoder_dfc.ipynb).
|
|
0 commit comments