Skip to content

Commit a3479de

Browse files
committed
Sparse Gaussian processes
1 parent bffc6dc commit a3479de

File tree

5 files changed

+51327
-23
lines changed

5 files changed

+51327
-23
lines changed

README.md

+17-17
Original file line numberDiff line numberDiff line change
@@ -4,46 +4,46 @@
44

55
This repository is a collection of notebooks about *Bayesian Machine Learning*. The following links display
66
some of the notebooks via [nbviewer](https://nbviewer.jupyter.org/) to ensure a proper rendering of formulas.
7+
Dependencies are specified in `requirements.txt` files in subdirectories.
78

89
- [Bayesian regression with linear basis function models](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression.ipynb).
9-
Introduction to Bayesian linear regression. Implementation from scratch with plain NumPy as well as usage of scikit-learn
10-
for comparison. See also
11-
[PyMC4 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression_pymc4.ipynb) and
10+
Introduction to Bayesian linear regression. Implementation with plain NumPy and scikit-learn. See also
1211
[PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-linear-regression/bayesian_linear_regression_pymc3.ipynb).
1312

1413
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes.ipynb)
1514
[Gaussian processes](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes.ipynb?flush_cache=true).
16-
Introduction to Gaussian processes for regression. Example implementations with plain NumPy/SciPy as well as with libraries
17-
scikit-learn and GPy ([requirements.txt](gaussian-processes/requirements.txt)).
15+
Introduction to Gaussian processes for regression. Implementation with plain NumPy/SciPy as well as with scikit-learn and GPy.
1816

1917
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes_classification.ipynb)
2018
[Gaussian processes for classification](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes_classification.ipynb).
21-
Introduction to Gaussian processes for classification. Example implementations with plain NumPy/SciPy as well as with
22-
scikit-learn ([requirements.txt](gaussian-processes/requirements.txt)).
19+
Introduction to Gaussian processes for classification. Implementation with plain NumPy/SciPy as well as with scikit-learn.
20+
21+
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes_sparse.ipynb)
22+
[Sparse Gaussian processes](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes_sparse.ipynb).
23+
Introduction to sparse Gaussian processes using a variational approach. Example implementation with JAX.
2324

2425
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-optimization/bayesian_optimization.ipynb)
2526
[Bayesian optimization](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-optimization/bayesian_optimization.ipynb).
26-
Introduction to Bayesian optimization. Example implementations with plain NumPy/SciPy as well as with libraries
27-
scikit-optimize and GPyOpt. Hyper-parameter tuning as application example.
27+
Introduction to Bayesian optimization. Implementation with plain NumPy/SciPy as well as with libraries scikit-optimize
28+
and GPyOpt. Hyper-parameter tuning as application example.
2829

2930
- [Variational inference in Bayesian neural networks](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-neural-networks/bayesian_neural_networks.ipynb).
30-
Demonstrates how to implement a Bayesian neural network and variational inference of network parameters. Example implementation
31-
with Keras ([requirements.txt](bayesian-neural-networks/requirements.txt)). See also
32-
[PyMC4 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/bayesian-neural-networks/bayesian_neural_networks_pymc4.ipynb).
31+
Demonstrates how to implement a Bayesian neural network and variational inference of weights. Example implementation
32+
with Keras.
3333

3434
- [Reliable uncertainty estimates for neural network predictions](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/noise-contrastive-priors/ncp.ipynb).
35-
Uses noise contrastive priors in Bayesian neural networks to get more reliable uncertainty estimates for OOD data.
36-
Implemented with Tensorflow 2 and Tensorflow Probability ([requirements.txt](noise-contrastive-priors/requirements.txt)).
35+
Uses noise contrastive priors for Bayesian neural networks to get more reliable uncertainty estimates for OOD data.
36+
Implemented with Tensorflow 2 and Tensorflow Probability.
3737

3838
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_1.ipynb)
3939
[Latent variable models, part 1: Gaussian mixture models and the EM algorithm](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_1.ipynb).
40-
Introduction to the expectation maximization (EM) algorithm and its application to Gaussian mixture models. Example
41-
implementation with plain NumPy/SciPy and scikit-learn for comparison. See also
40+
Introduction to the expectation maximization (EM) algorithm and its application to Gaussian mixture models.
41+
Implementation with plain NumPy/SciPy and scikit-learn. See also
4242
[PyMC3 implementation](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_1_pymc3.ipynb).
4343

4444
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_2.ipynb)
4545
[Latent variable models, part 2: Stochastic variational inference and variational autoencoders](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/latent-variable-models/latent_variable_models_part_2.ipynb).
46-
Introduction to stochastic variational inference with variational autoencoder as application example. Implementation
46+
Introduction to stochastic variational inference with a variational autoencoder as application example. Implementation
4747
with Tensorflow 2.x.
4848

4949
- [Deep feature consistent variational autoencoder](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/autoencoder-applications/variational_autoencoder_dfc.ipynb).

gaussian-processes/gaussian_processes.ipynb

+7-5
Original file line numberDiff line numberDiff line change
@@ -400,7 +400,8 @@
400400
}
401401
],
402402
"source": [
403-
"from numpy.linalg import cholesky, det, lstsq\n",
403+
"from numpy.linalg import cholesky, det\n",
404+
"from scipy.linalg import solve_triangular\n",
404405
"from scipy.optimize import minimize\n",
405406
"\n",
406407
"def nll_fn(X_train, Y_train, noise, naive=True):\n",
@@ -437,14 +438,15 @@
437438
" # in http://www.gaussianprocess.org/gpml/chapters/RW2.pdf, Section\n",
438439
" # 2.2, Algorithm 2.1.\n",
439440
" \n",
440-
" def ls(a, b):\n",
441-
" return lstsq(a, b, rcond=-1)[0]\n",
442-
" \n",
443441
" K = kernel(X_train, X_train, l=theta[0], sigma_f=theta[1]) + \\\n",
444442
" noise**2 * np.eye(len(X_train))\n",
445443
" L = cholesky(K)\n",
444+
" \n",
445+
" S1 = solve_triangular(L, Y_train, lower=True)\n",
446+
" S2 = solve_triangular(L.T, S1, lower=False)\n",
447+
" \n",
446448
" return np.sum(np.log(np.diagonal(L))) + \\\n",
447-
" 0.5 * Y_train.dot(ls(L.T, ls(L, Y_train))) + \\\n",
449+
" 0.5 * Y_train.dot(S2) + \\\n",
448450
" 0.5 * len(X_train) * np.log(2*np.pi)\n",
449451
"\n",
450452
" if naive:\n",

0 commit comments

Comments
 (0)