GPflow is a package for building Gaussian process models in python, using TensorFlow. It was originally created and is now managed by James Hensman and Alexander G. de G. Matthews. The full list of contributors (in alphabetical order) is Rasmus Bonnevie, Alexis Boukouvalas, Ivo Couckuyt, Keisuke Fujii, Zoubin Ghahramani, David J. Harris, James Hensman, Pablo Leon-Villagra, Daniel Marthaler, Alexander G. de G. Matthews, Tom Nickson, Valentine Svensson and Mark van der Wilk. GPflow is an open source project so if you feel you have some relevant skills and are interested in contributing then please do contact us.


1. Install TensorFlow. Please see instructions on the main TensorFlow webpage. You will need version 1.0. . We find that for many users pip installation is the fastest way to get going.

2. install package GPflow is a pure python library for now, so you could just add it to your path (we use python develop) or try an install python install (untested). You can run the tests with python test.

Version history is documented here.

We also provide a Docker image which can be run using

docker run -it -p 8888:8888 gpflow/gpflow

Code to generate the image can be found here.

What’s the difference between GPy and GPflow?

GPflow has origins in GPy by the GPy contributors, and much of the interface is intentionally similar for continuity (though some parts of the interface may diverge in future). GPflow has a rather different remit from GPy though:

  • GPflow leverages TensorFlow for faster/bigger computation
  • GPflow has much less code than GPy, mostly because all gradient computation is handled by TensorFlow.
  • GPflow focusses on variational inference and MCMC – there is no expectation propagation or Laplace approximation.
  • GPflow does not have any plotting functionality.

What models are implemented?

GPflow has a slew of kernels that can be combined in a straightforward way. See the later section on Using kernels in GPflow. As for inference, the options are currently:


For GP regression with Gaussian noise, it’s possible to marginalize the function values exactly: you’ll find this in GPflow.gpr.GPR. You can do maximum likelihood or MCMC for the covariance function parameters (notebook).

It’s also possible to do Sparse GP regression using the GPflow.sgpr.SGPR class. This is based on work by Michalis Titsias [4].


For non-Gaussian likelihoods, GPflow has a model that can jointly sample over the function values and the covariance parameters: GPflow.gpmc.GPMC. There’s also a sparse equivalent in GPflow.sgpmc.SGPMC, based on a recent paper [1].

Variational inference

It’s often sufficient to approximate the function values as a Gaussian, for which we follow [2] in GPflow.vgp.VGP. In addition, there is a sparse version based on [3] in GPflow.svgp.SVGP. In the Gaussian likelihood case some of the optimization may be done analytically as discussed in [4] and implemented in GPflow.sgpr.SGPR . All of the sparse methods in GPflow are solidified in [5].

The following table summarizes the model options in GPflow.

  Gaussian Likelihood Non-Gaussian (variational) Non-Gaussian (MCMC)
Full-covariance GPflow.gpr.GPR GPflow.vgp.VGP GPflow.gpmc.GPMC
Sparse approximation GPflow.sgpr.SGPR GPflow.svgp.SVGP GPflow.sgpmc.SGPMC

A unified view of many of the relevant references, along with some extensions, and an early discussion of GPflow itself, is given in the PhD thesis of Matthews [8].


For visualisation, the GPLVM [6] and Bayesian GPLVM [7] models are implemented in GPflow. (notebook).


All constuctive input is gratefully received. For more information, see the notes for contributors.

Citing GPflow

To cite GPflow, please reference the [Technical report]( Sample Bibtex is given below:

author = {Matthews, Alexander G. de G. and {van der Wilk}, Mark and Nickson, Tom and Fujii, Keisuke. and {Boukouvalas}, Alexis and {Le{‘o}n-Villagr{‘a}}, Pablo and Ghahramani, Zoubin and Hensman, James},
title = “{{GP}flow: A {G}aussian process library using {T}ensor{F}low}”,
journal = {arXiv preprint 1610.08733},
year = 2016,
month = oct


[1] MCMC for Variationally Sparse Gaussian Processes J Hensman, A G de G Matthews, M Filippone, Z Ghahramani Advances in Neural Information Processing Systems, 1639-1647, 2015.

[2] The variational Gaussian approximation revisited M Opper, C Archambeau Neural computation 21 (3), 786-792, 2009.

[3] Scalable Variational Gaussian Process Classification J Hensman, A G de G Matthews, Z Ghahramani Proceedings of AISTATS 18, 2015.

[4] Variational Learning of Inducing Variables in Sparse Gaussian Processes. M Titsias Proceedings of AISTATS 12, 2009.

[5] On Sparse variational methods and the Kullback-Leibler divergence between stochastic processes A G de G Matthews, J Hensman, R E Turner, Z Ghahramani Proceedings of AISTATS 19, 2016.

[6] Gaussian process latent variable models for visualisation of high dimensional data. Lawrence, Neil D. Advances in Neural Information Processing Systems, 329-336, 2004.

[7] Bayesian Gaussian Process Latent Variable Model. Titsias, Michalis K., and Neil D. Lawrence. Proceedings of AISTATS, 2010.

[8] Scalable Gaussian process inference using variational methods. Alexander G. de G. Matthews. PhD Thesis. University of Cambridge, 2016.


James Hensman was supported by an MRC fellowship and Alexander G. de G. Matthews was supported by EPSRC grants EP/I036575/1 and EP/N014162/1.