gpflow

gpflow.Module

class gpflow.Module(*args, **kwargs)[source]

Bases: tensorflow.Module

Attributes
parameters
trainable_parameters

Methods

__call__(*args, **kwargs)

Call self as a function.

Parameters
  • args (Any) –

  • kwargs (Any) –

_repr_html_()[source]

Nice representation of GPflow objects in IPython/Jupyter notebooks

Return type

str

_repr_pretty_(p, cycle)[source]

Nice representation of GPflow objects in the IPython shell

Parameters
  • p (RepresentationPrinter) –

  • cycle (bool) –

Return type

None

gpflow.Parameter

class gpflow.Parameter(value, *, transform=None, prior=None, prior_on=<PriorOn.CONSTRAINED: 'constrained'>, trainable=True, dtype=None, name=None)[source]

Bases: tensorflow_probability.util.TransformedVariable

Attributes
prior_on
trainable

True if this instance is trainable, else False.

transform
unconstrained_variable

Methods

__call__(*args, **kwargs)

Call self as a function.

assign(value[, use_locking, name, read_value])

Assigns constrained value to the unconstrained parameter’s variable.

log_prior_density()

Log of the prior probability density of the constrained variable.

Parameters
  • value (Union[int, float, Sequence[Any], Tensor, Variable, Parameter]) –

  • transform (Optional[Bijector]) –

  • prior (Optional[Distribution]) –

  • prior_on (Union[str, PriorOn]) –

  • trainable (bool) –

  • dtype (Union[dtype, DType, None]) –

  • name (Optional[str]) –

__init__(value, *, transform=None, prior=None, prior_on=<PriorOn.CONSTRAINED: 'constrained'>, trainable=True, dtype=None, name=None)[source]

A parameter retains both constrained and unconstrained representations. If no transform is provided, these two values will be the same. It is often challenging to operate with unconstrained parameters. For example, a variance cannot be negative, therefore we need a positive constraint and it is natural to use constrained values. A prior can be imposed either on the constrained version (default) or on the unconstrained version of the parameter.

Parameters
  • value (Union[int, float, Sequence[Any], Tensor, Variable, Parameter]) –

  • transform (Optional[Bijector]) –

  • prior (Optional[Distribution]) –

  • prior_on (Union[str, PriorOn]) –

  • trainable (bool) –

  • dtype (Union[dtype, DType, None]) –

  • name (Optional[str]) –

assign(value, use_locking=False, name=None, read_value=True)[source]

Assigns constrained value to the unconstrained parameter’s variable. It passes constrained value through parameter’s transform first.

Example:

``` a = Parameter(2.0, transform=tfp.bijectors.Softplus()) b = Parameter(3.0)

a.assign(4.0) # a parameter to 2.0 value. a.assign(tf.constant(5.0)) # a parameter to 5.0 value. a.assign(b) # a parameter to constrained value of b. ```

Parameters
  • value (Union[int, float, Sequence[Any], Tensor, Variable, Parameter]) – Constrained tensor-like value.

  • use_locking (bool) – If True, use locking during the assignment.

  • name (Optional[str]) – The name of the operation to be created.

  • read_value (bool) – if True, will return something which evaluates to the new value of the variable; if False will return the assign op.

Return type

Tensor

log_prior_density()[source]

Log of the prior probability density of the constrained variable.

Return type

Tensor

property trainable

True if this instance is trainable, else False.

This attribute cannot be set directly. Use gpflow.set_trainable().

Return type

bool

gpflow.default_float

gpflow.default_float()[source]

Returns default float type

gpflow.default_int

gpflow.default_int()[source]

Returns default integer type

gpflow.default_jitter

gpflow.default_jitter()[source]

The jitter is a constant that GPflow adds to the diagonal of matrices to achieve numerical stability of the system when the condition number of the associated matrices is large, and therefore the matrices nearly singular.

gpflow.set_trainable

gpflow.set_trainable(model, flag)[source]

Set trainable flag for all tf.Variable`s and `gpflow.Parameter`s in a `tf.Module or collection of `tf.Module`s.

Parameters
  • model (Union[Module, Iterable[Module]]) –

  • flag (bool) –

Return type

None

gpflow.base

gpflow.conditionals

gpflow.config

gpflow.covariances

gpflow.expectations

gpflow.inducing_variables

gpflow.kernels

gpflow.kullback_leiblers

gpflow.likelihoods

gpflow.logdensities

gpflow.mean_functions

Throughout GPflow, by default, latent functions being modelled with Gaussian processes are assumed to have zero mean, f ~ GP(0, k(x,x’)).

In some cases we may wish to model only the deviation from a fixed function with a Gaussian process. For flexibility this fixed function could be both input dependent and parameterised function, μ(x; θ), with some unknown parameters θ, resulting in f ~ GP(μ(x;θ), k(x,x’)).

The GPflow MeanFunction class allows this to be done whilst additionally learning parameters of the parametric function.

gpflow.models

gpflow.monitor

Provides basic functionality to monitor optimisation runs

gpflow.optimizers

gpflow.probability_distributions

gpflow.quadrature

gpflow.utilities