gpflow.utilities¶
gpflow.utilities.deepcopy¶
-
gpflow.utilities.
deepcopy
(input_module, memo=None)[source]¶ Returns a deepcopy of the input tf.Module. To do that first resets the caches stored inside each tfp.bijectors.Bijector to allow the deepcopy of the tf.Module.
- Parameters
input_module (~M) – tf.Module including keras.Model, keras.layers.Layer and gpflow.Module.
memo (
Optional
[Dict
[int
,Any
]]) – passed through to func:copy.deepcopy (see https://docs.python.org/3/library/copy.html).
- Return type
~M
- Returns
Returns a deepcopy of an input object.
gpflow.utilities.freeze¶
gpflow.utilities.leaf_components¶
gpflow.utilities.multiple_assign¶
-
gpflow.utilities.
multiple_assign
(module, parameters)[source]¶ Multiple assign takes a dictionary with new values. Dictionary keys are paths to the tf.Variable`s or `gpflow.Parameter of the input module.
- Parameters
module (
Module
) – tf.Module.parameters (
Dict
[str
,Tensor
]) – a dictionary with keys of the form “.module.path.to.variable” and new value tensors.
gpflow.utilities.parameter_dict¶
-
gpflow.utilities.
parameter_dict
(module)[source]¶ Returns a dictionary of parameters (variables) for the tf.Module component. Dictionary keys are relative paths to the attributes to which parameters (variables) assigned to.
- class SubModule(tf.Module):
- def __init__(self):
self.parameter = gpflow.Parameter(1.0) self.variable = tf.Variable(1.0)
- class Module(tf.Module):
- def __init__(self):
self.submodule = SubModule()
m = Module() params = parameter_dict(m) # { # “.submodule.parameter”: <parameter object>, # “.submodule.variable”: <variable object> # }
- Parameters
module (
Module
) –- Return type
Dict
[str
,Union
[Parameter
,Variable
]]
gpflow.utilities.positive¶
-
gpflow.utilities.
positive
(lower=None, base=None)[source]¶ Returns a positive bijector (a reversible transformation from real to positive numbers).
- Parameters
lower (
Optional
[float
]) – overrides default lower bound (if None, defaults to gpflow.config.default_positive_minimum())base (
Optional
[str
]) – overrides base positive bijector (if None, defaults to gpflow.config.default_positive_bijector())
- Return type
Bijector
- Returns
a bijector instance
gpflow.utilities.print_summary¶
gpflow.utilities.read_values¶
gpflow.utilities.reset_cache_bijectors¶
-
gpflow.utilities.
reset_cache_bijectors
(input_module)[source]¶ Recursively finds tfp.bijectors.Bijector-s inside the components of the tf.Module using traverse_component. Resets the caches stored inside each tfp.bijectors.Bijector.
- Parameters
input_module (
Module
) – tf.Module including keras.Model, keras.layers.Layer and gpflow.Module.- Return type
Module
- Returns
same object but with all bijector caches reset
gpflow.utilities.select_dict_parameters_with_prior¶
gpflow.utilities.tabulate_module_summary¶
gpflow.utilities.to_default_int¶
gpflow.utilities.training_loop¶
-
gpflow.utilities.
training_loop
(closure, optimizer=None, var_list=None, maxiter=1000.0, compile=False)[source]¶ Simple generic training loop. At each iteration uses a GradientTape to compute the gradients of a loss function with respect to a set of variables.
- Parameters
closure (
Callable
[[],Tensor
]) – Callable that constructs a loss function based on data and model being trainedoptimizer (
Optional
[Optimizer
]) – tf.optimizers or tf.keras.optimizers that updates variables by applying the corresponding loss gradients. Adam is a default optimizer with default settings.var_list (
Optional
[List
[Variable
]]) – List of model variables to be learnt during trainingmaxiter – Maximum number of
- Returns