A model is defined in it’s own package. The model must implement
consisting of a single method
Observe(float64) float64. In
the model’s source code:
- Methods on the type implementing
model.Modelreturning a single float or nothing are differentiated.
- Within the methods, the following is differentiated:
- assignments to
float64(including parallel assignments if all values are of type
- returns of float64;
- standalone calls to methods on the type implementing model.Model (apparently called for side effects on the model).
- Imported package name
- Non-dummy identifiers starting with the prefix for
generated identifiers (
_by default) are reserved.
Derivatives do not propagate through a function that is not an elemental or a call to a model method. If a derivative is not registered for an elemental, calling the elemental in a differentiated context will cause a run-time error.
Gradient() float64 is provided for a model type,
the model is treated as an ‘elemental’ model, and the gradient
Gradient() is used by inference algorithms. This
allows to code the gradient by hand instead of relying on
Functions are considered elementals (and must have a registered derivative) if their signature is either of kind
func (float64, float64*) float64
that is, one or more non-variadic
float64 arguments and
float64 return value, or
func (float64) float64
For example, functions
func foo(float64, float64, float64) float64 func bar(float64) float64
are considered elementals, while functions
func fee(...float64) float64 func buz(int, float64) float64
are not. Gradients for selected functions from the
package are pre-defined (
Erfc). Auxiliary elemental functions with
pre-defined gradients are in
Distributions are models. Several distributions are provided in
In addition to the
Observe method, distributions have
(single observation) and
Logps (multiple observations) methods
which accept distribution parameters and observations as
individual arguments rather than in a single slice.
deriv is used to differentiate a model.
The command-line syntax is:
deriv -h for the full list of command-line options. The
differentiated model is put into subpackage “ad” of the model’s
package, with the same name as the original package.
- optimization via gradient ascent methods.
- full posterior inference via Hamiltonian Monte Carlo variants.
DepthAdapter enables adaption of NUTS step size with respect to the average