A model is defined in it’s own package. The model must implement interface model.Model. In the model’s source code:

  1. Method Observe of interface model.Model is differentiated.
  2. All methods on the type implementing model.Model are differentiated.
  3. Within the methods, the following is differentiated:
    • assignments to float64 (including parallel assignments if all values are of type float64);
    • returns of float64;
    • standalone calls to methods on the type implementing model.Model (apparently called for side effects on the model).
  4. Imported package name ad is reserved.
  5. Non-dummy identifiers starting with the prefix for generated identifiers (_ by default) are reserved.

Derivatives do not propagate through a function that is not an elemental or a call to a model method. If a derivative is not registered for an elemental, calling the elemental in a differentiated context will cause a run-time error.


Functions are considered elementals (and must have a registered derivative) if their signature is of kind

        func (float64, float64*) float64

that is, one or more non-variadic float64 argument and float64 return value. For example, function

        func (float64, float64, float64) float64

is considered elemental, while functions

        func (...float64) float64
        func ([]float64) float64
        func (int, float64) float64

are not. Gradients for selected functions from the math package are pre-defined (Sqrt, Exp, Log, Pow, Sin, Cos, Tan). Auxiliary elemental functions with pre-defined gradients are in


Distributions are models. Several distributions are provided in In addition to the Observe method, distributions have Logp (single observation) and Logps (multiple observations) methods which accept distribution parameters and observations as individual arguments rather than in a single slice.


Command-line utility deriv is used to differentiate a model. The command-line syntax is:

deriv path/to/model/package

For example,

deriv examples/hello/model

Run deriv -h for the full list of command-line options. The differentiated model is put into subpackage “ad” of the model’s package, with the same name as the original package.


For inference, infergo offers


An optimizer implements interface infer.Grad. Interface implementations are gradient ascent with momentum and Adam. Both methods are capable to work with stochastic data (e.g. streams or batches).

Full posterior

An MCMC sampler for full posterior inference implements interface infer.MCMC. Inteface implementations are HMC and NUTS.

DepthAdapter enables adaption of NUTS step size with respect to the average tree depth.