Infergo – Go programs that learn

infergo is a probabilistic programming facility for the Go language. infergo allows to write probabilistic models in almost unrestricted Go and relies on automatic differentiation for optimization and inference. Works anywhere where Go does. Hosted on Bitbucket. Licensed under the MIT license.

Example

more examples

Learning parameters of the Normal distribution from observations:

Model

type Model struct {
    Data []float64
}

// x[0] is the mean, x[1] is the log stddev of the distribution
func (m *Model) Observe(x []float64) float64 {
    // Our prior is a unit normal ...
    ll := Normal.Logps(0, 1, x...)
    // ... but the posterior is based on data observations.
	ll += Normal.Logps(x[0], math.Exp(x[1]), m.Data...)
    return ll
}

Inference

// Data
m := &Model{[]float64{
	-0.854, 1.067, -1.220, 0.818, -0.749,
	0.805, 1.443, 1.069, 1.426, 0.308}}

// Parameters
mean, logs := 0, 0
x := []float64{mean, logs}
	
// Optimization
opt := &infer.Momentum{
    Rate:  0.01,
    Decay: 0.998,
}
for iter := 0; iter != 1000; iter++ {
    opt.Step(m, x)
}
mean, logs = x[0], x[1]

// Posterior
hmc := &infer.HMC{
	Eps: 0.1,
}
samples := make(chan []float64)
hmc.Sample(m, x, samples)
for i := 0; i != 1000; i++ {
	x = <-samples
}
hmc.Stop()

Acknowledgements

I owe a debt of gratitude to Frank Wood who introduced me to probabilistic programming and inspired me to pursue probabilistic programming paradigms and applications. I also want to thank Jan-Willem van de Meent, with whom I had fruitful discussions of motives, ideas, and implementation choices behind infergo, and whose thoughts and recommendations significantly influenced infergo design. Finally, I want to thank PUB+, the company I worked for during early stages of development of Infergo, for supporting me in development of Infergo and letting me experiment with applying probabilistic programming to critical decision-making in production environment.

infergo v1.2.2

Infergo v1.2.2 is out.

Infergo has been made to work with Go 1.25. Accompanying repositories (infergo-studies, gogp) have been updated to depend on this version.

gogp v1.0.1

GoGP v1.0.1 is out. This is the first stable (v1) release of GoGP, a library for Gaussian process regression. GoGP has been used in production for over a year, and has undergone many changes improving performance and robustness.

infergo v1.0.1

Infergo v1.0.1 is out.

This is the first stable (v1) release of Infergo. Infergo has undergone many changes during the past year, and has been used in production for mission-critical computations in the cloud.

gogp v0.1.0

GoGP is out. GoGP is a library for Gaussian process regression in Go and uses Infergo for automatic differentiation and inference.

infergo v0.7.0

Infergo v0.7.0 is out.

This release is a result of improving and extending Infergo along with development of GoGP, a library for Gaussian process regression.

What’s new:

  • model’s gradient can be explicitly specified as the Gradient() method, instead of through automatic differentation.
  • An elemental may also be a function which accepts a slice of floats (in addition to functions which accept one or more float scalars as parameters).
  • More kernels in the supplied kernel library.
  • As usual, fixes and performance improvements.

infergo v0.6.1

Infergo v0.6.1 is out.

What’s new:

  • I moved many things around, some in a backward-incompatible way, but should only affect a minority of users.
  • As a side-effect of using Infergo for a rather involved model, I fixed two bugs in the automatic differentiation transformation. The bugs manifested in edge cases I didn’t even think they exist.
  • The accompanying repository infergo-studies now contains a new case study — a rewrite of Stan’s LDA example.

infergo v0.5.0

infergo v0.5.0 is out.

What’s new:

  • Multithreading support. Differentiation can be performed concurrently in multiple goroutines without locking of calls to Observe or Gradient, and with little contention.
  • Examples and case studies performing inference in parallel, both using Infergo’s own inference algorithms, or through integration in Gonum.

The Tale of GoIDs

2.16 And the LORD God commanded the man, saying: ‘Of every tree of the garden thou mayest freely eat;

2.17 but of the tree of the knowledge of good and evil, thou shalt not eat of it; for in the day that thou eatest thereof thou shalt surely die.’

The Book of Genesis

Go gives the programmer introspection into every aspect of the language, and of a running program. But to one thing the programmer does not have access, and it is the goroutine identifier. Because the day the programmers know the goroutine identifier, they create goroutine-local storage through shared access and mutexes, and shall surely die.

infergo v0.3.0

infergo v0.3.0 is out.

What’s new:

  • Only methods returning float64 or nothing are differentiated. This allows to define helper methods on the model, such as returning the number of parameters, and call the methods outside of differentiated context.
  • infergo models can be optimized using Gonum optimization algorithms. This includes BFGS and variants. Case study lr-gonum applies L-BFGS to linear regression.
  • Case studies have been extended. New studies include:
    • linear regression, solved using either stochastic gradient descent and BFGS;
    • compilation of infergo models and inference into WebAssembly and running in the browser;
    • integration with Gonum;
    • Neal’s funnel, a re-parameterization example borrowed from Stan documentation.

infergo v0.2.2

infergo v0.2.2 is out.

What’s new:

  • Constant folding.
  • Automatic import of packages required for short variable declarations (see issue #10).