JuliaCon 2022 (Times are UTC)

GraphPPL.jl: a package for specification of probabilistic models
2022-07-28 , Red

We present GraphPPL.jl - a package for user-friendly specification of probabilistic models with variational inference constraints. GraphPPL.jl creates a model as a factor graph and supports the specification of factorization and form constraints on the variational posterior for the latent variables. The package collection GraphPPL.jl, ReactiveMP.jl and Rocket.jl provide together a full reactive programming-based ecosystem for running efficient and customizable variational Bayesian inference.


Background

Bayesian modeling has become a popular framework for important real-time machine learning applications, such as speech recognition and robot navigation. Unfortunately, many useful probabilistic time-series models contain a large number of latent variables, and consequently real-time Bayesian inference based on Monte Carlo sampling or other black-box methods in these models is not feasible.

Problem statement

Existing packages for automated Bayesian inference in the Julia language ecosystem, such as Turing.jl, Stan.jl, and Soss.jl, support probabilistic model specification by well-designed macro-based meta languages. These packages assume that inference is executed by black-box variational or sampling-based methods. In principle, for conjugate probabilistic time-series models, message passing-based variational inference by minimization of a constrained Bethe Free Energy yields approximate inference solutions obtained with cheaper computational costs. In this contribution, we develop a user-friendly and comprehensive meta language for specification of both a probabilistic model and variational inference constraints that balance accuracy of inference results with computational costs.

Solution proposal

The GraphPPL.jl package implements a user-friendly specification language for both the model and the inference constraints. GraphPPL.jl exports the @model macro to create a probabilistic model in the form of a factor graph that is compatible with ReactiveMP.jl's reactive message passing-based inference engine. To enable fast and accurate inference, all message update rules default to precomputed analytical solutions. The ReactiveMP.jl package already implements a selection of precomputed rules. If an analytical solution is not available, then the GraphPPL.jl package provides ways to tweak, relax, and customize local constraints in selected parts of the factor graph. To simplify this process, the package exports the @constraints macro to specify extra factorization and form constraints on the variational posterior [1]. For advanced use cases, GraphPPL.jl exports the @meta macro that enables custom message passing inference modifications for each node in a factor graph representation of the model. This approach enables local approximation methods only if necessary and allows for efficient variational Bayesian inference.

Evaluation

Over the past two years, our probabilistic modeling ecosystem comprising GraphPPL.jl, ReactiveMP.jl, and Rocket.jl has been battle tested with many sophisticated models that led to several publications in high-ranked journals such as Entropy [1] and Frontiers [2], and conferences like MLSP-2021 [3] and ISIT-2021 [4]. The current contribution enables a user-friendly approach to very sophisticated Bayesian modeling problems.

Conclusions

We believe that a user-friendly specification of efficient Bayesian inference solutions for complex models is a key factor to expedite application of Bayesian methods. We developed a complete ecosystem for running efficient, fast, and reactive variational Bayesian inference with a user-friendly specification language for the probabilistic model and variational constraints. We are excited to present GraphPPL.jl as a part of our complete variational Bayesian inference ecosystem and discuss the advantages and drawbacks of this approach.

References

[1] Ismail Senoz, Thijs van de Laar, Dmitry Bagaev, Bert de Vries. Variational Message Passing and Local Constraint Manipulation in Factor Graphs, Entropy. Special Issue on Approximate Bayesian Inference, 2021.

[2] Albert Podusenko, Bart van Erp, Magnus Koudahl, Bert de Vries. AIDA: An Active Inference-Based Design Agent for Audio Processing Algorithms, Frontiers in Signal Processing, 2022.

[3] Albert Podusenko, Bart van Erp, Dmitry Bagaev, Ismail Senoz, Bert de Vries. Message Passing-Based Inference in the Gamma Mixture Model, 2021 IEEE 31st International Workshop on Machine Learning for Signal Processing (MLSP).

[4] Ismail Senoz, Albert Podusenko, Semih Akbayrak, Christoph Mathys, Bert de Vries. The Switching Hierarchical Gaussian Filter, 2021 IEEE International Symposium on Information Theory (ISIT).

My research interests lie in the fields of computers science, machine learning and probabilistic programming. Currently I am a PhD candidate in the SPS group of Electrical Engineering department in Eindhoven University of Technology. I’m working on a high-performant implementation of message passing-based Bayesian inference package in the Julia programming language. My research project focuses on Signal Processing and Active inference applications, but is also aimed to expand the scope of possible applications for message passing in general.