Juliacon 2024

ExponentialFamilyManifolds.jl: Advancing Probabilistic Modeling
07-12, 10:20–10:30 (Europe/Amsterdam), Function (4.1)

ExponentialFamilyManifolds.jl efficiently tackles non-conjugate inference in probabilistic models by projecting distributions onto exponential families using Kullback-Leibler divergence optimization. Integrating with Manifolds.jl, Manopt.jl, and RxInfer.jl, it enhances scalability and inference efficiency, especially in models with conjugate and non-conjugate elements. This makes inference with exponential family priors both efficient and flexible.


Probabilistic modeling (also known as Bayesian modeling) is a vital tool for contemporary data scientists, enabling them to process information about complex real-world processes accurately. A significant computational challenge in this field is the use of non-conjugate priors. While these priors may increase model accuracy in principle, they also complicate the inference process.

For example, we may choose to model consumption as a log-linear function of household income for a household consumption model. In a Bayesian context, this function is known as the likelihood function, which needs to be paired with a prior distribution for the income. The conjugate prior distribution is a log-normal distribution over income. Bayesian inference with conjugate distributions leads analytically to the same distribution for both the prior and posterior. The Julia package RxInfer.jl, based on message passing in a factor graph, is a fast-executing inference toolbox that excels at inference for models with conjugate distribution pairs.

However, given the high variability and possible uncertainties about personal data such as income, a gamma distribution prior might be a more suitable prior for income data. Unfortunately, the gamma prior is non-conjugated with the log-linear likelihood function. This leads to a posterior that is not Gamma-distributed and consequently to a computationally (much) more complex inference problem. In those cases, the general-purpose inference package Turing.jl, based on black-box variational inference and Monte Carlo sampling, will likely be able to handle the inference task. Still, scalability issues often arise, particularly for large datasets and/or large models. Thus, we have a computational trade-off at the heart of Bayesian modeling: we want both accurate model assumptions and a tractable inference process, but these preferences often conflict.

The ExponentialFamilyManifolds.jl (EFM) package addresses this limitation by significantly extending the range of tractable models for message passing-based inference protocols as implemented by RxInfer.jl. Due to the modularity of message passing-based inference, EFM-enhanced RxInfer.jl inherits all the efficiencies of analytical closed-form message update rules for conjugate distribution pairs.
EFM tackles the issue of local non-conjugate distribution pairs by projecting the product of the likelihood (e.g., log-normal) and the non-conjugate prior (e.g., gamma) in a manner that preserves conjugacy with other model components (gamma in our scenario). Technically, EFM efficiently projects generic distributions onto a specified exponential family distribution through Kullback-Leibler divergence optimization. This process is enhanced through the use of Manifolds.jl and Manopt.jl, both robust Julia ecosystem packages.

In short, the proposed EFM-enhanced Bayesian inference approach leads to a much wider usable range of tractable models in probabilistic modeling problems.

See also: GitHub

Mykola Lukashchuk is a PhD candidate at Eindhoven University of Technology.

Bert de Vries received MSc (1986) and PhD (1991) degrees in Electrical Engineering from Eindhoven University of Technology (TU/e) and the University of Florida, respectively. From 1992 to 1999, he worked as a research scientist at Sarnoff Research Center in Princeton (NJ, USA). Since 1999, he has been employed in the hearing aid industry, both in engineering and managerial positions. De Vries was appointed professor in the Signal Processing Systems Group at TU/e in 2012. His research focuses on the development of intelligent autonomous agents that learn from in-situ interactions with their environment. We aim to use these agents to automate the development of novel signal processing and control algorithms, see biaslab.org. Our research draws inspiration from diverse fields including computational neuroscience, Bayesian machine learning, and signal processing systems. A current major application area concerns the personalization of medical signal processing systems such as hearing aid algorithms.

This speaker also appears in: