CompositionalNetworks.jl: a scaling glass-box neural network
07-29, 17:50–18:00 (UTC), Green

Interpretable Compositional Networks (ICN), a variant of neural networks, that allows the user to get interpretable results, unlike regular artificial neural networks. An ICN is a glass-box producing functions composition that scale with the size of the input, allowing a learning phase on relatively small spaces.
This presentation covers the different Julia packages and paradigms involved, a set of use-case, current limitations, future developments, and hopefully possible collaborations.


The JuliaConstraints GitHub organization was born last fall and aims to improve collaborative packages around the theme of Constraint Programming (CP) in Julia.
As for many fields of optimization, there is often a trade-off between efficiency and the simplicity of the model. CompositionalNetworks.jl was designed to smooth that trade-off. One could make a parallel with not having to choose between the speed of C and the simplicity of Python (among others).
An Interpretable Compositional Networks (ICN) takes any vector (of arbitrary size) as an input and outputs a (non-negative) value that corresponds to a user given metric. For instance, consider an error function network in Constraint Programming, one can choose a Hamming distance metric to evaluate the distance between a configuration of the variables’ values and the closest satisfying values. It provides the minimum number of variables to change to reach a solution.
A usual constraint showing the modeling power of Constraint Programming is the AllDifferent constraint which ensures that all the variables take different values. One can model a Sudoku problem with only such constraints.
An ICN, in its most basic form, is composed of four layers: transformation, arithmetic, aggregation, and composition layers. Weights between the layers are binary, meaning that neurons (operations) are either connected to, or disconnected from each other neuron in adjacent layers. These simple boolean weights allow a straightforward composition of the operations composing an ICN, and provide a result that is interpretable by a human. The user can then, either verify and use the composition directly, or use it as an inspiration for a handmade composition.
An ICN learning on a small space of 4 variables with domain [1, 2, 3, 4] can extract the following function:

icn_all_different(x::AbstractVector) = x |> count_eq_left |> count_positive |> identity

where count_eq_left is the function that counts the number of elements of x smaller than xi for each i, and count_positive counts the number of elements xi>0. This output is equivalent to the best known handmade error function for the AllDifferent constraint. Furthermore, it is fully scalable to any vector length.

In CompositionalNetworks.jl, we generate the code of the composed function directly. We can even compile it on the fly due to the meta programming capabilities of Julia. Moreover, we can also export the compositions to human-readable language or other programming languages.
Users can check and modify the function composed by an ICN to adapt or improve the output to its needs and requirements. Of course, the function can also be used directly.

During this talk, we will cover an out-of-the-box use of CompositionalNetworks.jl along with the different julian and non-julian key aspects to the development of this package. Among others, the use of other julian packages as dependencies such as the genetic algorithm in Evolutionary.jl to fix the Boolean weights of an ICN, or the generation of compositions in either programming code or mathematical language through Julia efficient meta programming.
The versatility of the Julia version of ICN mixed with metaprogramming allows a much broader practical use cases for any user of ICN compared to the original C++ version, where modifying the code is a much harder task, and metaprogramming is not possible (and usually not recommended for (pre)compiled languages)

While we provide a basic ICN use-case as error function networks in Constraint Programming, it is straightforward for the user to provide additional operations, or even layers. The type of functions learned and composed is more versatile than our use case. We hope this package can have some use for, but not limited to, the people in the Constraint Programming and the Julia communities.

Although our current applications are mainly within some packages of JuliaConstraints, we hope to exchange with the community for other methods to compose functions, apply them to other problems, and improve our understanding of Julia for Interpretable Compositional Networks.

PhD student in the University of Évora