2023-07-28 –, 32-155
In many fields of optimization, there is often a tradeoff between efficiency and the simplicity of the model. ConstraintLearning.jl is an interface to several tools designed to smooth that tradeoff.
- CompositionalNetworks.jl: a scaling glass-box method to learn highly combinatorial functions [JuliaCon 2021]
- QUBOConstraints.jl: a package to automatically learn QUBO matrices from optimization constraints.
Applications are not limited to Constraint Programming, but are focused on it.
In Constraint Programming, a problem can be modeled as simply as
- a set of variables' domains
- a set of predicates over those variables called constraints
- an optional objective
Often, efficient solvers expect more complex models to provide additional efficiency. For instance, Constraint-Based Local Search (CBLS) solvers have significant speedups when the constraint is encoded as a more refined function than a predicate. We designed CompositionalNetworks.jl to learn those functions from simple predicates, effectively removing the modeling complexity.
Similarly, we designed QUBOConstraints.jl such that QUBO matrices are learned from simple predicates. Among other things, QUBO encoding can be used on QUBO based solvers and quantum annealing machines.
Finally, ConstraintLearning.jl provides a common interface for both learning techniques. It also effectively allows both packages to only contains minimal data structures and generic solving interfaces to be including in appropriate solvers.
Researcher at Internet Initiative Japan (IIJ) research lab, specialized in Optimization, Networks, Data Structures, and Algorithms.