2021-07-28 –, Red
We present FrankWolfe.jl, a new Julia package implementing several Frank-Wolfe algorithms to optimize differentiable functions with convex constraints.
The Julia optimization ecosystem includes toolboxes for unconstrained optimization on one hand and domain-specific modelling languages for constrained optimization on the other hand.
This package offers the possibility to optimize functions defined as Julia code with DSL-based closed-form or arbitrary convex constraints in an efficient manner.
For large-scale and data-intensive optimization, first-order methods are often a favoured choice, motivated by faster iterations and lower memory requirements.
Frank-Wolfe algorithms allow the optimization of a differentiable function over a convex set, solving a linear optimization problem at each iteration to determine a progress direction.
Each of these linear subproblems is much cheaper than the quadratic subproblems solved by projected gradient algorithms.
The talk will present the package and how it fits an unaddressed spot in the Julia optimization landscape, comparing it with the DSL approaches such as JuMP and Convex.jl,
StructuredOptimization.jl and to the other smooth optimization frameworks such as Optim.jl and JuliaSmoothOptimizers.
After a quick overview of the algorithm, we will cover some interesting properties on specific optimization problems, in particular the solution sparsity preserved throughout the whole optimization process.
Sparsity means in particular that the iterates are a convex combination of a low number of extreme points of the feasible set which can result in low-rank matrices, sparse arrays or other specific structures depending on the feasible set.
In the last part of the talk, we will cover some insight gained from the development of the package on building generic algorithms and in particular managing to handle vertices assuming a vector space but not necessarily finite dimensions.
Mathieu is a researcher in computational mathematics working at the Zuse Institute Berlin. His interests span mixed-integer, convex optimization, applications in engineering and statistics.