Guillaume Dalle
Postdoctoral researcher at EPFL (Switzerland) in the INDY, IdePHICS and SPOC labs.
Currently working on graph machine learning.
Active member of the JuliaGraphs and JuliaDiff organizations.
Personal website: https://gdalle.github.io/
Sessions
HMMs are a very useful statistical framework, but existing implementations often limit user creativity. With Julia, we can have it all: custom probability distributions, arbitrary types, numerical stability, automatic differentiation, exogenous controls... without giving up on performance!
This talk will introduce the package HiddenMarkovModels.jl
and its main features, but also share useful lessons about leveraging abstraction in scientific software.
Automatic Differentiation (AD) is the art of letting your computer work out derivatives for you. The Julia ecosystem provides many different AD tools, which can be overwhelming. This talk will give everyone the necessary information to answer two simple questions:
- As a developer, how do I make my functions differentiable?
- As a user, how do I differentiate through other people's functions?
Video: https://www.youtube.com/live/ZKt0tiG5ajw?t=19747s
Interfaces.jl is an attempt to solve Julias lack of explicit interface declarations. It provides macros to define and implement interfaces concisely, and functions to test and check that they are implemented correctly.
Its subpackage BaseInterface.jl defines some common Base Julia interfaces using Interfaces.jl.
This talk will cover things learned writing and using the package, and hopefully inspire some discussion on the future of interfaces in Julia more broadly.