2025-10-02 –, Robert Faure Amphitheater
Language: English
Recurrent neural networks (RNNs) are deep learning (DL) models for sequential data, with many variants developed over the last two decades. However, core DL libraries typically offer only a limited number of cell implementations. In this talk, I introduce RecurrentLayers.jl and LuxRecurrentLayers.jl, two libraries that extend Flux and Lux with over 30 additional RNN cells each, enabling broader experimentation and research.
Recurrent neural networks (RNNs) are widely used machine learning models (ML) for modeling sequential data. By maintaining an internal state through recurrent connections, RNNs can keep a dynamical memory of past time steps and capture temporal dependencies. Over the years, many RNN variants have been proposed to improve learning dynamics, stability, and expressiveness. As a result, the landscape of recurrent models has grown increasingly rich and diverse.
However, these models are often scattered across unique implementations, in different frameworks and one-off repositories, making comparison and experimentation difficult. The advent of the transformers era further contributed to relegating RNNs into a secondary role, leading to reduced support and limited availability of diverse architectures.
As a solution, I introduce RecurrentLayers.jl and LuxRecurrentLayers.jl, two libraries that expand the RNN offerings of Flux.jl and Lux.jl. The libraries provide each over 30 recurrent cells and modular components for building custom architectures. Core RNN enhancements, such as multiplicative integration and independent recurrence, are implemented in a reusable way.
In this talk, I will present an overview of the packages, demonstrate how they integrate with the existing deep learning ecosystem in Julia, and highlight the range of features they offer for research and experimentation. Finally, I will showcase how the libraries facilitate easy cross-pollination of ideas from different research papers, enabling the rapid development of novel and customized RNN architectures.
I’m a postdoc at the Max Planck Institute for the Physics of Complex Systems in Dresden, with research interests in machine learning, chaos, and nonlinear time series. My current work focuses on using ML to forecast chaotic dynamics.
I’ve been a Julia user since 2019 and have contributed to the ecosystem with packages like ReservoirComputing.jl and CellularAutomata.jl.
You can find out more about my work on my website.