JuliaCon 2022 (Times are UTC)

ロケール設定が保存されました。pretalxでは英語のサポートが充実していると思っていますが、問題やエラーが発生した場合は、ぜひご連絡ください。

Lux.jl: Explicit Parameterization of Neural Networks in Julia
07/28, 13:00–13:10 (UTC), Red

Julia already has quite a few well-established Neural Network Frameworks -- Flux & KNet. However, certain design elements -- Coupled Model and Parameters & Internal Mutations -- associated with these frameworks make them less compiler and user friendly. Making changes to address these problems in the respective frameworks would be too disruptive for users. To address these challenges, we designed Lux, a NN framework.


Lux, is a neural network framework built completely using pure functions to make it both compiler and automatic differentiation friendly. Relying on the most straightforward pure functions API ensures no reference issues to debug, and compilers can optimize it as much as possible, is compatible with Symbolics/XLA/etc. without any tricks.

Repository: https://github.com/avik-pal/ExplicitFluxLayers.jl/