JuliaCon 2022 (Times are UTC)

Simple Chains: Fast CPU Neural Networks
07-28, 17:40–17:50 (UTC), Blue

SimpleChains is an open source pure-Julia machine learning library developed by PumasAI and JuliaComputing in collaboration with Roche and the University of Maryland, Baltimore.
It is specialized for relatively small-sized models and NeuralODEs, attaining best in class performance for these problems. The performance advantage remains significant when scaling to tens of thousands of parameters, where it's still >5x faster than Flux or Pytorch while all use a CPU, even outperforming GPUs.

SimpleChains is a pure-Julia library that is simple in two ways:
1. All kernels are simple loops (it leverages LoopVectorization.jl for performance).
2. It only supports simple (feedforward) neural networks.

It additionally manages memory manually, and currently relies on hand written pull back definitions.
In combination, these allow it to be 50x faster than Flux training an MNIST example on a 10980XE.

This talk will focus on introducing the library, showing off a few examples, and explaining some of they "why" behind it's performance.

Chris Elrod is a frequent commenter on the Julia Discourse, Slack, and Zulip, as well as a contributor to the ecosystem, known in particular for LoopVectorization.jl and JuliaSIMD.

This speaker also appears in: