JuliaCon 2020 (times are in UTC)

WaspNet.jl, a Julian Spiking Neural Network Simulator

We present WaspNet.jl, a framework intended for spiking neural network (SNN) simulations for computational neuroscience and machine learning. WaspNet.jl provides efficient implementations for SNN primitives and common neuronal models. This framework is robust to the neuron model, supporting both spiking and non-spiking neurons, allowing for novel hybrid topologies. We exhibit GPU acceleration with CuArrays.jl and demonstrate capabilities of WaspNet.jl with a variety of example experiments.


In computational neuroscience, spiking neural network (SNN) simulations are ubiquitous. SNNs can be used to reproduce biology and understand brain behavior, to explore neural networks as dynamical systems, or to sparsely encode and solve machine learning problems. Julia's inherent computational efficiency, flexible type system, and rich metaprogramming ecosystem leave it well posed to act as a preferred tool in this domain. Our contribution, WaspNet.jl, pairs Julia with the problem of clock-driven SNN simulation.

WaspNet.jl is a lightweight neural network (NN) framework meant for fast, transparent simulations of NNs. The framework provides three layers of abstraction: neuron, layer or population, and network.

Neurons are treated as black boxes relating an input signal to an output signal; in a spiking neuron, this means simulating a differential equation, although non-spiking neuronal models are also supported. Layers, or populations, are collections of neurons which orchestrate the simulation of constituent neurons and route inputs to their destination neurons. Networks comprise collections of populations and map signals between these populations.

Networks and layers are agnostic of the underlying neuronal models and provide the primary utility of WaspNet.jl. By leveraging Julia’s type system, we are able to write generic code for networks and layers which guarantees low overhead for simulations. This allows users to focus their efforts on neuronal models and network topologies.

The structure of WaspNet.jl and the temporal sparseness of SNN simulation lends itself well to parallelization. If no GPU is available, parallelizing computations in different layers across multiple cores is a natural extension; if a GPU is available, WaspNet.jl integrates easily with CuArrays.jl to further accelerate calculations.

In order to showcase the capabilities of WaspNet.jl, we exhibit benchmarks and NN experiments performed with the library. These experiments demonstrate the flexibility of WaspNet.jl to integrate distinct neuron types (spiking, non-spiking), topologies (feed-forward, recurrent), and acceleration techniques all from within the Julia language.

As of this writing, there is no actively developed, well-documented, fully-featured SNN simulator for Julia. By developing WaspNet.jl, we hope that we can make a meaningful contribution to the Julia, machine learning, and computational neuroscience communities.