JuliaCon 2020 (times are in UTC)

Climate models in 16bit: Arithmetic and algorithmic challenges
07-31, 13:20–13:30 (UTC), Purple Track

Powered by Julia’s type-flexibility, various posit and float arithmetics are tested in ShallowWaters.jl for perspectives to accelerate climate models on modern computing architecture in 16 bit, using either deterministic or StochasticRounding.jl. Algorithmic bottlenecks with low precision are identified and information theory is used to find the best number format for a given algorithm, which led to the development of Sonums.jl – a number format that learns from data.


The need for high precision calculations with 64bit floating-point numbers for weather and climate models has been questioned. Lower precision numbers can accelerate simulations and are increasingly supported by modern computing architecture. Posit numbers, a recently proposed alternative to floating-point numbers, claim to have smaller arithmetic rounding errors in many applications. As a standardized posit processor does not exist yet, we emulate posit arithmetic with SoftPosit.jl on a conventional processor. Julia’s type-flexibility easily allows to test benefits of posits compared to floats at 16bit in the Lorenz system and in ShallowWaters.jl. We show that forecasts based on posits are clearly more accurate than floats. Mixing 16 bit arithmetic with 32 bit for critical computations strongly reduces errors and is promising for present-day float-based hardware. Reduced precision communication of boundary values with 16 or 8-bit encoded as floats or posits introduces negligible errors, presenting a perspective for reduced data communication within a computer cluster. Stochastic rounding modes, that are exact in expectation, are found to improve simulations at 16-bit and mimic uncertainties. Algorithmic bottlenecks with low precision are identified using Sherlogs.jl to facilitate the transition towards 16-bit arithmetic. We analyse algorithms form an information theory perspective to find the best number format for a given application. This approach led the the development of Sonums.jl, a number format that is optimal once trained on data to minimize the rounding error. The results promote the potential of 16-bit formats for at least parts of complex weather and climate models, where rounding errors would be entirely masked by intitial condition, model or discretization error.

Co-authored by
- Peter Dueben, ECMWF, Reading, UK
- Tim Palmer, University of Oxford, UK

Sun is shining