JuliaCon Local Paris 2025

Kernels without borders: Parallel programming with KernelAbstractions.jl
Language: English

Modern computing relies on parallelism, from GPUs accelerating AI workloads to multi-core CPUs in every laptop. But writing code that harnesses this power across different hardware remains challenging. In this talk, we'll explore how KernelAbstractions.jl brings GPU-style programming to Julia, allowing you to write parallel kernels once and run them anywhere.


Tim Besard is a software engineer at JuliaHub, where he leads GPU support and development for the Julia programming language. He holds a Ph.D. in computer science engineering from Ghent University, Belgium, and has been a key contributor to Julia's GPU ecosystem since 2014. Tim maintains several foundational GPU packages including CUDA.jl, GPUArrays.jl, GPUCompiler.jl, and LLVM.jl, which together form the backbone of GPU computing in Julia.