Juliacon 2024

Randomized Diagonal Estimation in Julia
07-12, 15:40–15:50 (Europe/Amsterdam), Else (1.3)

Implicit diagonal estimation is a long-standing problem that is concerned with approximating the diagonal of a matrix (function) that can only be accessed through matrix-vector products. We present RandomizedDiagonalEstimation.jl, which implements existing and novel randomized methods for accomplishing this task in a scalable way. The practical applications of the provided algorithms extend across diverse domains such as network science, material science, optimization and machine learning.


RandomizedDiagonalEstimation.jl provides the first comprehensive toolkit of methods for implicit diagonal estimation. It features established algorithms for approximating the diagonal of large matrices when one has only access to matrix-vector products. This problem is of interest in various fields of application, examples include the estimation of subgraph centralities in network science, the approximation of charge densities in material science or efficient second-order optimization methods in machine learning. The package includes the Girard-Hutchinson estimator and Diag++, recently proposed by Baston and Nakatsukasa. In addition, we propose and implement novel extensions tailored at the diagonal of matrix functions, such as the exponential and the inverse of a matrix, using polynomial interpolation and Krylov methods. The package integrates well within the Julia ecosystem and provides the possibility to easily add additional algorithms in the future.

See also: Slides RandomizedDiagonalEstimation.jl (8.3 MB)

I am an ELLIS Doctoral student at the University of Tübingen and the Bosch Center for Artificial Intelligence. The work on randomized diagonal estimation presented at JuliaCon 2024 was carried out in the course my master thesis at TU Munich and KTH Stockholm (supervised by Claudio Mayrink Verdun, Felix Krahmer and Elias Jarlebring).