Differential Programming Tensor Networks
2019-07-25 , Room 349

A package about einsum, as well as differentiable tensor network algorithms built on top of it. Why we need automatic differentiating tensor networks and how to achieve this goal.


A tensor network is a contraction of tensors, it has wide applications in fields of physics, big data and machine learning. It can be used to represent a quantum wave function ansatz, compress data and model probabilities. Supporting automatic differentiation brings hope for many difficult problems like training project entangled pair states. We provide a unified, dispatchable interface for einsum as the middleware to port tensor contraction libraries and tensor network applications and a through support to back-propagation through linear algebra functions. Based on these supporting libraries, we show some tensor network algorithms as examples.

References:
* Differentiable Programming Tensor Networks, Hai-Jun Liao, Jin-Guo Liu, Lei Wang, and Tao Xiang [under preparation]
* Unsupervised Generative Modeling Using Matrix Product States, Zhao-Yu Han, Jun Wang, Heng Fan, Lei Wang, and Pan Zhang Phys. Rev. X 8, 031012


Co-authors

Andreas Peter