Automatic Differentiation for Sparse TensorsVirtual
Sparse tensors are prevalent in many data-intensive applications. However, existing automatic differentiation (AD) frameworks are tailored towards dense tensors, which makes it a challenge to efficiently compute gradients through sparse tensor operations. This is due to irregular sparsity patterns that can result in substantial memory and computational overheads. We propose a novel framework that enables the efficient AD of sparse tensors. The key aspects of our work include a compilation pipeline leveraging two intermediate DSLs with AD-agnostic domain-specific optimizations followed by efficient C++ code generation. Our framework outperforms state-of-the-art alternatives across a variety of synthetic and real-world sparse tensor datasets.
Sun 18 JunDisplayed time zone: Eastern Time (US & Canada) change
11:20 - 12:30 | |||
11:20 15mTalk | Accelerating Sparse Matrix Computations with Code Specialization CTSTA Maryam Mehri Dehnavi University of Toronto | ||
11:35 15mTalk | A General Distributed Framework for Contraction of a Sparse Tensor with a Tensor Network CTSTA Raghavendra Kanakagiri University of Illinois Urbana-Champaign | ||
11:50 15mTalk | Automatic Differentiation for Sparse TensorsVirtual CTSTA Amir Shaikhha University of Edinburgh | ||
12:05 15mTalk | Compiler Support for Structured Data CTSTA Saman Amarasinghe Massachusetts Institute of Technology | ||
12:20 10mPanel | Discussion CTSTA |