Sun 18 Jun 2023 09:20 - 09:35 at Magnolia 4 - CTSTA: Session 1

Data layout is becoming a central ingredient to achieving high performance code. A layout that is organized according to its desired access order can significantly reduce data movement; the computation must be modified accordingly to match data layout order. We present dlcomp, a code generator for tensor contraction computations. It takes as its input a tensor computation in Einstein notation and a data layout description in the sparse polyhedral format. Using a combination of polyhedra scanning and synthesis, it generates code that iterates over the layout of one tensor and matches each nonzero to the corresponding element(s) in other tensors using an optimized find operation. We briefly discuss how this approach can be generalized beyond sparse tensors.

Sun 18 Jun

Displayed time zone: Eastern Time (US & Canada) change

09:00 - 11:00
09:00
5m
Day opening
Introduction
CTSTA
Fredrik Kjolstad Stanford University
09:05
15m
Talk
Software and Hardware for Sparse ML
CTSTA
Fredrik Kjolstad Stanford University
09:20
15m
Talk
Integrating Data Layout into Compilers and Code Generators
CTSTA
Mary Hall University of Utah
09:35
15m
Talk
Tackling the challenges of high-performance graph analytics at compiler level
CTSTA
Gokcen Kestor Pacific Northwest National Laboratory
09:50
10m
Panel
Discussion
CTSTA

10:00
5m
Break
BreakSocial
CTSTA

10:05
15m
Talk
Challenges and Opportunities for Sparse Compilers in LLM
CTSTA
Zihao Ye University of Washington
10:20
15m
Talk
The Sparse Abstract Machine
CTSTA
Olivia Hsu Stanford University
10:35
15m
Talk
TeAAL: A Declarative Framework for Modeling Sparse Tensor Accelerators
CTSTA
Nandeeka Nayak University of Illinois at Urbana-Champaign
10:50
10m
Panel
Discussion
CTSTA