Close

Presentation

SanQus: Staleness and Quantization-Aware Full-Graph Decentralized Training in GNNs
DescriptionGraph neural networks (GNNs) have demonstrated significant success in modeling graphs; however, they encounter challenges in efficiently scaling to large graphs. To address this, we propose the SanQus system, advancing our previous work, Sancus. SanQus reduces the need for expensive communication among distributed workers by utilizing Staleness and Quantization-Aware broadcasting. SanQus manages embedding staleness, skips unnecessary broadcasts, and treats decentralized GNN processing as sequential matrix operations. To further reduce communication, SanQus caches historical embeddings and performs quantization-aware broadcast. Theoretically, SanQus demonstrates bounded approximation errors and optimal convergence rates. Extensive experiments on big graphs with common GNN models show that SanQus reduces communication by up to 86% and triples throughput without sacrificing accuracy, outperforming state-of-the-art systems.
Event Type
ACM Student Research Competition: Graduate Poster
ACM Student Research Competition: Undergraduate Poster
Doctoral Showcase
Posters
TimeTuesday, 19 November 202412pm - 5pm EST
LocationB302-B305
Registration Categories
TP
XO/EX