Redirecting to original paper in 30 seconds...

Click below to go immediately or wait for automatic redirect

arxiv_ml 50% Match 6 days ago

ASGO: Adaptive Structured Gradient Optimization

uncategorized › parse-error
📄 Abstract

Abstract: Training deep neural networks is a structured optimization problem, because the parameters are naturally represented by matrices and tensors rather than by vectors. Under this structural representation, it has been widely observed that gradients are low-rank and Hessians are approximately block diagonal. These structured properties are crucial for designing efficient optimization algorithms, but are not utilized by many current popular optimizers like Adam. In this paper, we present a novel optimization algorithm ASGO that capitalizes on these properties by employing a preconditioner that is adaptively updated using structured gradients. By a fine-grained theoretical analysis, ASGO is proven to achieve superior convergence rates compared to existing structured gradient methods. Based on this convergence theory, we further demonstrate that ASGO can benefit from low-rank gradients and block diagonal Hessians. We also discuss practical modifications of ASGO and empirically verify ASGO's effectiveness on language model tasks. Code is available at https://github.com/infinity-stars/ASGO.
Authors (7)
Kang An
Yuxing Liu
Rui Pan
Yi Ren
Shiqian Ma
Donald Goldfarb
+1 more
Submitted
March 26, 2025
arXiv Category
cs.LG
arXiv PDF

Key Contributions

JSON parse error: Unexpected token i in JSON at position 53128