[Submitted on 6 Nov 2025]
OrthoAdapt: Practical Gradient Orthogonalization for Transformer Optimization
View PDFAbstract:We present OrthoAdapt, a computationally efficient optimizer that combines adaptive learning rates with partial gradient orthogonalization. Through systematic evaluation on a 134M parameter transformer trained on FineWeb, OrthoAdapt achieves a statistically significant improvement over AdamW (4.821 ± 0.012 vs 4.927 ± 0.011, p < 0.01) with only 5\% additional compute overhead. The method's simplicity and robustness make it suitable for production environments where small, reliable improvements are valued.
Submission history
[v1] Thu, 6 Nov 2025 07:46 UTC