[Submitted on 6 Nov 2025]
GeoAdam: Geometric Adaptive Momentum for Transformer Optimization
View PDFAbstract:We present GeoAdam, a novel optimizer combining layer-specific adaptation with geometric orthogonalization for transformer language models. On the FineWeb benchmark with a 134M parameter Qwen architecture, GeoAdam achieves a validation loss of 3.924, representing a 20.3\% improvement over AdamW (4.9266) and competitive performance with state-of-the-art methods. Our key innovations include: (1) layer-wise adaptive learning rates based on parameter roles, (2) a geometric orthogonalization procedure for attention layers, and (3) warmup scheduling. Ablation studies demonstrate the orthogonalization provides a 0.3 point improvement while layer-specific adaptation contributes 0.5 points. The method maintains reasonable memory overhead (39.5GB vs AdamW's 31.5GB) while offering faster convergence.
Submission history
[v1] Thu, 6 Nov 2025 06:42 UTC