[Submitted on 6 Nov 2025]
AttentiveLayerAdam: Analysis of Orthogonal Constraints in Transformer Optimization
View PDFAbstract:This paper presents an investigation of orthogonalization constraints in transformer optimization through AttentiveLayerAdam, a modified Adam optimizer with layer-specific learning rates and attention weight orthogonalization. Our method showed progressive improvement across ablations but ultimately underperformed the AdamW baseline (4.927) with a final validation loss of 9.853. We analyze the computational overhead (23\% slower than AdamW) and compare to recent orthogonal optimization approaches. While demonstrating stable orthogonal constraints are feasible, results suggest they require refinement to compete with standard approaches.
Submission history
[v1] Thu, 6 Nov 2025 09:16 UTC