[Submitted on 2 Nov 2025]
Revisiting GEGLU: An Empirical Analysis of Gated Feedforward Variants in Transformers
View PDFAbstract:This paper presents a systematic evaluation of Gated Gaussian Error Linear Unit (GEGLU) in transformer feedforward networks. Through controlled experiments on the FineWeb benchmark, we demonstrate that GEGLU achieves improved validation perplexity compared to the standard SwiGLU baseline, while maintaining identical computational complexity. Our analysis includes ablation studies across model sizes and a comprehensive comparison with recent feedforward variants.
Submission history
[v1] Sun, 2 Nov 2025 04:24 UTC