[Submitted on 2 Nov 2025]
Systematic Evaluation of Feedforward Network Variants in Transformer Architectures
View PDFAbstract:This paper presents a systematic evaluation of feedforward network variations in transformer architectures, with particular focus on modifications to the gated activation mechanism. We conduct controlled experiments comparing four variants against the SwiGLU baseline: (1) polynomial-enhanced GEGLU, (2) normalized GEGLU, (3) scaled residual GEGLU, and (4) pure GEGLU. All experiments use a 134M parameter transformer trained on the FineWeb dataset with fixed hyperparameters (learning rate 6e-4, batch size 256, 100K steps). While our best variant achieved a marginal 0.6\% improvement in validation loss (4.898 vs 4.927), most modifications degraded performance. We discuss implications for architectural innovation and identify key limitations of our study, including the need for multi-run statistical validation and broader exploration of the design space.
Submission history
[v1] Sun, 2 Nov 2025 17:59 UTC