[Submitted on 1 Nov 2025]
Adaptive Gated Feedforward Networks with Learnable Expansion
View PDFAbstract:We introduce a modified feedforward network architecture for transformers that incorporates learnable gating expansion and intermediate transformations. While maintaining the simplicity of standard feedforward networks, our approach introduces two key modifications: (1) a learnable expansion factor for the gating mechanism that adapts during training, and (2) an intermediate transformation with fixed residual connection. Experiments on language modeling demonstrate that our approach achieves better perplexity (4.864) compared to the standard SwiGLU baseline (4.9266) while maintaining similar computational efficiency. We provide ablation studies showing the contribution of each component and analyze the training dynamics.
Submission history
[v1] Sat, 1 Nov 2025 15:42 UTC