[Submitted on 26 Oct 2025]
Adaptive Activation Mixing for Transformer Feedforward Networks
View PDFAbstract:We analyze adaptive activation mixing in transformer feedforward networks, combining GELU and SiLU activations with a learned gating mechanism. Our approach achieves 5.108 validation loss versus 4.927 for SwiGLU baseline, demonstrating the feasibility of dynamic activation selection with minimal parameter overhead.
Submission history
[v1] Sun, 26 Oct 2025 01:25 UTC