Rethinking Softmax: Self-Attention with Polynomial Activations | Dark Hacker News