Toward Manifest Relationality in Transformers via Symmetry Reduction

Authors: J. François, L. Ravera

Year: 2026

cs.LGcs.NEhep-thstat.ML

0
Citations
2026
Published
2
Authors

Abstract

Transformer models contain substantial internal redundancy arising from coordinate-dependent representations and continuous symmetries, in model space and in head space, respectively. While recent approaches address this by explicitly breaking symmetry, we propose a complementary framework based on symmetry reduction. We reformulate representations, attention mechanisms, and optimization dynamics in terms of invariant relational quantities, eliminating redundant degrees of freedom by construction. This perspective yields architectures that operate directly on relational structures, providing a principled geometric framework for reducing parameter redundancy and analyzing optimization.

Read PDF