Skip to main content
To KTH's start page

Georg Bökman: Making practical use of equivariance in neural networks

Time: Wed 2025-04-30 10.15

Location: KTH 3418, Lindstedtsvägen 25 and Zoom

Video link: Zoom meeting ID: 632 2469 3290

Respondent: Nils Quaetaert

Supervisor: Kathlén Kohn

Export to calendar

Abstract.

Given two representations of a group G, we consider neural networks that are equivariant w.r.t. the representations. Equivariant neural networks are commonly motivated by their adherence to a symmetry of the task at hand, and the fact that this adherence can lead to improved performance. For example, rotation and reflection invariant networks often outperform non-invariant networks on classification of cell images. In this talk, we highlight two further motivations for using equivariant networks. First, equivariant networks that map to a non-trivial representation of G give oriented output. We explain the usefulness of oriented output in an image matching task with upright images and demonstrate that training ordinary neural networks on this task gives (approximately) rotation equivariant networks without using data augmentation. Second, equivariant linear maps can be block-diagonalized by Schur’s lemma, yielding computational benefits compared to arbitrary linear maps. We present recent work on making use of these computational benefits, outperforming state-of-the-art ordinary neural networks (ViTs, ConvNets and MLPs) on image classification with faster networks.