Existing equivariant neural networks for continuous groups require discretization or group representations. All these approaches require detailed knowledge of the

group parametrization and cannot learn entirely new symmetries. In this work, we propose to work with the Lie algebra (infinitesimal generators) instead of the Lie group. Our model, the Lie algebra convolutional network (L-conv) can learn potential symmetries and does not require discretization of the group. We show that L-conv can serve as a building block to construct any group equivariant architecture. We discuss how CNNs and Graph Convolutional Networks are related to and can be expressed as L-conv with appropriate groups. We also derive the MSE loss for a single L-conv layer and find a deep relation with Lagrangians used in physics, with some of the physics aiding in defining generalization and symmetries in the loss landscape. Conversely, L-conv could be used to propose more general equivariant ansĂ¤tze for scientific machine learning.

**Publication:**

**Automatic Symmetry Discovery with Lie Algebra Convolutional Network**.

Dehmamy, Nima, Robin Walters, Yanchen Liu, Dashun Wang, and Rose Yu. "Automatic Symmetry Discovery with Lie Algebra Convolutional Network." In *Thirty-Fifth Conference on Neural Information Processing Systems*. 2021.