• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site

Ekaterina Filimoshina presented a talk at the International Joint Conference on Neural Networks (Rome, Italy)

Research intern of the Laboratory for Geometric Algebra and Applications Ekaterina Filimoshina took part in the annual international conference on computer science International Joint Conference on Neural Networks (IJCNN) 2025. This year the conference was held at the Pontifical Gregorian University, Rome, Italy, from June 30 to July 5. More than 2,350 participants registered for the conference, of which more than 1,250 people participated offline. A total of 5,526 papers were submitted to IJCNN 2025, the acceptance rate was 38%. Keynote speakers included Samy Bengio (Apple), Francesca Rossi (IBM Research), Sepp Hochreiter (JKU Linz, NXAI GmbH), Ruiqi Gao (Google DeepMind), Andrzej Cichocki (Polish Academy of Science), and others.

Conference website: https://2025.ijcnn.org/
Program: https://2025.ijcnn.org/program/program

Ekaterina presented a report as part of the special session 'Complex- and Hypercomplex-Valued Neural Networks' (special session website). The 'Complex- and Hypercomplex-Valued Neural Networks' sessions have become a traditional event of the IJCNN conference, and 14 such sessions have been organized since 2006. Organizers of the special session: Marcos Eduardo Valle (Universidade Estadual de Campinas), Sven Buchholz (Technische Hochschule Brandenburg), Eckhard Hitzer (International Christian University), João Papa (São Paulo State University), Akira Hirose (The University of Tokyo).

Talk: Ekaterina Filimoshina, Dmitry Shirokov, 'Equivariant Neural Networks with Geometric Algebras: A New Approach', July 4, 2025.

Abstract: This work is devoted to construction and implementation of new equivariant neural networks based on geometric (Clifford) algebras. We propose, implement, test, and compare with competitors a new architecture of equivariant neural networks, which we call Generalized Lipschitz Group Equivariant Neural Networks (GLGENN). These networks are equivariant to all pseudo-orthogonal transformations, including rotations. We introduce generalized Lipschitz groups and prove for the first time that the following mappings in geometric algebras are equivariant with respect to these groups and pseudo-orthogonal and complex orthogonal groups: projections onto subspaces determined by grade involution and reversion and polynomials of geometric algebra elements. Leveraging these equivariant mappings, we design generalized geometric product and linear layers. GLGENN demonstrate superior performance in benchmark equivariant regression tasks, outperforming competitors, while using fewer optimizable parameters. Due to a relatively small number of parameters in the architecture, GLGENN have less tendency to overfitting. GLGENN have promising applications in natural science and computer vision, where tasks inherently involve equivariance to pseudo-orthogonal transformations.

Based on the results of participation in the conference, the work will be published in IEEE Proceedings.

We would like to express our gratitude to the HSE Center for Student Academic Development for supporting Ekaterina's trip to the IJCNN 2025 conference!