A Complete Beginner's Guide To G-Invariant and G-Equivariant Neural Networks (02/08/2024) - Part 2

Поделиться
HTML-код
  • Опубликовано: 19 окт 2024
  • Talk: A Complete Beginner's Guide To G-Invariant and G-Equivariant Neural Networks - Bruno Ribeiro (Purdue University)
    This tutorial provides an integrated view of the key concepts of set and graph learning methods. This tutorial focuses on teaching principles and methods needed to design and deploy novel set and graph learning models, emphasizing the relationship between traditional statistical models, invariant theory, and the algorithmic challenges of designing and deploying set and graph learning models in real-world applications. This tutorial has both theoretical and coding components. This tutorial assumes some familiarity with coding in Python, linear algebra, probability theory, and statistical machine learning.
    Invariant Representations and Group Transformations: Here, we familiarize participants with the notions of invariant representations and group transformations. G-invariant and G-equivariant neural networks are
    the design principles of state-of-the-art architectures in deep learning. They enable neural networks to learn and recognize patterns that are invariant or equivariant under different group transformations. G-invariant neural networks produce the same output for inputs that are related by a group
    transformation, while G-equivariant neural networks produce outputs that are related to the inputs by the same group transformation. These types of networks are the foundation in the design of neural architectures for a range of applications, including object recognition, graph tasks, point cloud tasks, and natural language processing.
    Set Representation Learning: This part of the tutorial introduces participants to Set Representation Learning, which are neural networks that are invariant to permutations of the input, including the use of
    permutation equivariant layers, Lie groups, Lie algebras, and SE and SO-invariant and equivariant layers. The section covers the derivation and implementation of popular architectures such as Deep Sets and Janossy Pooling. Participants will also learn about the use of set representation learning for tasks such as point cloud classification and molecular property prediction. This section provides a foundation for understanding the principles and applications of set representation learning in various domains.
    Graph Neural Networks and Graph Representation Learning: This section of the tutorial covers Graph Neural Networks and G-equivariant representations of joint permutations, including the use of graph convolutions and pooling operations in graph neural networks. The section also covers the concept of structural and positional node embeddings and their relation to G-equivariance and matrix factorization. This section provides a solid foundation for understanding the principles and applications of graph neural networks and G-equivariant representations in various domains.
  • НаукаНаука

Комментарии •