Graph Neural Networks (GNN) using Pytorch Geometric | Stanford University

Поделиться
HTML-код
  • Опубликовано: 8 сен 2024
  • This is the Graph Neural Networks: Hands-on Session from the Stanford 2019 Fall CS224W course.
    In this tutorial, we will explore the implementation of graph neural networks and investigate what representations these networks learn. Along the way, we'll see how PyTorch Geometric and TensorBoardX can help us with constructing and training graph models.
    Pytorch Geometric tutorial part starts at -- 0:33:30
    Details on:
    * Graph Convolutional Neural Networks (GCN)
    * Custom Convolutional Model
    * Message passing
    * Aggregation functions
    * Update
    * Graph Pooling
    The Google Colab Link: colab.research...
    The course website Link: web.stanford.ed...

Комментарии • 32

  • @xiaoyumo7403
    @xiaoyumo7403 4 года назад +94

    PyG starts from 33:33

    • @jianchen2550
      @jianchen2550 9 месяцев назад

      You are my hero. Love you

  • @alirezasamadi5804
    @alirezasamadi5804 2 года назад +10

    AMAZING ... all tutorials start with either very basic level and leave u high and dry when it reaches to actual point, or they start with a point that u don't have any idea... this tutorial is amazing

  • @SRV902
    @SRV902 4 года назад +19

    CrossEntropyLoss already does log softmax behind the scenes. On top of that F.softmax is applied at the end of the model forward which is not needed if nn.CrossEntropyLoss is used. This is before PyTorch Geometric is introduced

    • @SteveSperandeo
      @SteveSperandeo 9 месяцев назад +2

      No only not needed, but 2x softmax will break your model.

  • @williamashbee
    @williamashbee 3 года назад +6

    i'm unworthy of this presentation. good job.

  • @m.khanmohamadi9815
    @m.khanmohamadi9815 Год назад

    Thank you very much. it was very good tutorial of gnn neural network

  • @user-tz2pc6wy7f
    @user-tz2pc6wy7f 4 года назад +3

    Really good tutorial.

  • @hiro_happysky12
    @hiro_happysky12 Год назад +1

    Thanks for this amazing tutorial!! was really helpful for me☺

  • @jinchenghuang3755
    @jinchenghuang3755 3 года назад +2

    There is something to simplify :nn.CrossEntropy = F.nll_loss(F.log_softmax(x), label)

  • @joemeyer9655
    @joemeyer9655 3 месяца назад

    Nice!

  • @sumitkumar-el3kc
    @sumitkumar-el3kc 4 года назад +7

    Can anyone please tell me the prerequisites to start with GNN? I'm new to neural networks. Although I have some experience in ML but neural networks are still new to me.

    • @spartacusche
      @spartacusche 2 года назад +2

      you can see the coursera course deep learning, or cs229 stanford

  • @stevegabrial1106
    @stevegabrial1106 3 года назад +3

    Hi
    Plz upload more videos of Ml, DL by Stanford . thx

  • @TechVizTheDataScienceGuy
    @TechVizTheDataScienceGuy 4 года назад +1

    👍

  • @dam1an096
    @dam1an096 2 года назад

    gg, good tutorial ✌

  • @CXL601
    @CXL601 4 года назад

    model = pyg_nn.GAE(Encoder(dataset.num_features, channels)).to(dev)
    model.split_edge(data)
    --- get error saying ''GAE' object has no attribute 'split_edges'",
    Just checked the documentation, it is true that the latest version of GAE object doesn't have 'split_edges' functions.
    so random split?

    • @CXL601
      @CXL601 4 года назад +2

      oh, it is negative sampling

  • @prajwol_poudel
    @prajwol_poudel 2 года назад +1

    Hey, do you have more tutorials in coding from cs22aw like these?
    If anyone knows where I can find more coding playlist please share.

  • @jz7327
    @jz7327 3 года назад

    I'm wondering why that pool layer is necessary for the graph level task? Can't we just use some linear layers to predict some property that corresponds to the whole graph? Somebody can help me with this? Ty!

    • @jz7327
      @jz7327 3 года назад +2

      I think it's due to the dimension? For graph level tasks we want the whole graph represented by a vector, so this pooling is transferring the node embedding matrix to a vector?

    • @laurasnow7822
      @laurasnow7822 Год назад +2

      I’m like two years late to this question, but the node property matrix will be of different dimension for each graph size, so we can’t train a neural network on it directly. We could train a sequential neural network, but we don’t want to get different results based different edge orderings. The most naive approach would be to just take a sum or average of all node embedding and use that as a graph embedding. It might be enough in some cases.

  • @brandomiranda6703
    @brandomiranda6703 3 года назад

    what is the best GNN library as of now 2021 for PyTorch?

    • @mhadnanali
      @mhadnanali 3 года назад

      i have the same question

  • @user-uq7ri1pz2c
    @user-uq7ri1pz2c 3 года назад

    hey, what about other seminar tapes from the cs224w?

    • @lindseyai4843
      @lindseyai4843  3 года назад +3

      You can find them here:
      snap.stanford.edu/class/cs224w-videos-2019/

    • @aidasadeghi2147
      @aidasadeghi2147 10 месяцев назад

      hey the link is private is there any public?
      @@lindseyai4843

    • @pablobanchero3812
      @pablobanchero3812 6 месяцев назад

      @@lindseyai4843 Can you tell us the username and password?

  • @vincentedeh3926
    @vincentedeh3926 3 года назад

    How do i download the code used in the presentation?

  • @tongliu5755
    @tongliu5755 4 года назад +4

    码一下

  • @expectopatronum2784
    @expectopatronum2784 3 месяца назад