@@shawnz9833 You are welcome mate, I am doing some deep learning research as well. You are welcome to check out our research talks on my RUclips channel.
So GNNs are basically something like calculating word embeddings in NLP. We have a dataset describing the relationships between pairs of words (nodes), and we want a vector representation that reflects how often they co-occur (weight of the edge between the nodes), i.e., how much relatedness the two words have. Once we have such vectors, we can build a vanilla, recurrent, or convolutional neural net to find out a mapping between the vectors and the output we desire.
In 16:51, I think he meant for each node connecting to n (instead of n_j), because from the expression, we take all nodes n_j connected to n to be able to calculate the new state of node n h_t^n.
Actually, I read the original research paper on this network, which established a new field in artificial intelligence. I found slight differences between the speaker's words and the research paper, then it became clear to me that he was talking in general about this neural network. In general, the audience's performance was unsatisfactory, and there were many interruptions that made the person confused and disorganized. The last thing I would like to say is that this field was found in the research paper of 2008, which was published in 2009. The researchers said that it is a network resulting from the features of RNN and its strengths and the idea of working with Markov models, then it was wrapped in the concepts of graph theory.
awesome talk! The MSR audience asked quite a few questions, which are actually helpful , eg, what are they, how they work/update, why they are created and designed this way, etc
The explanation ability and use of high-level diagrams by the presenter were phenomenal. Questions from the audience definitely messed up the flow of the presentation quite a bit though.
Wow, what an excellent presentation, from someone with an ML background. Explains the basics a bit but also covers deep concepts. Super clear graphics! Seriously whoever made the graphics for this can I hire you to do my slide graphics? And thought it was very cool that the lecture attendees were bold enough to ask so many questions! Wish people asked more questions during my lectures+talks.
29:35 about CGNs, he said you multiply the sum of the messages with your own state. But in the equation, it is a sum. I didn't get which one is correct.
While the audience questions were mildly irritating (to put it, mildly), bombarding the speaker during his intro with questions that could reasonably be expected to be answered eventually in an 1-hour talk, why would the speaker give a talk on one of the most advanced neural network architecture to an audience without any machine learning background?
You are right, I am expecting to quickly adopt the GNN concept but the audience keeps asking irritating questions that I have to constantly hit the right button.
I agree. I mean I do see the point of giving this lecture to an audience without previous exposure to ML , if it is for the purpose of attracting them to the subject but in that case there should have been another video of the same lecture without so much interruption. It would take extra time and effort but for people who are trying to effectively learn GNN and have some knowledge of basic ML, these questions are very annoying and hinders the learning experience.
@@robbat1209 This comment is absolutely valid. If it bothers you, there are plenty of other comments that you could read. This talk was from 4 years ago. This was one of the only sources about GNN, and before GenAI video summaries would allow audience like myself to comfortably skip ahead without fear of missing important information.
OMG! For a second I thought he looked like the CEO of Google and was wondering to myself: why would the CEO of Google do a presentation about Neural Networks AT MICROSOFT!!
The miss, at 40:00 was right .... as i was alsoooo really confused, like all the matrix operations were seemed to be invalid if not swapped ... lol what kind of inverted conventions are these ....
35:40 I think the dimensionality of M should be (num_nodes x D), unless D==M. EDIT: from what follows, it should be M = HE, and D can be different from M.
I am pretty sure this is a great talk but unfortunately all the questions in between disturbs the flow a lot (also because most of them are hard to understand acoustically).
can we propagate messages for example depending on the edges features for example if the distance from node n to m is greater than their distance to p then we propagate the message first to p then we perform the propagation to the other node m
Great presentation. But I have to point out something. I have no idea why you would use einstein notation instead of simple matrix multiplication? It raises unnecessary confusion and it's not related to GNNs.
i am a researcher..the video contain the beautiful concept ...i like very much....:)..specially binary classification part ...i am so excited about this concepts....
What is the dimension M, for the msg_to_be_sent and received_messages etc. I get that D is the dimension of the node representation, N the num_nodes etc
I found the slide for everyone who asked here: miltos.allamanis.com/files/slides/2020gnn.pdf (Idk If I'm not supposed or allowed to post the link here, if not sorry for that, I'll delete my comment. Just let me know).
@@pharofx5884 I just watched the version from 2 years ago. Only 18 minutes long, but almost identical in content, yet that was much clearer. Really sad to see.
34:22 - is it vector-matrix multiplication? if so, the result is wrong i guess @edit: Matrix A should have ones under diagonal, not above - then result is as presented
thats why software engineers should not teach, you assume everything is a design/modeling detail when in reality they are part of the mathematics behind them. and i seriously miss those old days where professors used to teach with a chalk and board.
Don't understand the praise in the comment section, I actually found it kind of sloppy with typo-s but the audience and the questions are really great.
An aromatic ring is not a "single bone" next to a "double bone." The bonds are a resonance form in a single state. Treating them with graph theory is not supported by current models.
Appalling talk! It shows why coders are terrible in public speaking or often fail to explain things in a transparent manner. Before explaining how message passing is done in an end-to-end learning architecture, he jumped to talk about Gated GNN leaving an impression that GRU may be an important part of GNN. This is one of the reasons why he got so many questions and confusion surrounding his lecture.....what is h_t? "well, this is not something that changes"... seriously Microsoft!
*My takeaways:*
1. Background 0:48
2. Graph neural networks (GNN) and neural message passing 6:35
- Gated GNN 26:35
- Graph convolutional networks 29:27
3. Expressing GGNNs as matrix operations 33:36
4. GNN application examples 41:25
5. Other models as special cases of GNNs 47:53
6. ML in practice 49:28
can you help with that: what is a MLP in "you multiply it with a single layer MLP" @23:29?
@@shawnz9833 Multilayer perceptron
@@leixun Cool, thank you mate!
@@shawnz9833 You are welcome mate, I am doing some deep learning research as well. You are welcome to check out our research talks on my RUclips channel.
"Spherical Cow" - funniest analogy yet for a Neural Net layers. Great talk
Don't know why people are criticizing this video and the audience. Great introduction to graph neural networks!
So GNNs are basically something like calculating word embeddings in NLP. We have a dataset describing the relationships between pairs of words (nodes), and we want a vector representation that reflects how often they co-occur (weight of the edge between the nodes), i.e., how much relatedness the two words have. Once we have such vectors, we can build a vanilla, recurrent, or convolutional neural net to find out a mapping between the vectors and the output we desire.
At time 34:11, the (dot product of matrix A and matrix N) should be [ b + c ; c ; 0 ]
Actually, here what we use is the incoming edges (see 14:55), but that is true the slide is confusing about that.
It seems that the mistake is in the graph adjacency matrix, because the result vector is true given the drawing of the graph.
That is a mistake in the slide, A should be transposed to describe the incoming edges instead of the outgoing ones.
its using einstein notation, not the normal one that we use.
There are many mistakes or confusing comments in this presentation, no wonder the audience keeps asking questions. Not a good talk at all....
In 16:51, I think he meant for each node connecting to n (instead of n_j), because from the expression, we take all nodes n_j connected to n to be able to calculate the new state of node n h_t^n.
Actually, I read the original research paper on this network, which established a new field in artificial intelligence. I found slight differences between the speaker's words and the research paper, then it became clear to me that he was talking in general about this neural network. In general, the audience's performance was unsatisfactory, and there were many interruptions that made the person confused and disorganized. The last thing I would like to say is that this field was found in the research paper of 2008, which was published in 2009. The researchers said that it is a network resulting from the features of RNN and its strengths and the idea of working with Markov models, then it was wrapped in the concepts of graph theory.
great talk! the audience questions were helpful, but i felt like they were a bit too many in that they kinda negatively affected the flow of the talk.
awesome talk! The MSR audience asked quite a few questions, which are actually helpful , eg, what are they, how they work/update, why they are created and designed this way, etc
can you help with that: what is a MLP in "you multiply it with a single layer MLP" @23:29?
The explanation ability and use of high-level diagrams by the presenter were phenomenal. Questions from the audience definitely messed up the flow of the presentation quite a bit though.
Great Introduction!
This is why you have the Q&A at the end of the presentation.
I dont understand why GRU is used, the input in GRU is a (Node x Caracteristics) Matrix, where is the temporal dimension?
He explains using time progress, which make some cofusion to the audience and me.
Wow, what an excellent presentation, from someone with an ML background. Explains the basics a bit but also covers deep concepts. Super clear graphics! Seriously whoever made the graphics for this can I hire you to do my slide graphics? And thought it was very cool that the lecture attendees were bold enough to ask so many questions! Wish people asked more questions during my lectures+talks.
29:35 about CGNs, he said you multiply the sum of the messages with your own state. But in the equation, it is a sum. I didn't get which one is correct.
Excellent explanation
While the audience questions were mildly irritating (to put it, mildly), bombarding the speaker during his intro with questions that could reasonably be expected to be answered eventually in an 1-hour talk, why would the speaker give a talk on one of the most advanced neural network architecture to an audience without any machine learning background?
You are right, I am expecting to quickly adopt the GNN concept but the audience keeps asking irritating questions that I have to constantly hit the right button.
I agree. I mean I do see the point of giving this lecture to an audience without previous exposure to ML , if it is for the purpose of attracting them to the subject but in that case there should have been another video of the same lecture without so much interruption. It would take extra time and effort but for people who are trying to effectively learn GNN and have some knowledge of basic ML, these questions are very annoying and hinders the learning experience.
The questions of the audience are absolutely valid, if they bother you there are plenty of other videos without an audience that you could watch
@@robbat1209 This comment is absolutely valid. If it bothers you, there are plenty of other comments that you could read.
This talk was from 4 years ago. This was one of the only sources about GNN, and before GenAI video summaries would allow audience like myself to comfortably skip ahead without fear of missing important information.
@@MobileComputing 🤣🤣 true, my bad. The questions weren’t too horrible tho
Let the speaker talk!
Amazin Talk !
OMG! For a second I thought he looked like the CEO of Google and was wondering to myself: why would the CEO of Google do a presentation about Neural Networks AT MICROSOFT!!
The miss, at 40:00 was right .... as i was alsoooo really confused, like all the matrix operations were seemed to be invalid if not swapped ... lol what kind of inverted conventions are these ....
35:40 I think the dimensionality of M should be (num_nodes x D), unless D==M.
EDIT: from what follows, it should be M = HE, and D can be different from M.
DiscussIon on the actual topic starts ~ 6:40
I am pretty sure this is a great talk but unfortunately all the questions in between disturbs the flow a lot (also because most of them are hard to understand acoustically).
36:53 what is M in the shape (num_nodes by M)?
Very good presentation, but it is very difficult to follow with all the interrupting questions
So GNN is just message passing on a graph or did I miss something? This has been around since way back, isnt it??
There is no intuitive explaination, but quite informative
The talk is very interesting; however, the interruptions from the audience are quite disturbing.
The notation is incomplete or incorrect in so many places on the presentation, that it was hard to follow.
and for Edge classification?
can we propagate messages for example depending on the edges features for example if the distance from node n to m is greater than their distance to p then we propagate the message first to p then we perform the propagation to the other node m
dat cough frequency suspiciously high. Distance thyself socially sir.
Don't worry, it's from November 2019
@@maloxi1472 COVID was already spreading then, right? I hope he's ok... wherever he is now...
He is maybe THE patient zero
bless him
@@danielliu9616oh shii
Excellent introduction, thanks a lot!
Are those actual MS employers in the crowd? They are worse than 1st year CS students
is there anyway to get access to the slides? Great talk! Thanks.
34:46 . it is NOT A* N, it is N' * A ....
Need more tutorial on GNN
Where can we get the slide deck please?
Can you share the slides please. I like them.
Great presentation. But I have to point out something. I have no idea why you would use einstein notation instead of simple matrix multiplication? It raises unnecessary confusion and it's not related
to GNNs.
i am a researcher..the video contain the beautiful concept ...i like very much....:)..specially binary classification part ...i am so excited about this concepts....
Honestly some of the audience who raised questions have quite big ego and have no idea what they are talking about.
What is the dimension M, for the msg_to_be_sent and received_messages etc. I get that D is the dimension of the node representation, N the num_nodes etc
at time 35:51, the adjacency matrix A should also depend on edge type k imo.
OK.. The presenter confirmed this shortly after...
Is this a 2016 talk?
OH MY GOD that audience
I had to stop watching because of them
I wanted to believe it wasn’t true.
could you please post the ppt here? Thanks
I found the slide for everyone who asked here: miltos.allamanis.com/files/slides/2020gnn.pdf
(Idk If I'm not supposed or allowed to post the link here, if not sorry for that, I'll delete my comment. Just let me know).
You can let it remain here :)
Great session !
horror crowd. this is something I see in every microsoft talk
too many questions, just wait for the speaker please.
In formulating their questions they re-explained what is going on an order of magnitude better than the speaker. thats kinda sad
@@pharofx5884 I just watched the version from 2 years ago. Only 18 minutes long, but almost identical in content, yet that was much clearer. Really sad to see.
It would be great if you could also publish the slides!
Slides from the presenter's website: miltos.allamanis.com/files/slides/2020gnn.pdf
@@mayankgolhar8761 thanks
Is that SPJ I hear in the audience at 9:18
How tp turn off questions
You can tell this lecture was recorded during the prime Covid by hearing the constant coughing from audience (and the speaker)
Seems several people were not healthy
good
Where is the inputs and outputs?
Clearly you would've been one of the people asking foolish questions they could answer using Google
34:22 - is it vector-matrix multiplication? if so, the result is wrong i guess
@edit: Matrix A should have ones under diagonal, not above - then result is as presented
It should be (A^T)*N
Some good person should take this video and remove all the awful questions from the audience
Is this related to bread baking ?
At least it save time in doing strategy.
Rip ears. Wtf with the caughing. Use at least some compressor for the vocal audio omg.
People are not happy for the many questions. However, I'm kinda sad that he doesn't re-state the questions before answering :( like why?
I hear Simon Peyton Jones in the audience
Easily recognizable indeed!
thats why software engineers should not teach, you assume everything is a design/modeling detail when in reality they are part of the mathematics behind them. and i seriously miss those old days where professors used to teach with a chalk and board.
Wondering how many people had covid in that recording...
Don't understand the praise in the comment section, I actually found it kind of sloppy with typo-s but the audience and the questions are really great.
I found the lecture’s atmosphere dull and depressing. It seems that the lecturer was forced to give the lecture!
Unfortunately many good researchers can’t present their work well.
Great Video but annoying audience
People need to remember they’re watching a free video on RUclips...it’s not your advanced ML private tutoring session...
👍
The audience ruined this presentation. I have never felt worse for a presenter.
Horrible audience, great talk!
the interruptions are so annoying...
Kind of confusing for me. And the audience very annoying
So that's machine learning! Haha, lol
An aromatic ring is not a "single bone" next to a "double bone." The bonds are a resonance form in a single state. Treating them with graph theory is not supported by current models.
that audience was pretty annoying tbh
cough totally ruined the presentation.
are GNN's patented? does anyone know if using a paritcular ANN construct can be subject to litigation?
++
The audience needs to take a freaking ML 101 class before asking stupid questions
Appalling talk! It shows why coders are terrible in public speaking or often fail to explain things in a transparent manner. Before explaining how message passing is done in an end-to-end learning architecture, he jumped to talk about Gated GNN leaving an impression that GRU may be an important part of GNN. This is one of the reasons why he got so many questions and confusion surrounding his lecture.....what is h_t? "well, this is not something that changes"... seriously Microsoft!
awful introduction
Word salad. A hopeless mess of talking at and around a topic without actually touching it.
Take it down. Tell the guy to try again.
Slides can be found at: miltos.allamanis.com/files/slides/2020gnn.pdf