Graph based neural network
Graph based neural network:
The graph based neural network (GNN) work and to be building it up from the basics of message passing now the equation right now is a popular graph neural network known as a graph convolutional network or a gcn . we’ll have a basic understanding of how this equation produces node embeddings for the purpose of machine learning tasks now the historical development of graphical networks is actually very interesting because they were developed sort of independently in the world of chemistry. so the basic idea of two-step message passing is that you have a target node and its neighbors and look at thee node’s neighbors first right you’re going to look at b c and d and you can see this over here on the right and you’re going to take all the information from these nodes whatever information that you consider relevant to your problem
- Aggregate = pass information (the “message”) from a target node’s neighbors to the target node
- Update= update each node’s features based on “message” to from an embedded representation
Generally, node-by-node
H= node features/ embeddings
K= number of hops
Read more:: An Introduction to Graph Neural Networks (Basics and DeepWalks)
Aggregate function operators over sets, must be permutation invariant or permutation equivariant. Neural message passing uses neural network for the update and aggregate function.
Aggregate function is going to be a sum over all of the neighboring nodes ,then going to multiply that by a waiting matrix and then to take a weighting of our target node representation and then because it’s a neural network we have to add in some non-linearity so we’ll add in some activation function and this will then update our target node representation and really this is a very intuitive idea.
Each node’s updated value will become some type of weighting between its previous value and a weighting of its neighbor’s value. The choice of sum over your neighbors, you could take the maximum or you could concatenate all them together and then multiply by some type of weighting matrix so there are different choices that you can make here now the neural message passing algorithm that we showed on the last slide can actually be simplified a little bit and this is important because it’s kind of weird for us to treat the target node and the neighboring nodes asymmetrically like we have an entire weight matrix that’s just on one node and then another separate weight matrix that’s being used for the neighbors.
Neural message passing can be simplified to be efficient and generalizable by sharing parameters.
Collapse Wself and Wnigh into W by adding self -loops to the adjacency matrix A. This method reduces message passing to relatively simple matrix multiplication
Graph based neural network can also be based on a ‘convolution’ perspective:
where you’re generalizing a convolution on an image to an operation that works on an image to an operation that works on a graph or graph structured data now if we look at the original gnn algorithm, i believe this was based on a message passing idea but this looks almost the same these equations look very similar,
Original GNN (Merkwirth, 2005 + Scarselli et al., 2009)
The graph convolutional network they were coming at it from a convolutional perspective and really the only small difference is in how you define the adjacency matrix where you’re simply normalizing by the number of nodes in the neighborhood for each target node.
GCN (Kipf + Welling, 2016)
Normalizes by # of nodes in neighborhood
The development and application of graph based neural network models is a new and active area of research:
In the major conferences in machine learning or even some engineering journals now you will see that people are taking graph neural network models and applying them to new and novel problems and they are doing better than previous models have ever done and i think that is a fascinating thing, if you’re in research or you’re considering doing a phd or a masters or whatever that involves research, i would highly suggest that you can look into graph neural networks and see what new things you can do with them so as a sort of historical development.
In 2018 the graph attention networks model was proposed and this is one of the first models to use attention as a graph convolutional operation, so you know you don’t have to worry about the mathematical details here but basically, they are learning end to end the weighting between every node in a graph or every node and its neighboring nodes so in that way it is fundamentally different from a graph sort of stacking layers and you’re multiplying by a fixed or a waiting matrix that changes based on the layers but you have like this fixed weighting and then you also have sort of simplifying directions of convolutional paper is very interesting, and you should all look at it.
Simple Graph Convolution (SGC) in 2019, Check this stuff out if you want to learn more about graph based neural network. So if you want to learn more about graphene neural network, william hamilton he literally wrote the book on gnn’s a lot of the material from here is based off of some chapters in that book.
Read more:: WNet segmentation for unsupervised Images