Temporal Graph Network, or TGN, is a framework for deep learning on dynamic graphs represented as sequences of timed events. The memory (state) of the model at time $t$ consists of a vector $\mathbf{s}_i(t)$ for each node $i$ the model has seen so far. The memory of a node is updated after an event (e.g. interaction with another node or node-wise change), and its purpose is to represent the node's history in a compressed format. Thanks to this specific module, TGNs have the capability to memorize long term dependencies for each node in the graph. When a new node is encountered, its memory is initialized as the zero vector, and it is then updated for each event involving the node, even after the model has finished training.
Source: Temporal Graph Networks for Deep Learning on Dynamic GraphsPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Link Prediction | 3 | 15.00% |
Recommendation Systems | 2 | 10.00% |
Dynamic Link Prediction | 2 | 10.00% |
Node Classification | 2 | 10.00% |
Anomaly Detection | 1 | 5.00% |
Fraud Detection | 1 | 5.00% |
Graph Anomaly Detection | 1 | 5.00% |
Graph Embedding | 1 | 5.00% |
Graph Representation Learning | 1 | 5.00% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |