Update README.md
Browse files
README.md
CHANGED
@@ -7,4 +7,39 @@ size_categories:
|
|
7 |
- 10B<n<100B
|
8 |
tags:
|
9 |
- croissant
|
10 |
-
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
7 |
- 10B<n<100B
|
8 |
tags:
|
9 |
- croissant
|
10 |
+
---
|
11 |
+
|
12 |
+
# Towards Neural Scaling Laws for Foundation Models on Temporal Graphs
|
13 |
+
This repository provides the implementation of the TGS foundation model benchmarking and includes links to temporal networks suitable for foundation model training. TGS introduces a training process for foundation models using various real-world temporal networks, enabling prediction on previously unseen networks.
|
14 |
+
|
15 |
+
## Overview
|
16 |
+
Temporal graph learning focuses on predicting future interactions from evolving network data. Our study addresses whether it's possible to predict the evolution of an unseen network within the same domain using observed temporal graphs. We introduce the Temporal Graph Scaling (TGS) dataset, comprising 84 ERC20 token transaction networks collected from 2017 to 2023. To evaluate transferability, we pre-train Temporal Graph Neural Networks (TGNNs) on up to 64 token transaction networks and assess their performance on 20 unseen token types. Our findings reveal that the neural scaling law observed in NLP and Computer Vision also applies to temporal graph learning: pre-training on more networks with more parameters enhances downstream performance. This is the first empirical demonstration of temporal graph transferability. Notably, the largest pre-trained model surpasses fine-tuned TGNNs on unseen test networks, marking a significant step towards building foundation models for temporal graphs. The code and datasets are publicly available.
|
17 |
+
|
18 |
+
![](https://github.com/benjaminnNgo/ScalingTGNs/blob/main/pic/htgn-log2-all-v3-1.png)
|
19 |
+
*TGS foundation model performance on unseen networks*
|
20 |
+
|
21 |
+
### Dataset
|
22 |
+
All extracted transaction networks required for foundation model training can be downloaded [here](https://zenodo.org/doi/10.5281/zenodo.11455827).
|
23 |
+
|
24 |
+
The standard ML croissant repository for datasets TGS's metadata is also available [here](https://huggingface.co/datasets/ntgbaoo/Temporal_Graph_Scaling_TGS_Benchmark).
|
25 |
+
|
26 |
+
The TGS dataset extraction includes:
|
27 |
+
(1) Token Extraction: extracting the token transaction network from our P2P Ethereum live node.
|
28 |
+
(2) Discretizing: creating weekly snapshots for the Discretized Temporal Directed Graph (DTDG) setting.
|
29 |
+
(3) Labeling: assigning labels based on network growth; increasing trends are labeled one, decreasing trends are labeled zero.
|
30 |
+
|
31 |
+
![](https://github.com/benjaminnNgo/ScalingTGNs/blob/main/pic/Data_Processing_V1.png)
|
32 |
+
*TGS dataset extraction*
|
33 |
+
|
34 |
+
### Benchmark Implementation
|
35 |
+
|
36 |
+
TGS transaction networks are divided randomly into train and test sets. The train set is used to train foundation models with different sizes; then, the trained models are evaluated on the test set.
|
37 |
+
|
38 |
+
![](https://github.com/benjaminnNgo/ScalingTGNs/blob/main/pic/Foundation_training_vf.png)
|
39 |
+
*TGS foundation model training overview*
|
40 |
+
|
41 |
+
### Prerequisites
|
42 |
+
|
43 |
+
- Python 3.6+
|
44 |
+
- Libraries listed in `installed_packages.txt`
|
45 |
+
|