README.md 1.2 KB
Newer Older
W
wangwenjin 已提交
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
# GaAN: Gated Attention Networks for Learning on Large and Spatiotemporal Graphs

[GaAN](https://arxiv.org/abs/1803.07294) is a powerful neural network designed for machine learning on graph. It introduces an gated attention mechanism. Based on PGL, we reproduce the GaAN algorithm and train the model on [ogbn-proteins](https://ogb.stanford.edu/docs/nodeprop/#ogbn-proteins).

## Datasets
The ogbn-proteins dataset will be downloaded in directory ./dataset automatically.

## Dependencies
- paddlepaddle
- pgl
- ogb

## How to run
```bash
python train.py --lr 1e-2 --rc 0 --batch_size 1024 --epochs 100
``` 
### Hyperparameters
- use_gpu: whether to use gpu or not
- mini_data: use a small dataset to test code
- epochs: number of training epochs
- lr: learning rate
- rc: regularization coefficient
- log_path: the path of log
- batch_size: the number of batch size
- heads: the number of heads of attention
- hidden_size_a: the size of query and key vectors
- hidden_size_v: the size of value vectors
- hidden_size_m: the size of projection space for computing gates
- hidden_size_o: the size of output of GaAN layer 
W
wangwenjin 已提交
30 31 32 33 34 35

## Performance
We train our models for 100 epochs and report the **rocauc** on the test dataset.
|dataset|mean|std|
|-|-|-|
|ogbn-proteins|0.7786|0.0048|