Skip to content

Every Hop Etched in Memory: Tokenized Graph Mamba Meets Directed Graph Learning

Notifications You must be signed in to change notification settings

liulizhi1996/DIGRAM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DIGRAM

This is the PyTorch implementation of Every Hop Etched in Memory: Tokenized Graph Mamba Meets Directed Graph Learning.

workflow

Requirements

  • Environment:
    • Python == 3.12.3
    • Nvidia RTX 4090 with CUDA 12.1
  • Package dependencies:
    • PyTorch == 2.3.0
    • PyTorch Geometric == 2.6.1
    • PyTorch Geometric Signed Directed == 1.0.1

How to run

The commands and hyperparameter settings are provided in the run.sh script. The datasets will be automatically downloaded from the internet during program execution. Make sure the folder data exists in the root directory.

We evaluate the performance of DIGRAM on two types of downstream tasks.

  • Node classification (NC): Inferring the label of nodes. (train/val/test = 60%/20%/20%)
  • Link Prediction: Three subtasks are performed according to the dataset splitting strategies. (train/val/test = 80%/5%/15%)
    • Existence prediction (EP), which determines the likelihood of an edge existing between two nodes.
    • Direction prediction (DP), which identifies the orientation of an unidirectional edge.
    • Three-class link prediction (3C), which categorizes a pair of ordered nodes as positive, reverse, or non-existent.
## Task: Node Classification (NC)

# cora_ml
python3 node_classification.py --dataset cora_ml --q 0 --lr 0.01 --verbose

# citeseer
python3 node_classification.py --dataset citeseer --q 0.1 --lr 0.001 --verbose

# wikics
python3 node_classification.py --dataset wikics --q 0 --lr 0.001 --verbose

# pubmed
python3 node_classification.py --dataset pubmed --q 0.15 --hidden_channels 32 --lr 0.001 --verbose

# -------------------
## Task: Link Existence Prediction (EP)

# cora_ml
python3 link_prediction.py --dataset cora_ml --q 0 --lr 0.01 --task existence --verbose

# citeseer
python3 link_prediction.py --dataset citeseer --q 0 --lr 0.001 --task existence --verbose

# wikics
python3 link_prediction.py --dataset wikics --q 0.1 --lr 0.001 --task existence --verbose

# pubmed
python3 link_prediction.py --dataset pubmed --q 0.15 --lr 0.01 --task existence --verbose

# -------------------
## Task: Link Direction Prediction (DP)

# cora_ml
python3 link_prediction.py --dataset cora_ml --q 0.25 --lr 0.01 --weight_decay 0.5 --task direction --verbose

# citeseer
python3 link_prediction.py --dataset citeseer --q 0.2 --lr 0.01 --weight_decay 0.5 --task direction --verbose

# wikics
python3 link_prediction.py --dataset wikics --q 0.2 --lr 0.01 --weight_decay 0.5 --task direction --verbose

# pubmed
python3 link_prediction.py --dataset pubmed --q 0.25 --lr 0.01 --weight_decay 0.5 --task direction --verbose

# -------------------
## Task: Three-class Link Prediction (3C)

# cora_ml
python3 link_prediction.py --dataset cora_ml --q 0.05 --lr 0.01 --weight_decay 0.25 --task three_class_digraph --verbose

# citeseer
python3 link_prediction.py --dataset citeseer --q 0.05 --lr 0.01 --weight_decay 0.5 --task three_class_digraph --verbose

# wikics
python3 link_prediction.py --dataset wikics --q 0.15 --lr 0.01 --weight_decay 0.25 --task three_class_digraph --verbose

# pubmed
python3 link_prediction.py --dataset pubmed --q 0.2 --lr 0.01 --weight_decay 0.25 --task three_class_digraph --verbose

Results

Experiments are conducted on four publicly available digraph datasets: Cora-ML, CiteSeer, WikiCS, and PubMed.

  • Performance of Node Classification

result1

  • Performance of Link Prediction

result2

About

Every Hop Etched in Memory: Tokenized Graph Mamba Meets Directed Graph Learning

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published