Skip to content

structlearning/mxnet

Repository files navigation

Clique Number Estimation via Differentiable Functions of Adjacency Matrix Permutations

Environment

All the models and code have been run on Python 3.10.12 and CUDA 12.1. Install the required libraries by:

python3 -m venv mxnet_venv
source mxnet_env/bin/activate
pip install -r requirements.txt

# Additionally download dgl to run some baselines
pip install  dgl -f https://data.dgl.ai/wheels/torch-2.3/cu121/repo.html

Data

Download the data used from this link, and place it in the base directory.

Evaluation

We provide all the checkpoints for models used at this link. Download the checkpoints zip, unzip it, and run to place all checkpoints directories in the base folder (i.e., abl_weights, best_weights, our_weights, and overall_best_models should be present in the base directory).

mv checkpoints/* .

All the scripts to evaluate the models on the test data have been added to test_scripts/.

Evaluation of our models

To evaluate the three variants of our model based on stopping criteria: MxNet (MSS), MxNet (SubMatch), and MxNet (Composite), run the file

./test_scripts/our.sh

This evaluates all 3 variants on all datasets. To run the models trained for ablation study (with MSS loss only or with SubMatch loss only), run

./test_scripts/our_abl.sh

Evaluation of baselines

The following code runs all decoder-based baselines on all datasets for 1, 4, and 8 samples.

./test_scripts/difusco.sh # evaluate the 2 variants of difusco
./test_scripts/egn.sh # evaluate Erdos_Goes_Neural model
./test_scripts/gfnet.sh # evaluate GFlowNets model
./test_scripts/scat.sh # evaluate Scattering GCN model

The following code evaluates all non-decoder-based baselines on all datasets:

./test_scripts/nsfe.sh # evaluate SFE, NSFE, REINFORCE, Straight Through model

Evaluation of heuristics

The following code runs the three heuristics: node degree, pagerank, and clustering coefficient on 4 decoder variants: Erdos decoder, Scattering decoder, GFNET decoder, and DIFUSCO decoder for 1, 4, and 8 samples.

./test_scripts/nnb.sh # Evaluate heuristics on all decoders

Training

Similar to evaluation scripts, all training scripts have been provided in training_scripts.

Training of our models

To train the three variants of our model based on stopping criteria: MxNet (MSS), MxNet (SubMatch), and MxNet (Composite), run the file

./training_scripts/our.sh

To train the ablation models:

./training_scripts/our_abl_mss.sh # To train the models with MSS loss only
./training_scripts/our_abl_submatch.sh # To train the models with Submatch loss only

Training of baselines

The following code trains the baselines:

./training_scripts/difusco.sh # train the 2 variants of difusco
./training_scripts/egn.sh # train Erdos_Goes_Neural model
./training_scripts/gfnet.sh # train GFlowNets model
./training_scripts/scat.sh # train Scattering GCN model

The following code trains all non-decoder-based baselines on all datasets:

./training_scripts/nsfe.sh # train SFE, NSFE, REINFORCE, Straight Through model

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published