pyHGT: Unable to reproduce OGBN-MAG results
Hi HGT authors,
I am not able to reproduce your OGB leaderboard results. I followed your instructions to run your latest code (commit 9c2182f) for 10 times and got average test accuracy 0.4883 and std 0.0053.
The testing accuracy numbers of the 10 runs are: 0.4852, 0.479, 0.4935, 0.4906, 0.496, 0.4911, 0.4912, 0.4861, 0.4889, 0.4817
The ogb version I was using is 1.2.1. I did make sure evaluation is using variance_reduce for better performance. The commands I used to run your code is the following:
python3 preprocess_ogbn_mag.py --output_dir OGB_MAG.pk
for ((run=0;run<10;run=run+1))
do
dir_name=model_save_${run}
python3 train_ogbn_mag.py --n_hid 512 --n_layer 4 --n_heads 8 \
--data_dir ./OGB_MAG.pk --model_dir $dir_name \
--prev_norm --last_norm --use_RTE --conv_name hgt
python3 eval_ogbn_mag.py --n_hid 512 --n_layer 4 --n_heads 8 \
--data_dir ./OGB_MAG.pk --model_dir ${dir_name} \
--prev_norm --last_norm --use_RTE --conv_name hgt
done
Could you let me know if there is anything I missed?
Thanks! @acbull
About this issue
- Original URL
- State: closed
- Created 4 years ago
- Comments: 18 (7 by maintainers)
Hi, I have the same problem. If I follow your advice and run the following commands for 10 times, the numbers of testing accuracy on VR task are: 0.491 0.485 0.486 0.482 0.488 0.487 0.486 0.487 0.488 0.485
And the training log of the best is here: https://drive.google.com/file/d/10lUs1AXJOKTlvQVZHJHBedwQSlN3lF0d/view?usp=sharing
Is there anything I can do to reproduce your result? Thanks.