Skip to content

Artefact for "CoCoNuT: Combining Context-Aware Neural Translation Models using Ensemble for Program Repair."

License

Notifications You must be signed in to change notification settings

thibolu/CoCoNut-Artifact

 
 

Repository files navigation

CoCoNut-Artifact

Combining Context-Aware Neural Translation Models using Ensemble for Program Repair. Artefact repo

If you use CoCoNuT for academic purposes, please cite the following publication:

@article{lutellier2020coconut,
  title={CoCoNuT: Combining Context-Aware Neural Translation Models using Ensemble for Program Repair},
  author={Lutellier, Thibaud and Pham, Viet Hung and Pang, Lawrence and Li, Yitong and Wei, Moshi and Tan, Lin},
  booktitle={Proceedings of the 28th ACM SIGSOFT International Symposium on Software Testing and Analysis},  
  year={2020}
}

Results directory contains the correct patches generated by CoCoNuT.

fairseq-context directory contains an updated fairseq API that also implements the Context-aware architecture.

To use it, you can follow https://fairseq.readthedocs.io/en/latest/models.html#module-fairseq.models.fconv To use the new Context-aware model, you need to add the argument "--use-context" and choose the "fconv_context" for the architecture.

For example:

python $fairseq_dir/train.py --use-context --fp16 --save-dir $trg_dir/model --arch fconv_context --distributed-world-size 1 --encoder-embed-dim 250 --decoder-embed-dim 250 --decoder-out-embed-dim 250 --encoder-layers '[(256,3)] * 7' --decoder-layers '[(256,3)] * 7' --dropout 0.2 --clip-norm 0.1 --lr 0.25 --min-lr 1e-4 --momentum 0.99 --max-epoch 1 --batch-size 32 $trg_dir/bin

About

Artefact for "CoCoNuT: Combining Context-Aware Neural Translation Models using Ensemble for Program Repair."

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 92.8%
  • Shell 6.1%
  • Other 1.1%