Skip to content

modit-team/MODIT

Repository files navigation

MODIT: on Multi-mOdal learning of eDITing source code.

This is the official code and data repository for MODIT.

Requirements:

  1. python 3.6
  2. pytorch 1.5.1
  3. Cuda compilation tools, release 10.1, V10.1.243

For Modit:

  1. fairseq==0.9.0
  2. Apex

For Baselines:

  1. transformers==2.6.0
  2. tokenizers
  3. tree-sitter

Auxiliary packages are listed in requirements.txt. Please make sure all the packages are installed

Setup

Run setup.sh to setup the environment and download pre-processed dataset and models.

Experiments

Run scripts/run-all-experiments.sh to run the experiments reported in the paper.

Different architectures multi-encoder

Results

RQ1

How MODIT performs in predicting the correct patch?

rq1 result

RQ2

What are the contribution of different input modalities in MODIT’s performance?

rq2 result

RQ3

What is the best strategy to encode input modalities?

rq3 result

Acknowledgement

A large portion of the code in this repository are borrowed from PLBART-repository, CodeXGlue-repository, and CodeBERT-repository. We cordially thank theauthors from these repositories to open-source their works.

About

MODIT: On Multi-Modal Learning of Editing Source Code.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published