Skip to content

Implementation of "Automatic Source Code Summarization with Extended Tree-LSTM"

Notifications You must be signed in to change notification settings

sh1doy/summarization_tf

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

80 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Attention-based Tree-to-Sequence Code Summarization Model

The TensorFlow Eager Execution implementation of Source Code Summarization with Extended Tree-LSTM (Shido+, 2019)

including:

  • Multi-way Tree-LSTM model (Ours)
  • Child-sum Tree-LSTM model
  • N-ary Tree-LSTM model
  • DeepCom (Hu et al.)
  • CODE-NN (Iyer et al.)

Dataset

  1. Download raw dataset from [https://github.com/xing-hu/DeepCom]
  2. Parse them with parser.jar

Usage

  1. Prepare tree-structured data with dataset.py
    • Run $ python dataset.py [dir]
  2. Train and evaluate model with train.py
    • See $ python train.py -h

About

Implementation of "Automatic Source Code Summarization with Extended Tree-LSTM"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published