Skip to content

Temporal Hierarchies in Sequence to Sequence for Sentence Correction (IJCNN 2018)

Notifications You must be signed in to change notification settings

gcunhase/SentenceCorrection-MTGRU-Seq2Seq

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

About

Conference

Tensorflow code referent to "Temporal Hierarchies in Sequence to Sequence for Sentence Correction" (IEEE IJCNN 2018)

Comparison of GRU, LSTM, RNN and MTGRU in the English sentence correction task.

Check wiki page for more information

Contents

RequirementsHow to UseResultsHow to Cite

Requirements

  • Python 2.7, NLTK, progressbar2
  • CUDA 8.0
  • CuDNN v5.0
  • Tensorflow 1.0.1
sudo apt-get install cuda-8-0
cd /tmp/tensorflow-pkg/; wget hhtp://storage.googleapis.com/tensorflow/linux/gpu/tensorflow_gpu-1.0.1-cp27-none-linux_x86_64.whl
pip install --ignore-installed --upgrade tensorflow_gpu-1.0.1-cp27-none-linux_x86_64.whl
pip install -r requirements.txt

How to Use

1. Dataset

  • WMT'15 with POS-Tagging (Third Dataset Solution)
  • Download, pre-processing, and more: Dataset README

2. Training

  • Follow setup in Wiki for MTGRUCell and MultiMTRNNCell

  • In ./translate/ folder, run following script for GRU model:

    python translate_earlyStopping.py --train_dir=trainGRU --checkpoint_filename=checkpoint_perplexities_gru.txt --checkpoint_filename_best=checkpoint_perplexities_gru_best.txt
    
  • Note: Pre-trained 3-layer models are currently too big to be uploaded

  • Arguments

    Argument Type Description
    --use_rnn Boolean Use RNN
    --use_lstm Boolean Use LSTM
    --use_mtgru Boolean Use MTGRU
    --train_dir String Directory to save model checkpoint
    --checkpoint_filename String Filename to save model checkpoint
    --checkpoint_filename_best String Filename to save best model checkpoint

    Check translate_earlyStopping.py for more arguments

    Example MTGRU: python translate_earlyStopping.py --use_mtgru=True --train_dir=trainMTGRU --checkpoint_filename=checkpoint_perplexities_mtgru.txt --checkpoint_filename_best=checkpoint_perplexities_mtgru_best.txt

3. Testing

In ./translate/ folder:

python translate_earlyStopping.py --auto_decode

Use all the same parameters used during training of the model

4. Evaluation

  • Clone nlp-metrics to './evaluation/' for use in tester_allSentencesOneFile.py

    Change import path if necessary

  • Run: ./evaluation/scores.sh

    Change paths to your generated and target text files if needed

  • Plotting train and test perplexity curves (Matlab): ./evaluation/graphs.m

Results

Results for 3-layer models

BLEU scores

Generated sentences

Generated Sentences

Acknowledgement

If you use this code please cite it as:

@inproceedings{sergio2018temporal,
  title={Temporal Hierarchies in Sequence to Sequence for Sentence Correction},
  author={Sergio, Gwenaelle Cunha and Moirangthem, Dennis Singh and Lee, Minho},
  booktitle={2018 International Joint Conference on Neural Networks (IJCNN)},
  pages={1--7},
  year={2018},
  organization={IEEE}
}

Multiple Timescale code based on Singh's work.

Translate code was based on work done by mouuff.